Science.gov

Sample records for evaluation software tool

  1. SUSTAINABLE REMEDIATION SOFTWARE TOOL EXERCISE AND EVALUATION

    SciTech Connect

    Kohn, J.; Nichols, R.; Looney, B.

    2011-05-12

    The goal of this study was to examine two different software tools designed to account for the environmental impacts of remediation projects. Three case studies from the Savannah River Site (SRS) near Aiken, SC were used to exercise SiteWise (SW) and Sustainable Remediation Tool (SRT) by including both traditional and novel remediation techniques, contaminants, and contaminated media. This study combined retrospective analysis of implemented projects with prospective analysis of options that were not implemented. Input data were derived from engineering plans, project reports, and planning documents with a few factors supplied from calculations based on Life Cycle Assessment (LCA). Conclusions drawn from software output were generally consistent within a tool; both tools identified the same remediation options as the 'best' for a given site. Magnitudes of impacts varied between the two tools, and it was not always possible to identify the source of the disagreement. The tools differed in their quantitative approaches: SRT based impacts on specific contaminants, media, and site geometry and modeled contaminant removal. SW based impacts on processes and equipment instead of chemical modeling. While SW was able to handle greater variety in remediation scenarios, it did not include a measure of the effectiveness of the scenario.

  2. Evaluation of free non-diagnostic DICOM software tools

    NASA Astrophysics Data System (ADS)

    Liao, Wei; Deserno, Thomas M.; Spitzer, Klaus

    2008-03-01

    A variety of software exists to interpret files or directories compliant to the Digital Imaging and Communications in Medicine (DICOM) standard and display them as individual images or volume rendered objects. Some of them offer further processing and analysis features. The surveys that have been published so far are partly not up-to-date anymore, and neither a detailed description of the software functions nor a comprehensive comparison is given. This paper aims at evaluation and comparison of freely available, non-diagnostic DICOM software with respect to the following aspects: (i) data import; (ii) data export; (iii) header viewing; (iv) 2D image viewing; (v) 3D volume viewing; (vi) support; (vii) portability; (viii) workability; and (ix) usability. In total, 21 tools were included: 3D Slicer, AMIDE, BioImage Suite, DicomWorks, EViewBox, ezDICOM, FPImage, ImageJ, JiveX, Julius, MedImaView, MedINRIA, MicroView, MIPAV, MRIcron, Osiris, PMSDView, Syngo FastView, TomoVision, UniViewer, and XMedCon. Our results in table form can ease the selection of appropriate DICOM software tools. In particular, we discuss use cases for the inexperienced user, data conversion, and volume rendering, and suggest Syngo FastView or PMSDView, DicomWorks or XMedCon, and ImageJ or UniViewer, respectively.

  3. Measuring the development process: A tool for software design evaluation

    NASA Technical Reports Server (NTRS)

    Moy, S. S.

    1980-01-01

    The design metrics evaluator (DME), a component of an automated software design analysis system, is described. The DME quantitatively evaluates software design attributes. Its use directs attention to areas of a procedure, module, or complete program having a high potential for error.

  4. Methods and software tools for design evaluation in population pharmacokinetics–pharmacodynamics studies

    PubMed Central

    Nyberg, Joakim; Bazzoli, Caroline; Ogungbenro, Kay; Aliev, Alexander; Leonov, Sergei; Duffull, Stephen; Hooker, Andrew C; Mentré, France

    2015-01-01

    Population pharmacokinetic (PK)–pharmacodynamic (PKPD) models are increasingly used in drug development and in academic research; hence, designing efficient studies is an important task. Following the first theoretical work on optimal design for nonlinear mixed-effects models, this research theme has grown rapidly. There are now several different software tools that implement an evaluation of the Fisher information matrix for population PKPD. We compared and evaluated the following five software tools: PFIM, PkStaMp, PopDes, PopED and POPT. The comparisons were performed using two models, a simple-one compartment warfarin PK model and a more complex PKPD model for pegylated interferon, with data on both concentration and response of viral load of hepatitis C virus. The results of the software were compared in terms of the standard error (SE) values of the parameters predicted from the software and the empirical SE values obtained via replicated clinical trial simulation and estimation. For the warfarin PK model and the pegylated interferon PKPD model, all software gave similar results. Interestingly, it was seen, for all software, that the simpler approximation to the Fisher information matrix, using the block diagonal matrix, provided predicted SE values that were closer to the empirical SE values than when the more complicated approximation was used (the full matrix). For most PKPD models, using any of the available software tools will provide meaningful results, avoiding cumbersome simulation and allowing design optimization. PMID:24548174

  5. Methods and software tools for design evaluation in population pharmacokinetics-pharmacodynamics studies.

    PubMed

    Nyberg, Joakim; Bazzoli, Caroline; Ogungbenro, Kay; Aliev, Alexander; Leonov, Sergei; Duffull, Stephen; Hooker, Andrew C; Mentré, France

    2015-01-01

    Population pharmacokinetic (PK)-pharmacodynamic (PKPD) models are increasingly used in drug development and in academic research; hence, designing efficient studies is an important task. Following the first theoretical work on optimal design for nonlinear mixed-effects models, this research theme has grown rapidly. There are now several different software tools that implement an evaluation of the Fisher information matrix for population PKPD. We compared and evaluated the following five software tools: PFIM, PkStaMp, PopDes, PopED and POPT. The comparisons were performed using two models, a simple-one compartment warfarin PK model and a more complex PKPD model for pegylated interferon, with data on both concentration and response of viral load of hepatitis C virus. The results of the software were compared in terms of the standard error (SE) values of the parameters predicted from the software and the empirical SE values obtained via replicated clinical trial simulation and estimation. For the warfarin PK model and the pegylated interferon PKPD model, all software gave similar results. Interestingly, it was seen, for all software, that the simpler approximation to the Fisher information matrix, using the block diagonal matrix, provided predicted SE values that were closer to the empirical SE values than when the more complicated approximation was used (the full matrix). For most PKPD models, using any of the available software tools will provide meaningful results, avoiding cumbersome simulation and allowing design optimization. PMID:24548174

  6. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  7. Software engineering tools.

    PubMed

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development. PMID:10131419

  8. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  9. Machine Tool Software

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A NASA-developed software package has played a part in technical education of students who major in Mechanical Engineering Technology at William Rainey Harper College. Professor Hack has been using (APT) Automatically Programmed Tool Software since 1969 in his CAD/CAM Computer Aided Design and Manufacturing curriculum. Professor Hack teaches the use of APT programming languages for control of metal cutting machines. Machine tool instructions are geometry definitions written in APT Language to constitute a "part program." The part program is processed by the machine tool. CAD/CAM students go from writing a program to cutting steel in the course of a semester.

  10. Focus: Design and Evaluation of a Software Tool for Collecting Reader Feedback.

    ERIC Educational Resources Information Center

    de Jong, Menno; Lentz, Leo

    2001-01-01

    Describes "Focus," a software tool for collecting reader comments more efficiently. Discusses the design and rationale of the software. Notes that results obtained using Focus were compared to the reader feedback collected under the plus-minus method. Concludes that Focus participants appeared to comment more from a reviewer's and less from a…

  11. An evaluation of software tools for the design and development of cockpit displays

    NASA Technical Reports Server (NTRS)

    Ellis, Thomas D., Jr.

    1993-01-01

    The use of all-glass cockpits at the NASA Langley Research Center (LaRC) simulation facility has changed the means of design, development, and maintenance of instrument displays. The human-machine interface has evolved from a physical hardware device to a software-generated electronic display system. This has subsequently caused an increased workload at the facility. As computer processing power increases and the glass cockpit becomes predominant in facilities, software tools used in the design and development of cockpit displays are becoming both feasible and necessary for a more productive simulation environment. This paper defines LaRC requirements of a display software development tool and compares two available applications against these requirements. As a part of the software engineering process, these tools reduce development time, provide a common platform for display development, and produce exceptional real-time results.

  12. [Software CMAP TOOLS ™ to build concept maps: an evaluation by nursing students].

    PubMed

    Ferreira, Paula Barreto; Cohrs, Cibelli Rizzo; De Domenico, Edvane Birelo Lopes

    2012-08-01

    Concept mapping (CM) is a teaching strategy that can be used to solve clinical cases, but the maps are difficult to write. The objective of this study was to describe the challenges and contributions of the Cmap Tools® software in building concept maps to solve clinical cases. To do this, a descriptive and qualitative method was used with junior nursing students from the Federal University of São Paulo. The teaching strategy was applied and the data were collected using the focal group technique. The results showed that the software facilitates and guarantees the organization, visualization, and correlation of the data, but there are difficulties related to the handling of its tools initially. In conclusion, the formatting and auto formatting resources of Cmap Tools® facilitated the construction of concept maps; however, orientation strategies should be implemented for the initial stage of the software utilization. PMID:23018409

  13. CSAM Metrology Software Tool

    NASA Technical Reports Server (NTRS)

    Vu, Duc; Sandor, Michael; Agarwal, Shri

    2005-01-01

    CSAM Metrology Software Tool (CMeST) is a computer program for analysis of false-color CSAM images of plastic-encapsulated microcircuits. (CSAM signifies C-mode scanning acoustic microscopy.) The colors in the images indicate areas of delamination within the plastic packages. Heretofore, the images have been interpreted by human examiners. Hence, interpretations have not been entirely consistent and objective. CMeST processes the color information in image-data files to detect areas of delamination without incurring inconsistencies of subjective judgement. CMeST can be used to create a database of baseline images of packages acquired at given times for comparison with images of the same packages acquired at later times. Any area within an image can be selected for analysis, which can include examination of different delamination types by location. CMeST can also be used to perform statistical analyses of image data. Results of analyses are available in a spreadsheet format for further processing. The results can be exported to any data-base-processing software.

  14. The Comprehensive Evaluation of Electronic Learning Tools and Educational Software (CEELTES)

    ERIC Educational Resources Information Center

    Karolcík, Štefan; Cipková, Elena; Hrušecký, Roman; Veselský, Milan

    2015-01-01

    Despite the fact that digital technologies are more and more used in the learning and education process, there is still lack of professional evaluation tools capable of assessing the quality of used digital teaching aids in a comprehensive and objective manner. Construction of the Comprehensive Evaluation of Electronic Learning Tools and…

  15. NASA Software Estimating Tool (N-SET)

    NASA Technical Reports Server (NTRS)

    Stukes, Sherry

    2006-01-01

    The goals of this project are to: Develop an early lifecycle software cost estimation tool leveraging existing data and capabilities Collect additional software data from: a) Jet Propulsion Laboratory; b) Goddard Space Flight Center; and c) Marshall Space Flight Center. Analyze, normalize, evaluate, stratify, and validate data. Create a calibrated, validated, and documented tool initially using available data and subsequently using newly collected data.

  16. Development of a software tool and criteria evaluation for efficient design of small interfering RNA.

    PubMed

    Chaudhary, Aparna; Srivastava, Sonam; Garg, Sanjeev

    2011-01-01

    RNA interference can be used as a tool for gene silencing mediated by small interfering RNAs (siRNA). The critical step in effective and specific RNAi processing is the selection of suitable constructs. Major design criteria, i.e., Reynolds's design rules, thermodynamic stability, internal repeats, immunostimulatory motifs were emphasized and implemented in the siRNA design tool. The tool provides thermodynamic stability score, GC content and a total score based on other design criteria in the output. The viability of the tool was established with different datasets. In general, the siRNA constructs produced by the tool had better thermodynamic score and positional properties. Comparable thermodynamic scores and better total scores were observed with the existing tools. Moreover, the results generated had comparable off-target silencing effect. Criteria evaluations with additional criteria were achieved in WEKA. PMID:21145307

  17. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem

    2003-01-01

    To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this

  18. Development of a software tool and criteria evaluation for efficient design of small interfering RNA

    SciTech Connect

    Chaudhary, Aparna; Srivastava, Sonam; Garg, Sanjeev

    2011-01-07

    Research highlights: {yields} The developed tool predicted siRNA constructs with better thermodynamic stability and total score based on positional and other criteria. {yields} Off-target silencing below score 30 were observed for the best siRNA constructs for different genes. {yields} Immunostimulation and cytotoxicity motifs considered and penalized in the developed tool. {yields} Both positional and compositional criteria were observed to be important. -- Abstract: RNA interference can be used as a tool for gene silencing mediated by small interfering RNAs (siRNA). The critical step in effective and specific RNAi processing is the selection of suitable constructs. Major design criteria, i.e., Reynolds's design rules, thermodynamic stability, internal repeats, immunostimulatory motifs were emphasized and implemented in the siRNA design tool. The tool provides thermodynamic stability score, GC content and a total score based on other design criteria in the output. The viability of the tool was established with different datasets. In general, the siRNA constructs produced by the tool had better thermodynamic score and positional properties. Comparable thermodynamic scores and better total scores were observed with the existing tools. Moreover, the results generated had comparable off-target silencing effect. Criteria evaluations with additional criteria were achieved in WEKA.

  19. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareani, Corina; Venet, Arnaud; Visser, Willem; Washington, Rich

    2003-01-01

    We report on a study to determine the maturity of different verification and validation technologies (V&V) on a representative example of NASA flight software. The study consisted of a controlled experiment where three technologies (static analysis, runtime analysis and model checking) were compared to traditional testing with respect to their ability to find seeded errors in a prototype Mars Rover. What makes this study unique is that it is the first (to the best of our knowledge) to do a controlled experiment to compare formal methods based tools to testing on a realistic industrial-size example where the emphasis was on collecting as much data on the performance of the tools and the participants as possible. The paper includes a description of the Rover code that was analyzed, the tools used as well as a detailed description of the experimental setup and the results. Due to the complexity of setting up the experiment, our results can not be generalized, but we believe it can still serve as a valuable point of reference for future studies of this kind. It did confirm the belief we had that advanced tools can outperform testing when trying to locate concurrency errors. Furthermore the results of the experiment inspired a novel framework for testing the next generation of the Rover.

  20. Biological Imaging Software Tools

    PubMed Central

    Eliceiri, Kevin W.; Berthold, Michael R.; Goldberg, Ilya G.; Ibáñez, Luis; Manjunath, B.S.; Martone, Maryann E.; Murphy, Robert F.; Peng, Hanchuan; Plant, Anne L.; Roysam, Badrinath; Stuurman, Nico; Swedlow, Jason R.; Tomancak, Pavel; Carpenter, Anne E.

    2013-01-01

    Few technologies are more widespread in modern biological laboratories than imaging. Recent advances in optical technologies and instrumentation are providing hitherto unimagined capabilities. Almost all these advances have required the development of software to enable the acquisition, management, analysis, and visualization of the imaging data. We review each computational step that biologists encounter when dealing with digital images, the challenges in that domain, and the overall status of available software for bioimage informatics, focusing on open source options. PMID:22743775

  1. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    SciTech Connect

    Chen, H; Tan, J; Kavanaugh, J; Dolly, S; Gay, H; Thorstad, W; Anastasio, M; Altman, M; Mutic, S; Li, H

    2014-06-15

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-time and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding

  2. Software Tools: EPICUR.

    ERIC Educational Resources Information Center

    Abreu, Jose Luis; And Others

    EPICUR (Integrated Programing Environment for the Development of Educational Software) is a set of programming modules ranging from low level interfaces to high level algorithms aimed at the development of computer-assisted instruction (CAI) applications. The emphasis is on user-friendly interfaces and on multiplying productivity without loss of…

  3. Modern Tools for Modern Software

    SciTech Connect

    Kumfert, G; Epperly, T

    2001-10-31

    This is a proposal for a new software configure/build tool for building, maintaining, deploying, and installing software. At its completion, this new tool will replace current standard tool suites such as ''autoconf'', ''automake'', ''libtool'', and the de facto standard build tool, ''make''. This ambitious project is born out of the realization that as scientific software has grown in size and complexity over the years, the difficulty of configuring and building software has increased as well. For high performance scientific software, additional complexities often arises from the need for portability to multiple platforms (including many one-of-a-kind platforms), multilanguage implementations, use of third party libraries, and a need to adapt algorithms to the specific features of the hardware. Development of scientific software is being hampered by the quality of configuration and build tools commonly available. Inordinate amounts of time and expertise are required to develop and maintain the configure and build system for a moderately complex project. Better build and configure tools will increase developer productivity. This proposal is a first step in a process of shoring up the foundation upon which DOE software is created and used.

  4. GUM2DFT—a software tool for uncertainty evaluation of transient signals in the frequency domain

    NASA Astrophysics Data System (ADS)

    Eichstädt, S.; Wilkens, V.

    2016-05-01

    The Fourier transform and its counterpart for discrete time signals, the discrete Fourier transform (DFT), are common tools in measurement science and application. Although almost every scientific software package offers ready-to-use implementations of the DFT, the propagation of uncertainties in line with the guide to the expression of uncertainty in measurement (GUM) is typically neglected. This is of particular importance in dynamic metrology, when input estimation is carried out by deconvolution in the frequency domain. To this end, we present the new open-source software tool GUM2DFT, which utilizes closed formulas for the efficient propagation of uncertainties for the application of the DFT, inverse DFT and input estimation in the frequency domain. It handles different frequency domain representations, accounts for autocorrelation and takes advantage of the symmetry inherent in the DFT result for real-valued time domain signals. All tools are presented in terms of examples which form part of the software package. GUM2DFT will foster GUM-compliant evaluation of uncertainty in a DFT-based analysis and enable metrologists to include uncertainty evaluations in their routine work.

  5. Development and evaluation of an open source software tool for deidentification of pathology reports

    PubMed Central

    Beckwith, Bruce A; Mahaadevan, Rajeshwarri; Balis, Ulysses J; Kuo, Frank

    2006-01-01

    Background Electronic medical records, including pathology reports, are often used for research purposes. Currently, there are few programs freely available to remove identifiers while leaving the remainder of the pathology report text intact. Our goal was to produce an open source, Health Insurance Portability and Accountability Act (HIPAA) compliant, deidentification tool tailored for pathology reports. We designed a three-step process for removing potential identifiers. The first step is to look for identifiers known to be associated with the patient, such as name, medical record number, pathology accession number, etc. Next, a series of pattern matches look for predictable patterns likely to represent identifying data; such as dates, accession numbers and addresses as well as patient, institution and physician names. Finally, individual words are compared with a database of proper names and geographic locations. Pathology reports from three institutions were used to design and test the algorithms. The software was improved iteratively on training sets until it exhibited good performance. 1800 new pathology reports were then processed. Each report was reviewed manually before and after deidentification to catalog all identifiers and note those that were not removed. Results 1254 (69.7 %) of 1800 pathology reports contained identifiers in the body of the report. 3439 (98.3%) of 3499 unique identifiers in the test set were removed. Only 19 HIPAA-specified identifiers (mainly consult accession numbers and misspelled names) were missed. Of 41 non-HIPAA identifiers missed, the majority were partial institutional addresses and ages. Outside consultation case reports typically contain numerous identifiers and were the most challenging to deidentify comprehensively. There was variation in performance among reports from the three institutions, highlighting the need for site-specific customization, which is easily accomplished with our tool. Conclusion We have

  6. Comparison of a Web-Based Dietary Assessment Tool with Software for the Evaluation of Dietary Records

    PubMed Central

    BENEDIK, Evgen; KOROUŠIĆ SELJAK, Barbara; HRIBAR, Maša; ROGELJ, Irena; BRATANIČ, Borut; OREL, Rok; FIDLER MIS, Nataša

    2015-01-01

    Background Dietary assessment in clinical practice is performed by means of computer support, either in the form of a web-based tool or software. The aim of the paper is to present the results of the comparison of a Slovenian web-based tool with German software for the evaluation of four-day weighted paper-and-pencil-based dietary records (paper-DRs) in pregnant women. Methods A volunteer group of pregnant women (n=63) completed paper-DRs. These records were entered by an experienced research dietitian into a web-based application (Open Platform for Clinical Nutrition, OPEN, http://opkp.si/en, Ljubljana, Slovenia) and software application (Prodi 5.7 Expert plus, Nutri-Science, Stuttgart, Germany, 2011). The results for calculated energy intake, as well as 45 macro- and micronutrient intakes, were statistically compared by using the non-parametric Spearman’s rank correlation coefficient. The cut-off for Spearman’s rho was set at >0.600. Results 12 nutritional parameters (energy, carbohydrates, fat, protein, water, potassium, calcium, phosphorus, dietary fiber, vitamin C, folic acid, and stearic acid) were in high correlation (>0.800), 18 in moderate (0.600–0.799), 11 in weak correlation (0.400–0.599), while 5 (arachidonic acid, niacin, alpha-linolenic acid, fluoride, total sugars) did not show any statistical correlation. Conclusion Comparison of the results of the evaluation of dietary records using a web-based dietary assessment tool with those using software shows that there is a high correlation for energy and macronutrient content.

  7. The evaluation of Computed Tomography hard- and software tools for micropaleontologic studies on foraminifera

    NASA Astrophysics Data System (ADS)

    van Loo, D.; Speijer, R.; Masschaele, B.; Dierick, M.; Cnudde, V.; Boone, M.; de Witte, Y.; Dewanckele, J.; van Hoorebeke, L.; Jacobs, P.

    2009-04-01

    Foraminifera (Forams) are single-celled amoeba-like organisms in the sea, which build a tiny calcareous multi-chambered shell for protection. Their enormous abundance, great variation of shape through time and their presence in all marine deposits made these tiny microfossils the oil companies' best friend by facilitating the detection of new oil wells. Besides the success of forams in the oil and gas industry, they are also a most powerful tool for reconstructing climate change in the past. The shell of a foraminifer is a tiny gold mine of information both geometrical as chemical. However, until recently the best information on this architecture was only obtained through imaging the outside of a shell with Scanning Electron Microscopy (SEM), giving no clues towards internal structures other than single snapshots through breaking a specimen apart. With X-ray computed tomography (CT) it is possible to overcome this problem and uncover a huge amount of geometrical information without destructing the samples. Using the last generation of micro-CT's, called nano-CT, because of the sub-micron resolution, it is now possible to perform adequate imaging even on these tiny samples without needing huge facilities. In this research, a comparison is made between different X-ray sources and X-ray detectors and the resulting image resolution. Both sharpness, noise and contrast are very important parameters that will have important effects on the accuracy of the results and on the speed of data-processing. Combining this tomography technique with specific image processing software, called segmentation, it is possible to obtain a 3D virtual representation of the entire forams shell. This 3D virtual object can then be used for many purposes, from which automatic measurement of the chambers size is one of the most important ones. The segmentation process is a combination of several algorithms that are often used in CT evaluation, in this work an evaluation of those algorithms is

  8. Tools for quantitative form description; an evaluation of different software packages for semi-landmark analysis

    PubMed Central

    Houssaye, Alexandra; Herrel, Anthony; Fabre, Anne-Claire; Cornette, Raphael

    2015-01-01

    The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic), structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages (‘Edgewarp’ and ‘Morpho’) for the same sliding task, and investigate potential differences in the results and biological interpretation. ‘Morpho’ is much faster than ‘Edgewarp,’ notably as a result of the greater computational power of the ‘Morpho’ software routines and the complexity of the ‘Edgewarp’ workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to perform the analyses. PMID:26618086

  9. Tools for quantitative form description; an evaluation of different software packages for semi-landmark analysis.

    PubMed

    Botton-Divet, Léo; Houssaye, Alexandra; Herrel, Anthony; Fabre, Anne-Claire; Cornette, Raphael

    2015-01-01

    The challenging complexity of biological structures has led to the development of several methods for quantitative analyses of form. Bones are shaped by the interaction of historical (phylogenetic), structural, and functional constrains. Consequently, bone shape has been investigated intensively in an evolutionary context. Geometric morphometric approaches allow the description of the shape of an object in all of its biological complexity. However, when biological objects present only few anatomical landmarks, sliding semi-landmarks may provide good descriptors of shape. The sliding procedure, mandatory for sliding semi-landmarks, requires several steps that may be time-consuming. We here compare the time required by two different software packages ('Edgewarp' and 'Morpho') for the same sliding task, and investigate potential differences in the results and biological interpretation. 'Morpho' is much faster than 'Edgewarp,' notably as a result of the greater computational power of the 'Morpho' software routines and the complexity of the 'Edgewarp' workflow. Morphospaces obtained using both software packages are similar and provide a consistent description of the biological variability. The principal differences between the two software packages are observed in areas characterized by abrupt changes in the bone topography. In summary, both software packages perform equally well in terms of the description of biological structures, yet differ in the simplicity of the workflow and time needed to perform the analyses. PMID:26618086

  10. Fermilab Software Tools Program: Fermitools

    SciTech Connect

    Pordes, R.

    1995-10-01

    The Fermilab Software Tools Program (Fermitools) was established in 1994 as an intiative under which Fermilab provides software it has developed to outside collaborators. During the year and a half since its start ten software products have been packaged and made available on the official Fermilab anonymous ftp site, and backup support and information services have been made available for them. During the past decade, institutions outside the Fermilab physics experiment user community have in general only been able to obtain and use Fermilab developed software on an adhoc or informal basis. With the Fermitools program the Fermilab Computing Division has instituted an umbrella under which software that is regarded by its internal user community as useful and of high quality can be provided to users outside of High Energy Physics experiments. The main thrust of the Fermitools program is stimulating collaborative use and further development of the software. Having established minimal umbrella beaurocracy makes collaborative development and support easier. The published caveat given to people who take the software includes the statement ``Provision of the software implies no commitment of support by Fermilab. The Fermilab Computing Division is open to discussing other levels of support for use of the software with responsible and committed users and collaborator``. There have been no negative comments in response to this and the policy has not given rise to any questions or complaints. In this paper we present the goals and strategy of the program and introduce some of the software made available through it. We discuss our experiences to date and mention the perceived benefits of the Program.

  11. Evaluating Difficulty Levels of Dynamic Geometry Software Tools to Enhance Teachers' Professional Development

    ERIC Educational Resources Information Center

    Hohenwarter, Judith; Hohenwarter, Markus; Lavicza, Zsolt

    2010-01-01

    This paper describes a study aimed to identify commonly emerging impediments related to the introduction of dynamic mathematics software. We report on the analysis of data collected during a three-week professional development programme organised for middle and high school teachers in Florida. The study identified challenges that participants face…

  12. Data and software tools for gamma radiation spectral threat detection and nuclide identification algorithm development and evaluation

    NASA Astrophysics Data System (ADS)

    Portnoy, David; Fisher, Brian; Phifer, Daniel

    2015-06-01

    The detection of radiological and nuclear threats is extremely important to national security. The federal government is spending significant resources developing new detection systems and attempting to increase the performance of existing ones. The detection of illicit radionuclides that may pose a radiological or nuclear threat is a challenging problem complicated by benign radiation sources (e.g., cat litter and medical treatments), shielding, and large variations in background radiation. Although there is a growing acceptance within the community that concentrating efforts on algorithm development (independent of the specifics of fully assembled systems) has the potential for significant overall system performance gains, there are two major hindrances to advancements in gamma spectral analysis algorithms under the current paradigm: access to data and common performance metrics along with baseline performance measures. Because many of the signatures collected during performance measurement campaigns are classified, dissemination to algorithm developers is extremely limited. This leaves developers no choice but to collect their own data if they are lucky enough to have access to material and sensors. This is often combined with their own definition of metrics for measuring performance. These two conditions make it all but impossible for developers and external reviewers to make meaningful comparisons between algorithms. Without meaningful comparisons, performance advancements become very hard to achieve and (more importantly) recognize. The objective of this work is to overcome these obstacles by developing and freely distributing real and synthetically generated gamma-spectra data sets as well as software tools for performance evaluation with associated performance baselines to national labs, academic institutions, government agencies, and industry. At present, datasets for two tracks, or application domains, have been developed: one that includes temporal

  13. Component Modeling Approach Software Tool

    Energy Science and Technology Software Center (ESTSC)

    2010-08-23

    The Component Modeling Approach Software Tool (CMAST) establishes a set of performance libraries of approved components (frames, glass, and spacer) which can be accessed for configuring fenestration products for a project, and btaining a U-factor, Solar Heat Gain Coefficient (SHGC), and Visible Transmittance (VT) rating for those products, which can then be reflected in a CMA Label Certificate for code compliance. CMAST is web-based as well as client-based. The completed CMA program and software toolmore » will be useful in several ways for a vast array of stakeholders in the industry: Generating performance ratings for bidding projects Ascertaining credible and accurate performance data Obtaining third party certification of overall product performance for code compliance« less

  14. Commercial Expert-System-Building Software Tools

    NASA Technical Reports Server (NTRS)

    Gevarter, William B.

    1989-01-01

    Report evaluates commercially-available expert-system-building tools in terms of structures, representations of knowledge, inference mechanisms, interfaces with developers and end users, and capabilities of performing such functions as diagnosis and design. Software tools commercialized derivatives of artificial-intelligence systems developed by researchers at universities and research organizations. Reducing time to develop expert system by order of magnitude compared to that required with such traditional artificial development languages as LISP. Table lists 20 such tools, rating attributes as strong, fair, programmable by user, or having no capability in various criteria.

  15. Sandia software guidelines: Volume 5, Tools, techniques, and methodologies

    SciTech Connect

    Not Available

    1989-07-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. This volume describes software tools and methodologies available to Sandia personnel for the development of software, and outlines techniques that have proven useful within the Laboratories and elsewhere. References and evaluations by Sandia personnel are included. 6 figs.

  16. Evaluation of Visualization Software

    NASA Technical Reports Server (NTRS)

    Globus, Al; Uselton, Sam

    1995-01-01

    Visualization software is widely used in scientific and engineering research. But computed visualizations can be very misleading, and the errors are easy to miss. We feel that the software producing the visualizations must be thoroughly evaluated and the evaluation process as well as the results must be made available. Testing and evaluation of visualization software is not a trivial problem. Several methods used in testing other software are helpful, but these methods are (apparently) often not used. When they are used, the description and results are generally not available to the end user. Additional evaluation methods specific to visualization must also be developed. We present several useful approaches to evaluation, ranging from numerical analysis of mathematical portions of algorithms to measurement of human performance while using visualization systems. Along with this brief survey, we present arguments for the importance of evaluations and discussions of appropriate use of some methods.

  17. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  18. Software Tools Streamline Project Management

    NASA Technical Reports Server (NTRS)

    2009-01-01

    Three innovative software inventions from Ames Research Center (NETMARK, Program Management Tool, and Query-Based Document Management) are finding their way into NASA missions as well as industry applications. The first, NETMARK, is a program that enables integrated searching of data stored in a variety of databases and documents, meaning that users no longer have to look in several places for related information. NETMARK allows users to search and query information across all of these sources in one step. This cross-cutting capability in information analysis has exponentially reduced the amount of time needed to mine data from days or weeks to mere seconds. NETMARK has been used widely throughout NASA, enabling this automatic integration of information across many documents and databases. NASA projects that use NETMARK include the internal reporting system and project performance dashboard, Erasmus, NASA s enterprise management tool, which enhances organizational collaboration and information sharing through document routing and review; the Integrated Financial Management Program; International Space Station Knowledge Management; Mishap and Anomaly Information Reporting System; and management of the Mars Exploration Rovers. Approximately $1 billion worth of NASA s projects are currently managed using Program Management Tool (PMT), which is based on NETMARK. PMT is a comprehensive, Web-enabled application tool used to assist program and project managers within NASA enterprises in monitoring, disseminating, and tracking the progress of program and project milestones and other relevant resources. The PMT consists of an integrated knowledge repository built upon advanced enterprise-wide database integration techniques and the latest Web-enabled technologies. The current system is in a pilot operational mode allowing users to automatically manage, track, define, update, and view customizable milestone objectives and goals. The third software invention, Query

  19. Evaluation of three methods for retrospective correction of vignetting on medical microscopy images utilizing two open source software tools.

    PubMed

    Babaloukas, Georgios; Tentolouris, Nicholas; Liatis, Stavros; Sklavounou, Alexandra; Perrea, Despoina

    2011-12-01

    Correction of vignetting on images obtained by a digital camera mounted on a microscope is essential before applying image analysis. The aim of this study is to evaluate three methods for retrospective correction of vignetting on medical microscopy images and compare them with a prospective correction method. One digital image from four different tissues was used and a vignetting effect was applied on each of these images. The resulted vignetted image was replicated four times and in each replica a different method for vignetting correction was applied with fiji and gimp software tools. The highest peak signal-to-noise ratio from the comparison of each method to the original image was obtained from the prospective method in all tissues. The morphological filtering method provided the highest peak signal-to-noise ratio value amongst the retrospective methods. The prospective method is suggested as the method of choice for correction of vignetting and if it is not applicable, then the morphological filtering may be suggested as the retrospective alternative method. PMID:21950542

  20. Evaluating Instructional Software.

    ERIC Educational Resources Information Center

    Hoffman, Joseph L.; Lyons, David J.

    1997-01-01

    Presents an evaluation instrument for evaluating instructional multimedia programs. Highlights include the need to evaluate; an illustrated example; and instructions for filling out the instrument that includes compatibility for hardware and software; instructional design issues, including content and audience definition; and interface, including…

  1. Software tools for optical interferometry

    NASA Astrophysics Data System (ADS)

    Thureau, Nathalie D.; Ireland, Michael; Monnier, John D.; Pedretti, Ettore

    2006-06-01

    We describe a set of general purpose utilities for visualizing and manipulating optical interferometry data stored in the FITS-based OIFITS data format. This class of routines contains code like the OiPlot navigation/visualization tool which allows the user to extract visibility, closure phase and UV-coverage information from the OIFITS files and to display the information in various ways. OiPlot also has basic data model fitting capabilities which can be used for a rapid first analysis of the scientific data. More advanced image reconstruction techniques are part of a dedicated utility. In addition, these routines allow data from multiple interferometers to be combined and used together. Part of our work also aims at developing software specific to the Michigan InfraRed Combiner (MIRC). Our experience designing a flexible and robust graphical user interfaced based on sockets using python libraries has wide applicability and this paper will discuss practicalities.

  2. AUTOSIM: An automated repetitive software testing tool

    NASA Technical Reports Server (NTRS)

    Dunham, J. R.; Mcbride, S. E.

    1985-01-01

    AUTOSIM is a software tool which automates the repetitive run testing of software. This tool executes programming tasks previously performed by a programmer with one year of programming experience. Use of the AUTOSIM tool requires a knowledge base containing information about known faults, code fixes, and the fault diagnosis-correction process. AUTOSIM can be considered as an expert system which replaces a low level of programming expertise. Reference information about the design and implementation of the AUTOSIM software test tool provides flowcharts to assist in maintaining the software code and a description of how to use the tool.

  3. Evaluation of high-performance computing software

    SciTech Connect

    Browne, S.; Dongarra, J.; Rowan, T.

    1996-12-31

    The absence of unbiased and up to date comparative evaluations of high-performance computing software complicates a user`s search for the appropriate software package. The National HPCC Software Exchange (NHSE) is attacking this problem using an approach that includes independent evaluations of software, incorporation of author and user feedback into the evaluations, and Web access to the evaluations. We are applying this approach to the Parallel Tools Library (PTLIB), a new software repository for parallel systems software and tools, and HPC-Netlib, a high performance branch of the Netlib mathematical software repository. Updating the evaluations with feed-back and making it available via the Web helps ensure accuracy and timeliness, and using independent reviewers produces unbiased comparative evaluations difficult to find elsewhere.

  4. A software technology evaluation program

    NASA Technical Reports Server (NTRS)

    Novaes-Card, David N.

    1985-01-01

    A set of quantitative approaches is presented for evaluating software development methods and tools. The basic idea is to generate a set of goals which are refined into quantifiable questions which specify metrics to be collected on the software development and maintenance process and product. These metrics can be used to characterize, evaluate, predict, and motivate. They can be used in an active as well as passive way by learning form analyzing the data and improving the methods and tools based upon what is learned from that analysis. Several examples were given representing each of the different approaches to evaluation. The cost of the approaches varied inversely with the level of confidence in the interpretation of the results.

  5. Intelligent Software Tools for Advanced Computing

    SciTech Connect

    Baumgart, C.W.

    2001-04-03

    Feature extraction and evaluation are two procedures common to the development of any pattern recognition application. These features are the primary pieces of information which are used to train the pattern recognition tool, whether that tool is a neural network, a fuzzy logic rulebase, or a genetic algorithm. Careful selection of the features to be used by the pattern recognition tool can significantly streamline the overall development and training of the solution for the pattern recognition application. This report summarizes the development of an integrated, computer-based software package called the Feature Extraction Toolbox (FET), which can be used for the development and deployment of solutions to generic pattern recognition problems. This toolbox integrates a number of software techniques for signal processing, feature extraction and evaluation, and pattern recognition, all under a single, user-friendly development environment. The toolbox has been developed to run on a laptop computer, so that it may be taken to a site and used to develop pattern recognition applications in the field. A prototype version of this toolbox has been completed and is currently being used for applications development on several projects in support of the Department of Energy.

  6. Software engineering environment tool set integration

    NASA Technical Reports Server (NTRS)

    Selfridge, William P.

    1986-01-01

    Space Transportation System Division (STSD) Engineering has a program to promote excellence within the engineering function. This program resulted in a capital funded facility based on a VAX cluster called the Rockwell Operational Engineering System (ROSES). The second phase of a three phase plan to establish an integrated software engineering environment for ROSES is examined. It discusses briefly phase one which establishes the basic capability for a modern software development environment to include a tool set, training and standards. Phase two is a tool set integration. The tool set is primarily off-the-shelf tools acquired through vendors or government agencies (public domain). These tools were placed into categories of software development. These categories are: requirements, design, and construction support; verification and validation support; and software management support. The integration of the tool set is being performed through concept prototyping and development of tools specifically designed to support the life cycle and provide transition from one phase to the next.

  7. Modeling and MBL: Software Tools for Science.

    ERIC Educational Resources Information Center

    Tinker, Robert F.

    Recent technological advances and new software packages put unprecedented power for experimenting and theory-building in the hands of students at all levels. Microcomputer-based laboratory (MBL) and model-solving tools illustrate the educational potential of the technology. These tools include modeling software and three MBL packages (which are…

  8. Software management tools: Lessons learned from use

    NASA Technical Reports Server (NTRS)

    Reifer, D. J.; Valett, J.; Knight, J.; Wenneson, G.

    1985-01-01

    Experience in inserting software project planning tools into more than 100 projects producing mission critical software are discussed. The problems the software project manager faces are listed along with methods and tools available to handle them. Experience is reported with the Project Manager's Workstation (PMW) and the SoftCost-R cost estimating package. Finally, the results of a survey, which looked at what could be done in the future to overcome the problems experienced and build a set of truly useful tools, are presented.

  9. Software tool for portal dosimetry research.

    PubMed

    Vial, P; Hunt, P; Greer, P B; Oliver, L; Baldock, C

    2008-09-01

    This paper describes a software tool developed for research into the use of an electronic portal imaging device (EPID) to verify dose for intensity modulated radiation therapy (IMRT) beams. A portal dose image prediction (PDIP) model that predicts the EPID response to IMRT beams has been implemented into a commercially available treatment planning system (TPS). The software tool described in this work was developed to modify the TPS PDIP model by incorporating correction factors into the predicted EPID image to account for the difference in EPID response to open beam radiation and multileaf collimator (MLC) transmitted radiation. The processes performed by the software tool include; i) read the MLC file and the PDIP from the TPS, ii) calculate the fraction of beam-on time that each point in the IMRT beam is shielded by MLC leaves, iii) interpolate correction factors from look-up tables, iv) create a corrected PDIP image from the product of the original PDIP and the correction factors and write the corrected image to file, v) display, analyse, and export various image datasets. The software tool was developed using the Microsoft Visual Studio.NET framework with the C# compiler. The operation of the software tool was validated. This software provided useful tools for EPID dosimetry research, and it is being utilised and further developed in ongoing EPID dosimetry and IMRT dosimetry projects. PMID:18946980

  10. Parallel software tools at Langley Research Center

    NASA Technical Reports Server (NTRS)

    Moitra, Stuti; Tennille, Geoffrey M.; Lakeotes, Christopher D.; Randall, Donald P.; Arthur, Jarvis J.; Hammond, Dana P.; Mall, Gerald H.

    1993-01-01

    This document gives a brief overview of parallel software tools available on the Intel iPSC/860 parallel computer at Langley Research Center. It is intended to provide a source of information that is somewhat more concise than vendor-supplied material on the purpose and use of various tools. Each of the chapters on tools is organized in a similar manner covering an overview of the functionality, access information, how to effectively use the tool, observations about the tool and how it compares to similar software, known problems or shortfalls with the software, and reference documentation. It is primarily intended for users of the iPSC/860 at Langley Research Center and is appropriate for both the experienced and novice user.

  11. NASA PC software evaluation project

    NASA Technical Reports Server (NTRS)

    Dominick, Wayne D. (Editor); Kuan, Julie C.

    1986-01-01

    The USL NASA PC software evaluation project is intended to provide a structured framework for facilitating the development of quality NASA PC software products. The project will assist NASA PC development staff to understand the characteristics and functions of NASA PC software products. Based on the results of the project teams' evaluations and recommendations, users can judge the reliability, usability, acceptability, maintainability and customizability of all the PC software products. The objective here is to provide initial, high-level specifications and guidelines for NASA PC software evaluation. The primary tasks to be addressed in this project are as follows: to gain a strong understanding of what software evaluation entails and how to organize a structured software evaluation process; to define a structured methodology for conducting the software evaluation process; to develop a set of PC software evaluation criteria and evaluation rating scales; and to conduct PC software evaluations in accordance with the identified methodology. Communication Packages, Network System Software, Graphics Support Software, Environment Management Software, General Utilities. This report represents one of the 72 attachment reports to the University of Southwestern Louisiana's Final Report on NASA Grant NGT-19-010-900. Accordingly, appropriate care should be taken in using this report out of context of the full Final Report.

  12. Tool support for software lookup table optimization

    PubMed Central

    Strout, Michelle Mills; Bieman, James M.

    2012-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0 × and 6.9 × for two molecular biology algorithms, 1.4 × for a molecular dynamics program, 2.1 × to 2.8 × for a neural network application, and 4.6 × for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches. PMID:24532963

  13. Tool support for software lookup table optimization.

    PubMed

    Wilcox, Chris; Strout, Michelle Mills; Bieman, James M

    2011-12-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology and tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0 × and 6.9 × for two molecular biology algorithms, 1.4 × for a molecular dynamics program, 2.1 × to 2.8 × for a neural network application, and 4.6 × for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches. PMID:24532963

  14. Tool Support for Software Lookup Table Optimization

    DOE PAGESBeta

    Wilcox, Chris; Strout, Michelle Mills; Bieman, James M.

    2011-01-01

    A number of scientific applications are performance-limited by expressions that repeatedly call costly elementary functions. Lookup table (LUT) optimization accelerates the evaluation of such functions by reusing previously computed results. LUT methods can speed up applications that tolerate an approximation of function results, thereby achieving a high level of fuzzy reuse. One problem with LUT optimization is the difficulty of controlling the tradeoff between performance and accuracy. The current practice of manual LUT optimization adds programming effort by requiring extensive experimentation to make this tradeoff, and such hand tuning can obfuscate algorithms. In this paper we describe a methodology andmore » tool implementation to improve the application of software LUT optimization. Our Mesa tool implements source-to-source transformations for C or C++ code to automate the tedious and error-prone aspects of LUT generation such as domain profiling, error analysis, and code generation. We evaluate Mesa with five scientific applications. Our results show a performance improvement of 3.0× and 6.9× for two molecular biology algorithms, 1.4× for a molecular dynamics program, 2.1× to 2.8× for a neural network application, and 4.6× for a hydrology calculation. We find that Mesa enables LUT optimization with more control over accuracy and less effort than manual approaches.« less

  15. Evaluation Software in Counseling.

    ERIC Educational Resources Information Center

    Sabella, Russell A.

    Counselors today are presented with a number of differing applications software. This article intends to advance the counselor's knowledge and considerations of the various aspects of application software. Included is a discussion of the software applications typically of help to counselors in (a) managing their work (computer managed counseling);…

  16. Tool Use Within NASA Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  17. Evaluating Digital Authoring Tools

    ERIC Educational Resources Information Center

    Wilde, Russ

    2004-01-01

    As the quality of authoring software increases, online course developers become less reliant on proprietary learning management systems, and develop skills in the design of original, in-house materials and the delivery platforms for them. This report examines the capabilities of digital authoring software tools for the development of learning…

  18. Verifying nuclear fuel assemblies in wet storages on a partial defect level: A software simulation tool for evaluating the capabilities of the Digital Cherenkov Viewing Device

    NASA Astrophysics Data System (ADS)

    Grape, Sophie; Jacobsson Svärd, Staffan; Lindberg, Bo

    2013-01-01

    The Digital Cherenkov Viewing Device (DCVD) is an instrument that records the Cherenkov light emitted from irradiated nuclear fuels in wet storages. The presence, intensity and pattern of the Cherenkov light can be used by the International Atomic Energy Agency (IAEA) inspectors to verify that the fuel properties comply with declarations. The DCVD is since several years approved by the IAEA for gross defect verification, i.e. to control whether an item in a storage pool is a nuclear fuel assembly or a non-fuel item [1]. Recently, it has also been endorsed as a tool for partial defect verification, i.e. to identify if a fraction of the fuel rods in an assembly have been removed or replaced. The latter recognition was based on investigations of experimental studies on authentic fuel assemblies and of simulation studies on hypothetic cases of partial defects [2]. This paper describes the simulation methodology and software which was used in the partial defect capability evaluations. The developed simulation procedure uses three stand-alone software packages: the ORIGEN-ARP code [3] used to obtain the gamma-ray spectrum from the fission products in the fuel, the Monte Carlo toolkit Geant4 [4] for simulating the gamma-ray transport in and around the fuel and the emission of Cherenkov light, and the ray-tracing programme Zemax [5] used to model the light transport through the assembly geometry to the DCVD and to mimic the behaviour of its lens system. Furthermore, the software allows for detailed information from the plant operator on power and/or burnup distributions to be taken into account to enhance the authenticity of the simulated images. To demonstrate the results of the combined software packages, simulated and measured DCVD images are presented. A short discussion on the usefulness of the simulation tool is also included.

  19. Evaluating Student Records Management Software.

    ERIC Educational Resources Information Center

    Vecchioli, Lisa

    This book establishes a framework that can be used to evaluate software for tracking and analyzing student records. First, it examines the characteristics the user should look for in a student-records management software package. Users should be aware of the dangers and costs of replacing a system or integrating new software into an existing…

  20. Software development tools: A bibliography, appendix C.

    NASA Technical Reports Server (NTRS)

    Riddle, W. E.

    1980-01-01

    A bibliography containing approximately 200 citations on tools which help software developers perform some development task (such as text manipulation, testing, etc.), and which would not necessarily be found as part of a computing facility is given. The bibliography comes from a relatively random sampling of the literature and is not complete. But it is indicative of the nature and range of tools currently being prepared or currently available.

  1. Software Tools for Empowering Instructional Developers.

    ERIC Educational Resources Information Center

    Gayeski, Diane M.

    1991-01-01

    Software systems are being created to assist both novice and expert instructional technologists in response to perceived need of organizations to increase their training. Underlying philosophies and goals of instructional developer automation tools and their potential effects upon the organizations who adopt them must be examined so they will help…

  2. Treatment Deployment Evaluation Tool

    SciTech Connect

    M. A. Rynearson; M. M. Plum

    1999-08-01

    The U.S. Department of Energy (DOE) is responsible for the final disposition of legacy spent nuclear fuel (SNF). As a response, DOE's National Spent Nuclear Fuel Program (NSNFP) has been given the responsibility for the disposition of DOE-owned SNF. Many treatment technologies have been identified to treat some forms of SNF so that the resulting treated product is acceptable by the disposition site. One of these promising treatment processes is the electrometallurgical treatment (EMT) currently in development; a second is an Acid Wash Decladding process. The NSNFP has been tasked with identifying possible strategies for the deployment of these treatment processes in the event that a treatment path is deemed necessary. To support the siting studies of these strategies, economic evaluations are being performed to identify the least-cost deployment path. This model (tool) was developed to consider the full scope of costs, technical feasibility, process material disposition, and schedule attributes over the life of each deployment alternative. Using standard personal computer (PC) software, the model was developed as a comprehensive technology economic assessment tool using a Life-Cycle Cost (LCC) analysis methodology. Model development was planned as a systematic, iterative process of identifying and bounding the required activities to dispose of SNF. To support the evaluation process, activities are decomposed into lower level, easier to estimate activities. Sensitivity studies can then be performed on these activities, defining cost issues and testing results against the originally stated problem.

  3. Treatment Deployment Evaluation Tool

    SciTech Connect

    Rynearson, Michael Ardel; Plum, Martin Michael

    1999-08-01

    The U.S. Department of Energy (DOE) is responsible for the final disposition of legacy spent nuclear fuel (SNF). As a response, DOE's National Spent Nuclear Fuel Program (NSNFP) has been given the responsibility for the disposition of DOE -owned SNF. Many treatment technologies have been identified to treat some forms of SNF so that the resulting treated product is acceptable by the disposition site. One of these promising treatment processes is the electrometallurgical treatment (EMT) currently in development; a second is an Acid Wash Decladding process. The NSNFP has been tasked with identifying possible strategies for the deployment of these treatment processes in the event that the treatment path is deemed necessary. To support the siting studies of these strategies, economic evaluations are being performed to identify the least-cost deployment path. This model (tool) was developed to consider the full scope of costs, technical feasibility, process material disposition, and schedule attributes over the life of each deployment alternative. Using standard personal computer (PC) software, the model was developed as a comprehensive technology economic assessment tool using a Life-Cycle Cost (LCC) analysis methodology. Model development was planned as a systematic, iterative process of identifying and bounding the required activities to dispose of SNF. To support the evaluation process, activities are decomposed into lower level, easier to estimate activities. Sensitivity studies can then be performed on these activities, defining cost issues and testing results against the originally stated problem.

  4. New Software Framework to Share Research Tools

    NASA Astrophysics Data System (ADS)

    Milner, Kevin; Becker, Thorsten W.; Boschi, Lapo; Sain, Jared; Schorlemmer, Danijel; Waterhouse, Hannah

    2009-03-01

    Solid Earth Teaching and Research Environment (SEATREE) is a modular and user-friendly software to facilitate the use of solid Earth research tools in the classroom and for interdisciplinary research collaboration. The software provides a stand-alone open-source package that allows users to operate in a “black box” mode, which hides implementation details, while also allowing them to dig deeper into the underlying source code. The overlying user interfaces are written in the Python programming language using a modern, object-oriented design, including graphical user interactions. SEATREE, which provides an interface to a range of new and existing lower level programs that can be written in any computer programming language, may in the long run contribute to new ways of sharing scientific research. By sharing both data and modeling tools in a consistent framework, published (numerical) experiments can be made truly reproducible again.

  5. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  6. A software tool for dataflow graph scheduling

    NASA Technical Reports Server (NTRS)

    Jones, Robert L., III

    1994-01-01

    A graph-theoretic design process and software tool is presented for selecting a multiprocessing scheduling solution for a class of computational problems. The problems of interest are those that can be described using a dataflow graph and are intended to be executed repetitively on multiple processors. The dataflow paradigm is very useful in exposing the parallelism inherent in algorithms. It provides a graphical and mathematical model which describes a partial ordering of algorithm tasks based on data precedence.

  7. Software Tools: A One-Semester Secondary School Computer Course.

    ERIC Educational Resources Information Center

    Bromley, John; Lakatos, John

    1985-01-01

    Provides a course outline, describes equipment and teacher requirements, discusses student evaluation and course outcomes, and details the computer programs used in a high school course. The course is designed to teach students use of the microcomputer as a tool through hands-on experience with a variety of commercial software programs. (MBR)

  8. Management of Astronomical Software Projects with Open Source Tools

    NASA Astrophysics Data System (ADS)

    Briegel, F.; Bertram, T.; Berwein, J.; Kittmann, F.

    2010-12-01

    In this paper we will offer an innovative approach to managing the software development process with free open source tools, for building and automated testing, a system to automate the compile/test cycle on a variety of platforms to validate code changes, using virtualization to compile in parallel on various operating system platforms, version control and change management, enhanced wiki and issue tracking system for online documentation and reporting and groupware tools as they are: blog, discussion and calendar. Initially starting with the Linc-Nirvana instrument a new project and configuration management tool for developing astronomical software was looked for. After evaluation of various systems of this kind, we are satisfied with the selection we are using now. Following the lead of Linc-Nirvana most of the other software projects at the MPIA are using it now.

  9. A Guide to Software Evaluation.

    ERIC Educational Resources Information Center

    Leonard, Rex; LeCroy, Barbara

    Arguing that software evaluation is crucial to the quality of courseware available in a school, this paper begins by discussing reasons why microcomputers are making such a tremendous impact on education, and notes that, although the quality of software has improved over the years, the challenge for teachers to integrate computing into the…

  10. Evaluating and Choosing ESL Software.

    ERIC Educational Resources Information Center

    Egbert, Joy; Petrie, Gina

    2002-01-01

    Outlines steps for selecting software for use in English-as-a-Second-Language insurrection. Steps include the following: 1) determine the goals and needs of the program, faculty, and learners; 2) narrow down the search among different options; 3) take time to evaluate the software. Provides two examples of how the process can work in different…

  11. Structure and software tools of AIDA.

    PubMed

    Duisterhout, J S; Franken, B; Witte, F

    1987-01-01

    AIDA consists of a set of software tools to allow for fast development and easy-to-maintain Medical Information Systems. AIDA supports all aspects of such a system both during development and operation. It contains tools to build and maintain forms for interactive data entry and on-line input validation, a database management system including a data dictionary and a set of run-time routines for database access, and routines for querying the database and output formatting. Unlike an application generator, the user of AIDA may select parts of the tools to fulfill his needs and program other subsystems not developed with AIDA. The AIDA software uses as host language the ANSI-standard programming language MUMPS, an interpreted language embedded in an integrated database and programming environment. This greatly facilitates the portability of AIDA applications. The database facilities supported by AIDA are based on a relational data model. This data model is built on top of the MUMPS database, the so-called global structure. This relational model overcomes the restrictions of the global structure regarding string length. The global structure is especially powerful for sorting purposes. Using MUMPS as a host language allows the user an easy interface between user-defined data validation checks or other user-defined code and the AIDA tools. AIDA has been designed primarily for prototyping and for the construction of Medical Information Systems in a research environment which requires a flexible approach. The prototyping facility of AIDA operates terminal independent and is even to a great extent multi-lingual. Most of these features are table-driven; this allows on-line changes in the use of terminal type and language, but also causes overhead. AIDA has a set of optimizing tools by which it is possible to build a faster, but (of course) less flexible code from these table definitions. By separating the AIDA software in a source and a run-time version, one is able to write

  12. Authoring tool evaluation

    SciTech Connect

    Wilson, A.L.; Klenk, K.S.; Coday, A.C.; McGee, J.P.; Rivenburgh, R.R.; Gonzales, D.M.; Mniszewski, S.M.

    1994-09-15

    This paper discusses and evaluates a number of authoring tools currently on the market. The tools evaluated are Visix Galaxy, NeuronData Open Interface Elements, Sybase Gain Momentum, XVT Power++, Aimtech IconAuthor, Liant C++/Views, and Inmark Technology zApp. Also discussed is the LIST project and how this evaluation is being used to fit an authoring tool to the project.

  13. Evaluating software testing strategies

    NASA Technical Reports Server (NTRS)

    Selby, R. W., Jr.; Basili, V. R.; Page, J.; Mcgarry, F. E.

    1984-01-01

    The strategies of code reading, functional testing, and structural testing are compared in three aspects of software testing: fault detection effectiveness, fault detection cost, and classes of faults detected. The major results are the following: (1) Code readers detected more faults than did those using the other techniques, while functional tester detected more faults than did structural testers; (2) Code readers had a higher fault detection rate than did those using the other methods, while there was no difference between functional testers and structural testers; (3) Subjects testing the abstract data type detected the most faults and had the highest fault detection rate, while individuals testing the database maintainer found the fewest faults and spent the most effort testing; (4) Subjects of intermediate and junior expertise were not different in number or percentage of faults found, fault detection rate, or fault detection effort; (5) subjects of advanced expertise found a greater number of faults than did the others, found a greater percentage of faults than did just those of junior expertise, and were not different from the others in either fault detection rate or effort; and (6) Code readers and functional testers both detected more omission faults and more control faults than did structural testers, while code readers detected more interface faults than did those using the other methods.

  14. ATLAS software configuration and build tool optimisation

    NASA Astrophysics Data System (ADS)

    Rybkin, Grigory; Atlas Collaboration

    2014-06-01

    ATLAS software code base is over 6 million lines organised in about 2000 packages. It makes use of some 100 external software packages, is developed by more than 400 developers and used by more than 2500 physicists from over 200 universities and laboratories in 6 continents. To meet the challenge of configuration and building of this software, the Configuration Management Tool (CMT) is used. CMT expects each package to describe its build targets, build and environment setup parameters, dependencies on other packages in a text file called requirements, and each project (group of packages) to describe its policies and dependencies on other projects in a text project file. Based on the effective set of configuration parameters read from the requirements files of dependent packages and project files, CMT commands build the packages, generate the environment for their use, or query the packages. The main focus was on build time performance that was optimised within several approaches: reduction of the number of reads of requirements files that are now read once per package by a CMT build command that generates cached requirements files for subsequent CMT build commands; introduction of more fine-grained build parallelism at package task level, i.e., dependent applications and libraries are compiled in parallel; code optimisation of CMT commands used for build; introduction of package level build parallelism, i. e., parallelise the build of independent packages. By default, CMT launches NUMBER-OF-PROCESSORS build commands in parallel. The other focus was on CMT commands optimisation in general that made them approximately 2 times faster. CMT can generate a cached requirements file for the environment setup command, which is especially useful for deployment on distributed file systems like AFS or CERN VMFS. The use of parallelism, caching and code optimisation significantly-by several times-reduced software build time, environment setup time, increased the efficiency of

  15. Software Tools for Stochastic Simulations of Turbulence

    NASA Astrophysics Data System (ADS)

    Kaufman, Ryan

    We present two software tools useful for the analysis of mesh based physics application data, and specifically for turbulent mixing simulations. Each has a broader, but separate scope, as we describe. Both features play a key role as we push computational science to its limits and thus the present work contributes to the frontier of research. The first tool is Wstar, a weak* comparison tool, which addresses the stochastic nature of turbulent flow. The goal is to compare underresolved turbulent data in convergence, parameter dependence, or validation studies. This is achieved by separating space-time data from state data (e.g. density, pressure, momentum, etc.) through coarsening and sampling. The collection of fine grained data in a single coarse cell is treated as a random sample in state space, whose cumulative distribution function defines a measure within that cell. This set of measures with the spacial dependence defined by the coarse grid defines a Young measure solution to the PDE. The second tool is a front tracking application programming interface (API) called FTI. It has the capability to generate geometric surfaces (e.g. the location of interspecies boundaries) of high complexity, and track them dynamically. FTI also includes the ghost fluid method, which enables mesh based fluid codes to maintain sharpness at interspecies boundaries by modifying solution stencils that cross such a boundary. FTI outlines and standardizes the methods involved in this model. FronTier, as developed here, is a software package which implements this standard. The client must implement the physics and grid interpolation routines outlined in the client interface to FTI. Specific client programs using this interface include the weather forecasting code WRF; the high energy physics code, FLASH; and two locally constructed fluid codes, cFluid and iFluid for compressible and incompressible flow respectively.

  16. BoBB, software to assess soil erosion risk - introduction of the tool and its use to evaluate appropriate crops and farming practices on endangered field plots

    NASA Astrophysics Data System (ADS)

    Devátý, Jan; Dostál, Tomáš; Hösl, Rosemarie; Strauss, Peter; Novotný, Ivan

    2013-04-01

    BoBB (Bodenerosion, Beratung, Berechnung) is simple software to support instant assessment of soil erosion hazard on agricultural fields. The program is profile-oriented, implementing the RUSLE model with slight changes allowing it to assess and compare different farming practices especially the soil-conservation field management. The input parameters datasets are supplied with necessary data for territory and natural conditions of Upper Austria but are generally usable for Central Europe. The software was developed on Federal Agency for Water Management, Petzenkirchen, Austria in 2011 - 2012. BAW and CTU in Prague are recently cooperating on validation and practical applicability approval of the model. Basic validation was done by comparing the outputs of the BoBB software with outputs of the original RUSLE model calculated by the RUSLE1 (USDA, 1998) and RUSLE2 (USDA, 2005) softwares. Further evaluation was performed to test the possibilities of BoBB to reveal field plots endangered by soil erosion. First, testing areas were selected out of a map of soil erosion risk, which had been calculated for the whole territory of the Czech Republic using a combination of the USLE approach and a GIS approach referring to the best available data set. This map in 10x10 meters resolution is used as basic source for assessment of soil erosion hazard and for necessity of GAEC requirements (Good agricultural practices assessment for agricultural subsidy policy) and is therefore accepted as standard at state level. Characteristic profiles were selected within defined testing areas and soil erosion hazard, determined by the USLE approach and BoBB have then been compared. Second, a comparison of BoBB outputs and database of soil erosion events (http://me.vumop.cz) was carried out. The database is created and maintained by the Czech Institute of Soil Conservation as a unique tool for soil erosion mapping and documentation. It was launched in 2010 and recently contains approximately

  17. Knickpoint finder: A software tool that improves neotectonic analysis

    NASA Astrophysics Data System (ADS)

    Queiroz, G. L.; Salamuni, E.; Nascimento, E. R.

    2015-03-01

    This work presents a new software tool for morphometric analysis of drainage networks based on the methods of Hack (1973) and Etchebehere et al. (2004). This tool is applicable to studies of morphotectonics and neotectonics. The software used a digital elevation model (DEM) to identify the relief breakpoints along drainage profiles (knickpoints). The program was coded in Python for use on the ArcGIS platform and is called Knickpoint Finder. A study area was selected to test and evaluate the software's ability to analyze and identify neotectonic morphostructures based on the morphology of the terrain. For an assessment of its validity, we chose an area of the James River basin, which covers most of the Piedmont area of Virginia (USA), which is an area of constant intraplate seismicity and non-orogenic active tectonics and exhibits a relatively homogeneous geodesic surface currently being altered by the seismogenic features of the region. After using the tool in the chosen area, we found that the knickpoint locations are associated with the geologic structures, epicenters of recent earthquakes, and drainages with rectilinear anomalies. The regional analysis demanded the use of a spatial representation of the data after processing using Knickpoint Finder. The results were satisfactory in terms of the correlation of dense areas of knickpoints with active lineaments and the rapidity of the identification of deformed areas. Therefore, this software tool may be considered useful in neotectonic analyses of large areas and may be applied to any area where there is DEM coverage.

  18. Software Tools to Support the Assessment of System Health

    NASA Technical Reports Server (NTRS)

    Melcher, Kevin J.

    2013-01-01

    This presentation provides an overview of three software tools that were developed by the NASA Glenn Research Center to support the assessment of system health: the Propulsion Diagnostic Method Evaluation Strategy (ProDIMES), the Systematic Sensor Selection Strategy (S4), and the Extended Testability Analysis (ETA) tool. Originally developed to support specific NASA projects in aeronautics and space, these software tools are currently available to U.S. citizens through the NASA Glenn Software Catalog. The ProDiMES software tool was developed to support a uniform comparison of propulsion gas path diagnostic methods. Methods published in the open literature are typically applied to dissimilar platforms with different levels of complexity. They often address different diagnostic problems and use inconsistent metrics for evaluating performance. As a result, it is difficult to perform a one ]to ]one comparison of the various diagnostic methods. ProDIMES solves this problem by serving as a theme problem to aid in propulsion gas path diagnostic technology development and evaluation. The overall goal is to provide a tool that will serve as an industry standard, and will truly facilitate the development and evaluation of significant Engine Health Management (EHM) capabilities. ProDiMES has been developed under a collaborative project of The Technical Cooperation Program (TTCP) based on feedback provided by individuals within the aircraft engine health management community. The S4 software tool provides a framework that supports the optimal selection of sensors for health management assessments. S4 is structured to accommodate user ]defined applications, diagnostic systems, search techniques, and system requirements/constraints. One or more sensor suites that maximize this performance while meeting other user ]defined system requirements that are presumed to exist. S4 provides a systematic approach for evaluating combinations of sensors to determine the set or sets of

  19. STAYSL PNNL Suite of Software Tools.

    SciTech Connect

    GREENWOOD, LARRY R.

    2013-07-19

    Version: 00 The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  20. STAYSL PNNL Suite of Software Tools.

    Energy Science and Technology Software Center (ESTSC)

    2013-07-19

    Version: 00 The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist ofmore » the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.« less

  1. Choosing CALL Software: Beginning the Evaluation Process.

    ERIC Educational Resources Information Center

    Bader, Melissa J.

    2000-01-01

    Synthesizes information that is available on software evaluation and provides a software evaluation checklist to help educators examine software based on linguistic and pedagogical criteria. The checklist allows educators to compare and contrast software products, enabling them to select software that is best suited to their classrooms.…

  2. A Software Communication Tool for the Tele-ICU

    PubMed Central

    Pimintel, Denise M.; Wei, Shang Heng; Odor, Alberto

    2013-01-01

    The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider. PMID:24551398

  3. Software Engineering Laboratory (SEL) compendium of tools, revision 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.

  4. Data Analysis with Graphical Models: Software Tools

    NASA Technical Reports Server (NTRS)

    Buntine, Wray L.

    1994-01-01

    Probabilistic graphical models (directed and undirected Markov fields, and combined in chain graphs) are used widely in expert systems, image processing and other areas as a framework for representing and reasoning with probabilities. They come with corresponding algorithms for performing probabilistic inference. This paper discusses an extension to these models by Spiegelhalter and Gilks, plates, used to graphically model the notion of a sample. This offers a graphical specification language for representing data analysis problems. When combined with general methods for statistical inference, this also offers a unifying framework for prototyping and/or generating data analysis algorithms from graphical specifications. This paper outlines the framework and then presents some basic tools for the task: a graphical version of the Pitman-Koopman Theorem for the exponential family, problem decomposition, and the calculation of exact Bayes factors. Other tools already developed, such as automatic differentiation, Gibbs sampling, and use of the EM algorithm, make this a broad basis for the generation of data analysis software.

  5. Tools and Behavioral Abstraction: A Direction for Software Engineering

    NASA Astrophysics Data System (ADS)

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  6. Cyber Security Evaluation Tool

    SciTech Connect

    2009-08-03

    CSET is a desktop software tool that guides users through a step-by-step process to assess their control system network security practices against recognized industry standards. The output from CSET is a prioritized list of recommendations for improving the cyber security posture of your organization’s ICS or enterprise network. CSET derives the recommendations from a database of cybersecurity standards, guidelines, and practices. Each recommendation is linked to a set of actions that can be applied to enhance cybersecurity controls.

  7. Cyber Security Evaluation Tool

    Energy Science and Technology Software Center (ESTSC)

    2009-08-03

    CSET is a desktop software tool that guides users through a step-by-step process to assess their control system network security practices against recognized industry standards. The output from CSET is a prioritized list of recommendations for improving the cyber security posture of your organization’s ICS or enterprise network. CSET derives the recommendations from a database of cybersecurity standards, guidelines, and practices. Each recommendation is linked to a set of actions that can be applied tomore » enhance cybersecurity controls.« less

  8. Software Tools to Support Research on Airport Departure Planning

    NASA Technical Reports Server (NTRS)

    Carr, Francis; Evans, Antony; Feron, Eric; Clarke, John-Paul

    2003-01-01

    A simple, portable and useful collection of software tools has been developed for the analysis of airport surface traffic. The tools are based on a flexible and robust traffic-flow model, and include calibration, validation and simulation functionality for this model. Several different interfaces have been developed to help promote usage of these tools, including a portable Matlab(TM) implementation of the basic algorithms; a web-based interface which provides online access to automated analyses of airport traffic based on a database of real-world operations data which covers over 250 U.S. airports over a 5-year period; and an interactive simulation-based tool currently in use as part of a college-level educational module. More advanced applications for airport departure traffic include taxi-time prediction and evaluation of "windowing" congestion control.

  9. Multimedia Software Evaluation Form for Teachers

    ERIC Educational Resources Information Center

    Herring, Donna F.; Notar, Charles E.; Wilson, Janell D.

    2005-01-01

    Schools are currently receiving increased funds for multimedia software for classrooms. There is a need for good software in the schools, and there is a need to know how to evaluate software and not naively rely on advertisements. Evaluators of multimedia software for education must have the skills to critically evaluate and make decisions not…

  10. SAPHIRE models and software for ASP evaluations

    SciTech Connect

    Sattison, M.B.; Schroeder, J.A.; Russell, K.D.

    1995-04-01

    The Idaho National Engineering Laboratory (INEL) over the past year has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of conditional core damage probability (CCDP) evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both NRR and AEOD. This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events.

  11. Integration of case tools for software project management

    SciTech Connect

    Paul, R.; Shinagawa, Y.; Khan, M.F.

    1996-12-31

    Building and maintenance of high quality large software projects is a complex and difficult process. Tools employing software metrics are becoming an effective aid for management of such large projects. In this paper, we briefly trace the evolution of such tools from their beginnings up until the current trends of integrated CASE tools. We present a generic integrated CASE environment incorporating a formal set of software metrics with a suite of advanced analytic techniques. The proposed integrated CASE environment is an enhancement of currently used tools, and can enable more efficient and cost-effective management of large and complex software projects.

  12. VTGRAPH - GRAPHIC SOFTWARE TOOL FOR VT TERMINALS

    NASA Technical Reports Server (NTRS)

    Wang, C.

    1994-01-01

    VTGRAPH is a graphics software tool for DEC/VT or VT compatible terminals which are widely used by government and industry. It is a FORTRAN or C-language callable library designed to allow the user to deal with many computer environments which use VT terminals for window management and graphic systems. It also provides a PLOT10-like package plus color or shade capability for VT240, VT241, and VT300 terminals. The program is transportable to many different computers which use VT terminals. With this graphics package, the user can easily design more friendly user interface programs and design PLOT10 programs on VT terminals with different computer systems. VTGRAPH was developed using the ReGis Graphics set which provides a full range of graphics capabilities. The basic VTGRAPH capabilities are as follows: window management, PLOT10 compatible drawing, generic program routines for two and three dimensional plotting, and color graphics or shaded graphics capability. The program was developed in VAX FORTRAN in 1988. VTGRAPH requires a ReGis graphics set terminal and a FORTRAN compiler. The program has been run on a DEC MicroVAX 3600 series computer operating under VMS 5.0, and has a virtual memory requirement of 5KB.

  13. CASRE ?? Eay-to-Use Software Reliability Measurement Tool

    NASA Technical Reports Server (NTRS)

    Nikora, A.; Lyu, M.; Farr, W.

    1993-01-01

    This paper describes the implementation of a software reliability measurement tool, CASRE, that incorporates the methematical modeling capabilities of the public domain tool SMERFS, and is being implemented in a Microsoft Windows environment.

  14. The Value of Open Source Software Tools in Qualitative Research

    ERIC Educational Resources Information Center

    Greenberg, Gary

    2011-01-01

    In an era of global networks, researchers using qualitative methods must consider the impact of any software they use on the sharing of data and findings. In this essay, I identify researchers' main areas of concern regarding the use of qualitative software packages for research. I then examine how open source software tools, wherein the publisher…

  15. A Tool for Managing Software Architecture Knowledge

    SciTech Connect

    Babar, Muhammad A.; Gorton, Ian

    2007-08-01

    This paper describes a tool for managing architectural knowledge and rationale. The tool has been developed to support a framework for capturing and using architectural knowledge to improve the architecture process. This paper describes the main architectural components and features of the tool. The paper also provides examples of using the tool for supporting wellknown architecture design and analysis methods.

  16. An evaluation of the Interactive Software Invocation System (ISIS) for software development applications. [flight software

    NASA Technical Reports Server (NTRS)

    Noland, M. S.

    1981-01-01

    The Interactive Software Invocation System (ISIS), which allows a user to build, modify, control, and process a total flight software system without direct communications with the host computer, is described. This interactive data management system provides the user with a file manager, text editor, a tool invoker, and an Interactive Programming Language (IPL). The basic file design of ISIS is a five level hierarchical structure. The file manager controls this hierarchical file structure and permits the user to create, to save, to access, and to purge pages of information. The text editor is used to manipulate pages of text to be modified and the tool invoker allows the user to communicate with the host computer through a RUN file created by the user. The IPL is based on PASCAL and contains most of the statements found in a high-level programming language. In order to evaluate the effectiveness of the system as applied to a flight project, the collection of software components required to support the Annular Suspension and Pointing System (ASPS) flight project were integrated using ISIS. The ASPS software system and its integration into ISIS is described.

  17. Free software tools for atlas-based volumetric neuroimage analysis

    NASA Astrophysics Data System (ADS)

    Bazin, Pierre-Louis; Pham, Dzung L.; Gandler, William; McAuliffe, Matthew

    2005-04-01

    We describe new and freely available software tools for measuring volumes in subregions of the brain. The method is fast, flexible, and employs well-studied techniques based on the Talairach-Tournoux atlas. The software tools are released as plug-ins for MIPAV, a freely available and user-friendly image analysis software package developed by the National Institutes of Health. Our software tools include a digital Talairach atlas that consists of labels for 148 different substructures of the brain at various scales.

  18. Caesy: A software tool for computer-aided engineering

    NASA Technical Reports Server (NTRS)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  19. Estimation of toxicity using a Java based software tool

    EPA Science Inventory

    A software tool has been developed that will allow a user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be accessed using a web browser (or alternatively downloaded and ran as a stand alone applic...

  20. ToxPredictor: a Toxicity Estimation Software Tool

    EPA Science Inventory

    The Computational Toxicology Team within the National Risk Management Research Laboratory has developed a software tool that will allow the user to estimate the toxicity for a variety of endpoints (such as acute aquatic toxicity). The software tool is coded in Java and can be ac...

  1. EISA 432 Energy Audits Best Practices: Software Tools

    SciTech Connect

    Maryl Fisher

    2014-11-01

    Five whole building analysis software tools that can aid an energy manager with fulfilling energy audit and commissioning/retro-commissioning requirements were selected for review in this best practices study. A description of each software tool is provided as well as a discussion of the user interface and level of expertise required for each tool, a review of how to use the tool for analyzing energy conservation opportunities, the format and content of reports generated by the tool, and a discussion on the applicability of the tool for commissioning.

  2. Tools Ensure Reliability of Critical Software

    NASA Technical Reports Server (NTRS)

    2012-01-01

    In November 2006, after attempting to make a routine maneuver, NASA's Mars Global Surveyor (MGS) reported unexpected errors. The onboard software switched to backup resources, and a 2-day lapse in communication took place between the spacecraft and Earth. When a signal was finally received, it indicated that MGS had entered safe mode, a state of restricted activity in which the computer awaits instructions from Earth. After more than 9 years of successful operation gathering data and snapping pictures of Mars to characterize the planet's land and weather communication between MGS and Earth suddenly stopped. Months later, a report from NASA's internal review board found the spacecraft's battery failed due to an unfortunate sequence of events. Updates to the spacecraft's software, which had taken place months earlier, were written to the wrong memory address in the spacecraft's computer. In short, the mission ended because of a software defect. Over the last decade, spacecraft have become increasingly reliant on software to carry out mission operations. In fact, the next mission to Mars, the Mars Science Laboratory, will rely on more software than all earlier missions to Mars combined. According to Gerard Holzmann, manager at the Laboratory for Reliable Software (LaRS) at NASA's Jet Propulsion Laboratory (JPL), even the fault protection systems on a spacecraft are mostly software-based. For reasons like these, well-functioning software is critical for NASA. In the same year as the failure of MGS, Holzmann presented a new approach to critical software development to help reduce risk and provide consistency. He proposed The Power of 10: Rules for Developing Safety-Critical Code, which is a small set of rules that can easily be remembered, clearly relate to risk, and allow compliance to be verified. The reaction at JPL was positive, and developers in the private sector embraced Holzmann's ideas.

  3. Software Users Manual (SUM): Extended Testability Analysis (ETA) Tool

    NASA Technical Reports Server (NTRS)

    Maul, William A.; Fulton, Christopher E.

    2011-01-01

    This software user manual describes the implementation and use the Extended Testability Analysis (ETA) Tool. The ETA Tool is a software program that augments the analysis and reporting capabilities of a commercial-off-the-shelf (COTS) testability analysis software package called the Testability Engineering And Maintenance System (TEAMS) Designer. An initial diagnostic assessment is performed by the TEAMS Designer software using a qualitative, directed-graph model of the system being analyzed. The ETA Tool utilizes system design information captured within the diagnostic model and testability analysis output from the TEAMS Designer software to create a series of six reports for various system engineering needs. The ETA Tool allows the user to perform additional studies on the testability analysis results by determining the detection sensitivity to the loss of certain sensors or tests. The ETA Tool was developed to support design and development of the NASA Ares I Crew Launch Vehicle. The diagnostic analysis provided by the ETA Tool was proven to be valuable system engineering output that provided consistency in the verification of system engineering requirements. This software user manual provides a description of each output report generated by the ETA Tool. The manual also describes the example diagnostic model and supporting documentation - also provided with the ETA Tool software release package - that were used to generate the reports presented in the manual

  4. Managing Digital Archives Using Open Source Software Tools

    NASA Astrophysics Data System (ADS)

    Barve, S.; Dongare, S.

    2007-10-01

    This paper describes the use of open source software tools such as MySQL and PHP for creating database-backed websites. Such websites offer many advantages over ones built from static HTML pages. This paper will discuss how OSS tools are used and their benefits, and after the successful implementation of these tools how the library took the initiative in implementing an institutional repository using DSpace open source software.

  5. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  6. Software Selection, Evaluation and Organization [and] Software Reviews. Article Reprints.

    ERIC Educational Resources Information Center

    Computing Teacher, 1985

    1985-01-01

    This collection of reprints from The Computing Teacher contains 11 articles on the selection, evaluation, and organization of software published between August 1983 and March 1986, as well as more than 20 reviews of educational software packages published between December 1982 and June 1986. The articles are: (1) "The New Wave of Educational…

  7. NASA Approach to HPCCP Support Software and Tools

    NASA Technical Reports Server (NTRS)

    Blaylock, Bruce; Chancellor, Marisa K. (Technical Monitor)

    1996-01-01

    The NASA HPCC Program, together with other agencies participating in the Federal HPCC Program, intends to advance technologies to enable the execution of grand challenge applications at sustained rates up to TeraFLOPS. During 1995-6 NASA undertook two major systems software efforts to improve the state of high performance support software and tools. The first of these activities was a replanning of support software and tools activities internal to the Agency. In replanning the software activities emphasis was placed on Meeting the needs of Grand Challenge Uses Few projects Near term useful results. The revised NASA plan calls for support software and tools activities in four areas: Application Creation Process Support Application Usage/Operations Support Advanced Support Software and Tools Concepts Metrics Based Monitoring and Management The second major activity undertaken was participation in a multiagency Task Force resulting from the Second Pasadena Workshop on System Software and Tools. The task force developed the Guidelines for Writing System Software and Tools Requirements for Parallel and Clustered Computers.

  8. How to Use the Software Evaluation Form.

    ERIC Educational Resources Information Center

    Reynolds, Karen E.

    1985-01-01

    Provides a form for evaluating software and software design. The form, which emphasizes instructional qualities, considers science processes, hardware requirements, program mechanics, student reaction, and other areas. Guidelines for using the form are included. (DH)

  9. Innovative Software Tools Measure Behavioral Alertness

    NASA Technical Reports Server (NTRS)

    2014-01-01

    To monitor astronaut behavioral alertness in space, Johnson Space Center awarded Philadelphia-based Pulsar Informatics Inc. SBIR funding to develop software to be used onboard the International Space Station. Now used by the government and private companies, the technology has increased revenues for the firm by an average of 75 percent every year.

  10. Saphire models and software for ASP evaluations

    SciTech Connect

    Sattison, M.B.

    1997-02-01

    The Idaho National Engineering Laboratory (INEL) over the three years has created 75 plant-specific Accident Sequence Precursor (ASP) models using the SAPHIRE suite of PRA codes. Along with the new models, the INEL has also developed a new module for SAPHIRE which is tailored specifically to the unique needs of ASP evaluations. These models and software will be the next generation of risk tools for the evaluation of accident precursors by both the U.S. Nuclear Regulatory Commission`s (NRC`s) Office of Nuclear Reactor Regulation (NRR) and the Office for Analysis and Evaluation of Operational Data (AEOD). This paper presents an overview of the models and software. Key characteristics include: (1) classification of the plant models according to plant response with a unique set of event trees for each plant class, (2) plant-specific fault trees using supercomponents, (3) generation and retention of all system and sequence cutsets, (4) full flexibility in modifying logic, regenerating cutsets, and requantifying results, and (5) user interface for streamlined evaluation of ASP events. Future plans for the ASP models is also presented.

  11. Software Construction and Analysis Tools for Future Space Missions

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Clancy, Daniel (Technical Monitor)

    2002-01-01

    NASA and its international partners will increasingly depend on software-based systems to implement advanced functions for future space missions, such as Martian rovers that autonomously navigate long distances exploring geographic features formed by surface water early in the planet's history. The software-based functions for these missions will need to be robust and highly reliable, raising significant challenges in the context of recent Mars mission failures attributed to software faults. After reviewing these challenges, this paper describes tools that have been developed at NASA Ames that could contribute to meeting these challenges; 1) Program synthesis tools based on automated inference that generate documentation for manual review and annotations for automated certification. 2) Model-checking tools for concurrent object-oriented software that achieve memorability through synergy with program abstraction and static analysis tools.

  12. ISWHM: Tools and Techniques for Software and System Health Management

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole J.; Darwiche, Adnan

    2010-01-01

    This presentation presents status and results of research on Software Health Management done within the NRA "ISWHM: Tools and Techniques for Software and System Health Management." Topics include: Ingredients of a Guidance, Navigation, and Control System (GN and C); Selected GN and C Testbed example; Health Management of major ingredients; ISWHM testbed architecture; and Conclusions and next Steps.

  13. Developing a Decision Support System: The Software and Hardware Tools.

    ERIC Educational Resources Information Center

    Clark, Phillip M.

    1989-01-01

    Describes some of the available software and hardware tools that can be used to develop a decision support system implemented on microcomputers. Activities that should be supported by software are discussed, including data entry, data coding, finding and combining data, and data compatibility. Hardware considerations include speed, storage…

  14. Some Interactive Aspects of a Software Design Schema Acquisition Tool

    NASA Technical Reports Server (NTRS)

    Lee, Hing-Yan; Harandi, Mehdi T.

    1991-01-01

    This paper describes a design schema acquisition tool which forms an important component of a hybrid software design system for reuse. The hybrid system incorporates both schema-based approaches in supporting software design reuse activities and is realized by extensions to the IDeA system. The paper also examines some of the interactive aspects that the tool requires with the domain analyst to accomplish its acquisition task.

  15. Software tool for xenon gamma-ray spectrometer control

    NASA Astrophysics Data System (ADS)

    Chernysheva, I. V.; Novikov, A. S.; Shustov, A. E.; Dmitrenko, V. V.; Pyae Nyein, Sone; Petrenko, D.; Ulin, S. E.; Uteshev, Z. M.; Vlasik, K. F.

    2016-02-01

    Software tool "Acquisition and processing of gamma-ray spectra" for xenon gamma-ray spectrometers control was developed. It supports the multi-windows interface. Software tool has the possibilities for acquisition of gamma-ray spectra from xenon gamma-ray detector via USB or RS-485 interfaces, directly or via TCP-IP protocol, energy calibration of gamma-ray spectra, saving gamma-ray spectra on a disk.

  16. iPhone examination with modern forensic software tools

    NASA Astrophysics Data System (ADS)

    Höne, Thomas; Kröger, Knut; Luttenberger, Silas; Creutzburg, Reiner

    2012-06-01

    The aim of the paper is to show the usefulness of modern forensic software tools for iPhone examination. In particular, we focus on the new version of Elcomsoft iOS Forensic Toolkit and compare it with Oxygen Forensics Suite 2012 regarding functionality, usability and capabilities. It is shown how these software tools works and how capable they are in examining non-jailbreaked and jailbreaked iPhones.

  17. Generating DEM from LIDAR data - comparison of available software tools

    NASA Astrophysics Data System (ADS)

    Korzeniowska, K.; Lacka, M.

    2011-12-01

    In recent years many software tools and applications have appeared that offer procedures, scripts and algorithms to process and visualize ALS data. This variety of software tools and of "point cloud" processing methods contributed to the aim of this study: to assess algorithms available in various software tools that are used to classify LIDAR "point cloud" data, through a careful examination of Digital Elevation Models (DEMs) generated from LIDAR data on a base of these algorithms. The works focused on the most important available software tools: both commercial and open source ones. Two sites in a mountain area were selected for the study. The area of each site is 0.645 sq km. DEMs generated with analysed software tools ware compared with a reference dataset, generated using manual methods to eliminate non ground points. Surfaces were analysed using raster analysis. Minimum, maximum and mean differences between reference DEM and DEMs generated with analysed software tools were calculated, together with Root Mean Square Error. Differences between DEMs were also examined visually using transects along the grid axes in the test sites.

  18. The Validation of a Software Evaluation Instrument.

    ERIC Educational Resources Information Center

    Schmitt, Dorren Rafael

    This study, conducted at six southern universities, analyzed the validity and reliability of a researcher developed instrument designed to evaluate educational software in secondary mathematics. The instrument called the Instrument for Software Evaluation for Educators uses measurement scales, presents a summary section of the evaluation, and…

  19. Software Tools for Weed Seed Germination Modeling

    Technology Transfer Automated Retrieval System (TEKTRAN)

    The next generation of weed seed germination models will need to account for variable soil microclimate conditions. In order to predict this microclimate environment we have developed a suite of individual tools (models) that can be used in conjunction with the next generation of weed seed germinati...

  20. Lessons learned in deploying software estimation technology and tools

    NASA Technical Reports Server (NTRS)

    Panlilio-Yap, Nikki; Ho, Danny

    1994-01-01

    Developing a software product involves estimating various project parameters. This is typically done in the planning stages of the project when there is much uncertainty and very little information. Coming up with accurate estimates of effort, cost, schedule, and reliability is a critical problem faced by all software project managers. The use of estimation models and commercially available tools in conjunction with the best bottom-up estimates of software-development experts enhances the ability of a product development group to derive reasonable estimates of important project parameters. This paper describes the experience of the IBM Software Solutions (SWS) Toronto Laboratory in selecting software estimation models and tools and deploying their use to the laboratory's product development groups. It introduces the SLIM and COSTAR products, the software estimation tools selected for deployment to the product areas, and discusses the rationale for their selection. The paper also describes the mechanisms used for technology injection and tool deployment, and concludes with a discussion of important lessons learned in the technology and tool insertion process.

  1. MOSS, an evaluation of software engineering techniques

    NASA Technical Reports Server (NTRS)

    Bounds, J. R.; Pruitt, J. L.

    1976-01-01

    An evaluation of the software engineering techniques used for the development of a Modular Operating System (MOSS) was described. MOSS is a general purpose real time operating system which was developed for the Concept Verification Test (CVT) program. Each of the software engineering techniques was described and evaluated based on the experience of the MOSS project. Recommendations for the use of these techniques on future software projects were also given.

  2. Programming software for usability evaluation

    SciTech Connect

    Edwards, T.L.; Allen, H.W.

    1997-01-01

    This report provides an overview of the work completed for a portion of the User Interface Testbed for Technology Packaging (UseIT) project. The authors present software methods for programming systems to record and view interactions with a graphical user interface. A brief description of the human factors design process is presented. The software methods exploit features available in the X Window System and the operating system for Windows{trademark} 95 and Windows{trademark} NT{reg_sign}.

  3. A software tool to analyze clinical workflows from direct observations.

    PubMed

    Schweitzer, Marco; Lasierra, Nelia; Hoerbst, Alexander

    2015-01-01

    Observational data of clinical processes need to be managed in a convenient way, so that process information is reliable, valid and viable for further analysis. However, existing tools for allocating observations fail in systematic data collection of specific workflow recordings. We present a software tool which was developed to facilitate the analysis of clinical process observations. The tool was successfully used in the project OntoHealth, to build, store and analyze observations of diabetes routine consultations. PMID:26262417

  4. Educational Software Evaluation Form for Teachers

    ERIC Educational Resources Information Center

    Kara, Yilmaz

    2007-01-01

    The purpose of the study was to develop an educational software evaluation form to provide an evaluation and selection instrument of educational software that met the requirements of some balance between mechanics, content and pedagogy that is user friendly. The subjects for the study comprised a group of 32 biology teachers working in secondary…

  5. Evaluating Interactive Video: Software and Hardware.

    ERIC Educational Resources Information Center

    Sorge, Dennis H.; And Others

    1993-01-01

    Discusses selection criteria for evaluating software and hardware used in interactive video based on experiences from the Purdue Academic Learning Opportunity System Project at Purdue University. Highlights include checklists for evaluating software and selecting hardware, including peripheral equipment; videodisc players; hardware compatibility;…

  6. Development of Fuel Accounting Software Tool

    NASA Astrophysics Data System (ADS)

    Eun, Jong Won; Suk, Juil

    1996-12-01

    A successful spacecraft mission depends on the proper maintenance of the orbit and attitude. One important requirement for the orbit and attitude planning is the accurate estimation of the propellant remaining onboard the spacecraft. For GEO communi-cations satellite, a precise fuel remaining estimation is of particular importance. This paper focuses on the bookkeeping method that was developed for calculating the pro-pellant budget by recording fuel consumption history. In general, the bookkeeping method includes detailed observation of spacecraft maneuver operations throughout the whole mission life. Application of this method is illustrated using a communica-tions satellite. In this the fuel accounting s/w tool, a PC-based spread sheet is utilized to provide an overall view of input/output elements, and to provide strong numerical and graphical merits for analyses.

  7. A Dynamic MPI Software Correctness Checking Tool

    Energy Science and Technology Software Center (ESTSC)

    2005-10-31

    Umpire is prototype tool developed at LLNL by Bronis R. de Supinski, J. M. May, Martin Schulz and Jeffery Vetter as part of the ASDE TRTS project for detecting programming errors at runtime in message passing applications. Umpire monitors the MPI operations of an application by interposing itself between the application and the MPI runtime system using the MPI profiling layer. Umpire then checks the application’s MPI behavior for specific errors. Umpire detects errors thatmore » are local to individual MPI tasks, including resource errors (e.g., leaks of MPI datatypes and other opaque objects) and overwrites of non-blocking send buffers. It also detects distributed errors, including deadlocks involving any MPI-1 constructs and datatype mismatches between matching communication operations.« less

  8. Microcomputer Software Evaluation Instrument Version 1983.

    ERIC Educational Resources Information Center

    Klopfer, Leopold E.; And Others

    1984-01-01

    Gives guidelines for using a microcomputer software evaluation instrument which focuses on policy issues, instructional quality, science subject-matter standards, and technical quality. The complete evaluation instrument is included. (JM)

  9. Class diagram based evaluation of software performance

    NASA Astrophysics Data System (ADS)

    Pham, Huong V.; Nguyen, Binh N.

    2013-03-01

    The evaluation of software performance in the early stages of the software life cycle is important and it has been widely studied. In the software model specification, class diagram is the important object-oriented software specification model. The measures based on a class diagram have been widely studied to evaluate quality of software such as complexity, maintainability, reuse capability, etc. However the software performance evaluation based on Class model has not been widely studied, especially for object-oriented design of embedded software. Therefore, in this paper we propose a new approach to directly evaluate the software performance based on class diagrams. From a class diagram, we determine the parameters which are used to evaluate and build formula of the measures such as Size of Class Variables, Size of Class Methods, Size of Instance Variables, Size of Instance Methods, etc. Then, we do analysis of the dependence of performance on these measures and build the performance evaluation function from class diagram. Thereby we can choose the best class diagram based on this evaluation function.

  10. Meta-tools for software development and knowledge acquisition

    NASA Technical Reports Server (NTRS)

    Eriksson, Henrik; Musen, Mark A.

    1992-01-01

    The effectiveness of tools that provide support for software development is highly dependent on the match between the tools and their task. Knowledge-acquisition (KA) tools constitute a class of development tools targeted at knowledge-based systems. Generally, KA tools that are custom-tailored for particular application domains are more effective than are general KA tools that cover a large class of domains. The high cost of custom-tailoring KA tools manually has encouraged researchers to develop meta-tools for KA tools. Current research issues in meta-tools for knowledge acquisition are the specification styles, or meta-views, for target KA tools used, and the relationships between the specification entered in the meta-tool and other specifications for the target program under development. We examine different types of meta-views and meta-tools. Our current project is to provide meta-tools that produce KA tools from multiple specification sources--for instance, from a task analysis of the target application.

  11. Software tool for data mining and its applications

    NASA Astrophysics Data System (ADS)

    Yang, Jie; Ye, Chenzhou; Chen, Nianyi

    2002-03-01

    A software tool for data mining is introduced, which integrates pattern recognition (PCA, Fisher, clustering, hyperenvelop, regression), artificial intelligence (knowledge representation, decision trees), statistical learning (rough set, support vector machine), computational intelligence (neural network, genetic algorithm, fuzzy systems). It consists of nine function models: pattern recognition, decision trees, association rule, fuzzy rule, neural network, genetic algorithm, Hyper Envelop, support vector machine, visualization. The principle and knowledge representation of some function models of data mining are described. The software tool of data mining is realized by Visual C++ under Windows 2000. Nonmonotony in data mining is dealt with by concept hierarchy and layered mining. The software tool of data mining has satisfactorily applied in the prediction of regularities of the formation of ternary intermetallic compounds in alloy systems, and diagnosis of brain glioma.

  12. DEVICE CONTROL TOOL FOR CEBAF BEAM DIAGNOSTICS SOFTWARE

    SciTech Connect

    Pavel Chevtsov

    2008-02-11

    Continuously monitoring the beam quality in the CEBAF accelerator, a variety of beam diagnostics software created at Jefferson Lab makes a significant contribution to very high availability of the machine for nuclear physics experiments. The interface between this software and beam instrumentation hardware components is provided by a device control tool, which is optimized for beam diagnostics tasks. As a part of the device/driver development framework at Jefferson Lab, this tool is very easy to support and extend to integrate new beam instrumentation components. All device control functions are based on the configuration (ASCII text) files that completely define the used hardware interface standards (CAMAC, VME, RS-232, GPIB, etc.) and communication protocols. The paper presents the main elements of the device control tool for beam diagnostics software at Jefferson Lab.

  13. Cumulative Aggregate Risk Evaluation Software

    EPA Science Inventory

    CARES is a state-of-the-art software program designed to conduct complex exposure and risk assessments for pesticides, such as the assessments required under the 1996 Food Quality Protection Act (FQPA). CARES was originally developed under the auspices of CropLife America (CLA),...

  14. HALOE test and evaluation software

    NASA Technical Reports Server (NTRS)

    Edmonds, W.; Natarajan, S.

    1987-01-01

    Computer programming, system development and analysis efforts during this contract were carried out in support of the Halogen Occultation Experiment (HALOE) at NASA/Langley. Support in the major areas of data acquisition and monitoring, data reduction and system development are described along with a brief explanation of the HALOE project. Documented listings of major software are located in the appendix.

  15. Concepts and tools for the software life cycle

    NASA Astrophysics Data System (ADS)

    Tausworthe, Robert C.

    1985-10-01

    The life cycle process for large software-intensive systems is an extremely intricate and complex process involving many people performing amid a very large base of evolving computer programs, documentation and data. To be successful, the process must be well conceived, planned and conducted; however, the nature of scientific and other high-technology projects involving large-scale software is such that conceptualization, planning and implementation to the degree of detail required is so laborintensive and unmotivating as to be counter-productive and seldom cost-effective. The tools, techniques and aids needed to engineer, manage and administrate a large software-intensive task are themselves parts of a large software base, and are incurred only at great expense. This paper focuses on the needs of the software life cycle in terms of supporting tools and methodologies. The concept of a distributed network for engineering, management and administrative functions for engineering, management and administrative functions is outlined, and the key characteristics of localized subnets in high-communications-traffic areas of software activity are discussed. A formal, deliberate, structured, systems-engineered approach toward the construction of uniform, coordinated tools is proposed as a means to reduce development and maintenance costs, foster creativity, enhance reliability, promote standardization and sustain human motivation.

  16. TRAVIT: software tool to simulate dry etch in maskmaking

    NASA Astrophysics Data System (ADS)

    Babin, S.; Bay, K.; Okulovsky, S.

    2005-06-01

    A software tool, TRAVIT, has been developed to simulate dry etch in maskmaking. The software predicts the etch profile, etched critical dimensions (CDs), and CD-variation for any pattern of interest. The software also takes into account microloading effect that is pattern dependent and contributes to CD variation. Once CD variation is known, it can then be applied to correct the CD-error. Examples of simulations including variable ICP power, physical and chemical etch components, and optimization of a bias and CD variation are presented. Incorporating simulation into the maskmaking process can save cost and shorten the time to production.

  17. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  18. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  19. Knowledge engineering software: A demonstration of a high end tool

    SciTech Connect

    Salzman, G.C.; Krall, R.B.; Marinuzzi, J.G.

    1987-01-01

    Many investigators wanting to apply knowledge-based systems (KBS) as consultants for cancer diagnosis have turned to tools running on personal computers. While some of these tools serve well for small tasks, they lack the power available with the high end KBS tools such as KEE (Knowledge Engineering Environment) and ART (Automated Reasoning Tool). These tools were originally developed on Lisp machines and have the full functionality of the Lisp language as well as many additional features. They provide a rich and highly productive environment for the software developer. To illustrate the capability of one of these high end tools we have converted a table showing the classification of benign soft tissue tumors into a KEE knowledge base. We have used the tools available in Kee to identify the tumor type for a hypothetical patient. 10 figs.

  20. Automated software development tools in the MIS (Management Information Systems) environment

    SciTech Connect

    Arrowood, L.F.; Emrich, M.L.

    1987-09-11

    Quantitative and qualitative benefits can be obtained through the use of automated software development tools. Such tools are best utilized when they complement existing procedures and standards. They can assist systems analysts and programmers with project specification, design, implementation, testing, and documentation. Commercial products have been evaluated to determine their efficacy. User comments have been included to illustrate actual benefits derived from introducing these tools into MIS organizations.

  1. Computer software management, evaluation, and dissemination

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The activities of the Computer Software Management and Information Center involving the collection, processing, and distribution of software developed under the auspices of NASA and certain other federal agencies are reported. Program checkout and evaluation, inventory control, customer services and marketing, dissemination, program maintenance, and special development tasks are discussed.

  2. Evaluation as a Learning Tool

    ERIC Educational Resources Information Center

    Feinstein, Osvaldo Nestor

    2012-01-01

    Evaluation of programs or projects is often perceived as a threat. This is to a great extent related to the anticipated use of evaluation for accountability, which is often prioritized at the expense of using evaluation as a learning tool. Frequently it is argued that there is a trade-off between these two evaluation functions. An alternative…

  3. Design and implementation of the mobility assessment tool: software description

    PubMed Central

    2013-01-01

    Background In previous work, we described the development of an 81-item video-animated tool for assessing mobility. In response to criticism levied during a pilot study of this tool, we sought to develop a new version built upon a flexible framework for designing and administering the instrument. Results Rather than constructing a self-contained software application with a hard-coded instrument, we designed an XML schema capable of describing a variety of psychometric instruments. The new version of our video-animated assessment tool was then defined fully within the context of a compliant XML document. Two software applications—one built in Java, the other in Objective-C for the Apple iPad—were then built that could present the instrument described in the XML document and collect participants’ responses. Separating the instrument’s definition from the software application implementing it allowed for rapid iteration and easy, reliable definition of variations. Conclusions Defining instruments in a software-independent XML document simplifies the process of defining instruments and variations and allows a single instrument to be deployed on as many platforms as there are software applications capable of interpreting the instrument, thereby broadening the potential target audience for the instrument. Continued work will be done to further specify and refine this type of instrument specification with a focus on spurring adoption by researchers in gerontology and geriatric medicine. PMID:23879716

  4. Proposing a Mathematical Software Tool in Physics Secondary Education

    ERIC Educational Resources Information Center

    Baltzis, Konstantinos B.

    2009-01-01

    MathCad® is a very popular software tool for mathematical and statistical analysis in science and engineering. Its low cost, ease of use, extensive function library, and worksheet-like user interface distinguish it among other commercial packages. Its features are also well suited to educational process. The use of natural mathematical notation…

  5. Using Software Tools to Automate the Assessment of Student Programs.

    ERIC Educational Resources Information Center

    Jackson, David

    1991-01-01

    Argues that advent of computer-aided instruction (CAI) systems for teaching introductory computer programing makes it imperative that software be developed to automate assessment and grading of student programs. Examples of typical student programing problems are given, and application of the Unix tools Lex and Yacc to the automatic assessment of…

  6. Understanding Computation of Impulse Response in Microwave Software Tools

    ERIC Educational Resources Information Center

    Potrebic, Milka M.; Tosic, Dejan V.; Pejovic, Predrag V.

    2010-01-01

    In modern microwave engineering curricula, the introduction of the many new topics in microwave industrial development, or of software tools for design and simulation, sometimes results in students having an inadequate understanding of the fundamental theory. The terminology for and the explanation of algorithms for calculating impulse response in…

  7. Role of Social Software Tools in Education: A Literature Review

    ERIC Educational Resources Information Center

    Minocha, Shailey

    2009-01-01

    Purpose: The purpose of this paper is to provide a review of literature on the role of Web 2.0 or social software tools in education. Design/methodology/approach: This paper is a critical and comprehensive review of a range of literature sources (until January 2009) addressing the various issues related to the educator's perspective of pedagogical…

  8. Simple tools and software for precision weed mapping

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Simple Tools and Software for Precision Weed Mapping L. Wiles If you have a color digital camera and a handheld GPS unit, you can map weed problems in your fields. German researchers are perfecting technology to map weed species and density with digital cameras for precision herbicide application. ...

  9. GenePRIMP: A software quality control tool

    SciTech Connect

    Amrita Pati

    2010-05-05

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  10. GenePRIMP: A software quality control tool

    ScienceCinema

    Amrita Pati

    2010-09-01

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  11. New generation of exploration tools: interactive modeling software and microcomputers

    SciTech Connect

    Krajewski, S.A.

    1986-08-01

    Software packages offering interactive modeling techniques are now available for use on microcomputer hardware systems. These packages are reasonably priced for both company and independent explorationists; they do not require users to have high levels of computer literacy; they are capable of rapidly completing complex ranges of sophisticated geologic and geophysical modeling tasks; and they can produce presentation-quality output for comparison with real-world data. For example, interactive packages are available for mapping, log analysis, seismic modeling, reservoir studies, and financial projects as well as for applying a variety of statistical and geostatistical techniques to analysis of exploration data. More importantly, these packages enable explorationists to directly apply their geologic expertise when developing and fine-tuning models for identifying new prospects and for extending producing fields. As a result of these features, microcomputers and interactive modeling software are becoming common tools in many exploration offices. Gravity and magnetics software programs illustrate some of the capabilities of such exploration tools.

  12. Evaluation of a Human Modeling Software Tool in the Prediction of Extra Vehicular Activity Tasks for an International Space Station Assembly Mission

    NASA Technical Reports Server (NTRS)

    Dischinger, H. Charles; Loughead, Tomas E.

    1997-01-01

    The difficulty of accomplishing work in extravehicular activity (EVA) is well documented. It arises as a result of motion constraints imposed by a pressurized spacesuit in a near-vacuum and of the frictionless environment induced in microgravity. The appropriate placement of foot restraints is crucial to ensuring that astronauts can remove and drive bolts, mate and demate connectors, and actuate levers. The location on structural members of the foot restraint sockets, to which the portable foot restraint is attached, must provide for an orientation of the restraint that affords the astronaut adequate visual and reach envelopes. Previously, the initial location of these sockets was dependent upon the experienced designer's ability to estimate placement. The design was tested in a simulated zero-gravity environment; spacesuited astronauts performed the tasks with mockups while submerged in water. Crew evaluation of the tasks based on these designs often indicated the bolt or other structure to which force needed to be applied was not within an acceptable work envelope, resulting in redesign. The development of improved methods for location of crew aids prior to testing would result in savings to the design effort for EVA hardware. Such an effort to streamline EVA design is especially relevant to International Space Station construction and maintenance. Assembly operations alone are expected to require in excess of four hundred hours of EVA. Thus, techniques which conserve design resources for assembly missions can have significant impact. We describe an effort to implement a human modelling application in the design effort for an International Space Station Assembly Mission. On Assembly Flight 6A, the Canadian-built Space Station Remote Manipulator System will be delivered to the U.S. Laboratory. It will be released from its launch restraints by astronauts in EVA. The design of the placement of foot restraint sockets was carried out using the human model Jack, and

  13. Computerized nursing staffing: a software evaluation.

    PubMed

    Pereira, Irene Mari; Gaidzinski, Raquel Rapone; Fugulin, Fernanda Maria Togeiro; Peres, Heloísa Helena Ciqueto; Lima, Antônio Fernandes Costa; Castilho, Valéria; Mira, Vera Lúcia; Massarollo, Maria Cristina Komatsu Braga

    2011-12-01

    The complexity involved in operationalizing the method for nursing staffing, in view of the uncountable variable related to identifying the workload, the effective working time of the staff, and the Technical Security Index (TSI) revealed the need to develop a software program named: Computerized Nursing Staffing (DIPE, in Portuguese acronyms). This exploratory, descriptive study was performed with the objective to evaluate the technical quality and functional performance of DIPE. Participants were eighteen evaluators, ten of whom where nurse faculty or nurse hospital unit managers, and eight health informatics experts. The software evaluation was performed according to norm NBR ISO/IEC 9126-1, considering the features functionality, reliability, usability, efficiency, and maintainability. The software evaluation reached positive results and agreement among the evaluators for all the evaluated features. The reported suggestions are important for proposing further improving and enhancing the DIPE. PMID:22282068

  14. A Software Tool for Integrated Optical Design Analysis

    NASA Technical Reports Server (NTRS)

    Moore, Jim; Troy, Ed; DePlachett, Charles; Montgomery, Edward (Technical Monitor)

    2001-01-01

    Design of large precision optical systems requires multi-disciplinary analysis, modeling, and design. Thermal, structural and optical characteristics of the hardware must be accurately understood in order to design a system capable of accomplishing the performance requirements. The interactions between each of the disciplines become stronger as systems are designed lighter weight for space applications. This coupling dictates a concurrent engineering design approach. In the past, integrated modeling tools have been developed that attempt to integrate all of the complex analysis within the framework of a single model. This often results in modeling simplifications and it requires engineering specialist to learn new applications. The software described in this presentation addresses the concurrent engineering task using a different approach. The software tool, Integrated Optical Design Analysis (IODA), uses data fusion technology to enable a cross discipline team of engineering experts to concurrently design an optical system using their standard validated engineering design tools.

  15. EPA`s evaluation of utility emissions data submitted under Title IV: Application of software tools in an operational data quality assurance program

    SciTech Connect

    Hillock, C.S.; Wockenfuss, M.E.

    1995-12-31

    Title IV (Acid Deposition Control) of the Clean Air Act Amendments of 1990 requires annual reductions of 10 million tons of sulfur dioxide and substantial reductions of nitrogen oxides from electric utilities. These reductions will occur in two phases with Phase 1 beginning in 1995. To ensure the reduction goals are met, affected utilities must monitor their emissions, perform quality assurance and quality control tests, and report the data to EPA as required by 40 CFR Part 75. EPA`s Emissions Tracking System (ETS) was developed to analyze all submitted data reports and to provide the annual emissions data needed to determine whether utilities comply with their allowable SO{sub 2} emissions. EPA received the first quarterly reports from Phase 1 utilities at the end of January, 1994. A substantial number of these initial reports exhibited major problems and EPA required many utilities to improve and resubmit their reports. Data reports for the following three quarters of 1994 showed continual improvement as utilities gained experience in operating and improving their monitoring hardware and software and as EPA clarified published guidance. Helpful utility comments enabled EPA to correct and refine ETS. Phase 2 will affect approximately 700 additional plants. These plants will begin reporting their emissions data to EPA at the beginning of May, 1995. EPA is improving data handling and processing procedures and expanding the ETS software system capabilities to receive and process the substantial increase in submitted data during 1995.

  16. Software Platform Evaluation - Verifiable Fuel Cycle Simulation (VISION) Model

    SciTech Connect

    J. J. Jacobson; D. E. Shropshire; W. B. West

    2005-11-01

    The purpose of this Software Platform Evaluation (SPE) is to document the top-level evaluation of potential software platforms on which to construct a simulation model that satisfies the requirements for a Verifiable Fuel Cycle Simulation Model (VISION) of the Advanced Fuel Cycle (AFC). See the Software Requirements Specification for Verifiable Fuel Cycle Simulation (VISION) Model (INEEL/EXT-05-02643, Rev. 0) for a discussion of the objective and scope of the VISION model. VISION is intended to serve as a broad systems analysis and study tool applicable to work conducted as part of the AFCI (including costs estimates) and Generation IV reactor development studies. This document will serve as a guide for selecting the most appropriate software platform for VISION. This is a “living document” that will be modified over the course of the execution of this work.

  17. Development of a software tool for an internal dosimetry using MIRD method

    NASA Astrophysics Data System (ADS)

    Chaichana, A.; Tocharoenchai, C.

    2016-03-01

    Currently, many software packages for the internal radiation dosimetry have been developed. Many of them do not provide sufficient tools to perform all of the necessary steps from nuclear medicine image analysis for dose calculation. For this reason, we developed a CALRADDOSE software that can be performed internal dosimetry using MIRD method within a single environment. MATLAB software version 2015a was used as development tool. The calculation process of this software proceeds from collecting time-activity data from image data followed by residence time calculation and absorbed dose calculation using MIRD method. To evaluate the accuracy of this software, we calculate residence times and absorbed doses of 5 Ga- 67 studies and 5 I-131 MIBG studies and then compared the results with those obtained from OLINDA/EXM software. The results showed that the residence times and absorbed doses obtained from both software packages were not statistically significant differences. The CALRADDOSE software is a user-friendly, graphic user interface-based software for internal dosimetry. It provides fast and accurate results, which may be useful for a routine work.

  18. COSTMODL: An automated software development cost estimation tool

    NASA Technical Reports Server (NTRS)

    Roush, George B.

    1991-01-01

    The cost of developing computer software continues to consume an increasing portion of many organizations' total budgets, both in the public and private sector. As this trend develops, the capability to produce reliable estimates of the effort and schedule required to develop a candidate software product takes on increasing importance. The COSTMODL program was developed to provide an in-house capability to perform development cost estimates for NASA software projects. COSTMODL is an automated software development cost estimation tool which incorporates five cost estimation algorithms including the latest models for the Ada language and incrementally developed products. The principal characteristic which sets COSTMODL apart from other software cost estimation programs is its capacity to be completely customized to a particular environment. The estimation equations can be recalibrated to reflect the programmer productivity characteristics demonstrated by the user's organization, and the set of significant factors which effect software development costs can be customized to reflect any unique properties of the user's development environment. Careful use of a capability such as COSTMODL can significantly reduce the risk of cost overruns and failed projects.

  19. Software Tool Integrating Data Flow Diagrams and Petri Nets

    NASA Technical Reports Server (NTRS)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  20. An expert system based software sizing tool, phase 2

    NASA Technical Reports Server (NTRS)

    Friedlander, David

    1990-01-01

    A software tool was developed for predicting the size of a future computer program at an early stage in its development. The system is intended to enable a user who is not expert in Software Engineering to estimate software size in lines of source code with an accuracy similar to that of an expert, based on the program's functional specifications. The project was planned as a knowledge based system with a field prototype as the goal of Phase 2 and a commercial system planned for Phase 3. The researchers used techniques from Artificial Intelligence and knowledge from human experts and existing software from NASA's COSMIC database. They devised a classification scheme for the software specifications, and a small set of generic software components that represent complexity and apply to large classes of programs. The specifications are converted to generic components by a set of rules and the generic components are input to a nonlinear sizing function which makes the final prediction. The system developed for this project predicted code sizes from the database with a bias factor of 1.06 and a fluctuation factor of 1.77, an accuracy similar to that of human experts but without their significant optimistic bias.

  1. The MineTool Software Suite: A Novel Data Mining Palette of Tools for Automated Modeling of Space Physics Data

    NASA Astrophysics Data System (ADS)

    Sipes, T.; Karimabadi, H.; Roberts, A.

    2009-12-01

    We present a new data mining software tool called MineTool for analysis and modeling of space physics data. MineTool is a graphical user interface implementation that merges two data mining algorithms into an easy-to-use software tool: an algorithm for analysis and modeling of static data [Karimabadi et al, 2007] and MineTool-TS, an algorithm for data mining of time series data [Karimabadi et al, 2009]. By virtue of automating the modeling process and model evaluations, MineTool makes data mining and predictive modeling more accessible to non-experts. The software is entirely in Java and freeware. By ranking all inputs as predictors of the outcome before constructing a model, MineTool enables inclusion of only relevant variables as well. The technique aggregates the various stages of model building into a four-step process consisting of (i) data segmentation and sampling, (ii) variable pre-selection and transform generation, (iii) predictive model estimation and validation, and (iv) final model selection. Optimal strategies are chosen for each modeling step. A notable feature of the technique is that the final model is always in closed analytical form rather than “black box” form characteristic of some other techniques. Having the analytical model enables deciphering the importance of various variables to affecting the outcome. MineTool suite also provides capabilities for data preparation for data mining as well as visualization of the datasets. MineTool has successfully been used to develop models for automated detection of flux transfer events (FTEs) at Earth’s magnetopause in the Cluster spacecraft time series data and 3D magnetopause modeling. In this presentation, we demonstrate the ease of use of the software through examples including how it was used in the FTE problem.

  2. Software Selection: A Primer on Source and Evaluation.

    ERIC Educational Resources Information Center

    Burston, Jack

    2003-01-01

    Provides guidance on making decisions regarding the selection of foreign language instructional software. Identifies sources of foreign language software, indicates sources of foreign language software reviews, and outlines essential procedures of software evaluation. (Author/VWL)

  3. Evaluating Instructional Software for the Microcomputer.

    ERIC Educational Resources Information Center

    Cohen, Vicki L. Blum

    In order to develop a systematic procedure for the evaluation and revision of educational software for microcomputers, a study was undertaken by the Educational Products Information Exchange (EPIE) Institute and the Microcomputer Resource Center at Columbia University to define the criteria that are needed to evaluate instructional microcomputer…

  4. Construction of an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Friedland, Peter; Keller, Richard M.; Mckay, Christopher P.; Sims, Michael H.; Thompson, David E.

    1992-01-01

    Scientific model-building can be a time intensive and painstaking process, often involving the development of large complex computer programs. Despite the effort involved, scientific models cannot be distributed easily and shared with other scientists. In general, implemented scientific models are complicated, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing, using and sharing models. The proposed tool will include an interactive intelligent graphical interface and a high-level domain-specific modeling language. As a test bed for this research, we propose to develop a software prototype in the domain of planetary atmospheric modeling.

  5. Constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael; Podolak, Ester; Mckay, Christopher

    1990-01-01

    Scientific model building can be an intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot be easily distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. In this paper, we describe a prototype for a scientific modeling software tool that serves as an aid to the scientist in developing and using models. This tool includes an interactive intelligent graphical interface, a high level domain specific modeling language, a library of physics equations and experimental datasets, and a suite of data display facilities. Our prototype has been developed in the domain of planetary atmospheric modeling, and is being used to construct models of Titan's atmosphere.

  6. Westinghouse Waste Simulation and Optimization Software Tool - 13493

    SciTech Connect

    Mennicken, Kim; Aign, Joerg

    2013-07-01

    Radioactive waste is produced during NPP operation and NPP D and D. Different kinds of waste with different volumes and properties have to be treated. Finding a technically and commercially optimized waste treatment concept is a difficult and time consuming process. The Westinghouse waste simulation and optimization software tool is an approach to study the total life cycle cost of any waste management facility. The tool enables the user of the simulation and optimization software to plan processes and storage buildings and to identify bottlenecks in the overall waste management design before starting detailed planning activities. Furthermore, application of the software enables the user to optimize the number of treatment systems, to determine the minimum design capacity for onsite storage facilities, to identify bottlenecks in the overall design and to identify the most cost-effective treatment paths by maintaining optimal waste treatment technologies. In combination with proven waste treatment equipment and integrated waste management solutions, the waste simulation and optimization software provides reliable qualitative results that lead to an effective planning and minimization of the total project planning risk of any waste management activity. (authors)

  7. Software Certification for Temporal Properties With Affordable Tool Qualification

    NASA Technical Reports Server (NTRS)

    Xia, Songtao; DiVito, Benedetto L.

    2005-01-01

    It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.

  8. Management of an affiliated Physics Residency Program using a commercial software tool.

    PubMed

    Zacarias, Albert S; Mills, Michael D

    2010-01-01

    A review of commercially available allied health educational management software tools was performed to evaluate their capacity to manage program data associated with a CAMPEP-accredited Therapy Physics Residency Program. Features of these software tools include: a) didactic course reporting and organization, b) competency reporting by topic, category and didactic course, c) student time management and accounting, and d) student patient case reporting by topic, category and course. The software package includes features for recording school administrative information; setting up lists of courses, faculty, clinical sites, categories, competencies, and time logs; and the inclusion of standardized external documents. There are provisions for developing evaluation and survey instruments. The mentors and program may be evaluated by residents, and residents may be evaluated by faculty members using this feature. Competency documentation includes the time spent on the problem or with the patient, time spent with the mentor, date of the competency, and approval by the mentor and program director. Course documentation includes course and lecture title, lecturer, topic information, date of lecture and approval by the Program Director. These software tools have the facility to include multiple clinical sites, with local subadministrators having the ability to approve competencies and attendance at clinical conferences. In total, these software tools have the capability of managing all components of a CAMPEP-accredited residency program. The application database lends the software to the support of multiple affiliated clinical sites within a single residency program. Such tools are a critical and necessary component if the medical physics profession is to meet the projected needs for qualified medical physicists in future years. PMID:20717075

  9. NTRFinder: a software tool to find nested tandem repeats.

    PubMed

    Matroud, Atheer A; Hendy, M D; Tuffley, C P

    2012-02-01

    We introduce the software tool NTRFinder to search for a complex repetitive structure in DNA we call a nested tandem repeat (NTR). An NTR is a recurrence of two or more distinct tandem motifs interspersed with each other. We propose that NTRs can be used as phylogenetic and population markers. We have tested our algorithm on both real and simulated data, and present some real NTRs of interest. NTRFinder can be downloaded from http://www.maths.otago.ac.nz/~aamatroud/. PMID:22121222

  10. An Open Source Software Tool for Hydrologic Climate Change Assessment

    NASA Astrophysics Data System (ADS)

    Park, Dong Kwan; Shin, Mun-Ju; Kim, Young-Oh

    2015-04-01

    With the Intergovernmental Panel on Climate Change (IPCC) publishing Climate Change Assessment Reports containing updated forecasts and scenarios regularly, it is necessary to also periodically perform hydrologic assessments studies on these scenarios. The practical users including scientists and government people need to use handy tools that operate from climate input data of historical observations and climate change scenarios to rainfall-runoff simulation and assessment periodically. We propose HydroCAT (Hydrologic Climate change Assessment Tool), which is a flexible software tool designed to simplify and streamline hydrologic climate change assessment studies with the incorporation of: taking climate input values from general circulation models using the latest climate change scenarios; simulation of downscaled values using statistical downscaling methods; calibration and simulation of well-know multiple lumped conceptual hydrologic models; assessment of results using statistical methods. This package is designed in an open source, R-based, software package that includes an operating framework to support wide data frameworks, variety of hydrologic models, and climate change scenarios. The use of the software is demonstrated in a case study of the Geum River basin in Republic of Korea.

  11. Using Verbal Protocol Methodology in the Evaluation of Software and Hardware.

    ERIC Educational Resources Information Center

    Mathison, Sandra; Meyer, Tricia R.; Vargas, Juan D.

    1999-01-01

    Describes verbal protocols as a useful tool for evaluating computer hardware and software, especially if informed by activity theory. Such protocols cannot, however, stand alone in a thorough evaluation design. (Author/SLD)

  12. Tool Support for Parametric Analysis of Large Software Simulation Systems

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Gundy-Burlet, Karen; Pasareanu, Corina; Menzies, Tim; Barrett, Tony

    2008-01-01

    The analysis of large and complex parameterized software systems, e.g., systems simulation in aerospace, is very complicated and time-consuming due to the large parameter space, and the complex, highly coupled nonlinear nature of the different system components. Thus, such systems are generally validated only in regions local to anticipated operating points rather than through characterization of the entire feasible operational envelope of the system. We have addressed the factors deterring such an analysis with a tool to support envelope assessment: we utilize a combination of advanced Monte Carlo generation with n-factor combinatorial parameter variations to limit the number of cases, but still explore important interactions in the parameter space in a systematic fashion. Additional test-cases, automatically generated from models (e.g., UML, Simulink, Stateflow) improve the coverage. The distributed test runs of the software system produce vast amounts of data, making manual analysis impossible. Our tool automatically analyzes the generated data through a combination of unsupervised Bayesian clustering techniques (AutoBayes) and supervised learning of critical parameter ranges using the treatment learner TAR3. The tool has been developed around the Trick simulation environment, which is widely used within NASA. We will present this tool with a GN&C (Guidance, Navigation and Control) simulation of a small satellite system.

  13. CAD/CAM software for an industrial laser manufacturing tool

    NASA Astrophysics Data System (ADS)

    Stassen Boehlen, Ines; Fieret, Jim; Holmes, Andrew S.; Lee, Kin W.

    2003-07-01

    A facility for rapid prototyping of MEMS devices is crucial for the development of novel miniaturized components in all sectors of high-tech industry, e.g. telecommunications, information technology, micro-optics and aerospace. To overcome the disadvantages of existing techniques in terms of cost and flexibility, a new approach has been taken to provide a tool for rapid prototyping and small-scale production: Complex CAD/CAM software has been developed that automatically generates the tool paths according to a CAD drawing of the MEMS device. As laser ablation is a much more complicated process than mechanical machining, for which such software has already been in use for many years, the generation of these tool paths relies not only on geometric considerations, but also on a sophisticated simulation module taking into account various material and laser parameters and micro-effects. The following laser machining options have been implemented: cutting, hole drilling, slot cutting, 2D area clearing, pocketing and 2½D surface machining. Once the tool paths are available, a post processor translates this information into CNC commands that control a scanner head. This scanner head then guides the beam of a UV solid-state laser to machine the desired structure by direct laser ablation.

  14. Classroom Live: a software-assisted gamification tool

    NASA Astrophysics Data System (ADS)

    de Freitas, Adrian A.; de Freitas, Michelle M.

    2013-06-01

    Teachers have come to rely on a variety of approaches in order to elicit and sustain student interest in the classroom. One particular approach, known as gamification, seeks to improve student engagement by transforming the traditional classroom experience into a competitive multiplayer game. Initial attempts at classroom gamification relied on the teacher manually tracking student progress. At the US Air Force Academy, we wanted to experiment with a software gamification tool. Our client/server suite, dubbed Classroom Live, streamlines the gamification process for the teacher by simplifying common tasks. Simultaneously, the tool provides students with an esthetically pleasing user interface that offers in game rewards in exchange for their participation. Classroom Live is still in development, but our initial experience using the tool has been extremely positive and confirms our belief that students respond positively to gamification, even at the undergraduate level.

  15. MSP-Tool: a VBA-based software tool for the analysis of multispecimen paleointensity data

    NASA Astrophysics Data System (ADS)

    Monster, Marilyn; de Groot, Lennart; Dekkers, Mark

    2015-12-01

    The multispecimen protocol (MSP) is a method to estimate the Earth's magnetic field's past strength from volcanic rocks or archeological materials. By reducing the amount of heating steps and aligning the specimens parallel to the applied field, thermochemical alteration and multi-domain effects are minimized. We present a new software tool, written for Microsoft Excel 2010 in Visual Basic for Applications (VBA), that evaluates paleointensity data acquired using this protocol. In addition to the three ratios (standard, fraction-corrected and domain-state-corrected) calculated following Dekkers and Böhnel (2006) and Fabian and Leonhardt (2010) and a number of other parameters proposed by Fabian and Leonhardt (2010), it also provides several reliability criteria. These include an alteration criterion, whether or not the linear regression intersects the y axis within the theoretically prescribed range, and two directional checks. Overprints and misalignment are detected by isolating the remaining natural remanent magnetization (NRM) and the partial thermoremanent magnetization (pTRM) gained and comparing their declinations and inclinations. The NRM remaining and pTRM gained are then used to calculate alignment-corrected multispecimen plots. Data are analyzed using bootstrap statistics. The program was tested on lava samples that were given a full TRM and that acquired their pTRMs at angles of 0, 15, 30 and 90° with respect to their NRMs. MSP-Tool adequately detected and largely corrected these artificial alignment errors.

  16. COSTMODL - AN AUTOMATED SOFTWARE DEVELOPMENT COST ESTIMATION TOOL

    NASA Technical Reports Server (NTRS)

    Roush, G. B.

    1994-01-01

    The cost of developing computer software consumes an increasing portion of many organizations' budgets. As this trend continues, the capability to estimate the effort and schedule required to develop a candidate software product becomes increasingly important. COSTMODL is an automated software development estimation tool which fulfills this need. Assimilating COSTMODL to any organization's particular environment can yield significant reduction in the risk of cost overruns and failed projects. This user-customization capability is unmatched by any other available estimation tool. COSTMODL accepts a description of a software product to be developed and computes estimates of the effort required to produce it, the calendar schedule required, and the distribution of effort and staffing as a function of the defined set of development life-cycle phases. This is accomplished by the five cost estimation algorithms incorporated into COSTMODL: the NASA-developed KISS model; the Basic, Intermediate, and Ada COCOMO models; and the Incremental Development model. This choice affords the user the ability to handle project complexities ranging from small, relatively simple projects to very large projects. Unique to COSTMODL is the ability to redefine the life-cycle phases of development and the capability to display a graphic representation of the optimum organizational structure required to develop the subject project, along with required staffing levels and skills. The program is menu-driven and mouse sensitive with an extensive context-sensitive help system that makes it possible for a new user to easily install and operate the program and to learn the fundamentals of cost estimation without having prior training or separate documentation. The implementation of these functions, along with the customization feature, into one program makes COSTMODL unique within the industry. COSTMODL was written for IBM PC compatibles, and it requires Turbo Pascal 5.0 or later and Turbo

  17. A NEO population generation and observation simulation software tool

    NASA Astrophysics Data System (ADS)

    Müller, Sven; Gelhaus, Johannes; Hahn, Gerhard; Franco, Raffaella

    One of the main targets of ESA's Space Situational Awareness (SSA) program is to build a wide knowledge base about objects that can potentially harm Earth (Near-Earth Objects, NEOs). An important part of this effort is to create the Small Bodies Data Centre (SBDC) which is going to aggregate measurement data from a fully-integrated NEO observation sensor network. Until this network is developed, artificial NEO measurement data is needed in order to validate SBDC algorithms. Moreover, to establish a functioning NEO observation sensor network, it has to be determined where to place sensors, what technical requirements have to be met in order to be able to detect NEOs and which observation strategies work the best. Because of this, a sensor simulation software was needed. This paper presents a software tool which allows users to create and analyse NEO populations and to simulate and analyse population observations. It is a console program written in Fortran and comes with a Graphical User Interface (GUI) written in Java and C. The tool can be distinguished into the components ``Population Generator'' and ``Observation Simulator''. The Population Generator component is responsible for generating and analysing a NEO population. Users can choose between creating fictitious (random) and synthetic populations. The latter are based on one of two models describing the orbital and size distribution of observed NEOs: The existing socalled ``Bottke Model'' (Bottke et al. 2000, 2002) and the new ``Granvik Model'' (Granvik et al. 2014, in preparation) which has been developed in parallel to the tool. Generated populations can be analysed by defining 2D, 3D and scatter plots using various NEO attributes. As a result, the tool creates the appropiate files for the plotting tool ``gnuplot''. The tool's Observation Simulator component yields the Observation Simulation and Observation Analysis functions. Users can define sensor systems using ground- or space-based locations as well as

  18. Software Tools for Developing and Simulating the NASA LaRC CMF Motion Base

    NASA Technical Reports Server (NTRS)

    Bryant, Richard B., Jr.; Carrelli, David J.

    2006-01-01

    The NASA Langley Research Center (LaRC) Cockpit Motion Facility (CMF) motion base has provided many design and analysis challenges. In the process of addressing these challenges, a comprehensive suite of software tools was developed. The software tools development began with a detailed MATLAB/Simulink model of the motion base which was used primarily for safety loads prediction, design of the closed loop compensator and development of the motion base safety systems1. A Simulink model of the digital control law, from which a portion of the embedded code is directly generated, was later added to this model to form a closed loop system model. Concurrently, software that runs on a PC was created to display and record motion base parameters. It includes a user interface for controlling time history displays, strip chart displays, data storage, and initializing of function generators used during motion base testing. Finally, a software tool was developed for kinematic analysis and prediction of mechanical clearances for the motion system. These tools work together in an integrated package to support normal operations of the motion base, simulate the end to end operation of the motion base system providing facilities for software-in-the-loop testing, mechanical geometry and sensor data visualizations, and function generator setup and evaluation.

  19. Empirical Software Evaluation: A Practical Alternative.

    ERIC Educational Resources Information Center

    Hedbring, Charles

    1987-01-01

    The article presents a software evaluation checklist developed by a teaching-research laboratory for severely handicapped students in New York City. In an introductory section, the use of laptop microcomputers in helping handicapped learners acquire, maintain, and generalize functional skills is described as the fifth ingredient of an integrated…

  20. A software tool for analyzing multichannel cochlear implant signals.

    PubMed

    Lai, Wai Kong; Bögli, Hans; Dillier, Norbert

    2003-10-01

    A useful and convenient means to analyze the radio frequency (RF) signals being sent by a speech processor to a cochlear implant would be to actually capture and display them with appropriate software. This is particularly useful for development or diagnostic purposes. sCILab (Swiss Cochlear Implant Laboratory) is such a PC-based software tool intended for the Nucleus family of Multichannel Cochlear Implants. Its graphical user interface provides a convenient and intuitive means for visualizing and analyzing the signals encoding speech information. Both numerical and graphic displays are available for detailed examination of the captured CI signals, as well as an acoustic simulation of these CI signals. sCILab has been used in the design and verification of new speech coding strategies, and has also been applied as an analytical tool in studies of how different parameter settings of existing speech coding strategies affect speech perception. As a diagnostic tool, it is also useful for troubleshooting problems with the external equipment of the cochlear implant systems. PMID:14534409

  1. Northwestern University Schizophrenia Data and Software Tool (NUSDAST)

    PubMed Central

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I.; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data), cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function), clinical (demographic, sibling relationship, SAPS and SANS psychopathology), and genetic (20 polymorphisms) data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions. PMID:24223551

  2. Northwestern University Schizophrenia Data and Software Tool (NUSDAST).

    PubMed

    Wang, Lei; Kogan, Alex; Cobia, Derin; Alpert, Kathryn; Kolasny, Anthony; Miller, Michael I; Marcus, Daniel

    2013-01-01

    The schizophrenia research community has invested substantial resources on collecting, managing and sharing large neuroimaging datasets. As part of this effort, our group has collected high resolution magnetic resonance (MR) datasets from individuals with schizophrenia, their non-psychotic siblings, healthy controls and their siblings. This effort has resulted in a growing resource, the Northwestern University Schizophrenia Data and Software Tool (NUSDAST), an NIH-funded data sharing project to stimulate new research. This resource resides on XNAT Central, and it contains neuroimaging (MR scans, landmarks and surface maps for deep subcortical structures, and FreeSurfer cortical parcellation and measurement data), cognitive (cognitive domain scores for crystallized intelligence, working memory, episodic memory, and executive function), clinical (demographic, sibling relationship, SAPS and SANS psychopathology), and genetic (20 polymorphisms) data, collected from more than 450 subjects, most with 2-year longitudinal follow-up. A neuroimaging mapping, analysis and visualization software tool, CAWorks, is also part of this resource. Moreover, in making our existing neuroimaging data along with the associated meta-data and computational tools publically accessible, we have established a web-based information retrieval portal that allows the user to efficiently search the collection. This research-ready dataset meaningfully combines neuroimaging data with other relevant information, and it can be used to help facilitate advancing neuroimaging research. It is our hope that this effort will help to overcome some of the commonly recognized technical barriers in advancing neuroimaging research such as lack of local organization and standard descriptions. PMID:24223551

  3. EVALUATING ENVIRONMENTAL DECISION SUPPORT TOOLS.

    SciTech Connect

    SULLIVAN, T.

    2004-10-01

    Effective contaminated land management requires a number of decisions addressing a suite of technical, economic, and social concerns. These concerns include human health risks, ecological risks, economic costs, technical feasibility of proposed remedial actions, and the value society places on clean-up and re-use of formerly contaminated lands. Decision making, in the face of uncertainty and multiple and often conflicting objectives, is a vital and challenging role in environmental management that affects a significant economic activity. Although each environmental remediation problem is unique and requires a site-specific analysis, many of the key decisions are similar in structure. This has led many to attempt to develop standard approaches. As part of the standardization process, attempts have been made to codify specialist expertise into decision support tools. This activity is intended to facilitate reproducible and transparent decision making. The process of codifying procedures has also been found to be a useful activity for establishing and rationalizing management processes. This study will have two primary objectives. The first is to develop taxonomy for Decision Support Tools (DST) to provide a framework for understanding the different tools and what they are designed to address in the context of environmental remediation problems. The taxonomy will have a series of subject areas for the DST. From these subjects, a few key areas will be selected for further study and software in these areas will be identified. The second objective, will be to review the existing DST in the selected areas and develop a screening matrix for each software product.

  4. Review of free software tools for image analysis of fluorescence cell micrographs.

    PubMed

    Wiesmann, V; Franz, D; Held, C; Münzenmayer, C; Palmisano, R; Wittenberg, T

    2015-01-01

    An increasing number of free software tools have been made available for the evaluation of fluorescence cell micrographs. The main users are biologists and related life scientists with no or little knowledge of image processing. In this review, we give an overview of available tools and guidelines about which tools the users should use to segment fluorescence micrographs. We selected 15 free tools and divided them into stand-alone, Matlab-based, ImageJ-based, free demo versions of commercial tools and data sharing tools. The review consists of two parts: First, we developed a criteria catalogue and rated the tools regarding structural requirements, functionality (flexibility, segmentation and image processing filters) and usability (documentation, data management, usability and visualization). Second, we performed an image processing case study with four representative fluorescence micrograph segmentation tasks with figure-ground and cell separation. The tools display a wide range of functionality and usability. In the image processing case study, we were able to perform figure-ground separation in all micrographs using mainly thresholding. Cell separation was not possible with most of the tools, because cell separation methods are provided only by a subset of the tools and are difficult to parametrize and to use. Most important is that the usability matches the functionality of a tool. To be usable, specialized tools with less functionality need to fulfill less usability criteria, whereas multipurpose tools need a well-structured menu and intuitive graphical user interface. PMID:25359577

  5. Static Analysis Tools, a Practical Approach for Safety-Critical Software Verification

    NASA Astrophysics Data System (ADS)

    Lopes, R.; Vicente, D.; Silva, N.

    2009-05-01

    Static code analysis tools available today range from Lintbased syntax parsers to standards' compliance checkers to tools using more formal methods for verification. As safety critical software complexity is increasing, these tools provide a mean to ensure code quality, safety and dependability attributes. They also provide a mean to introduce further automation in code analysis activities. The features presented by static code analysis tools are particularly interesting for V&V activities. In the scope of Independent Code Verification (IVE), two different static analysis tools have been used during Code Verification activities of the LISA Pathfinder onboard software in order to assess their contribution to the efficiency of the process and quality of the results. Polyspace (The MathWorks) and FlexeLint (Gimpel) tools have been used as examples of high-budget and low-budget tools respectively. Several aspects have been addressed: effort has been categorised for closer analysis (e.g. setup and configuration time, execution time, analysis of the results, etc), reported issues have been categorised according to their type and the coverage of traditional IVE tasks by the static code analysis tools has been evaluated. Final observations have been performed by analysing the previously referred subjects, namely regarding cost effectiveness, quality of results, complementarities between the results of different static code analysis tools and relation between automated code analysis and manual code inspection.

  6. Identification and evaluation of software measures

    NASA Technical Reports Server (NTRS)

    Card, D. N.

    1981-01-01

    A large scale, systematic procedure for identifying and evaluating measures that meaningfully characterize one or more elements of software development is described. The background of this research, the nature of the data involved, and the steps of the analytic procedure are discussed. An example of the application of this procedure to data from real software development projects is presented. As the term is used here, a measure is a count or numerical rating of the occurrence of some property. Examples of measures include lines of code, number of computer runs, person hours expended, and degree of use of top down design methodology. Measures appeal to the researcher and the manager as a potential means of defining, explaining, and predicting software development qualities, especially productivity and reliability.

  7. Designing a Software Tool for Fuzzy Logic Programming

    NASA Astrophysics Data System (ADS)

    Abietar, José M.; Morcillo, Pedro J.; Moreno, Ginés

    2007-12-01

    Fuzzy Logic Programming is an interesting and still growing research area that agglutinates the efforts for introducing fuzzy logic into logic programming (LP), in order to incorporate more expressive resources on such languages for dealing with uncertainty and approximated reasoning. The multi-adjoint logic programming approach is a recent and extremely flexible fuzzy logic paradigm for which, unfortunately, we have not found practical tools implemented so far. In this work, we describe a prototype system which is able to directly translate fuzzy logic programs into Prolog code in order to safely execute these residual programs inside any standard Prolog interpreter in a completely transparent way for the final user. We think that the development of such fuzzy languages and programing tools might play an important role in the design of advanced software applications for computational physics, chemistry, mathematics, medicine, industrial control and so on.

  8. Splash: A Software Tool for Stereotactic Planning of Recording Chamber Placement and Electrode Trajectories

    PubMed Central

    Sperka, Daniel J.; Ditterich, Jochen

    2011-01-01

    While computer-aided planning of human neurosurgeries is becoming more and more common, animal researchers still largely rely on paper atlases for planning their approach before implanting recording chambers to perform invasive recordings of neural activity, which makes this planning process tedious and error-prone. Here we present SPLASh (Stereotactic PLAnning Software), an interactive software tool for the stereotactic planning of recording chamber placement and electrode trajectories. SPLASh has been developed for monkey cortical recordings and relies on a combination of structural MRIs and electronic brain atlases. Since SPLASh is based on the neuroanatomy software Caret, it should also be possible to use it for other parts of the brain or other species for which Caret atlases are available. The tool allows the user to interactively evaluate different possible placements of recording chambers and to simulate electrode trajectories. PMID:21472085

  9. SU-E-T-27: A Tool for Routine Quality Assurance of Radiotherapy Dose Calculation Software

    SciTech Connect

    Popple, R; Cardan, R; Duan, J; Wu, X; Shen, S; Brezovich, I

    2014-06-01

    Purpose: Dose calculation software is thoroughly evaluated when it is commissioned; however, evaluation of periodic software updates is typically limited in scope due to staffing constraints and the need to quickly return the treatment planning system to clinical service. We developed a tool for quickly and comprehensively testing and documenting dose calculation software against measured data. Methods: A tool was developed using MatLab (The MathWorks, Natick, MA) for evaluation of dose calculation algorithms against measured data. Inputs to the tool are measured data, reference DICOM RT PLAN files describing the measurements, and dose calculations in DICOM format. The tool consists of a collection of extensible modules that can perform analysis of point dose, depth dose curves, and profiles using dose difference, distance-to-agreement, and the gamma-index. Each module generates a report subsection that is incorporated into a master template, which is converted to final form in portable document format (PDF). Results: After each change to the treatment planning system, a report can be generated in approximately 90 minutes. The tool has been in use for more than 5 years, spanning 5 versions of the eMC and 4 versions of the AAA. We have detected changes to the algorithms that affected clinical practice once during this period. Conclusion: Our tool provides an efficient method for quality assurance of dose calculation software, providing a complete set of tests for an update. Future work includes the addition of plan level tests, allowing incorporation of, for example, the TG-119 test suite for IMRT, and integration with the treatment planning system via an application programming interface. Integration with the planning system will permit fully-automated testing and reporting at scheduled intervals.

  10. Software Tools for In-Situ Documentation of Built Heritage

    NASA Astrophysics Data System (ADS)

    Smars, P.

    2013-07-01

    The paper presents open source software tools developed by the author to facilitate in-situ documentation of architectural and archæological heritage. The design choices are exposed and related to a general issue in conservation and documentation: taking decisions about a valuable object under threat . The questions of level of objectivity is central to the three steps of this process. It is our belief that in-situ documentation has to be favoured in this demanding context, full of potential discoveries. The very powerful surveying techniques in rapid development nowadays enhance our vision but often tend to bring back a critical part of the documentation process to the office. The software presented facilitate a direct treatment of the data on the site. Emphasis is given to flexibility, interoperability and simplicity. Key features of the software are listed and illustrated with examples (3D model of Gothic vaults, analysis of the shape of a column, deformation of a wall, direct interaction with AutoCAD).

  11. A software tool for graphically assembling damage identification algorithms

    NASA Astrophysics Data System (ADS)

    Allen, David W.; Clough, Joshua A.; Sohn, Hoon; Farrar, Charles R.

    2003-08-01

    At Los Alamos National Laboratory (LANL), various algorithms for structural health monitoring problems have been explored in the last 5 to 6 years. The original DIAMOND (Damage Identification And MOdal aNalysis of Data) software was developed as a package of modal analysis tools with some frequency domain damage identification algorithms included. Since the conception of DIAMOND, the Structural Health Monitoring (SHM) paradigm at LANL has been cast in the framework of statistical pattern recognition, promoting data driven damage detection approaches. To reflect this shift and to allow user-friendly analyses of data, a new piece of software, DIAMOND II is under development. The Graphical User Interface (GUI) of the DIAMOND II software is based on the idea of GLASS (Graphical Linking and Assembly of Syntax Structure) technology, which is currently being implemented at LANL. GLASS is a Java based GUI that allows drag and drop construction of algorithms from various categories of existing functions. In the platform of the underlying GLASS technology, DIAMOND II is simply a module specifically targeting damage identification applications. Users can assemble various routines, building their own algorithms or benchmark testing different damage identification approaches without writing a single line of code.

  12. Energy efficiency assessment methods and tools evaluation

    SciTech Connect

    McMordie, K.L.; Richman, E.E.; Keller, J.M.; Dixon, D.R.

    1994-08-01

    Many different methods of assessing the energy savings potential at federal installations, and identifying attractive projects for capital investment have been used by the different federal agencies. These methods range from high-level estimating tools to detailed design tools, both manual and software assisted. These methods have different purposes and provide results that are used for different parts of the project identification, and implementation process. Seven different assessment methods are evaluated in this study. These methods were selected by the program managers at the DoD Energy Policy Office, and DOE Federal Energy Management Program (FEMP). Each of the methods was applied to similar buildings at Bolling Air Force Base (AFB), unless it was inappropriate or the method was designed to make an installation-wide analysis, rather than focusing on particular buildings. Staff at Bolling AFB controlled the collection of data.

  13. Applying a Framework to Evaluate Assignment Marking Software: A Case Study on Lightwork

    ERIC Educational Resources Information Center

    Heinrich, Eva; Milne, John

    2012-01-01

    This article presents the findings of a qualitative evaluation on the effect of a specialised software tool on the efficiency and quality of assignment marking. The software, Lightwork, combines with the Moodle learning management system and provides support through marking rubrics and marker allocations. To enable the evaluation a framework has…

  14. Seismic Software Evaluation at the Swiss Seismological Service

    NASA Astrophysics Data System (ADS)

    Clinton, John; Olivieri, Marco; Kaestli, Philipp

    2010-05-01

    The Swiss Seismological Service (SED) has an ongoing responsibility to improve the seismic monitoring capability for Switzerland. This is a crucial issue for a country with a low background seismicity but where a large M6+ earthquake is expected in the next decades. With over 30 stations and station spacing of ~25km, the SED operate one of the densest broadband networks in the world, which is complimented by a similar number of real time strong motion stations. An existing in-house processing software has been operational for the last 15 years, and though well suited for the Swiss setting, including the ability to 1. automatically locate and alert local events and 2. manually relocate events with a nonlinear location algorithm using a 3-D velocity model, the software does not satisfactorily accommodate integration of standard community software tools, nor provide a modern database interface for either station metadata or event parameters. To take advantage of major improvements in software architecture and community tools, we wish to migrate to a community standard solution for data acquisition, automatic and manual processing, and archival. We have been evaluating in detail SeisComp3, a state-of-the-art monitoring system developed by GFZ, as well as Nanometrics Apollo Suite (which uses USGS Hydra at it core for event processing). We present our analysis of the capabilities of each software we have been evaluating. In particular, we focus on the capability of each software to detect and identify small local (>Ml1) as well as large regional events. We discuss our results in terms or location and magnitude accuracy, with particular attention to the specific improvements needed from monitoring systems for improved monitoring of small regions with high quality seismic networks.

  15. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  16. Evaluation of Computer Software for Use in the Classroom.

    ERIC Educational Resources Information Center

    Johnson, William E.

    To help teachers cope with the proliferation of software and software sources, a number of resources are available to aid in the evaluation and selection of educational software. For instance, both the "Educator's Handbook and Software Directory" and "Swift's Directory of Educational Software, Apple II Edition" provide listings of educational…

  17. Learning Photogrammetry with Interactive Software Tool PhoX

    NASA Astrophysics Data System (ADS)

    Luhmann, T.

    2016-06-01

    Photogrammetry is a complex topic in high-level university teaching, especially in the fields of geodesy, geoinformatics and metrology where high quality results are demanded. In addition, more and more black-box solutions for 3D image processing and point cloud generation are available that generate nice results easily, e.g. by structure-from-motion approaches. Within this context, the classical approach of teaching photogrammetry (e.g. focusing on aerial stereophotogrammetry) has to be reformed in order to educate students and professionals with new topics and provide them with more information behind the scene. Since around 20 years photogrammetry courses at the Jade University of Applied Sciences in Oldenburg, Germany, include the use of digital photogrammetry software that provide individual exercises, deep analysis of calculation results and a wide range of visualization tools for almost all standard tasks in photogrammetry. During the last years the software package PhoX has been developed that is part of a new didactic concept in photogrammetry and related subjects. It also serves as analysis tool in recent research projects. PhoX consists of a project-oriented data structure for images, image data, measured points and features and 3D objects. It allows for almost all basic photogrammetric measurement tools, image processing, calculation methods, graphical analysis functions, simulations and much more. Students use the program in order to conduct predefined exercises where they have the opportunity to analyse results in a high level of detail. This includes the analysis of statistical quality parameters but also the meaning of transformation parameters, rotation matrices, calibration and orientation data. As one specific advantage, PhoX allows for the interactive modification of single parameters and the direct view of the resulting effect in image or object space.

  18. Comparisons of Kinematics and Dynamics Simulation Software Tools

    NASA Technical Reports Server (NTRS)

    Shiue, Yeu-Sheng Paul

    2002-01-01

    Kinematic and dynamic analyses for moving bodies are essential to system engineers and designers in the process of design and validations. 3D visualization and motion simulation plus finite element analysis (FEA) give engineers a better way to present ideas and results. Marshall Space Flight Center (MSFC) system engineering researchers are currently using IGRIP from DELMIA Inc. as a kinematic simulation tool for discrete bodies motion simulations. Although IGRIP is an excellent tool for kinematic simulation with some dynamic analysis capabilities in robotic control, explorations of other alternatives with more powerful dynamic analysis and FEA capabilities are necessary. Kinematics analysis will only examine the displacement, velocity, and acceleration of the mechanism without considering effects from masses of components. With dynamic analysis and FEA, effects such as the forces or torques at the joint due to mass and inertia of components can be identified. With keen market competition, ALGOR Mechanical Event Simulation (MES), MSC visualNastran 4D, Unigraphics Motion+, and Pro/MECHANICA were chosen for explorations. In this study, comparisons between software tools were presented in terms of following categories: graphical user interface (GUI), import capability, tutorial availability, ease of use, kinematic simulation capability, dynamic simulation capability, FEA capability, graphical output, technical support, and cost. Propulsion Test Article (PTA) with Fastrac engine model exported from IGRIP and an office chair mechanism were used as examples for simulations.

  19. Evaluation of commercially available lighting design software

    SciTech Connect

    McConnell, D.

    1989-01-01

    For years the lighting industry has manually entered and manually performed calculations on the photometric data that is necessary for lighting designs. In the past few years many lighting manufacturers and private lighting design software companies have published computer programs to enter and perform these calculations. Sandia National Laboratories (SNL), and other interested organizations, are involved in outdoor lighting designs for Closed Circuit Television (CCTV) that require lighting design software programs. During the period when no commercial lighting design software programs existed, SNL first used a government agency's program and then developed an in-house program. The in-house program is very powerful but has limitations, so it is not feasible to distribute it to interested organizations. This program has been used extensively for many high security outdoor lighting design projects. There is still a demand for lighting design programs, so SNL has ordered several that are commercially available. These programs are being evaluated for two reasons: (1) to determine if their features are adequate to aid the user in lighting designs, and (2) to provide that information to SNL and other organizations. The information obtained in this paper is to be used to help an end user decide if a program is needed, and if so, to choose one. This paper presents the results of evaluations performed. 5 refs., 6 figs., 3 tabs.

  20. A survey on open source software testing tools: a preliminary study in 2011

    NASA Astrophysics Data System (ADS)

    Emami, Seyed Amir; Sim, Jason Chin Lung; Sim, Kwan Yong

    2011-12-01

    Software Testing is a costly and time consuming process in software development. Therefore, software testing tools are often deployed to automate the process in order to reduce cost and improve efficiency. However, many of them are proprietary and expensive. Hence, open source software testing tools could be an appealing alternative. In this paper, we survey the current states of open source software testing tools from three aspects, namely, their availability for different programming platforms and types testing activities, maintenance of the tools and license limitations. From the 152 tools surveyed, we found that open source software testing tools not only are widely available for popular programming platforms, but also support a wide range of testing activities. Furthermore, we also found that more than half of the tools surveyed have been actively maintained and updated by the open source communities. Finally, these tools have very few licensing limitations for commercial use, customization and redistribution.

  1. User Guide for the STAYSL PNNL Suite of Software Tools

    SciTech Connect

    Greenwood, Lawrence R.; Johnson, Christian D.

    2013-02-27

    The STAYSL PNNL software suite provides a set of tools for working with neutron activation rates measured in a nuclear fission reactor, an accelerator-based neutron source, or any neutron field to determine the neutron flux spectrum through a generalized least-squares approach. This process is referred to as neutron spectral adjustment since the preferred approach is to use measured data to adjust neutron spectra provided by neutron physics calculations. The input data consist of the reaction rates based on measured activities, an initial estimate of the neutron flux spectrum, neutron activation cross sections and their associated uncertainties (covariances), and relevant correction factors. The output consists of the adjusted neutron flux spectrum and associated covariance matrix, which is useful for neutron dosimetry and radiation damage calculations.

  2. Software Development Of XML Parser Based On Algebraic Tools

    NASA Astrophysics Data System (ADS)

    Georgiev, Bozhidar; Georgieva, Adriana

    2011-12-01

    In this paper, is presented one software development and implementation of an algebraic method for XML data processing, which accelerates XML parsing process. Therefore, the proposed in this article nontraditional approach for fast XML navigation with algebraic tools contributes to advanced efforts in the making of an easier user-friendly API for XML transformations. Here the proposed software for XML documents processing (parser) is easy to use and can manage files with strictly defined data structure. The purpose of the presented algorithm is to offer a new approach for search and restructuring hierarchical XML data. This approach permits fast XML documents processing, using algebraic model developed in details in previous works of the same authors. So proposed parsing mechanism is easy accessible to the web consumer who is able to control XML file processing, to search different elements (tags) in it, to delete and to add a new XML content as well. The presented various tests show higher rapidity and low consumption of resources in comparison with some existing commercial parsers.

  3. Talkoot: software tool to create collaboratories for earth science

    SciTech Connect

    Movva, Sunil; Ramachandran, Rahul; Maskey, Manil; Kulkarni, Ajinkya; Conover, Helen; Nair, U.S.

    2012-01-01

    Open science, where researchers share and publish every element of their research process in addition to the final results, can foster novel ways of collaboration among researchers and has the potential to spontaneously create new virtual research collaborations. Based on scientific interest, these new virtual research collaborations can cut across traditional boundaries such as institutions and organizations. Advances in technology allow for software tools that can be used by different research groups and institutions to build and support virtual collaborations and infuse open science. This paper describes Talkoot, a software toolkit designed and developed by the authors to provide Earth Science researchers a ready-to-use knowledge management environment and an online platform for collaboration. Talkoot allows Earth Science researchers a means to systematically gather, tag and share their data, analysis workflows and research notes. These Talkoot features are designed to foster rapid knowledge sharing within a virtual community. Talkoot can be utilized by small to medium sized groups and research centers, as well as large enterprises such a national laboratories and federal agencies.

  4. FACET: Future ATM Concepts Evaluation Tool

    NASA Technical Reports Server (NTRS)

    Bilmoria, Karl D.; Banavar, Sridhar; Chatterji, Gano B.; Sheth, Kapil S.; Grabbe, Shon

    2000-01-01

    FACET (Future ATM Concepts Evaluation Tool) is an Air Traffic Management research tool being developed at the NASA Ames Research Center. This paper describes the design, architecture and functionalities of FACET. The purpose of FACET is to provide E simulation environment for exploration, development and evaluation of advanced ATM concepts. Examples of these concepts include new ATM paradigms such as Distributed Air-Ground Traffic Management, airspace redesign and new Decision Support Tools (DSTs) for controllers working within the operational procedures of the existing air traffic control system. FACET is currently capable of modeling system-wide en route airspace operations over the contiguous United States. Airspace models (e.g., Center/sector boundaries, airways, locations of navigation aids and airports) are available from databases. A core capability of FACET is the modeling of aircraft trajectories. Using round-earth kinematic equations, aircraft can be flown along flight plan routes or great circle routes as they climb, cruise and descend according to their individual aircraft-type performance models. Performance parameters (e.g., climb/descent rates and speeds, cruise speeds) are obtained from data table lookups. Heading, airspeed and altitude-rate dynamics are also modeled. Additional functionalities will be added as necessary for specific applications. FACET software is written in Java and C programming languages. It is platform-independent, and can be run on a variety of computers. FACET has been designed with a modular software architecture to enable rapid integration of research prototype implementations of new ATM concepts. There are several advanced ATM concepts that are currently being implemented in FACET airborne separation assurance, dynamic density predictions, airspace redesign (re-sectorization), benefits of a controller DST for direct-routing, and the integration of commercial space transportation system operations into the U.S. National

  5. A software tool for removing patient identifying information from clinical documents.

    PubMed

    Friedlin, F Jeff; McDonald, Clement J

    2008-01-01

    We created a software tool that accurately removes all patient identifying information from various kinds of clinical data documents, including laboratory and narrative reports. We created the Medical De-identification System (MeDS), a software tool that de-identifies clinical documents, and performed 2 evaluations. Our first evaluation used 2,400 Health Level Seven (HL7) messages from 10 different HL7 message producers. After modifying the software based on the results of this first evaluation, we performed a second evaluation using 7,190 pathology report HL7 messages. We compared the results of MeDS de-identification process to a gold standard of human review to find identifying strings. For both evaluations, we calculated the number of successful scrubs, missed identifiers, and over-scrubs committed by MeDS and evaluated the readability and interpretability of the scrubbed messages. We categorized all missed identifiers into 3 groups: (1) complete HIPAA-specified identifiers, (2) HIPAA-specified identifier fragments, (3) non-HIPAA-specified identifiers (such as provider names and addresses). In the results of the first-pass evaluation, MeDS scrubbed 11,273 (99.06%) of the 11,380 HIPAA-specified identifiers and 38,095 (98.26%) of the 38,768 non-HIPAA-specified identifiers. In our second evaluation (status postmodification to the software), MeDS scrubbed 79,993 (99.47%) of the 80,418 HIPAA-specified identifiers and 12,689 (96.93%) of the 13,091 non-HIPAA-specified identifiers. Approximately 95% of scrubbed messages were both readable and interpretable. We conclude that MeDS successfully de-identified a wide range of medical documents from numerous sources and creates scrubbed reports that retain their interpretability, thereby maintaining their usefulness for research. PMID:18579831

  6. MUST - An integrated system of support tools for research flight software engineering. [Multipurpose User-oriented Software Technology

    NASA Technical Reports Server (NTRS)

    Straeter, T. A.; Foudriat, E. C.; Will, R. W.

    1977-01-01

    The objectives of NASA's MUST (Multipurpose User-oriented Software Technology) program at Langley Research Center are to cut the cost of producing software which effectively utilizes digital systems for flight research. These objectives will be accomplished by providing an integrated system of support software tools for use throughout the research flight software development process. A description of the overall MUST program and its progress toward the release of a first MUST system will be presented. This release includes: a special interactive user interface, a library of subroutines, assemblers, a compiler, automatic documentation tools, and a test and simulation system.

  7. The Software Line-up: What Reviewers Look for When Evaluating Software.

    ERIC Educational Resources Information Center

    ELECTRONIC Learning, 1982

    1982-01-01

    Contains a check list to aid teachers in evaluating software used in computer-assisted instruction on microcomputers. The evaluation form contains three sections: program description, program evaluation, and overall evaluation. A brief description of a software evaluation program in use at the Granite School District in Utah is included. (JJD)

  8. ELER software - a new tool for urban earthquake loss assessment

    NASA Astrophysics Data System (ADS)

    Hancilar, U.; Tuzun, C.; Yenidogan, C.; Erdik, M.

    2010-12-01

    ATC-55 (Yang, 2005). An urban loss assessment exercise for a scenario earthquake for the city of Istanbul is conducted and physical and social losses are presented. Damage to the urban environment is compared to the results obtained from similar software, i.e. KOERILoss (KOERI, 2002) and DBELA (Crowley et al., 2004). The European rapid loss estimation tool is expected to help enable effective emergency response, on both local and global level, as well as public information.

  9. Software-Based Pyrogram® Evaluation.

    PubMed

    Chen, Guoli; Olson, Matthew T; Eshleman, James R

    2015-01-01

    Pyrosequencing(®) is a widely used technology to detect gene mutations in a molecular research or diagnostics laboratory. Compared to Sanger sequencing, it is inherently more quantitative with a superior limit of detection, although it has a shorter read length and has difficulty with homopolymeric sequences.Results of Pyrosequencing experiments are typically presented as traces with sequential peaks, called Pyrograms(®). For the majority of clinical diagnostic cases, Pyrograms are straightforward to read. However, there are occasionally complex results that are uninterpretable or difficult to interpret. In this chapter, we demonstrate a computer software, named Pyromaker that has been developed to help with the analysis of Pyrograms. Pyromaker is a freely and publically available software program to assist in the recognition of patterns of mutations, interpretation of difficult or ambiguous testing results and design of an optimal strategy to detect potential mutations by generating simulated Pyrograms. In addition to help diagnostic activities, Pyromaker can also be used as a virtual and user-friendly educational tool to teach newcomers the fundamental mechanism of Pyrosequencing, and correct interpretation of actual Pyrosequencing data. PMID:26103889

  10. The Educational Software Design and Evaluation for K-8: Oral and Dental Health Software

    ERIC Educational Resources Information Center

    Kabakci, Isil; Birinci, Gurkay; Izmirli, Serkan

    2007-01-01

    The aim of this study is to inform about the development of the software "Oral and Dental Health" that will supplement the course of Science and Technology for K8 students in the primary school curriculum and to carry out an evaluation study of the software. This software has been prepared for educational purposes. In relation to the evaluation of…

  11. Technical Data Exchange Software Tools Adapted to Distributed Microsatellite Design

    NASA Astrophysics Data System (ADS)

    Pache, Charly

    2002-01-01

    One critical issue concerning distributed design of satellites, is the collaborative work it requires. In particular, the exchange of data between each group responsible for each subsystem can be complex and very time-consuming. The goal of this paper is to present a design collaborative tool, the SSETI Design Model (SDM), specifically developed for enabling satellite distributed design. SDM is actually used in the ongoing Student Space Exploration &Technology (SSETI) initiative (www.sseti.net). SSETI is lead by European Space Agency (ESA) outreach office (http://www.estec.esa.nl/outreach), involving student groups from all over Europe for design, construction and launch of a microsatellite. The first part of this paper presents the current version of the SDM tool, a collection of Microsoft Excel linked worksheets, one for each subsystem. An overview of the project framework/structure is given, explaining the different actors, the flows between them, as well as the different types of data and the links - formulas - between data sets. Unified Modeling Language (UML) diagrams give an overview of the different parts . Then the SDM's functionalities, developed in VBA scripts (Visual Basic for Application), are introduced, as well as the interactive features, user interfaces and administration tools. The second part discusses the capabilities and limitations of SDM current version. Taking into account these capabilities and limitations, the third part outlines the next version of SDM, a web-oriented, database-driven evolution of the current version. This new approach will enable real-time data exchange and processing between the different actors of the mission. Comprehensive UML diagrams will guide the audience through the entire modeling process of such a system. Tradeoffs simulation capabilities, security, reliability, hardware and software issues will also be thoroughly discussed.

  12. A software tool for rapid flood inundation mapping

    USGS Publications Warehouse

    Verdin, James; Verdin, Kristine; Mathis, Melissa; Magadzire, Tamuka; Kabuchanga, Eric; Woodbury, Mark; Gadain, Hussein

    2016-01-01

    The GIS Flood Tool (GFT) was developed by the U.S. Geological Survey with support from the U.S. Agency for International Development’s Office of U.S. Foreign Disaster Assistance to provide a means for production of reconnaissance-level flood inundation mapping for data-sparse and resource-limited areas of the world. The GFT has also attracted interest as a tool for rapid assessment flood inundation mapping for the Flood Inundation Mapping Program of the U.S. Geological Survey. The GFT can fill an important gap for communities that lack flood inundation mapping by providing a first-estimate of inundation zones, pending availability of resources to complete an engineering study. The tool can also help identify priority areas for application of scarce flood inundation mapping resources. The technical basis of the GFT is an application of the Manning equation for steady flow in an open channel, operating on specially processed digital elevation data. The GFT is implemented as a software extension in ArcGIS. Output maps from the GFT were validated at 11 sites with inundation maps produced previously by the Flood Inundation Mapping Program using standard one-dimensional hydraulic modeling techniques. In 80 percent of the cases, the GFT inundation patterns matched 75 percent or more of the one-dimensional hydraulic model inundation patterns. Lower rates of pattern agreement were seen at sites with low relief and subtle surface water divides. Although the GFT is simple to use, it should be applied with the oversight or review of a qualified hydraulic engineer who understands the simplifying assumptions of the approach.

  13. TINA manual landmarking tool: software for the precise digitization of 3D landmarks

    PubMed Central

    2012-01-01

    Background Interest in the placing of landmarks and subsequent morphometric analyses of shape for 3D data has increased with the increasing accessibility of computed tomography (CT) scanners. However, current computer programs for this task suffer from various practical drawbacks. We present here a free software tool that overcomes many of these problems. Results The TINA Manual Landmarking Tool was developed for the digitization of 3D data sets. It enables the generation of a modifiable 3D volume rendering display plus matching orthogonal 2D cross-sections from DICOM files. The object can be rotated and axes defined and fixed. Predefined lists of landmarks can be loaded and the landmarks identified within any of the representations. Output files are stored in various established formats, depending on the preferred evaluation software. Conclusions The software tool presented here provides several options facilitating the placing of landmarks on 3D objects, including volume rendering from DICOM files, definition and fixation of meaningful axes, easy import, placement, control, and export of landmarks, and handling of large datasets. The TINA Manual Landmark Tool runs under Linux and can be obtained for free from http://www.tina-vision.net/tarballs/. PMID:22480150

  14. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC`s perspective was ``to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.`` This translated into evaluating how easy it was to port ELROS over CRI`s ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC`s side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI`s goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.

  15. Evaluation of distributed computing tools

    SciTech Connect

    Stanberry, L.

    1992-10-28

    The original goal stated in the collaboration agreement from LCC's perspective was to show that networking tools available in UNICOS perform well enough to meet the requirements of LCC customers.'' This translated into evaluating how easy it was to port ELROS over CRI's ISO 2.0, which itself is a port of ISODE to the Cray. In addition we tested the interoperability of ELROS and ISO 2.0 programs running on the Cray, and communicating with each other, and with servers or clients running on other machines. To achieve these goals from LCC's side, we ported ELROS to the Cray, and also obtained and installed a copy of the ISO 2.0 distribution from CRI. CRI's goal for the collaboration was to evaluate the usability of ELROS. In particular, we were interested in their potential feedback on the use of ELROS in implementing ISO protocols--whether ELROS would be easter to use and perform better than other tools that form part of the standard ISODE system. To help achieve these goals for CRI, we provided them with a distribution tar file containing the ELROS system, once we had completed our port of ELROS to the Cray.

  16. Oxygen octahedra picker: A software tool to extract quantitative information from STEM images.

    PubMed

    Wang, Yi; Salzberger, Ute; Sigle, Wilfried; Eren Suyolcu, Y; van Aken, Peter A

    2016-09-01

    In perovskite oxide based materials and hetero-structures there are often strong correlations between oxygen octahedral distortions and functionality. Thus, atomistic understanding of the octahedral distortion, which requires accurate measurements of atomic column positions, will greatly help to engineer their properties. Here, we report the development of a software tool to extract quantitative information of the lattice and of BO6 octahedral distortions from STEM images. Center-of-mass and 2D Gaussian fitting methods are implemented to locate positions of individual atom columns. The precision of atomic column distance measurements is evaluated on both simulated and experimental images. The application of the software tool is demonstrated using practical examples. PMID:27344044

  17. NASA software specification and evaluation system design, part 2

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A survey and analysis of the existing methods, tools and techniques employed in the development of software are presented along with recommendations for the construction of reliable software. Functional designs for software specification language, and the data base verifier are presented.

  18. Dental students' evaluations of an interactive histology software.

    PubMed

    Rosas, Cristian; Rubí, Rafael; Donoso, Manuel; Uribe, Sergio

    2012-11-01

    This study assessed dental students' evaluations of a new Interactive Histology Software (IHS) developed by the authors and compared students' assessment of the extent to which this new software, as well as other histology teaching methods, supported their learning. The IHS is a computer-based tool for histology learning that presents high-resolution images of histology basics as well as specific oral histologies at different magnifications and with text labels. Survey data were collected from 204 first-year dental students at the Universidad Austral de Chile. The survey consisted of questions for the respondents to evaluate the characteristics of the IHS and the contribution of various teaching methods to their histology learning. The response rate was 85 percent. Student evaluations were positive for the design, usability, and theoretical-practical integration of the IHS, and the students reported they would recommend the method to future students. The students continued to value traditional teaching methods for histological lab work and did not think this new technology would replace traditional methods. With respect to the contribution of each teaching method to students' learning, no statistically significant differences (p>0.05) were found for an evaluation of IHS, light microscopy, and slide presentations. However, these student assessments were significantly more positive than the evaluations of other digital or printed materials. Overall, the students evaluated the IHS very positively in terms of method quality and contribution to their learning; they also evaluated use of light microscopy and teacher slide presentations positively. PMID:23144485

  19. NEXT GENERATION ANALYSIS SOFTWARE FOR COMPONENT EVALUATION - Results of Rotational Seismometer Evaluation

    NASA Astrophysics Data System (ADS)

    Hart, D. M.; Merchant, B. J.; Abbott, R. E.

    2012-12-01

    The Component Evaluation project at Sandia National Laboratories supports the Ground-based Nuclear Explosion Monitoring program by performing testing and evaluation of the components that are used in seismic and infrasound monitoring systems. In order to perform this work, Component Evaluation maintains a testing facility called the FACT (Facility for Acceptance, Calibration, and Testing) site, a variety of test bed equipment, and a suite of software tools for analyzing test data. Recently, Component Evaluation has successfully integrated several improvements to its software analysis tools and test bed equipment that have substantially improved our ability to test and evaluate components. The software tool that is used to analyze test data is called TALENT: Test and AnaLysis EvaluatioN Tool. TALENT is designed to be a single, standard interface to all test configuration, metadata, parameters, waveforms, and results that are generated in the course of testing monitoring systems. It provides traceability by capturing everything about a test in a relational database that is required to reproduce the results of that test. TALENT provides a simple, yet powerful, user interface to quickly acquire, process, and analyze waveform test data. The software tool has also been expanded recently to handle sensors whose output is proportional to rotation angle, or rotation rate. As an example of this new processing capability, we show results from testing the new ATA ARS-16 rotational seismometer. The test data was collected at the USGS ASL. Four datasets were processed: 1) 1 Hz with increasing amplitude, 2) 4 Hz with increasing amplitude, 3) 16 Hz with increasing amplitude and 4) twenty-six discrete frequencies between 0.353 Hz to 64 Hz. The results are compared to manufacture-supplied data sheets.

  20. Evaluation of Agricultural Accounting Software. Improved Decision Making. Third Edition.

    ERIC Educational Resources Information Center

    Lovell, Ashley C., Comp.

    Following a discussion of the evaluation criteria for choosing accounting software, this guide contains reviews of 27 accounting software programs that could be used by farm or ranch business managers. The information in the reviews was provided by the software vendors and covers the following points for each software package: general features,…

  1. Evaluation of Computer Software for Teaching Statistics.

    ERIC Educational Resources Information Center

    Webster, Elaine

    1992-01-01

    Examines the strengths and weaknesses of five selected statistical software packages with respect to how well the software enhances business statistics instructors's ability to teach traditionally difficult topics. Ranks the software on technical and pedagogical benefits. Results indicate (1) no advantage to using textbook-related software over…

  2. Use of software tools in the development of real time software systems

    NASA Technical Reports Server (NTRS)

    Garvey, R. C.

    1981-01-01

    The transformation of a preexisting software system into a larger and more versatile system with different mission requirements is discussed. The history of this transformation is used to illustrate the use of structured real time programming techniques and tools to produce maintainable and somewhat transportable systems. The predecessor system is a single ground diagnostic system; its purpose is to exercise a computer controlled hardware set prior to its deployment in its functional environment, as well as test the equipment set by supplying certain well known stimulas. The successor system (FTE) is required to perform certain testing and control functions while this hardware set is in its functional environment. Both systems must deal with heavy user input/output loads and a new I/O requirement is included in the design of the FTF system. Human factors are enhanced by adding an improved console interface and special function keyboard handler. The additional features require the inclusion of much new software to the original set from which FTF was developed. As a result, it is necessary to split the system into a duel programming configuration with high rates of interground communications. A generalized information routing mechanism is used to support this configuration.

  3. Air traffic management evaluation tool

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar (Inventor); Sheth, Kapil S. (Inventor); Chatterji, Gano Broto (Inventor); Bilimoria, Karl D. (Inventor); Grabbe, Shon (Inventor); Schipper, John F. (Inventor)

    2010-01-01

    Method and system for evaluating and implementing air traffic management tools and approaches for managing and avoiding an air traffic incident before the incident occurs. The invention provides flight plan routing and direct routing or wind optimal routing, using great circle navigation and spherical Earth geometry. The invention provides for aircraft dynamics effects, such as wind effects at each altitude, altitude changes, airspeed changes and aircraft turns to provide predictions of aircraft trajectory (and, optionally, aircraft fuel use). A second system provides several aviation applications using the first system. These applications include conflict detection and resolution, miles-in trail or minutes-in-trail aircraft separation, flight arrival management, flight re-routing, weather prediction and analysis and interpolation of weather variables based upon sparse measurements.

  4. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, Gregory; Ferguson, B. Lynn

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results and were generated using a Yates analysis. A previous study had indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledgebase and developing a procedure to extract rules to aid in TBC design.

  5. A software tool to design thermal barrier coatings

    NASA Technical Reports Server (NTRS)

    Petrus, G.; Ferguson, B. L.

    1995-01-01

    This paper summarizes work completed for a NASA Phase 1 SBIR program which demonstrated the feasibility of developing a software tool to aid in the design of thermal barrier coating (TBC) systems. Toward this goal, three tasks were undertaken and completed. Task 1 involved the development of a database containing the pertinent thermal and mechanical property data for the top coat, bond coat and substrate materials that comprise a TBC system. Task 2 involved the development of an automated set-up program for generating two dimensional (2D) finite element models of TBC systems. Most importantly, Task 3 involved the generation of a rule base to aid in the design of a TBC system. These rules were based on a factorial design of experiments involving FEM results, and were generated using a Yates analysis. A previous study has indicated the suitability and benefit of applying finite element analysis to perform computer based experiments to decrease but not eliminate physical experiments on TBC's. This program proved feasibility by expanding on these findings by developing a larger knowledge base and developing a procedure to extract rules to aid in TBC design.

  6. A software tool for the analysis of neuronal morphology data

    PubMed Central

    2014-01-01

    Anatomy plays a fundamental role in supporting and shaping nervous system activity. The remarkable progress of computer processing power within the last two decades has enabled the generation of electronic databases of complete three-dimensional (3D) dendritic and axonal morphology for neuroanatomical studies. Several laboratories are freely posting their reconstructions online after result publication v.gr. NeuroMorpho.Org (Nat Rev Neurosci7:318–324, 2006). These neuroanatomical archives represent a crucial resource to explore the relationship between structure and function in the brain (Front Neurosci6:49, 2012). However, such 'Cartesian’ descriptions bear little intuitive information for neuroscientists. Here, we developed a simple prototype of a MATLAB-based software tool to quantitatively describe the 3D neuronal structures from public repositories. The program imports neuronal reconstructions and quantifies statistical distributions of basic morphological parameters such as branch length, tortuosity, branch's genealogy and bifurcation angles. Using these morphological distributions, our algorithm can generate a set of virtual neurons readily usable for network simulations. PMID:24529393

  7. SHMTools: a general-purpose software tool for SHM applications

    SciTech Connect

    Harvey, Dustin; Farrar, Charles; Taylor, Stuart; Park, Gyuhae; Flynn, Eric B; Kpotufe, Samory; Dondi, Denis; Mollov, Todor; Todd, Michael D; Rosin, Tajana S; Figueiredo, Eloi

    2010-11-30

    This paper describes a new software package for various structural health monitoring (SHM) applications. The software is a set of standardized MATLAB routines covering three main stages of SHM: data acquisition, feature extraction, and feature classification for damage identification. A subset of the software in SHMTools is embeddable, which consists of Matlab functions that can be cross-compiled into generic 'C' programs to be run on a target hardware. The software is also designed to accommodate multiple sensing modalities, including piezoelectric active-sensing, which has been widely used in SHM practice. The software package, including standardized datasets, are publicly available for use by the SHM community. The details of this embeddable software will be discussed, along with several example processes that can be used for guidelines for future use of the software.

  8. DSC: software tool for simulation-based design of control strategies applied to wastewater treatment plants.

    PubMed

    Ruano, M V; Ribes, J; Seco, A; Ferrer, J

    2011-01-01

    This paper presents a computer tool called DSC (Simulation based Controllers Design) that enables an easy design of control systems and strategies applied to wastewater treatment plants. Although the control systems are developed and evaluated by simulation, this tool aims to facilitate the direct implementation of the designed control system to the PC of the full-scale WWTP (wastewater treatment plants). The designed control system can be programmed in a dedicated control application and can be connected to either the simulation software or the SCADA of the plant. To this end, the developed DSC incorporates an OPC server (OLE for process control) which facilitates an open-standard communication protocol for different industrial process applications. The potential capabilities of the DSC tool are illustrated through the example of a full-scale application. An aeration control system applied to a nutrient removing WWTP was designed, tuned and evaluated with the DSC tool before its implementation in the full scale plant. The control parameters obtained by simulation were suitable for the full scale plant with only few modifications to improve the control performance. With the DSC tool, the control systems performance can be easily evaluated by simulation. Once developed and tuned by simulation, the control systems can be directly applied to the full-scale WWTP. PMID:21330730

  9. Evaluating software development by analysis of changes: The data from the software engineering laboratory

    NASA Technical Reports Server (NTRS)

    1982-01-01

    An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.

  10. A Probabilistic Software System Attribute Acceptance Paradigm for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2005-01-01

    Standard software requirement formats are written from top-down perspectives only, that is, from an ideal notion of a client s needs. Despite the exactness of the standard format, software and system errors in designed systems have abounded. Bad and inadequate requirements have resulted in cost overruns, schedule slips and lost profitability. Commercial off-the-shelf (COTS) software components are even more troublesome than designed systems because they are often provided as is and subsequently delivered with unsubstantiated validation of described capabilities. For COTS software, there needs to be a way to express the client s software needs in a consistent and formal manner using software system attributes derived from software quality standards. Additionally, the format needs to be amenable to software evaluation processes that integrate observable evidence garnered from historical data. This paper presents a paradigm that effectively bridges the gap between what a client desires (top-down) and what has been demonstrated (bottom-up) for COTS software evaluation. The paradigm addresses the specification of needs before the software evaluation is performed and can be used to increase the shared understanding between clients and software evaluators about what is required and what is technically possible.

  11. Kid Tools: Self-Management, Problem- Solving, Organizational, and Planning Software for Children and Teachers

    ERIC Educational Resources Information Center

    Miller, Kevin J.; Fitzgerald, Gail E.; Koury, Kevin A.; Mitchem, Herine J.; Hollingsead, Candice

    2007-01-01

    This article provides an overview of KidTools, an electronic performance software system designed for elementary and middle school children to use independently on classroom or home computers. The software system contains 30 computerized research-based strategy tools that can be implemented in a classroom or home environment. Through the…

  12. Technology Pedagogy: Software Tools for Teaching and Learning

    ERIC Educational Resources Information Center

    Berry, James; Staub, Nancy

    2011-01-01

    Adoption of technology for teaching and learning is not as significant as the adoption and use of software that is used as a pedagogical extension of a teacher's approach to classroom instruction. It is the dynamic and integrated use of software that extends the pedagogical role of the teacher beyond the traditional lecture and discussion format.…

  13. Using Commercial Off-the-Shelf Software Tools for Space Shuttle Scientific Software

    NASA Technical Reports Server (NTRS)

    Groleau, Nicolas; Friedland, Peter (Technical Monitor)

    1994-01-01

    In October 1993, the Astronaut Science Advisor (ASA) was on board the STS-58 flight of the space shuttle. ASA is an interactive system providing data acquisition and analysis, experiment step re-scheduling, and various other forms of reasoning. As fielded, the system runs on a single Macintosh PowerBook 170, which hosts the six ASA modules. There is one other piece of hardware, an external (GW Instruments, Sommerville, Massachusetts) analog-to-digital converter connected to the PowerBook's SCSI port. Three main software tools were used: LabVIEW, CLIPS, and HyperCard: First, a module written in LabVIEW (National Instruments, Austin, Texas) controls the A/D conversion and stores the resulting data in appropriate arrays. This module also analyzes the numerical data to produce a small set of characteristic numbers or symbols describing the results of an experiment trial. Second, a forward-chaining inference system written in CLIPS (NASA) uses the symbolic information provided by the first stage with a static rule base to infer decisions about the experiment. This expert system shell is used by the system for diagnosis. The third component of the system is the user interface, written in HyperCard (Claris Inc. and Apple Inc., both in Cupertino, California).

  14. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    PubMed

    Yan, Hui; Dai, Jian-Rong

    2016-01-01

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm

  15. Evaluation of the DSN software methodology

    NASA Technical Reports Server (NTRS)

    Irvine, A. P.; Mckenzie, M.

    1978-01-01

    The effects of the DSN software methodology, as implemented under the DSN Programming System, on the DSN Mark 3 Data Subsystems Implementation Project (MDS) are described. The software methodology is found to provide a markedly increased visibility to management, and to produce software of greater reliability at a small decrease in implementation cost. It is also projected that additional savings will result during the maintenance phase. Documentation support is identified as an area that is receiving further attention.

  16. Development and applications of a software tool for diarthrodial joint analysis.

    PubMed

    Martelli, Sandra; Lopomo, Nicola; Greggio, Samuele; Ferretti, Emil; Visani, Andrea

    2006-07-01

    This paper describes a new software environment for advanced analysis of diarthrodial joints. The new tool provides a number of elaboration functions to investigate the joint kinematics, bone anatomy, and ligament and tendon properties. In particular, the shapes and the contact points of the articulating surfaces can be displayed and analysed through 2D user-defined sections and fittings (lines or conics). Ligament behaviour can be evaluated during joint movement, through the computation of elongations, orientations, and fiber strain. Motion trajectories can be also analysed through the calculation of helical axes, instantaneous rotations, and displacements in specific user-chosen coordinate reference frames. The software has an user-friendly graphical interface to display four-dimensional data (time-space data) obtained from medical images, navigation systems, spatial linkages or digitalizers, and can also generate printable reports and multiple graphs as well as ASCII files that can be imported to spreadsheet programs such as Microsoft Excel. PMID:16777259

  17. Early Childhood Teacher Candidates Evaluate Computer Software for Young Children

    ERIC Educational Resources Information Center

    Aldrich, Jennifer

    2002-01-01

    The proliferation of computer software for young children necessitates that early childhood teachers have the knowledge and skills needed to evaluate and select developmentally appropriate software. This article describes the manner in which one university prepares early childhood teacher candidates to analyze software for the young children in…

  18. Air traffic management evaluation tool

    NASA Technical Reports Server (NTRS)

    Sridhar, Banavar (Inventor); Sheth, Kapil S. (Inventor); Chatterji, Gano Broto (Inventor); Bilimoria, Karl D. (Inventor); Grabbe, Shon (Inventor); Schipper, John F. (Inventor)

    2012-01-01

    Methods for evaluating and implementing air traffic management tools and approaches for managing and avoiding an air traffic incident before the incident occurs. A first system receives parameters for flight plan configurations (e.g., initial fuel carried, flight route, flight route segments followed, flight altitude for a given flight route segment, aircraft velocity for each flight route segment, flight route ascent rate, flight route descent route, flight departure site, flight departure time, flight arrival time, flight destination site and/or alternate flight destination site), flight plan schedule, expected weather along each flight route segment, aircraft specifics, airspace (altitude) bounds for each flight route segment, navigational aids available. The invention provides flight plan routing and direct routing or wind optimal routing, using great circle navigation and spherical Earth geometry. The invention provides for aircraft dynamics effects, such as wind effects at each altitude, altitude changes, airspeed changes and aircraft turns to provide predictions of aircraft trajectory (and, optionally, aircraft fuel use). A second system provides several aviation applications using the first system. Several classes of potential incidents are analyzed and averted, by appropriate change en route of one or more parameters in the flight plan configuration, as provided by a conflict detection and resolution module and/or traffic flow management modules. These applications include conflict detection and resolution, miles-in trail or minutes-in-trail aircraft separation, flight arrival management, flight re-routing, weather prediction and analysis and interpolation of weather variables based upon sparse measurements. The invention combines these features to provide an aircraft monitoring system and an aircraft user system that interact and negotiate changes with each other.

  19. GMFilter and SXTestPlate: software tools for improving the SNPlex™ genotyping system

    PubMed Central

    Teuber, Markus; Wenz, Michael H; Schreiber, Stefan; Franke, Andre

    2009-01-01

    Background Genotyping of single-nucleotide polymorphisms (SNPs) is a fundamental technology in modern genetics. The SNPlex™ mid-throughput genotyping system (Applied Biosystems, Foster City, CA, USA) enables the multiplexed genotyping of up to 48 SNPs simultaneously in a single DNA sample. The high level of automation and the large amount of data produced in a high-throughput laboratory require advanced software tools for quality control and workflow management. Results We have developed two programs, which address two main aspects of quality control in a SNPlex™ genotyping environment: GMFilter improves the analysis of SNPlex™ plates by removing wells with a low overall signal intensity. It enables scientists to automatically process the raw data in a standardized way before analyzing a plate with the proprietary GeneMapper software from Applied Biosystems. SXTestPlate examines the genotype concordance of a SNPlex™ test plate, which was typed with a control SNP set. This program allows for regular quality control checks of a SNPlex™ genotyping platform. It is compatible to other genotyping methods as well. Conclusion GMFilter and SXTestPlate provide a valuable tool set for laboratories engaged in genotyping based on the SNPlex™ system. The programs enhance the analysis of SNPlex™ plates with the GeneMapper software and enable scientists to evaluate the performance of their genotyping platform. PMID:19267942

  20. Training Software Developers and Designers to Conduct Usability Evaluations

    ERIC Educational Resources Information Center

    Skov, Mikael Brasholt; Stage, Jan

    2012-01-01

    Many efforts to improve the interplay between usability evaluation and software development rely either on better methods for conducting usability evaluations or on better formats for presenting evaluation results in ways that are useful for software designers and developers. Both of these approaches depend on a complete division of work between…

  1. Using Colored Stochastic Petri Net (CS-PN) software for protocol specification, validation, and evaluation

    NASA Technical Reports Server (NTRS)

    Zenie, Alexandre; Luguern, Jean-Pierre

    1987-01-01

    The specification, verification, validation, and evaluation, which make up the different steps of the CS-PN software are outlined. The colored stochastic Petri net software is applied to a Wound/Wait protocol decomposable into two principal modules: request or couple (transaction, granule) treatment module and wound treatment module. Each module is specified, verified, validated, and then evaluated separately, to deduce a verification, validation and evaluation of the complete protocol. The colored stochastic Petri nets tool is shown to be a natural extension of the stochastic tool, adapted to distributed systems and protocols, because the color conveniently takes into account the numerous sites, transactions, granules and messages.

  2. Validation, Verification and Evaluation of Visualization Software: Position Statement

    NASA Technical Reports Server (NTRS)

    Globus, Al; Kutler, Paul (Technical Monitor)

    1998-01-01

    Visualization software needs rigorous verification in the form of much better testing, and experiments with human subjects are essential to scientifically validate and evaluate visualization techniques.

  3. An Approach to Building a Traceability Tool for Software Development

    NASA Technical Reports Server (NTRS)

    Delgado, Nelly; Watson, Tom

    1997-01-01

    It is difficult in a large, complex computer program to ensure that it meets the specified requirements. As the program evolves over time, a11 program constraints originally elicited during the requirements phase must be maintained. In addition, during the life cycle of the program, requirements typically change and the program must consistently reflect those changes. Imagine the following scenario. Company X wants to develop a system to automate its assembly line. With such a large system, there are many different stakeholders, e.g., managers, experts such as industrial and mechanical engineers, and end-users. Requirements would be elicited from all of the stake holders involved in the system with each stakeholder contributing their point of view to the requirements. For example, some of the requirements provided by an industrial engineer may concern the movement of parts through the assembly line. A point of view provided by the electrical engineer may be reflected in constraints concerning maximum power usage. End-users may be concerned with comfort and safety issues, whereas managers are concerned with the efficiency of the operation. With so many points of view affecting the requirements, it is difficult to manage them, communicate information to relevant stakeholders. and it is likely that conflicts in the requirements will arise. In the coding process, the implementors will make additional assumptions and interpretations on the design and the requirements of the system. During any stage of development, stakeholders may request that a requirement be added or changed. In such a dynamic environment, it is difficult to guarantee that the system will preserve the current set of requirements. Tracing, the mapping between objects in the artifacts of the system being developed, addresses this issue. Artifacts encompass documents such as the system definition, interview transcripts, memoranda, the software requirements specification, user's manuals, the functional

  4. BYMUR software: a free and open source tool for quantifying and visualizing multi-risk analyses

    NASA Astrophysics Data System (ADS)

    Tonini, Roberto; Selva, Jacopo

    2013-04-01

    instance, the global standards defined in the frame of GEM project for seismic hazard and risk) will grant the interoperability with other FOSS software and tools and, at the same time, to be on hand of the geo-scientific community. An already available example of connection is represented by the BET_VH(**) tool, which probabilistic volcanic hazard outputs will be used as input for BYMUR. Finally, the prototype version of BYMUR will be used for the case study of the municipality of Naples, by considering three different natural hazards (volcanic eruptions, earthquakes and tsunamis) and by assessing the consequent long-term risk evaluation. (**)BET_VH (Bayesian Event Tree for Volcanic Hazard) is probabilistic tool for long-term volcanic hazard assessment, recently re-designed and adjusted to be run on the Vhub cyber-infrastructure, a free web-based collaborative tool in volcanology research (see http://vhub.org/resources/betvh).

  5. Open source tools for ATR development and performance evaluation

    NASA Astrophysics Data System (ADS)

    Baumann, James M.; Dilsavor, Ronald L.; Stubbles, James; Mossing, John C.

    2002-07-01

    Early in almost every engineering project, a decision must be made about tools; should I buy off-the-shelf tools or should I develop my own. Either choice can involve significant cost and risk. Off-the-shelf tools may be readily available, but they can be expensive to purchase and to maintain licenses, and may not be flexible enough to satisfy all project requirements. On the other hand, developing new tools permits great flexibility, but it can be time- (and budget-) consuming, and the end product still may not work as intended. Open source software has the advantages of both approaches without many of the pitfalls. This paper examines the concept of open source software, including its history, unique culture, and informal yet closely followed conventions. These characteristics influence the quality and quantity of software available, and ultimately its suitability for serious ATR development work. We give an example where Python, an open source scripting language, and OpenEV, a viewing and analysis tool for geospatial data, have been incorporated into ATR performance evaluation projects. While this case highlights the successful use of open source tools, we also offer important insight into risks associated with this approach.

  6. Fuzzy cognitive map software tool for treatment management of uncomplicated urinary tract infection.

    PubMed

    Papageorgiou, Elpiniki I

    2012-03-01

    Uncomplicated urinary tract infection (uUTI) is a bacterial infection that affects individuals with normal urinary tracts from both structural and functional perspective. The appropriate antibiotics and treatment suggestions to individuals suffer of uUTI is an important and complex task that demands a special attention. How to decrease the unsafely use of antibiotics and their consumption is an important issue in medical treatment. Aiming to model medical decision making for uUTI treatment, an innovative and flexible approach called fuzzy cognitive maps (FCMs) is proposed to handle with uncertainty and missing information. The FCM is a promising technique for modeling knowledge and/or medical guidelines/treatment suggestions and reasoning with it. A software tool, namely FCM-uUTI DSS, is investigated in this work to produce a decision support module for uUTI treatment management. The software tool was tested (evaluated) in a number of 38 patient cases, showing its functionality and demonstrating that the use of the FCMs as dynamic models is reliable and good. The results have shown that the suggested FCM-uUTI tool gives a front-end decision on antibiotics' suggestion for uUTI treatment and are considered as helpful references for physicians and patients. Due to its easy graphical representation and simulation process the proposed FCM formalization could be used to make the medical knowledge widely available through computer consultation systems. PMID:22001398

  7. Development of Automatic Testing Tool for `Design & Coding Standard' for Railway Signaling Software

    NASA Astrophysics Data System (ADS)

    Hwang, Jong-gyu; Jo, Hyun-jeong

    2009-08-01

    In accordance with the development of recent computer technology, the dependency of railway signaling system on the computer software is being increased further, and accordingly, the testing for the safety and reliability of railway signaling system software became more important. This thesis suggested an automated testing tool for coding rules on this railway signaling system software, and presented its result of implementation. The testing items in the implemented tool had referred to the international standards in relation to the software for railway system and MISRA-C standards. This automated testing tool for railway signaling system can be utilized at the assessment stage for railway signaling system software also, and it is anticipated that it can be utilized usefully at the software development stage also.

  8. Evaluation of air pollution modelling tools as environmental engineering courseware.

    PubMed

    Souto González, J A; Bello Bugallo, P M; Casares Long, J J

    2004-01-01

    The study of phenomena related to the dispersion of pollutants usually takes advantage of the use of mathematical models based on the description of the different processes involved. This educational approach is especially important in air pollution dispersion, when the processes follow a non-linear behaviour so it is difficult to understand the relationships between inputs and outputs, and in a 3D context where it becomes hard to analyze alphanumeric results. In this work, three different software tools, as computer solvers for typical air pollution dispersion phenomena, are presented. Each software tool developed to be implemented on PCs, follows approaches that represent three generations of programming languages (Fortran 77, VisualBasic and Java), applied over three different environments: MS-DOS, MS-Windows and the world wide web. The software tools were tested by students of environmental engineering (undergraduate) and chemical engineering (postgraduate), in order to evaluate the ability of these software tools to improve both theoretical and practical knowledge of the air pollution dispersion problem, and the impact of the different environment in the learning process in terms of content, ease of use and visualization of results. PMID:15193095

  9. A new microcomputer software system evaluation paradigm: the medical perspective.

    PubMed

    Kokol, P

    1991-08-01

    The new fourth-generation software has enormously eased the burden of computing for users, but they have also created a confusing, difficult problem in software evaluation and selection. Therefore it is argued that a sound and complete evaluation paradigm is a key element in an efficient and effective software system design and use process. While much has been written about software evaluation in general, in the medical field, the guidances and recommendations previously provided are too general to be of practical use. Considering also the other weaknesses of conventional evaluation paradigms we have decided to develop a more adequate one which will have a solid theoretical framework, specific guidances, strict and well-defined taxonomic space, and a fair ranking approach. In the present paper we will therefore introduce our new evaluation paradigm and show its applicability in evaluation and selection of medical software systems according to their usability. PMID:1800598

  10. OpenROCS: a software tool to control robotic observatories

    NASA Astrophysics Data System (ADS)

    Colomé, Josep; Sanz, Josep; Vilardell, Francesc; Ribas, Ignasi; Gil, Pere

    2012-09-01

    We present the Open Robotic Observatory Control System (OpenROCS), an open source software platform developed for the robotic control of telescopes. It acts as a software infrastructure that executes all the necessary processes to implement responses to the system events that appear in the routine and non-routine operations associated to data-flow and housekeeping control. The OpenROCS software design and implementation provides a high flexibility to be adapted to different observatory configurations and event-action specifications. It is based on an abstract model that is independent of the specific hardware or software and is highly configurable. Interfaces to the system components are defined in a simple manner to achieve this goal. We give a detailed description of the version 2.0 of this software, based on a modular architecture developed in PHP and XML configuration files, and using standard communication protocols to interface with applications for hardware monitoring and control, environment monitoring, scheduling of tasks, image processing and data quality control. We provide two examples of how it is used as the core element of the control system in two robotic observatories: the Joan Oró Telescope at the Montsec Astronomical Observatory (Catalonia, Spain) and the SuperWASP Qatar Telescope at the Roque de los Muchachos Observatory (Canary Islands, Spain).

  11. Software Tool Support to Specify and Verify Scientific Sensor Data Properties to Improve Anomaly Detection

    NASA Astrophysics Data System (ADS)

    Gallegos, I.; Gates, A. Q.; Tweedie, C.; Cybershare

    2010-12-01

    Advancements in scientific sensor data acquisition technologies, such as wireless sensor networks and robotic trams equipped with sensors, are increasing the amount of data being collected at field sites . This elevates the challenges of verifying the quality of streamed data and monitoring the correct operation of the instrumentation. Without the ability to evaluate the data collection process at near real-time, scientists can lose valuable time and data. In addition, scientists have to rely on their knowledge and experience in the field to evaluate data quality. Such knowledge is rarely shared or reused by other scientists mostly because of the lack of a well-defined methodology and tool support. Numerous scientific projects address anomaly detection, mostly as part of the verification system’s source code; however, anomaly detection properties, which often are embedded or hard-coded in the source code, are difficult to refine. In addition, a software developer is required to modify the source code every time a new anomaly detection property or a modification to an existing property is needed. This poster describes the tool support that has been developed, based on software engineering techniques, to address these challenges. The overall tool support allows scientists to specify and reuse anomaly detection properties generated using the specification tool and to use the specified properties to conduct automated anomaly detection at near-real time. The anomaly-detection mechanism is independent of the system used to collect the sensor data. With guidance provided by a classification and categorization of anomaly-detection properties, the user specifies properties on scientific sensor data. The properties, which can be associated with particular field sites or instrumentation, document knowledge about data anomalies that otherwise would have limited availability to the scientific community.

  12. Klonos: A Similarity Analysis Based Tool for Software Porting

    Energy Science and Technology Software Center (ESTSC)

    2014-07-30

    The Klonos is a compiler-based tool that can help users for scientific application porting. The tool is based on the similarity analysis with the help of the OpenUH compiler (a branch of Open64 compiler). This tool combines syntactic and cost-model-provided metrics clusters, which aggregate similar subroutines that can be ported similarity. The generated porting plan, which allows programmers and compilers to reuse porting experience as much as possible during the porting process.

  13. NASA software specification and evaluation system: Software verification/validation techniques

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.

  14. RFcap: a software analysis tool for multichannel cochlear implant signals.

    PubMed

    Lai, Wai Kong; Dillier, Norbert

    2013-03-01

    Being able to display and analyse the output of a speech processor that encodes the parameters of complex stimuli to be presented by a cochlear implant (CI) is useful for software and hardware development as well as for diagnostic purposes. This firstly requires appropriate hardware that is able to receive and decode the radio frequency (RF)-coded signals, and then processing the decoded data using suitable software. The PCI-IF6 clinical hardware for the Nucleus CI system, together with the Nucleus Implant Communicator and Nucleus Matlab Toolbox research software libraries, provide the necessary functionality. RFcap is a standalone Matlab application that encapsulates the relevant functions to capture, display, and analyse the RF-coded signals intended for the Nucleus CI24M/R, CI24RE, and CI500 multichannel CIs. PMID:21762546

  15. Software engineering capability for Ada (GRASP/Ada Tool)

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  16. Software Validation, Verification, and Testing Technique and Tool Reference Guide. Final Report.

    ERIC Educational Resources Information Center

    Powell, Patricia B., Ed.

    Intended as an aid in the selection of software techniques and tools, this document contains three sections: (1) a suggested methodology for the selection of validation, verification, and testing (VVT) techniques and tools; (2) summary matrices by development phase usage, a table of techniques and tools with associated keywords, and an…

  17. Toward Evaluating Software According to Principles of Learning and Teaching.

    ERIC Educational Resources Information Center

    Shuell, Thomas J.; Schueckler, Linda M.

    1989-01-01

    Describes study that evaluated 16 software packages representing all grade levels against 19 criteria based on principles of effective learning and teaching. Implications of the results for the development of effective instructional software and its use are discussed, and the evaluation form used is included in the appendix. (21 references) (LRW)

  18. Computer Software for Teaching Basic Skills to Adults. An Evaluation.

    ERIC Educational Resources Information Center

    Montana State Univ., Bozeman. Center for Community Education.

    This color-coded guide/catalog was prepared as a resource for adult educators through a Montana project that evaluated computer software for teaching basic skills to adults. The guide is divided into three parts. Part I consists of the results of the assessment and evaluation of 119 pieces of software currently being used at 16 adult basic…

  19. A Model for Evaluating and Acquiring Educational Software in Psychology.

    ERIC Educational Resources Information Center

    Brown, Stephen W.; And Others

    This paper describes a model for evaluating and acquiring instructionally effective and cost effective educational computer software in university psychology departments. Four stages in evaluating the software are developed: (1) establishing departmental goals and objectives for educational use of computers; (2) inventorying and evaluating…

  20. Learning Content and Software Evaluation and Personalisation Problems

    ERIC Educational Resources Information Center

    Kurilovas, Eugenijus; Serikoviene, Silvija

    2010-01-01

    The paper aims to analyse several scientific approaches how to evaluate, implement or choose learning content and software suitable for personalised users/learners needs. Learning objects metadata customisation method as well as the Method of multiple criteria evaluation and optimisation of learning software represented by the experts' additive…

  1. Using CONFIG for Simulation of Operation of Water Recovery Subsystems for Advanced Control Software Evaluation

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Flores, Luis; Fleming, Land; Throop, Daiv

    2002-01-01

    A hybrid discrete/continuous simulation tool, CONFIG, has been developed to support evaluation of the operability life support systems. CON FIG simulates operations scenarios in which flows and pressures change continuously while system reconfigurations occur as discrete events. In simulations, intelligent control software can interact dynamically with hardware system models. CONFIG simulations have been used to evaluate control software and intelligent agents for automating life support systems operations. A CON FIG model of an advanced biological water recovery system has been developed to interact with intelligent control software that is being used in a water system test at NASA Johnson Space Center

  2. Evaluating a Multimedia Authoring Tool.

    ERIC Educational Resources Information Center

    John, Bonnie E.; Mashyna, Matthew M.

    1997-01-01

    Presents a case study of a computer scientist learning and using the Cognitive Walkthrough (CW) technique to assess a multimedia authoring tool. Compares predictions by the analysis to the usability problems found in empirical usability tests. Presents several hypotheses about the cause of low effectiveness, which suggest that additional…

  3. Calico: An Early-Phase Software Design Tool

    ERIC Educational Resources Information Center

    Mangano, Nicolas Francisco

    2013-01-01

    When developers are faced with a design challenge, they often turn to the whiteboard. This is typical during the conceptual stages of software design, when no code is in existence yet. It may also happen when a significant code base has already been developed, for instance, to plan new functionality or discuss optimizing a key component. While…

  4. A Study of Collaborative Software Development Using Groupware Tools

    ERIC Educational Resources Information Center

    Defranco-Tommarello, Joanna; Deek, Fadi P.

    2005-01-01

    The experimental results of a collaborative problem solving and program development model that takes into consideration the cognitive and social activities that occur during software development is presented in this paper. This collaborative model is based on the Dual Common Model that focuses on individual cognitive aspects of problem solving and…

  5. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  6. AWG-Parameters: new software tool to design arrayed waveguide gratings

    NASA Astrophysics Data System (ADS)

    Seyringer, D.; Bielik, M.

    2013-03-01

    A new software tool and its application in the design of optical multiplexers/demultiplexers based on arrayed waveguide gratings is presented. The motivation for this work is the fact that when designing arrayed waveguide gratings a set of geometrical parameters must be first calculated. These parameters are the input for AWG layout that will be created and simulated using commercial photonic design tools. It is important to point out that these parameters influence strongly correct AWG demultiplexing properties and therefore have to be calculated very carefully. However, most of the commercial photonic design tools do not support this fundamental calculation. To be able to design any AWG, with any software tool and particularly to save the time needed for AWG design a new software tool was developed. The tool was already applied in various AWG designs and also technologically well-proven.

  7. A software tool for automatic classification and segmentation of 2D/3D medical images

    NASA Astrophysics Data System (ADS)

    Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur

    2013-02-01

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.

  8. A Software Tool for Processing the Displacement Time Series Extracted from Raw Radar Data

    SciTech Connect

    Coppi, Francesco; Paolo Ricci, Pier; Gentile, Carmelo

    2010-05-28

    The application of high-resolution radar waveform and interferometric principles recently led to the development of a microwave interferometer, suitable to simultaneously measuring the (static or dynamic) deflection of several points on a large structure. From the technical standpoint, the sensor is a Stepped Frequency Continuous Wave (SF-CW), coherent radar, operating in the K{sub u} frequency band.In the paper, the main procedures adopted to extract the deflection time series from raw radar data and to assess the quality of data are addressed, and the MATLAB toolbox developed is described. Subsequently, other functions implemented in the software tool (e.g. evaluation of the spectral matrix of the deflection time-histories, identification of natural frequencies and operational mode shapes evaluation) are described and the application to data recorded on full-scale bridges is exemplified.

  9. Planning Tool for Strategic Evaluation of Facility Plans - 13570

    SciTech Connect

    Magoulas, Virginia; Cercy, Michael; Hall, Irin

    2013-07-01

    Savannah River National Laboratory (SRNL) has developed a strategic planning tool for the evaluation of the utilization of its unique resources for processing and research and development of nuclear materials. The Planning Tool is a strategic level tool for assessing multiple missions that could be conducted utilizing the SRNL facilities and showcasing the plan. Traditional approaches using standard scheduling tools and laying out a strategy on paper tended to be labor intensive and offered either a limited or cluttered view for visualizing and communicating results. A tool that can assess the process throughput, duration, and utilization of the facility was needed. SRNL teamed with Newport News Shipbuilding (NNS), a division of Huntington Ingalls Industries, to create the next generation Planning Tool. The goal of this collaboration was to create a simulation based tool that allows for quick evaluation of strategies with respect to new or changing missions, and clearly communicates results to the decision makers. This tool has been built upon a mature modeling and simulation software previously developed by NNS. The Planning Tool provides a forum for capturing dependencies, constraints, activity flows, and variable factors. It is also a platform for quickly evaluating multiple mission scenarios, dynamically adding/updating scenarios, generating multiple views for evaluating/communicating results, and understanding where there are areas of risks and opportunities with respect to capacity. The Planning Tool that has been developed is useful in that it presents a clear visual plan for the missions at the Savannah River Site (SRS). It not only assists in communicating the plans to SRS corporate management, but also allows the area stakeholders a visual look at the future plans for SRS. The design of this tool makes it easily deployable to other facility and mission planning endeavors. (authors)

  10. Software development tools for the CDF MX scanner

    SciTech Connect

    Stuermer, W.; Turner, K.; Littleton-Sestini, S.

    1991-11-01

    This paper discuses the design of the high level assembler and diagnostic control program developed for the MX, a high speed, custom designed computer used in the CDF data acquisition system at Fermilab. These programs provide a friendly productive environment for the development of software on the MX. Details of their implementation and special features, and some of the lessons learned during their development are included.

  11. Initial evaluation of automated treatment planning software.

    PubMed

    Gintz, Dawn; Latifi, Kujtim; Caudell, Jimmy; Nelms, Benjamin; Zhang, Geoffrey; Moros, Eduardo; Feygelman, Vladimir

    2016-01-01

    Even with advanced inverse-planning techniques, radiation treatment plan opti-mization remains a very time-consuming task with great output variability, which prompted the development of more automated approaches. One commercially available technique mimics the actions of experienced human operators to pro-gressively guide the traditional optimization process with automatically created regions of interest and associated dose-volume objectives. We report on the initial evaluation of this algorithm on 10 challenging cases of locoreginally advanced head and neck cancer. All patients were treated with VMAT to 70 Gy to the gross disease and 56 Gy to the elective bilateral nodes. The results of post-treatment autoplanning (AP) were compared to the original human-driven plans (HDP). We used an objective scoring system based on defining a collection of specific dosimetric metrics and corresponding numeric score functions for each. Five AP techniques with different input dose goals were applied to all patients. The best of them averaged the composite score 8% lower than the HDP, across the patient population. The difference in median values was statistically significant at the 95% confidence level (Wilcoxon paired signed-rank test p = 0.027). This result reflects the premium the institution places on dose homogeneity, which was consistently higher with the HDPs. The OAR sparing was consistently better with the APs, the differences reaching statistical significance for the mean doses to the parotid glands (p < 0.001) and the inferior pharyngeal constrictor (p = 0.016), as well as for the maximum doses to the spinal cord (p = 0.018) and brainstem (p = 0.040). If one is prepared to accept less stringent dose homogeneity criteria from the RTOG 1016 protocol, nine APs would comply with the protocol, while providing lower OAR doses than the HDPs. Overall, AP is a promising clinical tool, but it could benefit from a better process for shifting the balance between the target dose

  12. Computer Aided Learning of Mathematics: Software Evaluation

    ERIC Educational Resources Information Center

    Yushau, B.; Bokhari, M. A.; Wessels, D. C. J.

    2004-01-01

    Computer Aided Learning of Mathematics (CALM) has been in use for some time in the Prep-Year Mathematics Program at King Fahd University of Petroleum & Minerals. Different kinds of software (both locally designed and imported) have been used in the quest of optimizing the recitation/problem session hour of the mathematics classes. This paper…

  13. An Overview of Public Access Computer Software Management Tools for Libraries

    ERIC Educational Resources Information Center

    Wayne, Richard

    2004-01-01

    An IT decision maker gives an overview of public access PC software that's useful in controlling session length and scheduling, Internet access, print output, security, and the latest headaches: spyware and adware. In this article, the author describes a representative sample of software tools in several important categories such as setup…

  14. Slower Algebra Students Meet Faster Tools: Solving Algebra Word Problems with Graphing Software

    ERIC Educational Resources Information Center

    Yerushalmy, Michal

    2006-01-01

    The article discusses the ways that less successful mathematics students used graphing software with capabilities similar to a basic graphing calculator to solve algebra problems in context. The study is based on interviewing students who learned algebra for 3 years in an environment where software tools were always present. We found differences…

  15. Designing and Using Software Tools for Educational Purposes: FLAT, a Case Study

    ERIC Educational Resources Information Center

    Castro-Schez, J. J.; del Castillo, E.; Hortolano, J.; Rodriguez, A.

    2009-01-01

    Educational software tools are considered to enrich teaching strategies, providing a more compelling means of exploration and feedback than traditional blackboard methods. Moreover, software simulators provide a more motivating link between theory and practice than pencil-paper methods, encouraging active and discovery learning in the students.…

  16. Evaluating School Counseling Websites: An Evaluation Tool

    ERIC Educational Resources Information Center

    Reynolds, Glenda P.; Kitchens, Helen

    2007-01-01

    The purpose of this paper is to describe the use of a webpage evaluation for embedding technology in classes for teaching school counseling and counseling program development. The instructors created the Website Evaluation Form to help students recognize qualities of webpages that would enhance the school counseling program, broaden their…

  17. Evaluating uncertainty in integrated environmental models: A review of concepts and tools

    NASA Astrophysics Data System (ADS)

    Matott, L. Shawn; Babendreier, Justin E.; Purucker, S. Thomas

    2009-06-01

    This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with "standard" definitions are provided in the context of integrated approaches to environmental assessment. These constructs provide a reference point for cataloging 65 different model evaluation tools. Each tool is described briefly (in the auxiliary material) and is categorized for applicability across seven thematic model evaluation methods. Ratings for citation count and software availability are also provided, and a companion Web site containing download links for tool software is introduced. The paper concludes by reviewing strategies for tool interoperability and offers guidance for both practitioners and tool developers.

  18. Evaluation as a Tool for Improvement

    NASA Astrophysics Data System (ADS)

    Eisenhamer, B.; Donahue, M.

    1998-05-01

    The standard tools of evaluation will be applied to one long-running but not complete IDEAS program, the Women's Science Forum. We will present the methods of evaluation, and show how the Principal Investigator would change her program based on the input from the evaluation.

  19. Software tools for the analysis of video meteors emission spectra

    NASA Astrophysics Data System (ADS)

    Madiedo, J. M.; Toscano, F. M.; Trigo-Rodriguez, J. M.

    2011-10-01

    One of the goals of the SPanish Meteor Network (SPMN) is related to the study of the chemical composition of meteoroids by analyzing the emission spectra resulting from the ablation of these particles of interplanetary matter in the atmosphere. With this aim, some of the CCD video devices we employ to observe the nigh sky are endowed with holographic diffraction gratings, and a continuous monitoring of meteor activity is performed. We have recently developed a new software to analyze these spectra. A description of this computer program is given, and some of the results obtained so far are presented here.

  20. Evaluation of selected environmental decision support software

    SciTech Connect

    Sullivan, T.M.; Moskowitz, P.D.; Gitten, M.

    1997-06-01

    Decision Support Software (DSS) continues to be developed to support analysis of decisions pertaining to environmental management. Decision support systems are computer-based systems that facilitate the use of data, models, and structured decision processes in decision making. The optimal DSS should attempt to integrate, analyze, and present environmental information to remediation project managers in order to select cost-effective cleanup strategies. The optimal system should have a balance between the sophistication needed to address the wide range of complicated sites and site conditions present at DOE facilities, and ease of use (e.g., the system should not require data that is typically unknown and should have robust error checking of problem definition through input, etc.). In the first phase of this study, an extensive review of the literature, the Internet, and discussions with sponsors and developers of DSS led to identification of approximately fifty software packages that met the preceding definition.

  1. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  2. Evaluation of competing software reliability predictions

    NASA Technical Reports Server (NTRS)

    Abdel-Ghaly, A. A.; Chan, P. Y.; Littlewood, B.

    1986-01-01

    Different software reliability models can produce very different answers when called upon to predict future reliability in a reliability growth context. Users need to know which, if any, of the competing predictions are trustworthy. Some techniques are presented which form the basis of a partial solution to this problem. Rather than attempting to decide which model is generally best, the approach adopted here allows a user to decide upon the most appropriate model for each application.

  3. Screening and Evaluation Tool (SET) Users Guide

    SciTech Connect

    Layne Pincock

    2014-10-01

    This document is the users guide to using the Screening and Evaluation Tool (SET). SET is a tool for comparing multiple fuel cycle options against a common set of criteria and metrics. It does this using standard multi-attribute utility decision analysis methods.

  4. Automated Geospatial Watershed Assessment (AGWA) 3.0 Software Tool

    EPA Science Inventory

    The Automated Geospatial Watershed Assessment (AGWA) tool has been developed under an interagency research agreement between the U.S. Environmental Protection Agency, Office of Research and Development, and the U.S. Department of Agriculture, Agricultural Research Service. AGWA i...

  5. DECONV-TOOL: An IDL based deconvolution software package

    NASA Technical Reports Server (NTRS)

    Varosi, F.; Landsman, W. B.

    1992-01-01

    There are a variety of algorithms for deconvolution of blurred images, each having its own criteria or statistic to be optimized in order to estimate the original image data. Using the Interactive Data Language (IDL), we have implemented the Maximum Likelihood, Maximum Entropy, Maximum Residual Likelihood, and sigma-CLEAN algorithms in a unified environment called DeConv_Tool. Most of the algorithms have as their goal the optimization of statistics such as standard deviation and mean of residuals. Shannon entropy, log-likelihood, and chi-square of the residual auto-correlation are computed by DeConv_Tool for the purpose of determining the performance and convergence of any particular method and comparisons between methods. DeConv_Tool allows interactive monitoring of the statistics and the deconvolved image during computation. The final results, and optionally, the intermediate results, are stored in a structure convenient for comparison between methods and review of the deconvolution computation. The routines comprising DeConv_Tool are available via anonymous FTP through the IDL Astronomy User's Library.

  6. Evaluating software development characteristics: Assessment of software measures in the Software Engineering Laboratory. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1981-01-01

    Work on metrics is discussed. Factors that affect software quality are reviewed. Metrics is discussed in terms of criteria achievements, reliability, and fault tolerance. Subjective and objective metrics are distinguished. Product/process and cost/quality metrics are characterized and discussed.

  7. Development of ShakeAlert Performance Evaluation Software

    NASA Astrophysics Data System (ADS)

    Maechling, P. J.; Liukis, M.; Jordan, T. H.; CISN EEW Team

    2011-12-01

    The CISN Testing Center (CTC) is designed to provide automated and interactive performance evaluations of ShakeAlert earthquake early warning system performance. The CTC software consists of two main parts: (1) software programs that input ShakeAlert EEW performance reports, match ShakeAlert forecasts to observational data, and generate a variety of EEW system performance summaries, and (2) an automated testing framework that can input ShakeAlert EEW performance reports, retrieve ANSS observational data, and produce performance summaries on a daily, or event, basis. The interactive capabilities of the CTC software may be useful for offline testing of ShakeAlert system. The automated capabilities of the CTC software are designed to support ongoing ShakeAlert performance evaluations. The CTC software implements a number of standard EEW performance summaries including magnitude forecast error and location forecast error with evaluation of ShakeAlert ground motion forecasts such as peak velocity under development. The CTC software is distributed as open-source scientific software to support transparency in evaluation processing and to support testing software re-use within ShakeAlert development groups.

  8. A Framework for the Evaluation of CASE Tool Learnability in Educational Environments

    ERIC Educational Resources Information Center

    Senapathi, Mali

    2005-01-01

    The aim of the research is to derive a framework for the evaluation of Computer Aided Software Engineering (CASE) tool learnability in educational environments. Drawing from the literature of Human Computer Interaction and educational research, a framework for evaluating CASE tool learnability in educational environments is derived. The two main…

  9. Development and Evaluation of Computer-Based Laboratory Practical Learning Tool

    ERIC Educational Resources Information Center

    Gandole, Y. B.

    2006-01-01

    Effective evaluation of educational software is a key issue for successful introduction of advanced tools in the curriculum. This paper details to developing and evaluating a tool for computer assisted learning of science laboratory courses. The process was based on the generic instructional system design model. Various categories of educational…

  10. Evaluating modeling tools for the EDOS

    NASA Technical Reports Server (NTRS)

    Knoble, Gordon; Mccaleb, Frederick; Aslam, Tanweer; Nester, Paul

    1994-01-01

    The Earth Observing System (EOS) Data and Operations System (EDOS) Project is developing a functional, system performance model to support the system implementation phase of the EDOS which is being designed and built by the Goddard Space Flight Center (GSFC). The EDOS Project will use modeling to meet two key objectives: (1) manage system design impacts introduced by unplanned changed in mission requirements; and (2) evaluate evolutionary technology insertions throughout the development of the EDOS. To select a suitable modeling tool, the EDOS modeling team developed an approach for evaluating modeling tools and languages by deriving evaluation criteria from both the EDOS modeling requirements and the development plan. Essential and optional features for an appropriate modeling tool were identified and compared with known capabilities of several modeling tools. Vendors were also provided the opportunity to model a representative EDOS processing function to demonstrate the applicability of their modeling tool to the EDOS modeling requirements. This paper emphasizes the importance of using a well defined approach for evaluating tools to model complex systems like the EDOS. The results of this evaluation study do not in any way signify the superiority of any one modeling tool since the results will vary with the specific modeling requirements of each project.

  11. An internet-based software tool for submitting crime information to forensic laboratories

    NASA Astrophysics Data System (ADS)

    Ahluwalia, Rashpal S.; Govindarajulu, Sriram

    2004-11-01

    This paper describes an internet-based software tool developed for the West Virginia State Police Forensics Laboratory. The software enables law enforcement agents to submit crime information to the Forensic Laboratory via a secure Internet connection. Online electronic forms were created to mirror the existing paper based forms, making the transition easier. The process of submitting case information was standardized and streamlined, there by minimizing information inconsistency. The crime information once gathered is automatically stored in a database, and can be viewed and queried by any authorized law enforcement officers. The software tool will be deployed in all counties of WV.

  12. The Web Interface Template System (WITS), a software developer`s tool

    SciTech Connect

    Lauer, L.J.; Lynam, M.; Muniz, T.

    1995-11-01

    The Web Interface Template System (WITS) is a tool for software developers. WITS is a three-tiered, object-oriented system operating in a Client/Server environment. This tool can be used to create software applications that have a Web browser as the user interface and access a Sybase database. Development, modification, and implementation are greatly simplified because the developer can change and test definitions immediately, without writing or compiling any code. This document explains WITS functionality, the system structure and components of WITS, and how to obtain, install, and use the software system.

  13. JULIDE: a software tool for 3D reconstruction and statistical analysis of autoradiographic mouse brain sections.

    PubMed

    Ribes, Delphine; Parafita, Julia; Charrier, Rémi; Magara, Fulvio; Magistretti, Pierre J; Thiran, Jean-Philippe

    2010-01-01

    In this article we introduce JULIDE, a software toolkit developed to perform the 3D reconstruction, intensity normalization, volume standardization by 3D image registration and voxel-wise statistical analysis of autoradiographs of mouse brain sections. This software tool has been developed in the open-source ITK software framework and is freely available under a GPL license. The article presents the complete image processing chain from raw data acquisition to 3D statistical group analysis. Results of the group comparison in the context of a study on spatial learning are shown as an illustration of the data that can be obtained with this tool. PMID:21124830

  14. RadicalLocator: A software tool for identifying the radicals in Chinese characters.

    PubMed

    Yu, Lili; Reichle, Erik D; Jones, Mathew; Liversedge, Simon P

    2015-09-01

    This article describes a new software tool called RadicalLocator that can be used to automatically identify (e.g., for visual inspection) individual target radicals (i.e., groups of strokes) in written Chinese characters. We first briefly clarify why this software is useful for research purposes and discuss the factors that make this pattern recognition task so difficult. We then describe how the software can be downloaded and installed, and used to identify the radicals in characters for the purposes of, for example, selecting materials for psycholinguistic experiments. Finally, we discuss several known limitations of the software and heuristics for addressing them. PMID:25169830

  15. Collaborative Software Development in Support of Fast Adaptive AeroSpace Tools (FAAST)

    NASA Technical Reports Server (NTRS)

    Kleb, William L.; Nielsen, Eric J.; Gnoffo, Peter A.; Park, Michael A.; Wood, William A.

    2003-01-01

    A collaborative software development approach is described. The software product is an adaptation of proven computational capabilities combined with new capabilities to form the Agency's next generation aerothermodynamic and aerodynamic analysis and design tools. To efficiently produce a cohesive, robust, and extensible software suite, the approach uses agile software development techniques; specifically, project retrospectives, the Scrum status meeting format, and a subset of Extreme Programming's coding practices are employed. Examples are provided which demonstrate the substantial benefits derived from employing these practices. Also included is a discussion of issues encountered when porting legacy Fortran 77 code to Fortran 95 and a Fortran 95 coding standard.

  16. A Tool to Enhance Cooperation and Knowledge Transfer among Software Developers

    NASA Astrophysics Data System (ADS)

    Aydin, Seçil; Mishra, Deepti

    Software developers have been successfully tailoring software development methods according to the project situation and more so in small scale software development organizations. There is a need to share this knowledge with other developers who may be facing the same project situation so that they can benefit from other people experiences. In this paper, an approach to enhance cooperation among software developers, in terms of sharing the knowledge that was used successfully in past projects, is proposed. A web-based tool is developed that can assist in creation, storage and extraction of methods related with requirement elicitation phase. These methods are categorized according to certain criteria which helps in searching a method that will be most appropriate in a given project situation. This approach and tool can also be used for other software development activities.

  17. Arc Flash Boundary Calculations Using Computer Software Tools

    SciTech Connect

    Gibbs, M.D.

    2005-01-07

    Arc Flash Protection boundary calculations have become easier to perform with the availability of personal computer software. These programs incorporate arc flash protection boundary formulas for different voltage and current levels, calculate the bolted fault current at each bus, and use built in time-current coordination curves to determine the clearing time of protective devices in the system. Results of the arc flash protection boundary calculations can be presented in several different forms--as an annotation to the one-line diagram, as a table of arc flash protection boundary distances, and as printed placards to be attached to the appropriate equipment. Basic arc flash protection boundary principles are presented in this paper along with several helpful suggestions for performing arc flash protection boundary calculations.

  18. Critical Evaluations and Instructional Potential of Authoring and Titled Program Software.

    ERIC Educational Resources Information Center

    Reppert, James E.

    This paper describes and evaluates the instructional uses of the Claris Works and Ultimedia Tools Series authoring programs and the following titled software programs: CNN Time Capsule: 100 Defining Moments of 1993; Windows Magazine: 1994; Ultimate Digital Studio; Data Trek Manager Series; and Cinemania '95. (AEF)

  19. Tool School. Review Software for Basic CHOICE. CHOICE (Challenging Options in Career Education).

    ERIC Educational Resources Information Center

    Pitts, Ilse M.; And Others

    CHOICE Tool School is an Apple computer software program designed to reinforce job and role information presented to primary-aged migrant students in the Basic Job and Role activity folders and workbooks. Learners must decide if randomly displayed tools are or are not used by the worker selected for the game theme. Learners may choose the level of…

  20. Thermonuclear Reaction Rate Libraries and Software Tools for Nuclear Astrophysics Research

    NASA Astrophysics Data System (ADS)

    Smith, Michael S.; Cyburt, Richard; Schatz, Hendrik; Wiescher, Michael; Smith, Karl; Warren, Scott; Ferguson, Ryan; Lingerfelt, Eric; Buckner, Kim; Nesaraja, Caroline D.

    2008-05-01

    Thermonuclear reaction rates are a crucial input for simulating a wide variety of astrophysical environments. A new collaboration has been formed to ensure that astrophysical modelers have access to reaction rates based on the most recent experimental and theoretical nuclear physics information. To reach this goal, a new version of the REACLIB library has been created by the Joint Institute for Nuclear Astrophysics (JINA), now available online at http://www.nscl.msu.edu/~nero/db. A complementary effort is the development of software tools in the Computational Infrastructure for Nuclear Astrophysics, online at nucastrodata.org, to streamline, manage, and access the workflow of the reaction evaluations from their initiation to peer review to incorporation into the library. Details of these new projects will be described.

  1. Development of tools for safety analysis of control software in advanced reactors

    SciTech Connect

    Guarro, S.; Yau, M.; Motamed, M.

    1996-04-01

    Software based control systems have gained a pervasive presence in a wide variety of applications, including nuclear power plant control and protection systems which are within the oversight and licensing responsibility of the US Nuclear Regulatory Commission. While the cost effectiveness and flexibility of software based plant process control is widely recognized, it is very difficult to achieve and prove high levels of demonstrated dependability and safety assurance for the functions performed by process control software, due to the very flexibility and potential complexity of the software itself. The development of tools to model, analyze and test software design and implementations in the context of the system that the software is designed to control can greatly assist the task of providing higher levels of assurance than those obtainable by software testing alone. This report presents and discusses the development of the Dynamic Flowgraph Methodology (DFM) and its application in the dependability and assurance analysis of software-based control systems. The features of the methodology and full-scale examples of application to both generic process and nuclear power plant control systems are presented and discussed in detail. The features of a workstation software tool developed to assist users in the application of DFM are also described.

  2. Computer-Based Tools for Evaluating Graphical User Interfaces

    NASA Technical Reports Server (NTRS)

    Moore, Loretta A.

    1997-01-01

    The user interface is the component of a software system that connects two very complex system: humans and computers. Each of these two systems impose certain requirements on the final product. The user is the judge of the usability and utility of the system; the computer software and hardware are the tools with which the interface is constructed. Mistakes are sometimes made in designing and developing user interfaces because the designers and developers have limited knowledge about human performance (e.g., problem solving, decision making, planning, and reasoning). Even those trained in user interface design make mistakes because they are unable to address all of the known requirements and constraints on design. Evaluation of the user inter-face is therefore a critical phase of the user interface development process. Evaluation should not be considered the final phase of design; but it should be part of an iterative design cycle with the output of evaluation being feed back into design. The goal of this research was to develop a set of computer-based tools for objectively evaluating graphical user interfaces. The research was organized into three phases. The first phase resulted in the development of an embedded evaluation tool which evaluates the usability of a graphical user interface based on a user's performance. An expert system to assist in the design and evaluation of user interfaces based upon rules and guidelines was developed during the second phase. During the final phase of the research an automatic layout tool to be used in the initial design of graphical inter- faces was developed. The research was coordinated with NASA Marshall Space Flight Center's Mission Operations Laboratory's efforts in developing onboard payload display specifications for the Space Station.

  3. Open software tools for eddy covariance flux partitioning

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Agro-ecosystem management and assessment will benefit greatly from the development of reliable techniques for partitioning evapotranspiration (ET) into evaporation (E) and transpiration (T). Among other activities, flux partitioning can aid in evaluating consumptive vs. non-consumptive agricultural...

  4. Lessons learned applying CASE methods/tools to Ada software development projects

    NASA Technical Reports Server (NTRS)

    Blumberg, Maurice H.; Randall, Richard L.

    1993-01-01

    This paper describes the lessons learned from introducing CASE methods/tools into organizations and applying them to actual Ada software development projects. This paper will be useful to any organization planning to introduce a software engineering environment (SEE) or evolving an existing one. It contains management level lessons learned, as well as lessons learned in using specific SEE tools/methods. The experiences presented are from Alpha Test projects established under the STARS (Software Technology for Adaptable and Reliable Systems) project. They reflect the front end efforts by those projects to understand the tools/methods, initial experiences in their introduction and use, and later experiences in the use of specific tools/methods and the introduction of new ones.

  5. PCS Security Technology Evaluation Tool

    Energy Science and Technology Software Center (ESTSC)

    2007-01-30

    P-STET assists in the security technology decision making process from a costlbeneflt perspective. It aids in addressing such questions as whether to acquire and deploy new security technology, to re-configure an existing product or system, or to maintain status quo. P-STET offers both a qualitative and quantitative option. P-STET is most efficient when tailored to an organization’s security cost/benefit environment. It then serves as both a guide to show what types of security questions shouldmore » be addressed and as a means to analyze the data gathered from the questions to make an informed decision. The quantitative option provides a straightforward way to express costs/benefits in terms of dollars. It relies on the organization to quantify benefits or cost avoidances, and, therefore, best serves as a guide to ensure various cost and benefit angles are evaluated. The qualitative option allows the organization to assess costs by levels with respect to security and PCS budgets, operational impacts, and opportunity costs. Benefits are represented in terms of improvements to the organization’s operations and are also assessed by levels with respect to some benchmark such as compliance with best practices. Results are displayed graphically using radar charts, allowing the user to make a more intuitive decision. The shaded area of each chart represents the overall cost and benefit of the security investment. A good investment is denoted when the ratio of benefit shaded area to cost shaded area is large.« less

  6. A software tool for tomographic axial superresolution in STED microscopy.

    PubMed

    Koho, S; Deguchi, T; Hänninen, P E

    2015-11-01

    A method for generating three-dimensional tomograms from multiple three-dimensional axial projections in STimulated Emission Depletion (STED) superresolution microscopy is introduced. Our STED< method, based on the use of a micromirror placed on top of a standard microscopic sample, is used to record a three-dimensional projection at an oblique angle in relation to the main optical axis. Combining the STED< projection with the regular STED image into a single view by tomographic reconstruction, is shown to result in a tomogram with three-to-four-fold improved apparent axial resolution. Registration of the different projections is based on the use of a mutual-information histogram similarity metric. Fusion of the projections into a single view is based on Richardson-Lucy iterative deconvolution algorithm, modified to work with multiple projections. Our tomographic reconstruction method is demonstrated to work with real biological STED superresolution images, including a data set with a limited signal-to-noise ratio (SNR); the reconstruction software (SuperTomo) and its source code will be released under BSD open-source license. PMID:26258639

  7. Effectiveness of Crown Preparation Assessment Software As an Educational Tool in Simulation Clinic: A Pilot Study.

    PubMed

    Tiu, Janine; Cheng, Enxin; Hung, Tzu-Chiao; Yu, Chuan-Chia; Lin, Tony; Schwass, Don; Al-Amleh, Basil

    2016-08-01

    The aim of this pilot study was to evaluate the feasibility of a new tooth preparation assessment software, Preppr, as an educational tool for dental students in achieving optimal parameters for a crown preparation. In February 2015, 30 dental students in their fourth year in a five-year undergraduate dental curriculum in New Zealand were randomly selected from a pool of volunteers (N=40) out of the total class of 85. The participants were placed into one of three groups of ten students each: Group A, the control group, received only written and pictorial instructions; Group B received tutor evaluation and feedback; and Group C performed self-directed learning with the aid of Preppr. Each student was asked to prepare an all-ceramic crown on the lower first molar typodont within three hours and to repeat the exercise three times over the next four weeks. The exercise stipulated a 1 mm finish line dimension and total convergence angles (TOC) between 10 and 20 degrees. Fulfillment of these parameters was taken as an acceptable preparation. The results showed that Group C had the highest percentage of students who achieved minimum finish line dimensions and acceptable TOC angles. Those students also achieved the stipulated requirements earlier than the other groups. This study's findings provide promising data on the feasibility of using Preppr as a self-directed educational tool for students training to prepare dental crowns. PMID:27480712

  8. Users' manual for the Hydroecological Integrity Assessment Process software (including the New Jersey Assessment Tools)

    USGS Publications Warehouse

    Henriksen, James A.; Heasley, John; Kennen, Jonathan G.; Nieswand, Steven

    2006-01-01

    Applying the Hydroecological Integrity Assessment Process involves four steps: (1) a hydrologic classification of relatively unmodified streams in a geographic area using long-term gage records and 171 ecologically relevant indices; (2) the identification of statistically significant, nonredundant, hydroecologically relevant indices associated with the five major flow components for each stream class; and (3) the development of a stream-classification tool and a hydrologic assessment tool. Four computer software tools have been developed.

  9. Software tools for developing parallel applications. Part 1: Code development and debugging

    SciTech Connect

    Brown, J.; Geist, A.; Pancake, C.; Rover, D.

    1997-04-01

    Developing an application for parallel computers can be a lengthy and frustrating process making it a perfect candidate for software tool support. Yet application programmers are often the last to hear about new tools emerging from R and D efforts. This paper provides an overview of two focuses of tool support: code development and debugging. Each is discussed in terms of the programmer needs addressed, the extent to which representative current tools meet those needs, and what new levels of tool support are important if parallel computing is to become more widespread.

  10. A diagnostic tool for malaria based on computer software

    PubMed Central

    Kotepui, Manas; Uthaisar, Kwuntida; Phunphuech, Bhukdee; Phiwklam, Nuoil

    2015-01-01

    Nowadays, the gold standard method for malaria diagnosis is a staining of thick and thin blood film examined by expert laboratorists. It requires well-trained laboratorists, which is a time consuming task, and is un-automated protocol. For this study, Maladiag Software was developed to predict malaria infection in suspected malaria patients. The demographic data of patients, examination for malaria parasites, and complete blood count (CBC) profiles were analyzed. Binary logistic regression was used to create the equation for the malaria diagnosis. The diagnostic parameters of the equation were tested on 4,985 samples (703 infected and 4,282 control samples). The equation indicated 81.2% sensitivity and 80.3% specificity for predicting infection of malaria. The positive likelihood and negative likelihood ratio were 4.12 (95% CI = 4.01–4.23) and 0.23 (95% CI = 0.22–0.25), respectively. This parameter also had odds ratios (P value < 0.0001, OR = 17.6, 95% CI = 16.0–19.3). The equation can predict malaria infection after adjust for age, gender, nationality, monocyte (%), platelet count, neutrophil (%), lymphocyte (%), and the RBC count of patients. The diagnostic accuracy was 0.877 (Area under curve, AUC) (95% CI = 0.871–0.883). The system, when used in combination with other clinical and microscopy methods, might improve malaria diagnoses and enhance prompt treatment. PMID:26559606

  11. Development of evaluation method for software hazard identification techniques

    SciTech Connect

    Huang, H. W.; Chen, M. H.; Shih, C.; Yih, S.; Kuo, C. T.; Wang, L. H.; Yu, Y. C.; Chen, C. W.

    2006-07-01

    This research evaluated the applicable software hazard identification techniques nowadays, such as, Preliminary Hazard Analysis (PHA), Failure Modes and Effects Analysis (FMEA), Fault Tree Analysis (FTA), Markov chain modeling, Dynamic Flow-graph Methodology (DFM), and simulation-based model analysis; and then determined indexes in view of their characteristics, which include dynamic capability, completeness, achievability, detail, signal/noise ratio, complexity, and implementation cost. By this proposed method, the analysts can evaluate various software hazard identification combinations for specific purpose. According to the case study results, the traditional PHA + FMEA + FTA (with failure rate) + Markov chain modeling (with transfer rate) combination is not competitive due to the dilemma for obtaining acceptable software failure rates. However, the systematic architecture of FTA and Markov chain modeling is still valuable for realizing the software fault structure. The system centric techniques, such as DFM and simulation-based model-analysis, show the advantage on dynamic capability, achievability, detail, signal/noise ratio. However, their disadvantages are the completeness complexity and implementation cost. This evaluation method can be a platform to reach common consensus for the stakeholders. Following the evolution of software hazard identification techniques, the evaluation results could be changed. However, the insight of software hazard identification techniques is much more important than the numbers obtained by the evaluation. (authors)

  12. Computer software evaluation methodology and data base management system selection

    SciTech Connect

    Huntley, A.F.

    1986-04-01

    This document presents a Data Base Management System (DBMS) evaluation methodology that has been developed under the sponsorship of the Department of the Navy, Naval Management Systems Support Office (NAVMASSO), Norfolk, Virginia. NAVMASSO has recognized the need for a DBMS to support the Shipboard Nontactical Automated Data Processing Program (SNAP) and has tasked Oak Ridge National Laboratory (ORNL) with evaluating DBMSs that are available for the SNAP-I computer system - a Honeywell DPS-6 minicomputer - and the SNAP-II computer system - a Harris 300 minicomputer. In preparation for the SNAP-I/SNAP-II DBMS evaluation, ORNL has developed the DBMS evaluation methodology presented in this document. First, a discussion of the traditional computer software evaluation methodology is provided, with identification of aspects of the methodology that may cause the resulting evaluation to be deficient. A DBMS evaluation methodology that stresses the layered functionality of the software is then presented. The methodology requires a large amount of hands-on testing and allows evaluation team members to evaluate the software from the perspective of application developers and end-users who will use the system on a day-to-day basis. The document contains a discussion of several general considerations that must be evaluated. These are items that form a supportive environment and enhance the usability of the software, even though they may not affect the intrinsic functionality of the software. The technical facilities that define the limits of functionality of the software are then presented for evaluation. Areas where these facilities may not meet the desired functionality are identified. 14 refs.

  13. [Construction and evaluation of educational software on urinary indwelling catheters].

    PubMed

    Lopes, Ana Carolina Cristino; de Andrade Ferreira, Andréia; Fernandes, Jussara Alaíde Leite; da Silva Morita, Ana Beatriz Pinto; de Brito Poveda, Vanessa; de Souza, Adriano José Sorbile

    2011-03-01

    Since this is an era in which information is open concerning the benefits it brings, the field of nursing informatics earns its moment. The objective of this study was to design educational software for teaching and learning the technique of urinary indwelling catheterization and compare the acquisition of knowledge regarding the technique before and after the implementation of the educational software. This is a descriptive study using a quantitative approach. The pedagogical foundations for designing the software were the theories of Piaget and Vygotsky. The teaching-learning process was evaluated through a questionnaire consisting of 10 multiple choice questions which the 60 participants completed before and after using the software. The results showed the software made significant contributions after its application, thus being very useful in the teaching-learning process. PMID:21445511

  14. PAW, a general-purpose portable software tool for data analysis and presentation

    NASA Astrophysics Data System (ADS)

    Brun, René; Couet, Olivier; Vandoni, Carlo E.; Zanarini, Pietro

    1989-12-01

    During the last twenty years, CERN has played a leading role as the focus for development of packages and software libraries to solve problems related to high energy physics (HEP). The results of the integration of resources from many different laboratories can be expressed in several million lines of code written at CERN during this period of time, used at CERN and distributed to collaborating laboratories. Nowadays, this role of software developer is considered very important by the entire HEP community. In this paper a large software package, where man-machine interaction and graphics play a key role (PAW - Physics Analysis Workstation), is described. PAW is essentially an interactive system which includes many different software tools, strongly oriented towards data analysis and data presentation. Some of these tools have been available in different forms and with different human interfaces for several years.

  15. Evaluation of commercially available lighting design software

    SciTech Connect

    McConnell, D.G.

    1990-09-01

    This report addresses the need for commercially available lighting design computer programs and evaluates several of these programs. Sandia National Laboratories uses these programs to provide lighting designs for exterior closed-circuit television camera intrusion detection assessment for high-security perimeters.

  16. On the evaluation of segmentation editing tools

    PubMed Central

    Heckel, Frank; Moltz, Jan H.; Meine, Hans; Geisler, Benjamin; Kießling, Andreas; D’Anastasi, Melvin; dos Santos, Daniel Pinto; Theruvath, Ashok Joseph; Hahn, Horst K.

    2014-01-01

    Abstract. Efficient segmentation editing tools are important components in the segmentation process, as no automatic methods exist that always generate sufficient results. Evaluating segmentation editing algorithms is challenging, because their quality depends on the user’s subjective impression. So far, no established methods for an objective, comprehensive evaluation of such tools exist and, particularly, intermediate segmentation results are not taken into account. We discuss the evaluation of editing algorithms in the context of tumor segmentation in computed tomography. We propose a rating scheme to qualitatively measure the accuracy and efficiency of editing tools in user studies. In order to objectively summarize the overall quality, we propose two scores based on the subjective rating and the quantified segmentation quality over time. Finally, a simulation-based evaluation approach is discussed, which allows a more reproducible evaluation without the need for human input. This automated evaluation complements user studies, allowing a more convincing evaluation, particularly during development, where frequent user studies are not possible. The proposed methods have been used to evaluate two dedicated editing algorithms on 131 representative tumor segmentations. We show how the comparison of editing algorithms benefits from the proposed methods. Our results also show the correlation of the suggested quality score with the qualitative ratings. PMID:26158063

  17. Software Mapping Assessment Tool Documenting Behavioral Content in Computer Interaction: Examples of Mapped Problems with "Kid Pix" Program

    ERIC Educational Resources Information Center

    Bayram, Servet

    2005-01-01

    The purpose of software mapping is to delineate a method for software menu, tool, and palette use in the construction of elementary school science and mathematics curriculum activities. With this method, software "maps" were created for traversing science and math curriculum problems and activities using software. The other purpose of…

  18. Leveraging Existing Mission Tools in a Re-Usable, Component-Based Software Environment

    NASA Technical Reports Server (NTRS)

    Greene, Kevin; Grenander, Sven; Kurien, James; z,s (fshir. z[orttr); z,scer; O'Reilly, Taifun

    2006-01-01

    Emerging methods in component-based software development offer significant advantages but may seem incompatible with existing mission operations applications. In this paper we relate our positive experiences integrating existing mission applications into component-based tools we are delivering to three missions. In most operations environments, a number of software applications have been integrated together to form the mission operations software. In contrast, with component-based software development chunks of related functionality and data structures, referred to as components, can be individually delivered, integrated and re-used. With the advent of powerful tools for managing component-based development, complex software systems can potentially see significant benefits in ease of integration, testability and reusability from these techniques. These benefits motivate us to ask how component-based development techniques can be relevant in a mission operations environment, where there is significant investment in software tools that are not component-based and may not be written in languages for which component-based tools even exist. Trusted and complex software tools for sequencing, validation, navigation, and other vital functions cannot simply be re-written or abandoned in order to gain the advantages offered by emerging component-based software techniques. Thus some middle ground must be found. We have faced exactly this issue, and have found several solutions. Ensemble is an open platform for development, integration, and deployment of mission operations software that we are developing. Ensemble itself is an extension of an open source, component-based software development platform called Eclipse. Due to the advantages of component-based development, we have been able to vary rapidly develop mission operations tools for three surface missions by mixing and matching from a common set of mission operation components. We have also had to determine how to

  19. Apache Open Climate Workbench: Building Open Source Climate Science Tools and Community at the Apache Software Foundation

    NASA Astrophysics Data System (ADS)

    Joyce, M.; Ramirez, P.; Boustani, M.; Mattmann, C. A.; Khudikyan, S.; McGibbney, L. J.; Whitehall, K. D.

    2014-12-01

    Apache Open Climate Workbench (OCW; https://climate.apache.org/) is a Top-Level Project at the Apache Software Foundation that aims to provide a suite of tools for performing climate science evaluations using model outputs from a multitude of different sources (ESGF, CORDEX, U.S. NCA, NARCCAP) with remote sensing data from NASA, NOAA, and other agencies. Apache OCW is the second NASA project to become a Top-Level Project at the Apache Software Foundation. It grew out of the Jet Propulsion Laboratory's (JPL) Regional Climate Model Evaluation System (RCMES) project, a collaboration between JPL and the University of California, Los Angeles' Joint Institute for Regional Earth System Science and Engineering (JIFRESSE). Apache OCW provides scientists and developers with tools for data manipulation, metrics for dataset comparisons, and a visualization suite. In addition to a powerful low-level API, Apache OCW also supports a web application for quick, browser-controlled evaluations, a command line application for local evaluations, and a virtual machine for isolated experimentation with minimal setup. This talk will look at the difficulties and successes of moving a closed community research project out into the wild world of open source. We'll explore the growing pains Apache OCW went through to become a Top-Level Project at the Apache Software Foundation as well as the benefits gained by opening up development to the broader climate and computer science communities.

  20. scoringRules - A software package for probabilistic model evaluation

    NASA Astrophysics Data System (ADS)

    Lerch, Sebastian; Jordan, Alexander; Krüger, Fabian

    2016-04-01

    Models in the geosciences are generally surrounded by uncertainty, and being able to quantify this uncertainty is key to good decision making. Accordingly, probabilistic forecasts in the form of predictive distributions have become popular over the last decades. With the proliferation of probabilistic models arises the need for decision theoretically principled tools to evaluate the appropriateness of models and forecasts in a generalized way. Various scoring rules have been developed over the past decades to address this demand. Proper scoring rules are functions S(F,y) which evaluate the accuracy of a forecast distribution F , given that an outcome y was observed. As such, they allow to compare alternative models, a crucial ability given the variety of theories, data sources and statistical specifications that is available in many situations. This poster presents the software package scoringRules for the statistical programming language R, which contains functions to compute popular scoring rules such as the continuous ranked probability score for a variety of distributions F that come up in applied work. Two main classes are parametric distributions like normal, t, or gamma distributions, and distributions that are not known analytically, but are indirectly described through a sample of simulation draws. For example, Bayesian forecasts produced via Markov Chain Monte Carlo take this form. Thereby, the scoringRules package provides a framework for generalized model evaluation that both includes Bayesian as well as classical parametric models. The scoringRules package aims to be a convenient dictionary-like reference for computing scoring rules. We offer state of the art implementations of several known (but not routinely applied) formulas, and implement closed-form expressions that were previously unavailable. Whenever more than one implementation variant exists, we offer statistically principled default choices.

  1. Characterizing Verification Tools Through Coding Error Candidates Reported in Space Flight Software

    NASA Astrophysics Data System (ADS)

    Prause, Christian R.; Gerlich, Ralf; Gerlich, Rainer; Fischer, Anton

    2015-09-01

    Mastering the continuously increasing amount of software requires identification of more efficient strategies for software verification. Currently, fault coverage is only indirectly addressed, e.g. by code coverage. The idea as presented in this paper is to get a better understanding of fault coverage by a systematic classification of software fault types, derivation of footprints of verification tools regarding coverage of such fault types, and recording of required effort. A number of issues regarding fault identification and classification are discussed in this context.

  2. Knowledge-engineering software. A demonstration of a high-end tool.

    PubMed

    Salzman, G C; Krall, R B; Marinuzzi, J G

    1988-06-01

    Many investigators wanting to apply knowledge-based systems (KBSs) as consultants for cancer diagnosis have turned to tools running on personal computers. While some of these tools serve well for small tasks, they lack the power available with such high-end KBS tools as KEE (Knowledge Engineering Environment) and ART (Automated Reasoning Tool). These tools were originally developed on Lisp machines and have the full functionality of the Lisp language as well as many additional features. They provide a rich and highly productive environment for the software developer. This paper illustrates the capability of one of these high-end tools. First, a table showing the classification of benign soft tissue tumors was converted into a KEE knowledge base. The tools available in KEE were then used to identify the tumor type for a hypothetical patient. PMID:3408548

  3. Proceedings of the Workshop on software tools for distributed intelligent control systems

    SciTech Connect

    Herget, C.J.

    1990-09-01

    The Workshop on Software Tools for Distributed Intelligent Control Systems was organized by Lawrence Livermore National Laboratory for the United States Army Headquarters Training and Doctrine Command and the Defense Advanced Research Projects Agency. The goals of the workshop were to the identify the current state of the art in tools which support control systems engineering design and implementation, identify research issues associated with writing software tools which would provide a design environment to assist engineers in multidisciplinary control design and implementation, formulate a potential investment strategy to resolve the research issues and develop public domain code which can form the core of more powerful engineering design tools, and recommend test cases to focus the software development process and test associated performance metrics. Recognizing that the development of software tools for distributed intelligent control systems will require a multidisciplinary effort, experts in systems engineering, control systems engineering, and compute science were invited to participate in the workshop. In particular, experts who could address the following topics were selected: operating systems, engineering data representation and manipulation, emerging standards for manufacturing data, mathematical foundations, coupling of symbolic and numerical computation, user interface, system identification, system representation at different levels of abstraction, system specification, system design, verification and validation, automatic code generation, and integration of modular, reusable code.

  4. GenomeTools: a comprehensive software library for efficient processing of structured genome annotations.

    PubMed

    Gremme, Gordon; Steinbiss, Sascha; Kurtz, Stefan

    2013-01-01

    Genome annotations are often published as plain text files describing genomic features and their subcomponents by an implicit annotation graph. In this paper, we present the GenomeTools, a convenient and efficient software library and associated software tools for developing bioinformatics software intended to create, process or convert annotation graphs. The GenomeTools strictly follow the annotation graph approach, offering a unified graph-based representation. This gives the developer intuitive and immediate access to genomic features and tools for their manipulation. To process large annotation sets with low memory overhead, we have designed and implemented an efficient pull-based approach for sequential processing of annotations. This allows to handle even the largest annotation sets, such as a complete catalogue of human variations. Our object-oriented C-based software library enables a developer to conveniently implement their own functionality on annotation graphs and to integrate it into larger workflows, simultaneously accessing compressed sequence data if required. The careful C implementation of the GenomeTools does not only ensure a light-weight memory footprint while allowing full sequential as well as random access to the annotation graph, but also facilitates the creation of bindings to a variety of script programming languages (like Python and Ruby) sharing the same interface. PMID:24091398

  5. Assessment Tools for the Evaluation of Risk

    EPA Science Inventory

    ASTER (Assessment Tools for the Evaluation of Risk) was developed by the U.S. EPA Mid-Continent Ecology Division, Duluth, MN to assist regulators in performing ecological risk assessments. ASTER is an integration of the ECOTOXicology Database (ECOTOX; Techniques and software tools for estimating ultrasonic signal-to-noise ratios

    NASA Astrophysics Data System (ADS)

    Chiou, Chien-Ping; Margetan, Frank J.; McKillip, Matthew; Engle, Brady J.; Roberts, Ronald A.

    2016-02-01

    At Iowa State University's Center for Nondestructive Evaluation (ISU CNDE), the use of models to simulate ultrasonic inspections has played a key role in R&D efforts for over 30 years. To this end a series of wave propagation models, flaw response models, and microstructural backscatter models have been developed to address inspection problems of interest. One use of the combined models is the estimation of signal-to-noise ratios (S/N) in circumstances where backscatter from the microstructure (grain noise) acts to mask sonic echoes from internal defects. Such S/N models have been used in the past to address questions of inspection optimization and reliability. Under the sponsorship of the National Science Foundation's Industry/University Cooperative Research Center at ISU, an effort was recently initiated to improve existing research-grade software by adding graphical user interface (GUI) to become user friendly tools for the rapid estimation of S/N for ultrasonic inspections of metals. The software combines: (1) a Python-based GUI for specifying an inspection scenario and displaying results; and (2) a Fortran-based engine for computing defect signal and backscattered grain noise characteristics. The latter makes use of several models including: the Multi-Gaussian Beam Model for computing sonic fields radiated by commercial transducers; the Thompson-Gray Model for the response from an internal defect; the Independent Scatterer Model for backscattered grain noise; and the Stanke-Kino Unified Model for attenuation. The initial emphasis was on reformulating the research-grade code into a suitable modular form, adding the graphical user interface and performing computations rapidly and robustly. Thus the initial inspection problem being addressed is relatively simple. A normal-incidence pulse/echo immersion inspection is simulated for a curved metal component having a non-uniform microstructure, specifically an equiaxed, untextured microstructure in which the average

  6. Review of Ground Systems Development and Operations (GSDO) Tools for Verifying Command and Control Software

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.; Bonanne, Kevin H.; Favretto, Jeffrey A.; Jackson, Maddalena M.; Jones, Stephanie L.; Mackey, Ryan M.; Sarrel, Marc A.; Simpson, Kimberly A.

    2014-01-01

    The Exploration Systems Development (ESD) Standing Review Board (SRB) requested the NASA Engineering and Safety Center (NESC) conduct an independent review of the plan developed by Ground Systems Development and Operations (GSDO) for identifying models and emulators to create a tool(s) to verify their command and control software. The NESC was requested to identify any issues or weaknesses in the GSDO plan. This document contains the outcome of the NESC review.

  7. Performance Evaluation of 3d Modeling Software for Uav Photogrammetry

    NASA Astrophysics Data System (ADS)

    Yanagi, H.; Chikatsu, H.

    2016-06-01

    UAV (Unmanned Aerial Vehicle) photogrammetry, which combines UAV and freely available internet-based 3D modeling software, is widely used as a low-cost and user-friendly photogrammetry technique in the fields such as remote sensing and geosciences. In UAV photogrammetry, only the platform used in conventional aerial photogrammetry is changed. Consequently, 3D modeling software contributes significantly to its expansion. However, the algorithms of the 3D modelling software are black box algorithms. As a result, only a few studies have been able to evaluate their accuracy using 3D coordinate check points. With this motive, Smart3DCapture and Pix4Dmapper were downloaded from the Internet and commercial software PhotoScan was also employed; investigations were performed in this paper using check points and images obtained from UAV.

  8. Performance evaluation of bound diamond ring tools

    SciTech Connect

    Piscotty, M.A.; Taylor, J.S.; Blaedel, K.L.

    1995-07-14

    LLNL is collaborating with the Center for Optics Manufacturing (COM) and the American Precision Optics Manufacturers Association (APOMA) to optimize bound diamond ring tools for the spherical generation of high quality optical surfaces. An important element of this work is establishing an experimentally-verified link between tooling properties and workpiece quality indicators such as roughness, subsurface damage and removal rate. In this paper, we report on a standardized methodology for assessing ring tool performance and its preliminary application to a set of commercially-available wheels. Our goals are to (1) assist optics manufacturers (users of the ring tools) in evaluating tools and in assessing their applicability for a given operation, and (2) provide performance feedback to wheel manufacturers to help optimize tooling for the optics industry. Our paper includes measurements of wheel performance for three 2-4 micron diamond bronze-bond wheels that were supplied by different manufacturers to nominally- identical specifications. Preliminary data suggests that the difference in performance levels among the wheels were small.

  9. Microsoft Producer: A Software Tool for Creating Multimedia PowerPoint[R] Presentations

    ERIC Educational Resources Information Center

    Leffingwell, Thad R.; Thomas, David G.; Elliott, William H.

    2007-01-01

    Microsoft[R] Producer[R] is a powerful yet user-friendly PowerPoint companion tool for creating on-demand multimedia presentations. Instructors can easily distribute these presentations via compact disc or streaming media over the Internet. We describe the features of the software, system requirements, and other required hardware. We also describe…

  10. Review of Software Tools for Design and Analysis of Large scale MRM Proteomic Datasets

    PubMed Central

    Colangelo, Christopher M.; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-01-01

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  11. A Guide to the Use of Tool Software for the Apple Computer.

    ERIC Educational Resources Information Center

    Collett, Charles R.; Goldberg, Fred S.

    Designed to give teachers and supervisors a working knowledge of various approaches to enhancing pupil learning through software application programs, this guide is presented in a hands-on fashion. It supports a dual purpose, i.e., it can serve as an individual tutorial or as a turnkey staff development tool. All program files referred to may be…

  12. Using a Self-Administered Visual Basic Software Tool To Teach Psychological Concepts.

    ERIC Educational Resources Information Center

    Strang, Harold R.; Sullivan, Amie K.; Schoeny, Zahrl G.

    2002-01-01

    Introduces LearningLinks, a Visual Basic software tool that allows teachers to create individualized learning modules that use constructivist and behavioral learning principles. Describes field testing of undergraduates at the University of Virginia that tested a module designed to improve understanding of the psychological concepts of…

  13. DairyGEM: A software tool for assessing emissions and mitigation strategies for dairy production systems

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Many gaseous compounds are emitted from dairy farms. Those of current interest include the toxic compounds of ammonia and hydrogen sulfide and the greenhouse gases of methane, nitrous oxide and carbon dioxide. A relatively easy to use software tool was developed that predicts these emissions through...

  14. Review of software tools for design and analysis of large scale MRM proteomic datasets.

    PubMed

    Colangelo, Christopher M; Chung, Lisa; Bruce, Can; Cheung, Kei-Hoi

    2013-06-15

    Selective or Multiple Reaction monitoring (SRM/MRM) is a liquid-chromatography (LC)/tandem-mass spectrometry (MS/MS) method that enables the quantitation of specific proteins in a sample by analyzing precursor ions and the fragment ions of their selected tryptic peptides. Instrumentation software has advanced to the point that thousands of transitions (pairs of primary and secondary m/z values) can be measured in a triple quadrupole instrument coupled to an LC, by a well-designed scheduling and selection of m/z windows. The design of a good MRM assay relies on the availability of peptide spectra from previous discovery-phase LC-MS/MS studies. The tedious aspect of manually developing and processing MRM assays involving thousands of transitions has spurred to development of software tools to automate this process. Software packages have been developed for project management, assay development, assay validation, data export, peak integration, quality assessment, and biostatistical analysis. No single tool provides a complete end-to-end solution, thus this article reviews the current state and discusses future directions of these software tools in order to enable researchers to combine these tools for a comprehensive targeted proteomics workflow. PMID:23702368

  15. mMass as a Software Tool for the Annotation of Cyclic Peptide Tandem Mass Spectra

    PubMed Central

    Niedermeyer, Timo H. J.; Strohalm, Martin

    2012-01-01

    Natural or synthetic cyclic peptides often possess pronounced bioactivity. Their mass spectrometric characterization is difficult due to the predominant occurrence of non-proteinogenic monomers and the complex fragmentation patterns observed. Even though several software tools for cyclic peptide tandem mass spectra annotation have been published, these tools are still unable to annotate a majority of the signals observed in experimentally obtained mass spectra. They are thus not suitable for extensive mass spectrometric characterization of these compounds. This lack of advanced and user-friendly software tools has motivated us to extend the fragmentation module of a freely available open-source software, mMass (http://www.mmass.org), to allow for cyclic peptide tandem mass spectra annotation and interpretation. The resulting software has been tested on several cyanobacterial and other naturally occurring peptides. It has been found to be superior to other currently available tools concerning both usability and annotation extensiveness. Thus it is highly useful for accelerating the structure confirmation and elucidation of cyclic as well as linear peptides and depsipeptides. PMID:23028676

  16. 76 FR 5832 - International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-02-02

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF LABOR Employment and Training Administration International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA, San Jose, CA; Notice of Affirmative Determination Regarding Application for Reconsideration By application dated...

  17. New Tools for New Literacies Research: An Exploration of Usability Testing Software

    ERIC Educational Resources Information Center

    Asselin, Marlene; Moayeri, Maryam

    2010-01-01

    Competency in the new literacies of the Internet is essential for participating in contemporary society. Researchers studying these new literacies are recognizing the limitations of traditional methodological tools and adapting new technologies and new media for use in research. This paper reports our exploration of usability testing software to…

  18. Wiki as a Corporate Learning Tool: Case Study for Software Development Company

    ERIC Educational Resources Information Center

    Milovanovic, Milos; Minovic, Miroslav; Stavljanin, Velimir; Savkovic, Marko; Starcevic, Dusan

    2012-01-01

    In our study, we attempted to further investigate how Web 2.0 technologies influence workplace learning. Our particular interest was on using Wiki as a tool for corporate exchange of knowledge with the focus on informal learning. In this study, we collaborated with a multinational software development company that uses Wiki as a corporate tool…

  19. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records.

    PubMed

    Kong, Xiaoquan; Huang, Minyi; Duan, Renyan

    2015-01-01

    It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs). There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata) to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF) and check species names and the accuracy of coordinates (latitude and longitude). It is an open source software (GNU Affero General Public License/AGPL licensed) allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from . PMID:26030926

  1. SDMdata: A Web-Based Software Tool for Collecting Species Occurrence Records

    PubMed Central

    Kong, Xiaoquan; Huang, Minyi; Duan, Renyan

    2015-01-01

    It is important to easily and efficiently obtain high quality species distribution data for predicting the potential distribution of species using species distribution models (SDMs). There is a need for a powerful software tool to automatically or semi-automatically assist in identifying and correcting errors. Here, we use Python to develop a web-based software tool (SDMdata) to easily collect occurrence data from the Global Biodiversity Information Facility (GBIF) and check species names and the accuracy of coordinates (latitude and longitude). It is an open source software (GNU Affero General Public License/AGPL licensed) allowing anyone to access and manipulate the source code. SDMdata is available online free of charge from . PMID:26030926

  2. Proposal for constructing an advanced software tool for planetary atmospheric modeling

    NASA Technical Reports Server (NTRS)

    Keller, Richard M.; Sims, Michael H.; Podolak, Esther; Mckay, Christopher P.; Thompson, David E.

    1990-01-01

    Scientific model building can be a time intensive and painstaking process, often involving the development of large and complex computer programs. Despite the effort involved, scientific models cannot easily be distributed and shared with other scientists. In general, implemented scientific models are complex, idiosyncratic, and difficult for anyone but the original scientist/programmer to understand. We believe that advanced software techniques can facilitate both the model building and model sharing process. We propose to construct a scientific modeling software tool that serves as an aid to the scientist in developing and using models. The proposed tool will include an interactive intelligent graphical interface and a high level, domain specific, modeling language. As a testbed for this research, we propose development of a software prototype in the domain of planetary atmospheric modeling.

  3. Simplified tools for evaluating domestic ventilation systems

    SciTech Connect

    Maansson, L.G.; Orme, M.

    1999-07-01

    Within an International Energy Agency (IEA) project, Annex 27, experts from 8 countries (Canada, France, Italy, Japan, The Netherlands, Sweden, UK and USA) have developed simplified tools for evaluating domestic ventilation systems during the heating season. Tools for building and user aspects, thermal comfort, noise, energy, life cycle cost, reliability and indoor air quality (IAQ) have been devised. The results can be used both for dwellings at the design stage and after construction. The tools lead to immediate answers and indications about the consequences of different choices that may arise during discussion with clients. This paper presents an introduction to these tools. Examples applications of the indoor air quality and energy simplified tools are also provided. The IAQ tool accounts for constant emission sources, CO{sub 2}, cooking products, tobacco smoke, condensation risks, humidity levels (i.e., for judging the risk for mould and house dust mites), and pressure difference (for identifying the risk for radon or land fill spillage entering the dwelling or problems with indoor combustion appliances). An elaborated set of design parameters were worked out that resulted in about 17,000 combinations. By using multi-variate analysis it was possible to reduce this to 174 combinations for IAQ. In addition, a sensitivity analysis was made using 990 combinations. The results from all the runs were used to develop a simplified tool, as well as quantifying equations relying on the design parameters. A computerized energy tool has also been developed within this project, which takes into account air tightness, climate, window airing pattern, outdoor air flow rate and heat exchange efficiency.

  4. Software Construction and Composition Tools for Petascale Computing SCW0837 Progress Report

    SciTech Connect

    Epperly, T W; Hochstein, L

    2011-09-12

    The majority of scientific software is distributed as source code. As the number of library dependencies and supported platforms increases, so does the complexity of describing the rules for configuring and building software. In this project, we have performed an empirical study of the magnitude of the build problem by examining the development history of two DOE-funded scientific software projects. We have developed MixDown, a meta-build tool, to simplify the task of building applications that depend on multiple third-party libraries. The results of this research indicate that the effort that scientific programmers spend takes a significant fraction of the total development effort and that the use of MixDown can significantly simplify the task of building software with multiple dependencies.

  5. An infrastructure for the creation of high end scientific and engineering software tools and applications

    SciTech Connect

    Drummond, L.A.; Marques, O.A.; Wilson, G.V.

    2003-04-01

    This document has been prepared as a response to the High End Computing Revitalization Task Force (HECRTF) call for white papers. Our main goal is to identify mechanism necessary for the design and implementation of an infrastructure to support development of high-end scientific and engineering software tools and applications. This infrastructure will provide a plethora of software services to facilitate the efficient deployment of future HEC technology as well as collaborations among researchers and engineers across disciplines and institutions. In particular, we address here the following points; Key software technologies that must be advanced to strengthen the foundation for developing new generations of HEC systems. A Software Infrastructure for minimizing ''time to solution'' by users of HEC systems.

  6. RNAsoft: a suite of RNA secondary structure prediction and design software tools

    PubMed Central

    Andronescu, Mirela; Aguirre-Hernández, Rosalía; Condon, Anne; Hoos, Holger H.

    2003-01-01

    DNA and RNA strands are employed in novel ways in the construction of nanostructures, as molecular tags in libraries of polymers and in therapeutics. New software tools for prediction and design of molecular structure will be needed in these applications. The RNAsoft suite of programs provides tools for predicting the secondary structure of a pair of DNA or RNA molecules, testing that combinatorial tag sets of DNA and RNA molecules have no unwanted secondary structure and designing RNA strands that fold to a given input secondary structure. The tools are based on standard thermodynamic models of RNA secondary structure formation. RNAsoft can be found online at http://www.RNAsoft.ca. PMID:12824338

  7. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  8. Using McIDAS-V data analysis and visualization software as an educational tool for understanding the atmosphere

    NASA Astrophysics Data System (ADS)

    Achtor, T. H.; Rink, T.

    2010-12-01

    The University of Wisconsin’s Space Science and Engineering Center (SSEC) has been at the forefront in developing data analysis and visualization tools for environmental satellites and other geophysical data. The fifth generation of the Man-computer Interactive Data Access System (McIDAS-V) is Java-based, open-source, freely available software that operates on Linux, Macintosh and Windows systems. The software tools provide powerful new data manipulation and visualization capabilities that work with geophysical data in research, operational and educational environments. McIDAS-V provides unique capabilities to support innovative techniques for evaluating research results, teaching and training. McIDAS-V is based on three powerful software elements. VisAD is a Java library for building interactive, collaborative, 4 dimensional visualization and analysis tools. The Integrated Data Viewer (IDV) is a reference application based on the VisAD system and developed by the Unidata program that demonstrates the flexibility that is needed in this evolving environment, using a modern, object-oriented software design approach. The third tool, HYDRA, allows users to build, display and interrogate multi and hyperspectral environmental satellite data in powerful ways. The McIDAS-V software is being used for training and education in several settings. The McIDAS User Group provides training workshops at its annual meeting. Numerous online tutorials with training data sets have been developed to aid users in learning simple and more complex operations in McIDAS-V, all are available online. In a University of Wisconsin-Madison undergraduate course in Radar and Satellite Meteorology, McIDAS-V is used to create and deliver laboratory exercises using case study and real time data. At the high school level, McIDAS-V is used in several exercises in our annual Summer Workshop in Earth and Atmospheric Sciences to provide young scientists the opportunity to examine data with friendly and

  9. A software tool for automatic analysis of selected area diffraction patterns within Digital Micrograph™.

    PubMed

    Wu, C H; Reynolds, W T; Murayama, M

    2012-01-01

    A software package "SADP Tools" is developed as a complementary diffraction pattern analysis tool. The core program, called AutoSADP, is designed to facilitate automated measurements of d-spacing and interplaner angles from TEM selected area diffraction patterns (SADPs) of single crystals. The software uses iterative cross correlations to locate the forward scattered beam position and to find the coordinates of the diffraction spots. The newly developed algorithm is suitable for fully automated analysis and it works well with asymmetric diffraction patterns, off-zone axis patterns, patterns with streaks, and noisy patterns such as Fast Fourier transforms of high-resolution images. The AutoSADP tool runs as a macro for the Digital Micrograph program and can determine d-spacing values and interplanar angles based on the pixel ratio with an accuracy of better than about 2%. PMID:22079497

  10. IPAT: a freely accessible software tool for analyzing multiple patent documents with inbuilt landscape visualizer.

    PubMed

    Ajay, Dara; Gangwal, Rahul P; Sangamwar, Abhay T

    2015-01-01

    Intelligent Patent Analysis Tool (IPAT) is an online data retrieval tool, operated based on text mining algorithm to extract specific patent information in a predetermined pattern into an Excel sheet. The software is designed and developed to retrieve and analyze technology information from multiple patent documents and generate various patent landscape graphs and charts. The software is C# coded in visual studio 2010, which extracts the publicly available patent information from the web pages like Google Patent and simultaneously study the various technology trends based on user-defined parameters. In other words, IPAT combined with the manual categorization will act as an excellent technology assessment tool in competitive intelligence and due diligence for predicting the future R&D forecast. PMID:26452016

  11. A transparent and transportable methodology for evaluating Data Linkage software.

    PubMed

    Ferrante, Anna; Boyd, James

    2012-02-01

    There has been substantial growth in Data Linkage (DL) activities in recent years. This reflects growth in both the demand for, and the supply of, linked or linkable data. Increased utilisation of DL "services" has brought with it increased need for impartial information about the suitability and performance capabilities of DL software programs and packages. Although evaluations of DL software exist; most have been restricted to the comparison of two or three packages. Evaluations of a large number of packages are rare because of the time and resource burden placed on the evaluators and the need for a suitable "gold standard" evaluation dataset. In this paper we present an evaluation methodology that overcomes a number of these difficulties. Our approach involves the generation and use of representative synthetic data; the execution of a series of linkages using a pre-defined linkage strategy; and the use of standard linkage quality metrics to assess performance. The methodology is both transparent and transportable, producing genuinely comparable results. The methodology was used by the Centre for Data Linkage (CDL) at Curtin University in an evaluation of ten DL software packages. It is also being used to evaluate larger linkage systems (not just packages). The methodology provides a unique opportunity to benchmark the quality of linkages in different operational environments. PMID:22061295

  12. Some Remarks on Guidelines for Evaluating Statistical Software.

    ERIC Educational Resources Information Center

    Cox, Lawrence H.; Eddy, William F.

    The advisability of drafting guidelines for evaluating statistical software is considered. The Committee on Applied and Theoretical Statistics of the National Research Council has decided to initiate a project to articulate issues relating to guidelines and to determine their priorities. Because there has been a proliferation in statistical…

  13. R&D Speaks: Evaluation of Educational Software.

    ERIC Educational Resources Information Center

    Southwest Educational Development Lab., Austin, TX.

    A brief introduction states the purpose of and summarizes the five presentations made at a Southwest Educational Development Laboratory Regional Exchange (SEDL/RX) Project conference and notes that the SEDL/RX publication, "Evaluation of Educational Software: A Guide to Guides," was produced for use as a reference guide by participants in this…

  14. Microcomputers: Instrument Generation Software. Evaluation Guides. Guide Number 11.

    ERIC Educational Resources Information Center

    Gray, Peter J.

    Designed to assist evaluators in selecting the appropriate software for the generation of various data collection instruments, this guide discusses such key program characteristics as text entry, item storage and manipulation, item retrieval, and printing. Some characteristics of a good instrument generation program are discussed; these include…

  15. Software Engineering Laboratory (SEL) programmer workbench phase 1 evaluation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Phase 1 of the SEL programmer workbench consists of the design of the following three components: communications link, command language processor, and collection of software aids. A brief description, and evaluation, and recommendations are presented for each of these three components.

  16. Criteria for Evaluating and Selecting Multimedia Software for Instruction.

    ERIC Educational Resources Information Center

    Lee, Sung Heum; And Others

    Evaluating and selecting the appropriate software is a very important component of success in using multimedia systems in both educational and corporate settings. Computer-mediated multimedia (CMM) is the integration of two or more communication media, controlled or manipulated by the user via a computer, to present information. CMM can be…

  17. A comprehensive evaluation of assembly scaffolding tools

    PubMed Central

    2014-01-01

    Background Genome assembly is typically a two-stage process: contig assembly followed by the use of paired sequencing reads to join contigs into scaffolds. Scaffolds are usually the focus of reported assembly statistics; longer scaffolds greatly facilitate the use of genome sequences in downstream analyses, and it is appealing to present larger numbers as metrics of assembly performance. However, scaffolds are highly prone to errors, especially when generated using short reads, which can directly result in inflated assembly statistics. Results Here we provide the first independent evaluation of scaffolding tools for second-generation sequencing data. We find large variations in the quality of results depending on the tool and dataset used. Even extremely simple test cases of perfect input, constructed to elucidate the behaviour of each algorithm, produced some surprising results. We further dissect the performance of the scaffolders using real and simulated sequencing data derived from the genomes of Staphylococcus aureus, Rhodobacter sphaeroides, Plasmodium falciparum and Homo sapiens. The results from simulated data are of high quality, with several of the tools producing perfect output. However, at least 10% of joins remains unidentified when using real data. Conclusions The scaffolders vary in their usability, speed and number of correct and missed joins made between contigs. Results from real data highlight opportunities for further improvements of the tools. Overall, SGA, SOPRA and SSPACE generally outperform the other tools on our datasets. However, the quality of the results is highly dependent on the read mapper and genome complexity. PMID:24581555

  18. Second Evaluation of Job Queuing/Scheduling Software. Phase 1

    NASA Technical Reports Server (NTRS)

    Jones, James Patton; Brickell, Cristy; Chancellor, Marisa (Technical Monitor)

    1997-01-01

    The recent proliferation of high performance workstations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, NAS compiled a requirements checklist for job queuing/scheduling software. Next, NAS evaluated the leading job management system (JMS) software packages against the checklist. A year has now elapsed since the first comparison was published, and NAS has repeated the evaluation. This report describes this second evaluation, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still lacking, however, definite progress has been made by the vendors to correct the deficiencies. This report is supplemented by a WWW interface to the data collected, to aid other sites in extracting the evaluation information on specific requirements of interest.

  19. Evaluating software development by analysis of changes - Some data from the Software Engineering Laboratory

    NASA Technical Reports Server (NTRS)

    Weiss, D. M.; Basili, V. R.

    1985-01-01

    Basili and Weiss (1984) have discussed an approach for obtaining valid data which may be used to evaluate software development methodologies in a production environment. The methodology consists of five elements, including the identification of goals, the determination of questions of interest from the goals, the development of a data collection form, the development of data collection procedures, and the validation and analysis of the data. The current investigation is concerned with the presentation of the results from such an evaluation. The presented data were collected as part of studies reported by Basili et al. (1977). These studies had been conducted by NASA's Software Engineering Laboratory (SEL). Attention is given to an overview of the SEL, the application of the considered methodology, the results of a data analysis, and conclusions about the SEL environment.

  20. Backup flight control system functional evaluator software manual

    NASA Technical Reports Server (NTRS)

    Helmke, C. A.; Hasara, S. H.; Mount, F. E.

    1977-01-01

    The software for the Backup Flight Control System Functional Evaluator (BFCSFE) on a Data General Corporation Nova 1200 computer consists of three programs: the ground support program, the operational flight program (OFP), and the ground pulse code modulation (PCM) program. The Nova OFP software is structurally as close as possible to the AP101 code; therefore, this document highlights and describes only those areas of the Nova OFP that are significantly different from the AP101. Since the Ground Support Program was developed to meet BFCSFE requirements and differs considerably from the AP101 code, it is described in detail.

  1. DAISY: a new software tool to test global identifiability of biological and physiological systems.

    PubMed

    Bellu, Giuseppina; Saccomani, Maria Pia; Audoly, Stefania; D'Angiò, Leontina

    2007-10-01

    A priori global identifiability is a structural property of biological and physiological models. It is considered a prerequisite for well-posed estimation, since it concerns the possibility of recovering uniquely the unknown model parameters from measured input-output data, under ideal conditions (noise-free observations and error-free model structure). Of course, determining if the parameters can be uniquely recovered from observed data is essential before investing resources, time and effort in performing actual biomedical experiments. Many interesting biological models are nonlinear but identifiability analysis for nonlinear system turns out to be a difficult mathematical problem. Different methods have been proposed in the literature to test identifiability of nonlinear models but, to the best of our knowledge, so far no software tools have been proposed for automatically checking identifiability of nonlinear models. In this paper, we describe a software tool implementing a differential algebra algorithm to perform parameter identifiability analysis for (linear and) nonlinear dynamic models described by polynomial or rational equations. Our goal is to provide the biological investigator a completely automatized software, requiring minimum prior knowledge of mathematical modelling and no in-depth understanding of the mathematical tools. The DAISY (Differential Algebra for Identifiability of SYstems) software will potentially be useful in biological modelling studies, especially in physiology and clinical medicine, where research experiments are particularly expensive and/or difficult to perform. Practical examples of use of the software tool DAISY are presented. DAISY is available at the web site http://www.dei.unipd.it/~pia/. PMID:17707944

  2. A Review of Diffusion Tensor Magnetic Resonance Imaging Computational Methods and Software Tools

    PubMed Central

    Hasan, Khader M.; Walimuni, Indika S.; Abid, Humaira; Hahn, Klaus R.

    2010-01-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  3. A review of diffusion tensor magnetic resonance imaging computational methods and software tools.

    PubMed

    Hasan, Khader M; Walimuni, Indika S; Abid, Humaira; Hahn, Klaus R

    2011-12-01

    In this work we provide an up-to-date short review of computational magnetic resonance imaging (MRI) and software tools that are widely used to process and analyze diffusion-weighted MRI data. A review of different methods used to acquire, model and analyze diffusion-weighted imaging data (DWI) is first provided with focus on diffusion tensor imaging (DTI). The major preprocessing, processing and post-processing procedures applied to DTI data are discussed. A list of freely available software packages to analyze diffusion MRI data is also provided. PMID:21087766

  4. Cerec Smile Design--a software tool for the enhancement of restorations in the esthetic zone.

    PubMed

    Kurbad, Andreas; Kurbad, Susanne

    2013-01-01

    Restorations in the esthetic zone can now be enhanced using software tools. In addition to the design of the restoration itself, a part or all of the patient's face can be displayed on the monitor to increase the predictability of treatment results. Using the Smile Design components of the Cerec and inLab software, a digital photograph of the patient can be projected onto a three-dimensional dummy head. In addition to its use for the enhancement of the CAD process, this technology can also be utilized for marketing purposes. PMID:24364196

  5. Software tools for simultaneous data visualization and T cell epitopes and disorder prediction in proteins.

    PubMed

    Jandrlić, Davorka R; Lazić, Goran M; Mitić, Nenad S; Pavlović, Mirjana D

    2016-04-01

    We have developed EpDis and MassPred, extendable open source software tools that support bioinformatic research and enable parallel use of different methods for the prediction of T cell epitopes, disorder and disordered binding regions and hydropathy calculation. These tools offer a semi-automated installation of chosen sets of external predictors and an interface allowing for easy application of the prediction methods, which can be applied either to individual proteins or to datasets of a large number of proteins. In addition to access to prediction methods, the tools also provide visualization of the obtained results, calculation of consensus from results of different methods, as well as import of experimental data and their comparison with results obtained with different predictors. The tools also offer a graphical user interface and the possibility to store data and the results obtained using all of the integrated methods in the relational database or flat file for further analysis. The MassPred part enables a massive parallel application of all integrated predictors to the set of proteins. Both tools can be downloaded from http://bioinfo.matf.bg.ac.rs/home/downloads.wafl?cat=Software. Appendix A includes the technical description of the created tools and a list of supported predictors. PMID:26851400

  6. Pathway Tools version 13.0: integrated software for pathway/genome informatics and systems biology

    PubMed Central

    Paley, Suzanne M.; Krummenacker, Markus; Latendresse, Mario; Dale, Joseph M.; Lee, Thomas J.; Kaipa, Pallavi; Gilham, Fred; Spaulding, Aaron; Popescu, Liviu; Altman, Tomer; Paulsen, Ian; Keseler, Ingrid M.; Caspi, Ron

    2010-01-01

    Pathway Tools is a production-quality software environment for creating a type of model-organism database called a Pathway/Genome Database (PGDB). A PGDB such as EcoCyc integrates the evolving understanding of the genes, proteins, metabolic network and regulatory network of an organism. This article provides an overview of Pathway Tools capabilities. The software performs multiple computational inferences including prediction of metabolic pathways, prediction of metabolic pathway hole fillers and prediction of operons. It enables interactive editing of PGDBs by DB curators. It supports web publishing of PGDBs, and provides a large number of query and visualization tools. The software also supports comparative analyses of PGDBs, and provides several systems biology analyses of PGDBs including reachability analysis of metabolic networks, and interactive tracing of metabolites through a metabolic network. More than 800 PGDBs have been created using Pathway Tools by scientists around the world, many of which are curated DBs for important model organisms. Those PGDBs can be exchanged using a peer-to-peer DB sharing system called the PGDB Registry. PMID:19955237

  7. Distortion Analysis Toolkit—A Software Tool for Easy Analysis of Nonlinear Audio Systems

    NASA Astrophysics Data System (ADS)

    Pakarinen, Jyri

    2010-12-01

    Several audio effects devices deliberately add nonlinear distortion to the processed signal in order to create a desired sound. When creating virtual analog models of nonlinearly distorting devices, it would be very useful to carefully analyze the type of distortion, so that the model could be made as realistic as possible. While traditional system analysis tools such as the frequency response give detailed information on the operation of linear and time-invariant systems, they are less useful for analyzing nonlinear devices. Furthermore, although there do exist separate algorithms for nonlinear distortion analysis, there is currently no unified, easy-to-use tool for rapid analysis of distorting audio systems. This paper offers a remedy by introducing a new software tool for easy analysis of distorting effects. A comparison between a well-known guitar tube amplifier and two commercial software simulations is presented as a case study. This freely available software is written in Matlab language, but the analysis tool can also run as a standalone program, so the user does not need to have Matlab installed in order to perform the analysis.

  8. A Runtime Environment for Supporting Research in Resilient HPC System Software & Tools

    SciTech Connect

    Vallee, Geoffroy R; Naughton, III, Thomas J; Boehm, Swen; Engelmann, Christian

    2013-01-01

    The high-performance computing (HPC) community continues to increase the size and complexity of hardware platforms that support advanced scientific workloads. The runtime environment (RTE) is a crucial layer in the software stack for these large-scale systems. The RTE manages the interface between the operating system and the application running in parallel on the machine. The deployment of applications and tools on large-scale HPC computing systems requires the RTE to manage process creation in a scalable manner, support sparse connectivity, and provide fault tolerance. We have developed a new RTE that provides a basis for building distributed execution environments and developing tools for HPC to aid research in system software and resilience. This paper describes the software architecture of the Scalable runTime Component Infrastructure (STCI), which is intended to provide a complete infrastructure for scalable start-up and management of many processes in large-scale HPC systems. We highlight features of the current implementation, which is provided as a system library that allows developers to easily use and integrate STCI in their tools and/or applications. The motivation for this work has been to support ongoing research activities in fault-tolerance for large-scale systems. We discuss the advantages of the modular framework employed and describe two use cases that demonstrate its capabilities: (i) an alternate runtime for a Message Passing Interface (MPI) stack, and (ii) a distributed control and communication substrate for a fault-injection tool.

  9. Case study for the evaluation and selection of man-machine interface (MMI) software

    SciTech Connect

    Nekimken, H.; Pope, N.; Macdonald, J.; Bibeau, R.; Gomez, B.; Sellon, D.

    1996-06-01

    The authors evaluated three of the top man-machine interface (MMI) software systems. The main categories upon which they based their evaluation on were the following: operator interface; network and data distribution; input/output (I/O) interface; application development; alarms; real-time and historical trending; support, documentation, and training; processing tools (batch, recipe, logic); reports; custom interfacing; start-up/recovery; external database; and multimedia. They also present their MMI requirements and guidelines for the selection and evaluation of these MMI systems.

  10. Evaluation of Job Queuing/Scheduling Software: Phase I Report

    NASA Technical Reports Server (NTRS)

    Jones, James Patton

    1996-01-01

    The recent proliferation of high performance work stations and the increased reliability of parallel systems have illustrated the need for robust job management systems to support parallel applications. To address this issue, the national Aerodynamic Simulation (NAS) supercomputer facility compiled a requirements checklist for job queuing/scheduling software. Next, NAS began an evaluation of the leading job management system (JMS) software packages against the checklist. This report describes the three-phase evaluation process, and presents the results of Phase 1: Capabilities versus Requirements. We show that JMS support for running parallel applications on clusters of workstations and parallel systems is still insufficient, even in the leading JMS's. However, by ranking each JMS evaluated against the requirements, we provide data that will be useful to other sites in selecting a JMS.

  11. Development, comparison, and evaluation of software for radial distortion elimination

    NASA Astrophysics Data System (ADS)

    Papadaki, A. I.; Georgopoulos, A.

    2015-05-01

    Lately the interest of Computer Vision and Photogrammetry community has been focused on the automation of the processes of identification and elimination of the radial distortion, with the aim to correct the image coordinates and finally to obtain digital images with reliable geometric information. This effort has reached the point of development of commercial or free image processing software, claiming that it can automatically identify and remove the radial distortion from an image. In this paper in depth research has been conducted about the radial distortion and the methods of its identification and elimination. Specifically, it has been attempted to evaluate software as the aforementioned, about its effectiveness, accuracy and applicability on the elimination of the radial distortion from images. For the attainment of the desired aim, four different methods of comparison and evaluation of the performance of the software, with respect to the correction of an image, have been employed. The applied methods are (i) the optical evaluation of the produced digital images, (ii) the subtraction of the images, (iii) the comparison of the curves of the remaining radial distortion in the images and (iv) the comparison of the results from the orientation of an image pair. However, it was really important to have a benchmark for the evaluation, in order to ensure the objectivity and accuracy of the comparison. Therefore, a new reliable algorithm has been developed, which was of known and controllable accuracy. The results of these comparisons are presented and evaluated for their reliability and usefulness.

  12. OERL: A Tool For Geoscience Education Evaluators

    NASA Astrophysics Data System (ADS)

    Zalles, D. R.

    2002-12-01

    The Online Evaluation Resource Library (OERL) is a Web-based set of resources for improving the evaluation of projects funded by the Directorate for Education and Human Resources (EHR) of the National Science Foundation (NSF). OERL provides prospective project developers and evaluators with material that they can use to design, conduct, document, and review evaluations. OERL helps evaluators tackle the challenges of seeing if a project is meeting its implementation and outcome-related goals. Within OERL is a collection of exemplary plans, instruments, and reports from evaluations of EHR-funded projects in the geosciences and in other areas of science and mathematics. In addition, OERL contains criteria about good evaluation practices, professional development modules about evaluation design and questionnaire development, a dictionary of key evaluation terms, and links to evaluation standards. Scenarios illustrate how the resources can be used or adapted. Currently housed in OERL are 137 instruments, and full or excerpted versions of 38 plans and 60 reports. 143 science and math projects have contributed to the collection so far. OERL's search tool permits the launching of precise searches based on key attributes of resources such as their subject area and the name of the sponsoring university or research institute. OERL's goals are to 1) meet the needs for continuous professional development of evaluators and principal investigators, 2) complement traditional vehicles of learning about evaluation, 3) utilize the affordances of current technologies (e.g., Web-based digital libraries, relational databases, and electronic performance support systems) for improving evaluation practice, 4) provide anytime/anyplace access to update-able resources that support evaluators' needs, and 5) provide a forum by which professionals can interact on evaluation issues and practices. Geoscientists can search the collection of resources from geoscience education projects that have

  13. The State of Children's Software Evaluation--Yesterday, Today, and in the 21st Century.

    ERIC Educational Resources Information Center

    Buckleitner, Warren

    1999-01-01

    Examines current state of children's software evaluation practice. Predicts future educational software evaluation in light of the dynamic nature of educational and "edutainment" software. Discusses key issue such as the best way to evaluate the appropriateness of software for children at each age group and the most efficient means of making this…

  14. PC Software graphics tool for conceptual design of space/planetary electrical power systems

    NASA Technical Reports Server (NTRS)

    Truong, Long V.

    1995-01-01

    This paper describes the Decision Support System (DSS), a personal computer software graphics tool for designing conceptual space and/or planetary electrical power systems. By using the DSS, users can obtain desirable system design and operating parameters, such as system weight, electrical distribution efficiency, and bus power. With this tool, a large-scale specific power system was designed in a matter of days. It is an excellent tool to help designers make tradeoffs between system components, hardware architectures, and operation parameters in the early stages of the design cycle. The DSS is a user-friendly, menu-driven tool with online help and a custom graphical user interface. An example design and results are illustrated for a typical space power system with multiple types of power sources, frequencies, energy storage systems, and loads.

  15. RAVEN as a tool for dynamic probabilistic risk assessment: Software overview

    SciTech Connect

    Alfonsi, A.; Rabiti, C.; Mandelli, D.; Cogliati, J. J.; Kinoshita, R. A.

    2013-07-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermal-Hydraulic code RELAP-7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/ monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities. (authors)

  16. RAVEN AS A TOOL FOR DYNAMIC PROBABILISTIC RISK ASSESSMENT: SOFTWARE OVERVIEW

    SciTech Connect

    Alfonsi Andrea; Mandelli Diego; Rabiti Cristian; Joshua Cogliati; Robert Kinoshita

    2013-05-01

    RAVEN is a software tool under development at the Idaho National Laboratory (INL) that acts as the control logic driver and post-processing tool for the newly developed Thermo-Hydraylic code RELAP- 7. The scope of this paper is to show the software structure of RAVEN and its utilization in connection with RELAP-7. A short overview of the mathematical framework behind the code is presented along with its main capabilities such as on-line controlling/monitoring and Monte-Carlo sampling. A demo of a Station Black Out PRA analysis of a simplified Pressurized Water Reactor (PWR) model is shown in order to demonstrate the Monte-Carlo and clustering capabilities.

  17. Analyst Tools and Quality Control Software for the ARM Data System

    SciTech Connect

    Moore, S.T.

    2004-12-14

    ATK Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed a web-based data analysis and visualization tool, called NCVweb, that allows for easy viewing of ARM NetCDF files. NCVweb, along with our library of sharable Interactive Data Language procedures and functions, allows even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers.

  18. The -mdoc macro package: A software tool to support computer documentation standards

    SciTech Connect

    Sanders, C.E.

    1987-09-16

    At Los Alamos National Laboratory a small staff of writers and word processors in the Computer Documentation Group is responsible for producing computer documentation for the over 8000 users of the Laboratory's computer network. The -mdoc macro package was developed as a software tool to support that effort. The -mdoc macro package is used with the NROFF/TROFF document preparation system on the UNIX operating system. The -mdoc macro package incorporates the standards for computer documentation at Los Alamos that were established by the writers. Use of the -mdoc macro package has freed the staff of programming format details, allowing writers to concentrate on content of documents and word processors to produce documents in a timely manner. It is an easy-to-use software tool that adapts to changing skills, needs, and technology. 5 refs.

  19. Object-oriented software for evaluating measurement uncertainty

    NASA Astrophysics Data System (ADS)

    Hall, B. D.

    2013-05-01

    An earlier publication (Hall 2006 Metrologia 43 L56-61) introduced the notion of an uncertain number that can be used in data processing to represent quantity estimates with associated uncertainty. The approach can be automated, allowing data processing algorithms to be decomposed into convenient steps, so that complicated measurement procedures can be handled. This paper illustrates the uncertain-number approach using several simple measurement scenarios and two different software tools. One is an extension library for Microsoft Excel®. The other is a special-purpose calculator using the Python programming language.

  20. Project I-COP - architecture of software tool for decision support in oncology.

    PubMed

    Blaha, Milan; Janča, Dalibor; Klika, Petr; Mužík, Jan; Dušek, Ladislav

    2013-01-01

    This article briefly describes the development of the I-COP tool, which is designed to promote education and decision making of clinical oncologists. It is based on real data from medical facilities, which are processed, stored in database, analyzed and finally displayed in an interactive software application. Used data sources are shortly described in individual sections together with the functionality of developed tools. The final goal of this project is to provide support for work and education within each involved partner center. Clinical oncologists are therefore supposed to be the authors and users at the same time. PMID:23542983

  1. A SOFTWARE TOOL TO COMPARE MEASURED AND SIMULATED BUILDING ENERGY PERFORMANCE DATA

    SciTech Connect

    Maile, Tobias; Bazjanac, Vladimir; O'Donnell, James; Garr, Matthew

    2011-11-01

    Building energy performance is often inadequate when compared to design goals. To link design goals to actual operation one can compare measured with simulated energy performance data. Our previously developed comparison approach is the Energy Performance Comparison Methodology (EPCM), which enables the identification of performance problems based on a comparison of measured and simulated performance data. In context of this method, we developed a software tool that provides graphing and data processing capabilities of the two performance data sets. The software tool called SEE IT (Stanford Energy Efficiency Information Tool) eliminates the need for manual generation of data plots and data reformatting. SEE IT makes the generation of time series, scatter and carpet plots independent of the source of data (measured or simulated) and provides a valuable tool for comparing measurements with simulation results. SEE IT also allows assigning data points on a predefined building object hierarchy and supports different versions of simulated performance data. This paper briefly introduces the EPCM, describes the SEE IT tool and illustrates its use in the context of a building case study.

  2. A new software tool for computing Earth's atmospheric transmission of near- and far-infrared radiation

    NASA Technical Reports Server (NTRS)

    Lord, Steven D.

    1992-01-01

    This report describes a new software tool, ATRAN, which computes the transmittance of Earth's atmosphere at near- and far-infrared wavelengths. We compare the capabilities of this program with others currently available and demonstrate its utility for observational data calibration and reduction. The program employs current water-vapor and ozone models to produce fast and accurate transmittance spectra for wavelengths ranging from 0.8 microns to 10 mm.

  3. PlanetPack software tool for exoplanets detection: coming new features

    NASA Astrophysics Data System (ADS)

    Baluev, Roman V.

    2014-07-01

    We briefly overview the new features of PlanetPack2, the forthcoming update of PlanetPack, which is a software tool for exoplanets detection and characterization from Doppler radial velocity data. Among other things, this major update brings parallelized computing, new advanced models of the Doppler noise, handling of the so-called Keplerian periodogram, and routines for transits fitting and transit timing variation analysis.

  4. Tools to aid the specification and design of flight software, appendix B

    NASA Technical Reports Server (NTRS)

    Bristow, G.

    1980-01-01

    The tasks that are normally performed during the specification and architecture design stages of software development are identified. Ways that tools could perform, or aid the performance, of such tasks are also identified. Much of the verification and analysis that is suggested is currently rarely performed during these early stages, but it is believed that this analysis should be done as early as possible so as to detect errors as early as possible.

  5. Techniques and tools for measuring energy efficiency of scientific software applications

    NASA Astrophysics Data System (ADS)

    Abdurachmanov, David; Elmer, Peter; Eulisse, Giulio; Knight, Robert; Niemi, Tapio; Nurminen, Jukka K.; Nyback, Filip; Pestana, Gonçalo; Ou, Zhonghong; Khan, Kashif

    2015-05-01

    The scale of scientific High Performance Computing (HPC) and High Throughput Computing (HTC) has increased significantly in recent years, and is becoming sensitive to total energy use and cost. Energy-efficiency has thus become an important concern in scientific fields such as High Energy Physics (HEP). There has been a growing interest in utilizing alternate architectures, such as low power ARM processors, to replace traditional Intel x86 architectures. Nevertheless, even though such solutions have been successfully used in mobile applications with low I/O and memory demands, it is unclear if they are suitable and more energy-efficient in the scientific computing environment. Furthermore, there is a lack of tools and experience to derive and compare power consumption between the architectures for various workloads, and eventually to support software optimizations for energy efficiency. To that end, we have performed several physical and software-based measurements of workloads from HEP applications running on ARM and Intel architectures, and compare their power consumption and performance. We leverage several profiling tools (both in hardware and software) to extract different characteristics of the power use. We report the results of these measurements and the experience gained in developing a set of measurement techniques and profiling tools to accurately assess the power consumption for scientific workloads.

  6. User Driven Development of Software Tools for Open Data Discovery and Exploration

    NASA Astrophysics Data System (ADS)

    Schlobinski, Sascha; Keppel, Frank; Dihe, Pascal; Boot, Gerben; Falkenroth, Esa

    2016-04-01

    The use of open data in research faces challenges not restricted to inherent properties such as data quality, resolution of open data sets. Often Open data is catalogued insufficiently or fragmented. Software tools that support the effective discovery including the assessment of the data's appropriateness for research have shortcomings such as the lack of essential functionalities like support for data provenance. We believe that one of the reasons is the neglect of real end users requirements in the development process of aforementioned software tools. In the context of the FP7 Switch-On project we have pro-actively engaged the relevant user user community to collaboratively develop a means to publish, find and bind open data relevant for hydrologic research. Implementing key concepts of data discovery and exploration we have used state of the art web technologies to provide an interactive software tool that is easy to use yet powerful enough to satisfy the data discovery and access requirements of the hydrological research community.

  7. Selecting Software with Caution: An Empirical Evaluation of Popular Beginning Reading Software for Children with Early Literacy Difficulties

    ERIC Educational Resources Information Center

    Santoro, Lana Edwards; Bishop, M. J.

    2010-01-01

    It seems appropriate, if not necessary, to use empirically supported criteria to evaluate reading software applications. This study's purpose was to develop a research-based evaluation framework and review selected beginning reading software that might be used with struggling beginning readers. Thirty-one products were reviewed according to…

  8. gLAB-A Fully Software Tool to Generate, Process and Analyze GNSS Signals

    NASA Astrophysics Data System (ADS)

    Dionisio, Cesare; Citterico, Dario; Pirazzi, Gabriele; De Quattro, Nicola; Marracci, Riccardo; Cucchi, Luca; Valdambrini, Nicola; Formaioni, Irene

    2010-08-01

    In this paper the concept of Software Defined Radio (SDR) and its use in modern GNSS receiver is highlighted demonstrating how software receivers are important in many situations especially for verification and validation. After a brief introduction of gLab, a fully software high modular tool to generate, process and analyze current and future GNSS signals, the different software modules will be described. Demonstrating the wide range of uses concerning gLab, different practical example will be briefly overviewed: from the analysis of real data over the experimental GIOVE-B satellite, to the antenna group delay determination or the CN0 estimation under wide dynamic range etc.. gLab is the result of different projects leaded by Intecs in GNSS SW Radio: the signal generator is the result of the SWAN (Sistemi softWare per Applicazioni di Navigazione) project under Italian Space Agency (ASI) contract, the analyzer and the processing module have been developed for ESA to V&V the IOV (In Orbit Validation) Galileo Phase. In this case the GNSS SW RX works in parallel with Test User Receivers (TUR) in order to validate the Signal In Space (SiS). Is remarkable that gLab is the result of over three years of development and approximately one year of test and validation under ESA (European Space Agency) supervision.

  9. Database Tools for Evaluating Thermophysical Property Data

    NASA Astrophysics Data System (ADS)

    Rowley, Richard L.; Wilding, W. Vincent; Oscarson, John L.; Yang, Yan

    2007-06-01

    Most thermophysical-property databases (TPD) provide low-level quality control checks. This manuscript focuses on additional, higher-level data evaluations made possible by the breadth of data stored in the database. For example, thermodynamic equations relate the critical point, vapor-pressure curve, enthalpy of vaporization, liquid density, and liquid and vapor heat capacities to each other. Thermodynamic consistency among these properties can be used to guide selection of the best data sets. Even more broadly, molecular structure-based trends in properties can be identified within the database, and the properties of structurally related compounds can be effectively used to discriminate among available datasets. Automated property predictions can be used in conjunction with the TPD to guide the selection of the most accurate data. These and other high-level consistency tools will be illustrated based on evaluation and quality control work associated with the DIPPR® 801 TPD project for pure chemicals.

  10. Performance evaluation of swimmers: scientific tools.

    PubMed

    Smith, David J; Norris, Stephen R; Hogg, John M

    2002-01-01

    The purpose of this article is to provide a critical commentary of the physiological and psychological tools used in the evaluation of swimmers. The first-level evaluation should be the competitive performance itself, since it is at this juncture that all elements interplay and provide the 'highest form' of assessment. Competition video analysis of major swimming events has progressed to the point where it has become an indispensable tool for coaches, athletes, sport scientists, equipment manufacturers, and even the media. The breakdown of each swimming performance at the individual level to its constituent parts allows for comparison with the predicted or sought after execution, as well as allowing for comparison with identified world competition levels. The use of other 'on-going' monitoring protocols to evaluate training efficacy typically involves criterion 'effort' swims and specific training sets where certain aspects are scrutinised in depth. Physiological parameters that are often examined alongside swimming speed and technical aspects include oxygen uptake, heart rate, blood lactate concentration, blood lactate accumulation and clearance rates. Simple and more complex procedures are available for in-training examination of technical issues. Strength and power may be quantified via several modalities although, typically, tethered swimming and dry-land isokinetic devices are used. The availability of a 'swimming flume' does afford coaches and sport scientists a higher degree of flexibility in the type of monitoring and evaluation that can be undertaken. There is convincing evidence that athletes can be distinguished on the basis of their psychological skills and emotional competencies and that these differences become further accentuated as the athlete improves. No matter what test format is used (physiological, biomechanical or psychological), similar criteria of validity must be ensured so that the test provides useful and associative information

  11. The anatomy of E-Learning tools: Does software usability influence learning outcomes?

    PubMed

    Van Nuland, Sonya E; Rogers, Kem A

    2016-07-01

    Reductions in laboratory hours have increased the popularity of commercial anatomy e-learning tools. It is critical to understand how the functionality of such tools can influence the mental effort required during the learning process, also known as cognitive load. Using dual-task methodology, two anatomical e-learning tools were examined to determine the effect of their design on cognitive load during two joint learning exercises. A.D.A.M. Interactive Anatomy is a simplistic, two-dimensional tool that presents like a textbook, whereas Netter's 3D Interactive Anatomy has a more complex three-dimensional usability that allows structures to be rotated. It was hypothesized that longer reaction times on an observation task would be associated with the more complex anatomical software (Netter's 3D Interactive Anatomy), indicating a higher cognitive load imposed by the anatomy software, which would result in lower post-test scores. Undergraduate anatomy students from Western University, Canada (n = 70) were assessed using a baseline knowledge test, Stroop observation task response times (a measure of cognitive load), mental rotation test scores, and an anatomy post-test. Results showed that reaction times and post-test outcomes were similar for both tools, whereas mental rotation test scores were positively correlated with post-test values when students used Netter's 3D Interactive Anatomy (P = 0.007), but not when they used A.D.A.M. Interactive Anatomy. This suggests that a simple e-learning tool, such as A.D.A.M. Interactive Anatomy, is as effective as more complicated tools, such as Netter's 3D Interactive Anatomy, and does not academically disadvantage those with poor spatial ability. Anat Sci Educ 9: 378-390. © 2015 American Association of Anatomists. PMID:26671838

  12. Evaluating the Quantitative Capabilities of Metagenomic Analysis Software.

    PubMed

    Kerepesi, Csaba; Grolmusz, Vince

    2016-05-01

    DNA sequencing technologies are applied widely and frequently today to describe metagenomes, i.e., microbial communities in environmental or clinical samples, without the need for culturing them. These technologies usually return short (100-300 base-pairs long) DNA reads, and these reads are processed by metagenomic analysis software that assign phylogenetic composition-information to the dataset. Here we evaluate three metagenomic analysis software (AmphoraNet-a webserver implementation of AMPHORA2-, MG-RAST, and MEGAN5) for their capabilities of assigning quantitative phylogenetic information for the data, describing the frequency of appearance of the microorganisms of the same taxa in the sample. The difficulties of the task arise from the fact that longer genomes produce more reads from the same organism than shorter genomes, and some software assign higher frequencies to species with longer genomes than to those with shorter ones. This phenomenon is called the "genome length bias." Dozens of complex artificial metagenome benchmarks can be found in the literature. Because of the complexity of those benchmarks, it is usually difficult to judge the resistance of a metagenomic software to this "genome length bias." Therefore, we have made a simple benchmark for the evaluation of the "taxon-counting" in a metagenomic sample: we have taken the same number of copies of three full bacterial genomes of different lengths, break them up randomly to short reads of average length of 150 bp, and mixed the reads, creating our simple benchmark. Because of its simplicity, the benchmark is not supposed to serve as a mock metagenome, but if a software fails on that simple task, it will surely fail on most real metagenomes. We applied three software for the benchmark. The ideal quantitative solution would assign the same proportion to the three bacterial taxa. We have found that AMPHORA2/AmphoraNet gave the most accurate results and the other two software were under

  13. Open Software Tools Applied to Jordan's National Multi-Agent Water Management Model

    NASA Astrophysics Data System (ADS)

    Knox, Stephen; Meier, Philipp; Harou, Julien; Yoon, Jim; Selby, Philip; Lachaut, Thibaut; Klassert, Christian; Avisse, Nicolas; Khadem, Majed; Tilmant, Amaury; Gorelick, Steven

    2016-04-01

    Jordan is the fourth most water scarce country in the world, where demand exceeds supply in a politically and demographically unstable context. The Jordan Water Project (JWP) aims to perform policy evaluation by modelling the hydrology, economics, and governance of Jordan's water resource system. The multidisciplinary nature of the project requires a modelling software system capable of integrating submodels from multiple disciplines into a single decision making process and communicating results to stakeholders. This requires a tool for building an integrated model and a system where diverse data sets can be managed and visualised. The integrated Jordan model is built using Pynsim, an open-source multi-agent simulation framework implemented in Python. Pynsim operates on network structures of nodes and links and supports institutional hierarchies, where an institution represents a grouping of nodes, links or other institutions. At each time step, code within each node, link and institution can executed independently, allowing for their fully autonomous behaviour. Additionally, engines (sub-models) perform actions over the entire network or on a subset of the network, such as taking a decision on a set of nodes. Pynsim is modular in design, allowing distinct modules to be modified easily without affecting others. Data management and visualisation is performed using Hydra (www.hydraplatform.org), an open software platform allowing users to manage network structure and data. The Hydra data manager connects to Pynsim, providing necessary input parameters for the integrated model. By providing a high-level portal to the model, Hydra removes a barrier between the users of the model (researchers, stakeholders, planners etc) and the model itself, allowing them to manage data, run the model and visualise results all through a single user interface. Pynsim's ability to represent institutional hierarchies, inter-network communication and the separation of node, link and

  14. A Qualitative Evaluation of Evolution of a Learning Analytics Tool

    ERIC Educational Resources Information Center

    Ali, Liaqat; Hatala, Marek; Gasevic, Dragan; Jovanovic, Jelena

    2012-01-01

    LOCO-Analyst is a learning analytics tool we developed to provide educators with feedback on students learning activities and performance. Evaluation of the first version of the tool led to the enhancement of the tool's data visualization, user interface, and supported feedback types. The second evaluation of the improved tool allowed us to see…

  15. Development of a User Interface for a Regression Analysis Software Tool

    NASA Technical Reports Server (NTRS)

    Ulbrich, Norbert Manfred; Volden, Thomas R.

    2010-01-01

    An easy-to -use user interface was implemented in a highly automated regression analysis tool. The user interface was developed from the start to run on computers that use the Windows, Macintosh, Linux, or UNIX operating system. Many user interface features were specifically designed such that a novice or inexperienced user can apply the regression analysis tool with confidence. Therefore, the user interface s design minimizes interactive input from the user. In addition, reasonable default combinations are assigned to those analysis settings that influence the outcome of the regression analysis. These default combinations will lead to a successful regression analysis result for most experimental data sets. The user interface comes in two versions. The text user interface version is used for the ongoing development of the regression analysis tool. The official release of the regression analysis tool, on the other hand, has a graphical user interface that is more efficient to use. This graphical user interface displays all input file names, output file names, and analysis settings for a specific software application mode on a single screen which makes it easier to generate reliable analysis results and to perform input parameter studies. An object-oriented approach was used for the development of the graphical user interface. This choice keeps future software maintenance costs to a reasonable limit. Examples of both the text user interface and graphical user interface are discussed in order to illustrate the user interface s overall design approach.

  16. Robust Optimal Design of Experiments for Model Discrimination Using an Interactive Software Tool

    PubMed Central

    Stegmaier, Johannes; Skanda, Dominik; Lebiedz, Dirk

    2013-01-01

    Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU General Public License

  17. Robust optimal design of experiments for model discrimination using an interactive software tool.

    PubMed

    Stegmaier, Johannes; Skanda, Dominik; Lebiedz, Dirk

    2013-01-01

    Mathematical modeling of biochemical processes significantly contributes to a better understanding of biological functionality and underlying dynamic mechanisms. To support time consuming and costly lab experiments, kinetic reaction equations can be formulated as a set of ordinary differential equations, which in turn allows to simulate and compare hypothetical models in silico. To identify new experimental designs that are able to discriminate between investigated models, the approach used in this work solves a semi-infinite constrained nonlinear optimization problem using derivative based numerical algorithms. The method takes into account parameter variabilities such that new experimental designs are robust against parameter changes while maintaining the optimal potential to discriminate between hypothetical models. In this contribution we present a newly developed software tool that offers a convenient graphical user interface for model discrimination. We demonstrate the beneficial operation of the discrimination approach and the usefulness of the software tool by analyzing a realistic benchmark experiment from literature. New robust optimal designs that allow to discriminate between the investigated model hypotheses of the benchmark experiment are successfully calculated and yield promising results. The involved robustification approach provides maximally discriminating experiments for the worst parameter configurations, which can be used to estimate the meaningfulness of upcoming experiments. A major benefit of the graphical user interface is the ability to interactively investigate the model behavior and the clear arrangement of numerous variables. In addition to a brief theoretical overview of the discrimination method and the functionality of the software tool, the importance of robustness of experimental designs against parameter variability is demonstrated on a biochemical benchmark problem. The software is licensed under the GNU General Public License

  18. TESPI (Tool for Environmental Sound Product Innovation): a simplified software tool to support environmentally conscious design in SMEs

    NASA Astrophysics Data System (ADS)

    Misceo, Monica; Buonamici, Roberto; Buttol, Patrizia; Naldesi, Luciano; Grimaldi, Filomena; Rinaldi, Caterina

    2004-12-01

    TESPI (Tool for Environmental Sound Product Innovation) is the prototype of a software tool developed within the framework of the "eLCA" project. The project, (www.elca.enea.it)financed by the European Commission, is realising "On line green tools and services for Small and Medium sized Enterprises (SMEs)". The implementation by SMEs of environmental product innovation (as fostered by the European Integrated Product Policy, IPP) needs specific adaptation to their economic model, their knowledge of production and management processes and their relationships with innovation and the environment. In particular, quality and costs are the main driving forces of innovation in European SMEs, and well known barriers exist to the adoption of an environmental approach in the product design. Starting from these considerations, the TESPI tool has been developed to support the first steps of product design taking into account both the quality and the environment. Two main issues have been considered: (i) classic Quality Function Deployment (QFD) can hardly be proposed to SMEs; (ii) the environmental aspects of the product life cycle need to be integrated with the quality approach. TESPI is a user friendly web-based tool, has a training approach and applies to modular products. Users are guided through the investigation of the quality aspects of their product (customer"s needs and requirements fulfilment) and the identification of the key environmental aspects in the product"s life cycle. A simplified check list allows analyzing the environmental performance of the product. Help is available for a better understanding of the analysis criteria. As a result, the significant aspects for the redesign of the product are identified.

  19. Utilization of Software Tools for Uncertainty Calculation in Measurement Science Education

    NASA Astrophysics Data System (ADS)

    Zangl, Hubert; Zine-Zine, Mariam; Hoermaier, Klaus

    2015-02-01

    Despite its importance, uncertainty is often neglected by practitioners in the design of system even in safety critical applications. Thus, problems arising from uncertainty may only be identified late in the design process and thus lead to additional costs. Although there exists numerous tools to support uncertainty calculation, reasons for limited usage in early design phases may be low awareness of the existence of the tools and insufficient training in the practical application. We present a teaching philosophy that addresses uncertainty from the very beginning of teaching measurement science, in particular with respect to the utilization of software tools. The developed teaching material is based on the GUM method and makes use of uncertainty toolboxes in the simulation environment. Based on examples in measurement science education we discuss advantages and disadvantages of the proposed teaching philosophy and include feedback from students.

  20. SOFI Simulation Tool: A Software Package for Simulating and Testing Super-Resolution Optical Fluctuation Imaging.

    PubMed

    Girsault, Arik; Lukes, Tomas; Sharipov, Azat; Geissbuehler, Stefan; Leutenegger, Marcel; Vandenberg, Wim; Dedecker, Peter; Hofkens, Johan; Lasser, Theo

    2016-01-01

    Super-resolution optical fluctuation imaging (SOFI) allows one to perform sub-diffraction fluorescence microscopy of living cells. By analyzing the acquired image sequence with an advanced correlation method, i.e. a high-order cross-cumulant analysis, super-resolution in all three spatial dimensions can be achieved. Here we introduce a software tool for a simple qualitative comparison of SOFI images under simulated conditions considering parameters of the microscope setup and essential properties of the biological sample. This tool incorporates SOFI and STORM algorithms, displays and describes the SOFI image processing steps in a tutorial-like fashion. Fast testing of various parameters simplifies the parameter optimization prior to experimental work. The performance of the simulation tool is demonstrated by comparing simulated results with experimentally acquired data. PMID:27583365

  1. Student Evaluation of CALL Tools during the Design Process

    ERIC Educational Resources Information Center

    Nesbitt, Dallas

    2013-01-01

    This article discusses the comparative effectiveness of student input at different times during the design of CALL tools for learning kanji, the Japanese characters of Chinese origin. The CALL software "package" consisted of tools to facilitate the writing, reading and practising of kanji characters in context. A pre-design questionnaire…

  2. MASH Suite Pro: A Comprehensive Software Tool for Top-Down Proteomics.

    PubMed

    Cai, Wenxuan; Guner, Huseyin; Gregorich, Zachery R; Chen, Albert J; Ayaz-Guner, Serife; Peng, Ying; Valeja, Santosh G; Liu, Xiaowen; Ge, Ying

    2016-02-01

    Top-down mass spectrometry (MS)-based proteomics is arguably a disruptive technology for the comprehensive analysis of all proteoforms arising from genetic variation, alternative splicing, and posttranslational modifications (PTMs). However, the complexity of top-down high-resolution mass spectra presents a significant challenge for data analysis. In contrast to the well-developed software packages available for data analysis in bottom-up proteomics, the data analysis tools in top-down proteomics remain underdeveloped. Moreover, despite recent efforts to develop algorithms and tools for the deconvolution of top-down high-resolution mass spectra and the identification of proteins from complex mixtures, a multifunctional software platform, which allows for the identification, quantitation, and characterization of proteoforms with visual validation, is still lacking. Herein, we have developed MASH Suite Pro, a comprehensive software tool for top-down proteomics with multifaceted functionality. MASH Suite Pro is capable of processing high-resolution MS and tandem MS (MS/MS) data using two deconvolution algorithms to optimize protein identification results. In addition, MASH Suite Pro allows for the characterization of PTMs and sequence variations, as well as the relative quantitation of multiple proteoforms in different experimental conditions. The program also provides visualization components for validation and correction of the computational outputs. Furthermore, MASH Suite Pro facilitates data reporting and presentation via direct output of the graphics. Thus, MASH Suite Pro significantly simplifies and speeds up the interpretation of high-resolution top-down proteomics data by integrating tools for protein identification, quantitation, characterization, and visual validation into a customizable and user-friendly interface. We envision that MASH Suite Pro will play an integral role in advancing the burgeoning field of top-down proteomics. PMID:26598644

  3. Creating a strategic plan for configuration management using computer aided software engineering (CASE) tools

    SciTech Connect

    Smith, P.R.; Sarfaty, R.

    1993-05-01

    This paper provides guidance in the definition, documentation, measurement, enhancement of processes, and validation of a strategic plan for configuration management (CM). The approach and methodology used in establishing a strategic plan is the same for any enterprise, including the Department of Energy (DOE), commercial nuclear plants, the Department of Defense (DOD), or large industrial complexes. The principles and techniques presented are used world wide by some of the largest corporations. The authors used industry knowledge and the areas of their current employment to illustrate and provide examples. Developing a strategic configuration and information management plan for DOE Idaho Field Office (DOE-ID) facilities is discussed in this paper. A good knowledge of CM principles is the key to successful strategic planning. This paper will describe and define CM elements, and discuss how CM integrates the facility`s physical configuration, design basis, and documentation. The strategic plan does not need the support of a computer aided software engineering (CASE) tool. However, the use of the CASE tool provides a methodology for consistency in approach, graphics, and database capability combined to form an encyclopedia and a method of presentation that is easily understood and aids the process of reengineering. CASE tools have much more capability than those stated above. Some examples are supporting a joint application development group (JAD) to prepare a software functional specification document and, if necessary, provide the capability to automatically generate software application code. This paper briefly discusses characteristics and capabilities of two CASE tools that use different methodologies to generate similar deliverables.

  4. DVS-SOFTWARE: An Effective Tool for Applying Highly Parallelized Hardware To Computational Geophysics

    NASA Astrophysics Data System (ADS)

    Herrera, I.; Herrera, G. S.

    2015-12-01

    Most geophysical systems are macroscopic physical systems. The behavior prediction of such systems is carried out by means of computational models whose basic models are partial differential equations (PDEs) [1]. Due to the enormous size of the discretized version of such PDEs it is necessary to apply highly parallelized super-computers. For them, at present, the most efficient software is based on non-overlapping domain decomposition methods (DDM). However, a limiting feature of the present state-of-the-art techniques is due to the kind of discretizations used in them. Recently, I. Herrera and co-workers using 'non-overlapping discretizations' have produced the DVS-Software which overcomes this limitation [2]. The DVS-software can be applied to a great variety of geophysical problems and achieves very high parallel efficiencies (90%, or so [3]). It is therefore very suitable for effectively applying the most advanced parallel supercomputers available at present. In a parallel talk, in this AGU Fall Meeting, Graciela Herrera Z. will present how this software is being applied to advance MOD-FLOW. Key Words: Parallel Software for Geophysics, High Performance Computing, HPC, Parallel Computing, Domain Decomposition Methods (DDM)REFERENCES [1]. Herrera Ismael and George F. Pinder, Mathematical Modelling in Science and Engineering: An axiomatic approach", John Wiley, 243p., 2012. [2]. Herrera, I., de la Cruz L.M. and Rosas-Medina A. "Non Overlapping Discretization Methods for Partial, Differential Equations". NUMER METH PART D E, 30: 1427-1454, 2014, DOI 10.1002/num 21852. (Open source) [3]. Herrera, I., & Contreras Iván "An Innovative Tool for Effectively Applying Highly Parallelized Software To Problems of Elasticity". Geofísica Internacional, 2015 (In press)

  5. A user friendly software tool for the simulation and optimization of high power fiber lasers

    NASA Astrophysics Data System (ADS)

    Shang, Liang; Mao, Qinghe

    2008-12-01

    Double-clad rare-earth-doped fiber laser is a new generation of high power solid-state lasers. The numerical simulation is an important approach in the configuration design and parameter optimization for the high power fiber lasers (HPFLs). In this paper, we report our user-friendly high-power fiber laser simulation software system, which integrates the design, analysis and optimization functions together. The numerical simulations of the software are with the HPFL model based on the rate-equation theory. By using the theoretical model, for specific laser cavity configuration, doped fiber parameters and pump conditions, the distributions of the population inversion, forward and backward pump, and circulating lasing intensity along the doped fiber can be calculated, and thus, the main output characteristics, such as cavity gain, output power and laser efficiency, can be achieved accordingly. On the basis of the simulation results, the software supplies the functions developed for designs and optimizations of the pump configuration, doped fiber length and the reflectivity of output mirror. By combining the calculated mode-field distribution in doped fibers with the mechanism of curvature loss to suppress the higher-order modes, the software also supplies the function for optimizing the beam quality. With graphical user interface (GUI), all the functions of the software are provided function tools in menu options, especially for those which may be used frequently, toolbar buttons, shortcut keys and pop-up menus are also provided. The software is with single-document interface (SDI) and coded in C++ in the integrated development environment of Visual C++ 6.0. We believe it would be very helpful for the investigation and development of HPFLs.

  6. Emerging role of bioinformatics tools and software in evolution of clinical research

    PubMed Central

    Gill, Supreet Kaur; Christopher, Ajay Francis; Gupta, Vikas; Bansal, Parveen

    2016-01-01

    Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF) is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research. PMID:27453827

  7. Emerging role of bioinformatics tools and software in evolution of clinical research.

    PubMed

    Gill, Supreet Kaur; Christopher, Ajay Francis; Gupta, Vikas; Bansal, Parveen

    2016-01-01

    Clinical research is making toiling efforts for promotion and wellbeing of the health status of the people. There is a rapid increase in number and severity of diseases like cancer, hepatitis, HIV etc, resulting in high morbidity and mortality. Clinical research involves drug discovery and development whereas clinical trials are performed to establish safety and efficacy of drugs. Drug discovery is a long process starting with the target identification, validation and lead optimization. This is followed by the preclinical trials, intensive clinical trials and eventually post marketing vigilance for drug safety. Softwares and the bioinformatics tools play a great role not only in the drug discovery but also in drug development. It involves the use of informatics in the development of new knowledge pertaining to health and disease, data management during clinical trials and to use clinical data for secondary research. In addition, new technology likes molecular docking, molecular dynamics simulation, proteomics and quantitative structure activity relationship in clinical research results in faster and easier drug discovery process. During the preclinical trials, the software is used for randomization to remove bias and to plan study design. In clinical trials software like electronic data capture, Remote data capture and electronic case report form (eCRF) is used to store the data. eClinical, Oracle clinical are software used for clinical data management and for statistical analysis of the data. After the drug is marketed the safety of a drug could be monitored by drug safety software like Oracle Argus or ARISg. Therefore, softwares are used from the very early stages of drug designing, to drug development, clinical trials and during pharmacovigilance. This review describes different aspects related to application of computers and bioinformatics in drug designing, discovery and development, formulation designing and clinical research. PMID:27453827

  8. A Practical Comparison of De Novo Genome Assembly Software Tools for Next-Generation Sequencing Technologies

    PubMed Central

    Zhang, Wenyu; Chen, Jiajia; Yang, Yang; Tang, Yifei; Shang, Jing; Shen, Bairong

    2011-01-01

    The advent of next-generation sequencing technologies is accompanied with the development of many whole-genome sequence assembly methods and software, especially for de novo fragment assembly. Due to the poor knowledge about the applicability and performance of these software tools, choosing a befitting assembler becomes a tough task. Here, we provide the information of adaptivity for each program, then above all, compare the performance of eight distinct tools against eight groups of simulated datasets from Solexa sequencing platform. Considering the computational time, maximum random access memory (RAM) occupancy, assembly accuracy and integrity, our study indicate that string-based assemblers, overlap-layout-consensus (OLC) assemblers are well-suited for very short reads and longer reads of small genomes respectively. For large datasets of more than hundred millions of short reads, De Bruijn graph-based assemblers would be more appropriate. In terms of software implementation, string-based assemblers are superior to graph-based ones, of which SOAPdenovo is complex for the creation of configuration file. Our comparison study will assist researchers in selecting a well-suited assembler and offer essential information for the improvement of existing assemblers or the developing of novel assemblers. PMID:21423806

  9. CmapTools: A Software Environment for Knowledge Modeling and Sharing

    NASA Technical Reports Server (NTRS)

    Canas, Alberto J.

    2004-01-01

    In an ongoing collaborative effort between a group of NASA Ames scientists and researchers at the Institute for Human and Machine Cognition (IHMC) of the University of West Florida, a new version of CmapTools has been developed that enable scientists to construct knowledge models of their domain of expertise, share them with other scientists, make them available to anybody on the Internet with access to a Web browser, and peer-review other scientists models. These software tools have been successfully used at NASA to build a large-scale multimedia on Mars and in knowledge model on Habitability Assessment. The new version of the software places emphasis on greater usability for experts constructing their own knowledge models, and support for the creation of large knowledge models with large number of supporting resources in the forms of images, videos, web pages, and other media. Additionally, the software currently allows scientists to cooperate with each other in the construction, sharing and criticizing of knowledge models. Scientists collaborating from remote distances, for example researchers at the Astrobiology Institute, can concurrently manipulate the knowledge models they are viewing without having to do this at a special videoconferencing facility.

  10. Systematic Task Allocation Evaluation in Distributed Software Development

    NASA Astrophysics Data System (ADS)

    Münch, Jürgen; Lamersdorf, Ansgar

    Systematic task allocation to different development sites in global software development projects can open business and engineering perspectives and help to reduce risks and problems inherent in distributed development. Relying only on a single evaluation criterion such as development cost when distributing tasks to development sites has shown to be very risky and often does not lead to successful solutions in the long run. Task allocation in global software projects is challenging due to a multitude of impact factors and constraints. Systematic allocation decisions require the ability to evaluate and compare task allocation alternatives and to effectively establish customized task allocation practices in an organization. In this article, we present a customizable process for task allocation evaluation that is based on results from a systematic interview study with practitioners. In this process, the relevant criteria for evaluating task allocation alternatives are derived by applying principles from goal-oriented measurement. In addition, the customization of the process is demonstrated, related work and limitations are sketched, and an outlook on future work is given.

  11. Evaluating Uncertainty in Integrated Environmental Models: A Review of Concepts and Tools

    EPA Science Inventory

    This paper reviews concepts for evaluating integrated environmental models and discusses a list of relevant software-based tools. A simplified taxonomy for sources of uncertainty and a glossary of key terms with standard definitions are provided in the context of integrated appro...

  12. Evaluation of a Web Conferencing Tool and Collaborative Tasks in an Online Chinese Course

    ERIC Educational Resources Information Center

    Guo, Sijia

    2014-01-01

    This case study aims to explore the best practice of applying task-based language teaching (TBLT) via the web conferencing tool Blackboard Collaborate in a beginners' online Chinese course by evaluating the technical capacity of the software and the pedagogical values and limitations of the tasks designed. In this paper, Chapelle's (2001) criteria…

  13. SIGAPS: a prototype of bibliographic tool for medical research evaluation.

    PubMed

    Devos, P; Dufresne, E; Renard, J M; Beuscart, R

    2003-01-01

    Evaluation of research activity is extremely important but remains a complex domain. There's no standardized methods and evaluation is often based on the scientific publications. It is easy to identify, for a researcher, all the publications realized over a given period of time. At the level of an important establishment like an University Hospital, with about 500 researchers, this sort of inventory is very difficult to realize: we have to list the researchers, to list their publications, to determine the quality of articles produced, to store retrieved data and to calculate summary statistics. We have developed a full-Web prototype, using free software which, for a given researchers' list, interrogates the Pubmed server, downloads the found references and stores them in a local database. They are then enriched with local data which allow the realization of more or less complex analyses, the automatic production of reports, or keyword search. This tool is very easy to use, allowing for immediate analysis of publications of a researcher or a research team. This tool will allow to identify those active teams to be maintained or emergent teams to be supported. It will also allow to compare candidate profiles for appointments to research posts. PMID:14664073

  14. Web-based software tool for constraint-based design specification of synthetic biological systems.

    PubMed

    Oberortner, Ernst; Densmore, Douglas

    2015-06-19

    miniEugene provides computational support for solving combinatorial design problems, enabling users to specify and enumerate designs for novel biological systems based on sets of biological constraints. This technical note presents a brief tutorial for biologists and software engineers in the field of synthetic biology on how to use miniEugene. After reading this technical note, users should know which biological constraints are available in miniEugene, understand the syntax and semantics of these constraints, and be able to follow a step-by-step guide to specify the design of a classical synthetic biological system-the genetic toggle switch.1 We also provide links and references to more information on the miniEugene web application and the integration of the miniEugene software library into sophisticated Computer-Aided Design (CAD) tools for synthetic biology ( www.eugenecad.org ). PMID:25426642

  15. Analyst Tools and Quality Control Software for the ARM Data System

    SciTech Connect

    Moore, Sean; Hughes, Gary

    2008-07-31

    Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed web-based data analysis and visualization tools such as the interactive plotting program NCVweb, various diagnostic plot browsers, and a datastream processing status application. These tools allow even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers. We have also embarked on a system to comprehensively generate long time-series plots, frequency distributions, and other relevant statistics for scientific and engineering data in most high-level, publicly available ARM data streams. Furthermore, frequency distributions categorized by month or by season are made available to help define valid data ranges specific to those time domains. These statistics can be used to set limits that when checked, will improve upon the reporting of suspicious data and the early detection of instrument malfunction. The statistics and proposed limits are stored in a database for easy reporting, refining, and for use by other processes. Web-based applications to view the results are also available.

  16. SPRECware: software tools for Standard PREanalytical Code (SPREC) labeling - effective exchange and search of stored biospecimens.

    PubMed

    Nanni, Umberto; Betsou, Fotini; Riondino, Silvia; Rossetti, Luisa; Spila, Antonella; Valente, Maria Giovanna; Della-Morte, David; Palmirotta, Raffaele; Roselli, Mario; Ferroni, Patrizia; Guadagni, Fiorella

    2012-01-01

    Biobanks provide stored material to basic, translational, and epidemiological research and this material should be transferred without institute-dependent intrinsic bias. The ISBER Biospecimen Science Working Group has released a "Standard PREanalytical Code" (SPREC), which is a proposal for a standard coding of the preanalytical options that have been adopted in order to track and make explicit the preanalytical variations in the collection, preparation, and storage of specimens. In this paper we address 2 issues arising in any biobank or biolaboratory aiming at adopting SPREC: (i) reducing the burden required to adopt this standard coding, and (ii) maximize the immediate benefits of this adoption by providing a free, dedicated software tool. We propose SPRECware, a vision encompassing tools and solutions for the best exploitation of SPREC based on information technology (www.sprecware.org). As a first step, we make available SPRECbase, a software tool useful for generating, storing, managing, and exchanging SPREC-related information associated to specimens. Adopting SPREC is useful both for internal purposes (such as finding the samples having some given preanalytical features), and for exchanging the preanalytical information associated to biological samples between Laboratory Information Systems. In case of a common adoption of this coding, it would be easy to find out whether and where, among the participating Biological Resource Centers, the specimens for a given study are available in order to carry out a planned experiment. PMID:23032579

  17. Semantic integration of gene expression analysis tools and data sources using software connectors

    PubMed Central

    2013-01-01

    Background The study and analysis of gene expression measurements is the primary focus of functional genomics. Once expression data is available, biologists are faced with the task of extracting (new) knowledge associated to the underlying biological phenomenon. Most often, in order to perform this task, biologists execute a number of analysis activities on the available gene expression dataset rather than a single analysis activity. The integration of heteregeneous tools and data sources to create an integrated analysis environment represents a challenging and error-prone task. Semantic integration enables the assignment of unambiguous meanings to data shared among different applications in an integrated environment, allowing the exchange of data in a semantically consistent and meaningful way. This work aims at developing an ontology-based methodology for the semantic integration of gene expression analysis tools and data sources. The proposed methodology relies on software connectors to support not only the access to heterogeneous data sources but also the definition of transformation rules on exchanged data. Results We have studied the different challenges involved in the integration of computer systems and the role software connectors play in this task. We have also studied a number of gene expression technologies, analysis tools and related ontologies in order to devise basic integration scenarios and propose a reference ontology for the gene expression domain. Then, we have defined a number of activities and associated guidelines to prescribe how the development of connectors should be carried out. Finally, we have applied the proposed methodology in the construction of three different integration scenarios involving the use of different tools for the analysis of different types of gene expression data. Conclusions The proposed methodology facilitates the development of connectors capable of semantically integrating different gene expression analysis tools

  18. CUSTOMER RESPONSE TO BESTPRACTICES TRAINING AND SOFTWARE TOOLS PROVIDED BY DOE'S INDUSTRIAL TECHNOLOGIES PROGRAM

    SciTech Connect

    Schweitzer, Martin; Martin, Michaela A; Schmoyer, Richard L

    2008-03-01

    The BestPractices program area, which has evolved into the Save Energy Now (SEN) Initiative, is a component of the U.S. Department of Energy's (DOE's) Industrial Technologies Program (ITP) that provides technical assistance and disseminates information on energy-efficient technologies and practices to U.S. industrial firms. The BestPractices approach to information dissemination includes conducting training sessions which address energy-intensive systems (compressed air, steam, process heat, pumps, motors, and fans) and distributing DOE software tools on those same topics. The current report documents a recent Oak Ridge National Laboratory (ORNL) study undertaken to determine the implementation rate, attribution rate, and reduction factor for industrial end-users who received BestPractices training and registered software in FY 2006. The implementation rate is the proportion of service recipients taking energy-saving actions as a result of the service received. The attribution rate applies to those individuals taking energy-saving actions as a result of the services received and represents the portion of the savings achieved through those actions that is due to the service. The reduction factor is the saving that is realized from program-induced measures as a proportion of the potential savings that could be achieved if all service recipients took action. In addition to examining those factors, the ORNL study collected information on selected characteristics of service recipients, the perceived value of the services provided, and the potential energy savings that can be achieved through implementation of measures identified from the training or software. Because the provision of training is distinctly different from the provision of software tools, the two efforts were examined independently and the findings for each are reported separately.

  19. Proper bibeta ROC model: algorithm, software, and performance evaluation

    NASA Astrophysics Data System (ADS)

    Chen, Weijie; Hu, Nan

    2016-03-01

    Semi-parametric models are often used to fit data collected in receiver operating characteristic (ROC) experiments to obtain a smooth ROC curve and ROC parameters for statistical inference purposes. The proper bibeta model as recently proposed by Mossman and Peng enjoys several theoretical properties. In addition to having explicit density functions for the latent decision variable and an explicit functional form of the ROC curve, the two parameter bibeta model also has simple closed-form expressions for true-positive fraction (TPF), false-positive fraction (FPF), and the area under the ROC curve (AUC). In this work, we developed a computational algorithm and R package implementing this model for ROC curve fitting. Our algorithm can deal with any ordinal data (categorical or continuous). To improve accuracy, efficiency, and reliability of our software, we adopted several strategies in our computational algorithm including: (1) the LABROC4 categorization to obtain the true maximum likelihood estimation of the ROC parameters; (2) a principled approach to initializing parameters; (3) analytical first-order and second-order derivatives of the likelihood function; (4) an efficient optimization procedure (the L-BFGS algorithm in the R package "nlopt"); and (5) an analytical delta method to estimate the variance of the AUC. We evaluated the performance of our software with intensive simulation studies and compared with the conventional binormal and the proper binormal-likelihood-ratio models developed at the University of Chicago. Our simulation results indicate that our software is highly accurate, efficient, and reliable.

  20. adwTools Developed: New Bulk Alloy and Surface Analysis Software for the Alloy Design Workbench

    NASA Technical Reports Server (NTRS)

    Bozzolo, Guillermo; Morse, Jeffrey A.; Noebe, Ronald D.; Abel, Phillip B.

    2004-01-01

    A suite of atomistic modeling software, called the Alloy Design Workbench, has been developed by the Computational Materials Group at the NASA Glenn Research Center and the Ohio Aerospace Institute (OAI). The main goal of this software is to guide and augment experimental materials research and development efforts by creating powerful, yet intuitive, software that combines a graphical user interface with an operating code suitable for real-time atomistic simulations of multicomponent alloy systems. Targeted for experimentalists, the interface is straightforward and requires minimum knowledge of the underlying theory, allowing researchers to focus on the scientific aspects of the work. The centerpiece of the Alloy Design Workbench suite is the adwTools module, which concentrates on the atomistic analysis of surfaces and bulk alloys containing an arbitrary number of elements. An additional module, adwParams, handles ab initio input for the parameterization used in adwTools. Future modules planned for the suite include adwSeg, which will provide numerical predictions for segregation profiles to alloy surfaces and interfaces, and adwReport, which will serve as a window into the database, providing public access to the parameterization data and a repository where users can submit their own findings from the rest of the suite. The entire suite is designed to run on desktop-scale computers. The adwTools module incorporates a custom OAI/Glenn-developed Fortran code based on the BFS (Bozzolo- Ferrante-Smith) method for alloys, ref. 1). The heart of the suite, this code is used to calculate the energetics of different compositions and configurations of atoms.

  1. A Methodology to Evaluate Agent Oriented Software Engineering Techniques

    SciTech Connect

    Lin, Chia-En; Kavi, Krishna M.; Sheldon, Frederick T; Daley, Kristopher M; Abercrombie, Robert K

    2007-01-01

    Systems using software agents (or multi-agent systems, MAS) are becoming more popular within the development mainstream because, as the name suggests, an agent aims to handle tasks autonomously with intelligence. To benefit from autonomous control and reduced running costs, system functions are performed automatically. Agent-oriented considerations are being steadily accepted into the various software design paradigms. Agents may work alone, but most commonly, they cooperate toward achieving some application goal(s). MAS's are components in systems that are viewed as many individuals living in a society working together. From a SE perspective, solving a problem should encompass problem realization, requirements analysis, architecture design and implementation. These steps should be implemented within a life-cycle process including testing, verification, and reengineering to proving the built system is sound. In this paper, we explore the various applications of agent-based systems categorized into different application domains. A baseline is developed herein to help us focus on the core of agent concepts throughout the comparative study and to investigate both the object-oriented and agent-oriented techniques that are available for constructing agent-based systems. In each respect, we address the conceptual background associated with these methodologies and how available tools can be applied within specific domains.

  2. A software tool for STED-AFM correlative super-resolution microscopy

    NASA Astrophysics Data System (ADS)

    Koho, Sami; Deguchi, Takahiro; Löhmus, Madis; Näreoja, Tuomas; Hänninen, Pekka E.

    2015-03-01

    Multi-modal correlative microscopy allows combining the strengths of several imaging techniques to provide unique contrast. However it is not always straightforward to setup instruments for such customized experiments, as most microscope manufacturers use their own proprietary software, with limited or no capability to interface with other instruments - this makes correlation of the multi-modal data extremely challenging. We introduce a new software tool for simultaneous use of a STimulated Emission Depletion (STED) microscope with an Atomic Force Microscope (AFM). In our experiments, a Leica TCS STED commercial super-resolution microscope, together with an Agilent 5500ilm AFM microscope was used. With our software, it is possible to synchronize the data acquisition between the STED and AFM instruments, as well as to perform automatic registration of the AFM images with the super-resolution STED images. The software was realized in LabVIEW; the registration part was also implemented as an ImageJ script. The synchronization was realized by controlling simple trigger signals, also available in the commercial STED microscope, with a low-cost National Instruments USB-6501 digital I/O card. The registration was based on detecting the positions of the AFM tip inside the STED fieldof-view, which were then used as registration landmarks. The registration should work on any STED and tip-scanning AFM microscope combination, at nanometer-scale precision. Our STED-AFM correlation method has been tested with a variety of nanoparticle and fixed cell samples. The software will be released under BSD open-source license.

  3. A Tale of Two Cultures: Cross Cultural Comparison in Learning the Prezi Presentation Software Tool in the US and Norway

    ERIC Educational Resources Information Center

    Brock, Sabra; Brodahl, Cornelia

    2013-01-01

    Presentation software is an important tool for both student and professorial communicators. PowerPoint has been the standard since it was introduced in 1990. However, new "improved" software platforms are emerging. Prezi is one of these, claiming to remedy the linear thinking that underlies PowerPoint by creating one canvas and…

  4. The Arthroscopic Surgical Skill Evaluation Tool (ASSET)

    PubMed Central

    Koehler, Ryan J.; Amsdell, Simon; Arendt, Elizabeth A; Bisson, Leslie J; Braman, Jonathan P; Butler, Aaron; Cosgarea, Andrew J; Harner, Christopher D; Garrett, William E; Olson, Tyson; Warme, Winston J.; Nicandri, Gregg T.

    2014-01-01

    Background Surgeries employing arthroscopic techniques are among the most commonly performed in orthopaedic clinical practice however, valid and reliable methods of assessing the arthroscopic skill of orthopaedic surgeons are lacking. Hypothesis The Arthroscopic Surgery Skill Evaluation Tool (ASSET) will demonstrate content validity, concurrent criterion-oriented validity, and reliability, when used to assess the technical ability of surgeons performing diagnostic knee arthroscopy on cadaveric specimens. Study Design Cross-sectional study; Level of evidence, 3 Methods Content validity was determined by a group of seven experts using a Delphi process. Intra-articular performance of a right and left diagnostic knee arthroscopy was recorded for twenty-eight residents and two sports medicine fellowship trained attending surgeons. Subject performance was assessed by two blinded raters using the ASSET. Concurrent criterion-oriented validity, inter-rater reliability, and test-retest reliability were evaluated. Results Content validity: The content development group identified 8 arthroscopic skill domains to evaluate using the ASSET. Concurrent criterion-oriented validity: Significant differences in total ASSET score (p<0.05) between novice, intermediate, and advanced experience groups were identified. Inter-rater reliability: The ASSET scores assigned by each rater were strongly correlated (r=0.91, p <0.01) and the intra-class correlation coefficient between raters for the total ASSET score was 0.90. Test-retest reliability: there was a significant correlation between ASSET scores for both procedures attempted by each individual (r = 0.79, p<0.01). Conclusion The ASSET appears to be a useful, valid, and reliable method for assessing surgeon performance of diagnostic knee arthroscopy in cadaveric specimens. Studies are ongoing to determine its generalizability to other procedures as well as to the live OR and other simulated environments. PMID:23548808

  5. Verification of visual odometry algorithms with an OpenGL-based software tool

    NASA Astrophysics Data System (ADS)

    Skulimowski, Piotr; Strumillo, Pawel

    2015-05-01

    We present a software tool called a stereovision egomotion sequence generator that was developed for testing visual odometry (VO) algorithms. Various approaches to single and multicamera VO algorithms are reviewed first, and then a reference VO algorithm that has served to demonstrate the program's features is described. The program offers simple tools for defining virtual static three-dimensional scenes and arbitrary six degrees of freedom motion paths within such scenes and output sequences of stereovision images, disparity ground-truth maps, and segmented scene images. A simple script language is proposed that simplifies tests of VO algorithms for user-defined scenarios. The program's capabilities are demonstrated by testing a reference VO technique that employs stereoscopy and feature tracking.

  6. The impact of software and CAE tools on SEU in field programmable gate arrays

    SciTech Connect

    Katz, R.; Wang, J.; McCollum, J.; Cronquist, B.

    1999-12-01

    Field programmable gate array (FPGA) devices, heavily used in spacecraft electronics, have grown substantially in size over the past few years, causing designers to work at a higher conceptual level, with computer aided engineering (CAE) tools synthesizing and optimizing the logic from a description. It is shown that the use of commercial-off-the-shelf (COTS) CAE tools can produce unreliable circuit designs when the device is used in a radiation environment and a flip-flop is upset. At a lower level, software can be used to improve the SEU performance of a flip-flop, exploiting the configurable nature of FPGA technology and on-chip delay, parasitic resistive, and capacitive circuit elements.

  7. Virtual Power Electronics: Novel Software Tools for Design, Modeling and Education

    NASA Astrophysics Data System (ADS)

    Hamar, Janos; Nagy, István; Funato, Hirohito; Ogasawara, Satoshi; Dranga, Octavian; Nishida, Yasuyuki

    The current paper is dedicated to present browser-based multimedia-rich software tools and e-learning curriculum to support the design and modeling process of power electronics circuits and to explain sometimes rather sophisticated phenomena. Two projects will be discussed. The so-called Inetele project is financed by the Leonardo da Vinci program of the European Union (EU). It is a collaborative project between numerous EU universities and institutes to develop state-of-the art curriculum in Electrical Engineering. Another cooperative project with participation of Japanese, European and Australian institutes focuses especially on developing e-learning curriculum, interactive design and modeling tools, furthermore on development of a virtual laboratory. Snapshots from these two projects will be presented.

  8. The NetVISA automatic association tool. Next generation software testing and performance under realistic conditions.

    NASA Astrophysics Data System (ADS)

    Le Bras, Ronan; Arora, Nimar; Kushida, Noriyuki; Tomuta, Elena; Kebede, Fekadu; Feitio, Paulino

    2016-04-01

    The CTBTO's International Data Centre is in the process of developing the next generation software to perform the automatic association step. The NetVISA software uses a Bayesian approach with a forward physical model using probabilistic representations of the propagation, station capabilities, background seismicity, noise detection statistics, and coda phase statistics. The software has been in development for a few years and is now reaching the stage where it is being tested in a realistic operational context. An interactive module has been developed where the NetVISA automatic events that are in addition to the Global Association (GA) results are presented to the analysts. We report on a series of tests where the results are examined and evaluated by seasoned analysts. Consistent with the statistics previously reported (Arora et al., 2013), the first test shows that the software is able to enhance analysis work by providing additional event hypothesis for consideration by analysts. A test on a three-day data set was performed and showed that the system found 42 additional real events out of 116 examined, including 6 that pass the criterion for the Reviewed Event Bulletin of the IDC. The software was functional in a realistic, real-time mode, during the occurrence of the fourth nuclear test claimed by the Democratic People's Republic of Korea on January 6th, 2016. Confirming a previous statistical observation, the software found more associated stations (51, including 35 primary stations) than GA (36, including 26 primary stations) for this event. Nimar S. Arora, Stuart Russell, Erik Sudderth. Bulletin of the Seismological Society of America (BSSA) April 2013, vol. 103 no. 2A pp709-729.

  9. Tool and method for the theremal transient evaluation of packages

    NASA Astrophysics Data System (ADS)

    Szekely, Vladimir; Rencz, Marta; Courtois, Bernard

    1999-10-01

    This paper presents a new concept for the thermal transient measurement of IC packages. The TTMK thermal transient test kit described here consists of a test chip, a dedicated software running on a PC and a special cable connecting the PC to the IC package which encapsulates the test chip. The function of the thermal transient test equipment is realized partly by the test chip itself and partly by the measuring software. The software performs both the control of the measurements and the evaluation of the results. The output of the evaluation software may be a compact model network or the structure function describing the properties of the heat conduction path. The use of the TTMK kit and the capabilities of the evaluation software are presented in this paper.

  10. The Computer-based Health Evaluation Software (CHES): a software for electronic patient-reported outcome monitoring

    PubMed Central

    2012-01-01

    Background Patient-reported Outcomes (PROs) capturing e.g., quality of life, fatigue, depression, medication side-effects or disease symptoms, have become important outcome parameters in medical research and daily clinical practice. Electronic PRO data capture (ePRO) with software packages to administer questionnaires, storing data, and presenting results has facilitated PRO assessment in hospital settings. Compared to conventional paper-pencil versions of PRO instruments, ePRO is more economical with regard to staff resources and time, and allows immediate presentation of results to the medical staff. The objective of our project was to develop software (CHES – Computer-based Health Evaluation System) for ePRO in hospital settings and at home with a special focus on the presentation of individual patient’s results. Methods Following the Extreme Programming development approach architecture was not fixed up-front, but was done in close, continuous collaboration with software end users (medical staff, researchers and patients) to meet their specific demands. Developed features include sophisticated, longitudinal charts linking patients’ PRO data to clinical characteristics and to PRO scores from reference populations, a web-interface for questionnaire administration, and a tool for convenient creating and editing of questionnaires. Results By 2012 CHES has been implemented at various institutions in Austria, Germany, Switzerland, and the UK and about 5000 patients participated in ePRO (with around 15000 assessments in total). Data entry is done by the patients themselves via tablet PCs with a study nurse or an intern approaching patients and supervising questionnaire completion. Discussion During the last decade several software packages for ePRO have emerged for different purposes. Whereas commercial products are available primarily for ePRO in clinical trials, academic projects have focused on data collection and presentation in daily clinical practice and

  11. A software tool for material data analysis and property prediction: CASAC-ANA

    SciTech Connect

    Zhou, J.; Xie, Q.; Feng, J.; Li, S.; Xu, Z.; Chen, L.; Gui, Z.

    1995-12-31

    In this paper, a user-friendly software, CASAC-ANA, for material data analysis and property prediction is presented. In CASAC-ANA, there are seven methods: Nonlinear Mapping (NLM), Principal Component Analysis (PCA), Stepwise Discriminant Analysis (SDA), Discriminant Analysis with Constellation Graph (DACG), Hierarchical Clustering Analysis (HCA), Stepwise Multiple Linear Regression (SMLR), and Artificial Neural Networks (ANN). The software has some noteworthy features: (1) only one input file is needed and multipath output is produced; (2) both quantitative and qualitative data of dependent variables are accepted; and (3) it is easy to link with materials property databases. As a generalized modeling tool, CASAC-ANA can be used to treat material data concerning composition, technological processes, properties, and to predict properties of materials. The validity of the CASAC-ANA software has been tested successfully with three typical case studies concerning structural alloy steels, nickel-base superalloys, and continuously cast copper alloys. These CASAC-ANA methods have been compared and discussed.

  12. TaxI: a software tool for DNA barcoding using distance methods

    PubMed Central

    Steinke, Dirk; Vences, Miguel; Salzburger, Walter; Meyer, Axel

    2005-01-01

    DNA barcoding is a promising approach to the diagnosis of biological diversity in which DNA sequences serve as the primary key for information retrieval. Most existing software for evolutionary analysis of DNA sequences was designed for phylogenetic analyses and, hence, those algorithms do not offer appropriate solutions for the rapid, but precise analyses needed for DNA barcoding, and are also unable to process the often large comparative datasets. We developed a flexible software tool for DNA taxonomy, named TaxI. This program calculates sequence divergences between a query sequence (taxon to be barcoded) and each sequence of a dataset of reference sequences defined by the user. Because the analysis is based on separate pairwise alignments this software is also able to work with sequences characterized by multiple insertions and deletions that are difficult to align in large sequence sets (i.e. thousands of sequences) by multiple alignment algorithms because of computational restrictions. Here, we demonstrate the utility of this approach with two datasets of fish larvae and juveniles from Lake Constance and juvenile land snails under different models of sequence evolution. Sets of ribosomal 16S rRNA sequences, characterized by multiple indels, performed as good as or better than cox1 sequence sets in assigning sequences to species, demonstrating the suitability of rRNA genes for DNA barcoding. PMID:16214755

  13. Acts -- A collection of high performing software tools for scientific computing

    SciTech Connect

    Drummond, L.A.; Marques, O.A.

    2002-11-01

    During the past decades there has been a continuous growth in the number of physical and societal problems that have been successfully studied and solved by means of computational modeling and simulation. Further, many new discoveries depend on high performance computer simulations to satisfy their demands for large computational resources and short response time. The Advanced CompuTational Software (ACTS) Collection brings together a number of general-purpose computational tool development projects funded and supported by the U.S. Department of Energy (DOE). These tools make it easier for scientific code developers to write high performance applications for parallel computers. They tackle a number of computational issues that are common to a large number of scientific applications, mainly implementation of numerical algorithms, and support for code development, execution and optimization. The ACTS collection promotes code portability, reusability, reduction of duplicate efforts, and tool maturity. This paper presents a brief introduction to the functionality available in ACTS. It also highlight the tools that are in demand by Climate and Weather modelers.

  14. FIND: A new software tool and development platform for enhanced multicolor flow analysis

    PubMed Central

    2011-01-01

    Background Flow Cytometry is a process by which cells, and other microscopic particles, can be identified, counted, and sorted mechanically through the use of hydrodynamic pressure and laser-activated fluorescence labeling. As immunostained cells pass individually through the flow chamber of the instrument, laser pulses cause fluorescence emissions that are recorded digitally for later analysis as multidimensional vectors. Current, widely adopted analysis software limits users to manual separation of events based on viewing two or three simultaneous dimensions. While this may be adequate for experiments using four or fewer colors, advances have lead to laser flow cytometers capable of recording 20 different colors simultaneously. In addition, mass-spectrometry based machines capable of recording at least 100 separate channels are being developed. Analysis of such high-dimensional data by visual exploration alone can be error-prone and susceptible to unnecessary bias. Fortunately, the field of Data Mining provides many tools for automated group classification of multi-dimensional data, and many algorithms have been adapted or created for flow cytometry. However, the majority of this research has not been made available to users through analysis software packages and, as such, are not in wide use. Results We have developed a new software application for analysis of multi-color flow cytometry data. The main goals of this effort were to provide a user-friendly tool for automated gating (classification) of multi-color data as well as a platform for development and dissemination of new analysis tools. With this software, users can easily load single or multiple data sets, perform automated event classification, and graphically compare results within and between experiments. We also make available a simple plugin system that enables researchers to implement and share their data analysis and classification/population discovery algorithms. Conclusions The FIND (Flow

  15. SCENARIOS EVALUATION TOOL FOR CHLORINATED SOLVENT MNA

    SciTech Connect

    Vangelas, K; Brian02 Looney, B; Michael J. Truex; Charles J. Newell

    2006-08-16

    as in the technical and regulatory documents being developed within the ITRC. Three topic areas were identified for development during this project. These areas are: mass balance, Enhanced Attenuation (EA), and new characterization and monitoring tools and approaches to support MNA and EA. Each of these topics is documented in stand alone reports, WSRC-STI-2006-00082, WSRC-STI-2006-00083, and WSRC-STI-2006-00084, respectively. In brief, the mass balance efforts are examining methods and tools to allow a site to be evaluated in terms of a system where the inputs and processes within the system are compared to the outputs from the system, as well as understanding what attenuation processes may be occurring and how likely they are to occur within a system. Enhanced Attenuation is a new concept that is a transition step between primary treatments and MNA, when the natural attenuation processes are not sufficient to allow direct transition from the primary treatment to MNA. EA technologies are designed to either boost the level of the natural attenuation processes or decrease the loading of contaminants to the system for a period of time sufficient to allow the remedial goals to be met over the long-term. For characterization and monitoring, a phased approach based on documenting the site specific mass balance was developed. Tools and techniques to support the approach included direct measures of the biological processes and various tools to support cost-effective long-term monitoring of systems where the natural attenuation processes are the main treatment remedies. The effort revealed opportunities for integrating attenuation mechanisms into a systematic set of ''combined remedies'' for contaminated sites.

  16. Surface evaluation by estimation of fractal dimension and statistical tools.

    PubMed

    Hotar, Vlastimil; Salac, Petr

    2014-01-01

    Structured and complex data can be found in many applications in research and development, and also in industrial practice. We developed a methodology for describing the structured data complexity and applied it in development and industrial practice. The methodology uses fractal dimension together with statistical tools and with software modification is able to analyse data in a form of sequence (signals, surface roughness), 2D images, and dividing lines. The methodology had not been tested for a relatively large collection of data. For this reason, samples with structured surfaces produced with different technologies and properties were measured and evaluated with many types of parameters. The paper intends to analyse data measured by a surface roughness tester. The methodology shown compares standard and nonstandard parameters, searches the optimal parameters for a complete analysis, and specifies the sensitivity to directionality of samples for these types of surfaces. The text presents application of fractal geometry (fractal dimension) for complex surface analysis in combination with standard roughness parameters (statistical tool). PMID:25250380

  17. Surface Evaluation by Estimation of Fractal Dimension and Statistical Tools

    PubMed Central

    Salac, Petr

    2014-01-01

    Structured and complex data can be found in many applications in research and development, and also in industrial practice. We developed a methodology for describing the structured data complexity and applied it in development and industrial practice. The methodology uses fractal dimension together with statistical tools and with software modification is able to analyse data in a form of sequence (signals, surface roughness), 2D images, and dividing lines. The methodology had not been tested for a relatively large collection of data. For this reason, samples with structured surfaces produced with different technologies and properties were measured and evaluated with many types of parameters. The paper intends to analyse data measured by a surface roughness tester. The methodology shown compares standard and nonstandard parameters, searches the optimal parameters for a complete analysis, and specifies the sensitivity to directionality of samples for these types of surfaces. The text presents application of fractal geometry (fractal dimension) for complex surface analysis in combination with standard roughness parameters (statistical tool). PMID:25250380

  18. Data Analysis Software Tools for Enhanced Collaboration at the DIII-D National Fusion Facility

    SciTech Connect

    Schachter, J.; Peng, Q.; Schissel, D.P.

    1999-07-01

    Data analysis at the DIII-D National Fusion Facility is simplified by the use of two software packages in analysis codes. The first is GAP1otObj, an IDL-based object-oriented library used in visualization tools for dynamic plotting. GAPlotObj gives users the ability to manipulate graphs directly through mouse and keyboard-driven commands. The second software package is MDSplus, which is used at DIED as a central repository for analyzed data. GAPlotObj and MDSplus reduce the effort required for a collaborator to become familiar with the DIII-D analysis environment by providing uniform interfaces for data display and retrieval. Two visualization tools at DIII-D that benefit from them are ReviewPlus and EFITviewer. ReviewPlus is capable of displaying interactive 2D and 3D graphs of raw, analyzed, and simulation code data. EFITviewer is used to display results from the EFIT analysis code together with kinetic profiles and machine geometry. Both bring new possibilities for data exploration to the user, and are able to plot data from any fusion research site with an MDSplus data server.

  19. ConfocalCheck - A Software Tool for the Automated Monitoring of Confocal Microscope Performance

    PubMed Central

    Hng, Keng Imm; Dormann, Dirk

    2013-01-01

    Laser scanning confocal microscopy has become an invaluable tool in biomedical research but regular quality testing is vital to maintain the system’s performance for diagnostic and research purposes. Although many methods have been devised over the years to characterise specific aspects of a confocal microscope like measuring the optical point spread function or the field illumination, only very few analysis tools are available. Our aim was to develop a comprehensive quality assurance framework ranging from image acquisition to automated analysis and documentation. We created standardised test data to assess the performance of the lasers, the objective lenses and other key components required for optimum confocal operation. The ConfocalCheck software presented here analyses the data fully automatically. It creates numerous visual outputs indicating potential issues requiring further investigation. By storing results in a web browser compatible file format the software greatly simplifies record keeping allowing the operator to quickly compare old and new data and to spot developing trends. We demonstrate that the systematic monitoring of confocal performance is essential in a core facility environment and how the quantitative measurements obtained can be used for the detailed characterisation of system components as well as for comparisons across multiple instruments. PMID:24224017

  20. Software tools for quantification of X-ray microtomography at the UGCT

    NASA Astrophysics Data System (ADS)

    Vlassenbroeck, J.; Dierick, M.; Masschaele, B.; Cnudde, V.; Van Hoorebeke, L.; Jacobs, P.

    2007-09-01

    The technique of X-ray microtomography using X-ray tube radiation offers an interesting tool for the non-destructive investigation of a wide range of materials. A major challenge lies in the analysis and quantification of the resulting data, allowing for a full characterization of the sample under investigation. In this paper, we discuss the software tools for reconstruction and analysis of tomographic data that are being developed at the UGCT. The tomographic reconstruction is performed using Octopus, a high-performance and user-friendly software package. The reconstruction process transforms the raw acquisition data into a stack of 2D cross-sections through the sample, resulting in a 3D data set. A number of artifact and noise reduction algorithms are integrated to reduce ring artifacts, beam hardening artifacts, COR misalignment, detector or stage tilt, pixel non-linearities, etc. These corrections are very important to facilitate the analysis of the 3D data. The analysis of the 3D data focuses primarily on the characterization of pore structures, but will be extended to other applications. A first package for the analysis of pore structures in three dimensions was developed under Matlab ®. A new package, called Morpho+, is being developed in a C++ environment, with optimizations and extensions of the previously used algorithms. The current status of this project will be discussed. Examples of pore analysis can be found in pharmaceuticals, material science, geology and numerous other fields.

  1. DSSR: an integrated software tool for dissecting the spatial structure of RNA

    PubMed Central

    Lu, Xiang-Jun; Bussemaker, Harmen J.; Olson, Wilma K.

    2015-01-01

    Insight into the three-dimensional architecture of RNA is essential for understanding its cellular functions. However, even the classic transfer RNA structure contains features that are overlooked by existing bioinformatics tools. Here we present DSSR (Dissecting the Spatial Structure of RNA), an integrated and automated tool for analyzing and annotating RNA tertiary structures. The software identifies canonical and noncanonical base pairs, including those with modified nucleotides, in any tautomeric or protonation state. DSSR detects higher-order coplanar base associations, termed multiplets. It finds arrays of stacked pairs, classifies them by base-pair identity and backbone connectivity, and distinguishes a stem of covalently connected canonical pairs from a helix of stacked pairs of arbitrary type/linkage. DSSR identifies coaxial stacking of multiple stems within a single helix and lists isolated canonical pairs that lie outside of a stem. The program characterizes ‘closed’ loops of various types (hairpin, bulge, internal, and junction loops) and pseudoknots of arbitrary complexity. Notably, DSSR employs isolated pairs and the ends of stems, whether pseudoknotted or not, to define junction loops. This new, inclusive definition provides a novel perspective on the spatial organization of RNA. Tests on all nucleic acid structures in the Protein Data Bank confirm the efficiency and robustness of the software, and applications to representative RNA molecules illustrate its unique features. DSSR and related materials are freely available at http://x3dna.org/. PMID:26184874

  2. APT - NASA ENHANCED VERSION OF AUTOMATICALLY PROGRAMMED TOOL SOFTWARE - STAND-ALONE VERSION

    NASA Technical Reports Server (NTRS)

    Premo, D. A.

    1994-01-01

    The APT code is one of the most widely used software tools for complex numerically controlled (N/C) machining. APT is an acronym for Automatically Programmed Tools and is used to denote both a language and the computer software that processes that language. Development of the APT language and software system was begun over twenty years ago as a U. S. government sponsored industry and university research effort. APT is a "problem oriented" language that was developed for the explicit purpose of aiding the N/C machine tools. Machine-tool instructions and geometry definitions are written in the APT language to constitute a "part program." The APT part program is processed by the APT software to produce a cutter location (CL) file. This CL file may then be processed by user supplied post processors to convert the CL data into a form suitable for a particular N/C machine tool. This June, 1989 offering of the APT system represents an adaptation, with enhancements, of the public domain version of APT IV/SSX8 to the DEC VAX-11/780 for use by the Engineering Services Division of the NASA Goddard Space Flight Center. Enhancements include the super pocket feature which allows concave and convex polygon shapes of up to 40 points including shapes that overlap, that leave islands of material within the pocket, and that have one or more arcs as part of the pocket boundary. Recent modifications to APT include a rework of the POCKET subroutine and correction of an error that prevented the use within a macro of a macro variable cutter move statement combined with macro variable double check surfaces. Former modifications included the expansion of array and buffer sizes to accommodate larger part programs, and the insertion of a few user friendly error messages. The APT system software on the DEC VAX-11/780 is organized into two separate programs: the load complex and the APT processor. The load complex handles the table initiation phase and is usually only run when changes to the

  3. Evaluation of breast parenchymal density with QUANTRA software

    PubMed Central

    Pahwa, Shivani; Hari, Smriti; Thulkar, Sanjay; Angraal, Suveen

    2015-01-01

    Purpose: To evaluate breast parenchymal density using QUANTRA software and to correlate numerical breast density values obtained from QUANTRA with ACR BI-RADS breast density categories. Materials and Methods: Two-view digital mammograms of 545 consecutive women (mean age - 47.7 years) were categorized visually by three independent radiologists into one of the four ACR BI-RADS categories (D1-D4). Numerical breast density values as obtained by QUANTRA software were then used to establish the cutoff values for each category using receiver operator characteristic (ROC) analysis. Results: Numerical breast density values obtained by QUANTRA (range - 7-42%) were systematically lower than visual estimates. QUANTRA breast density value of less than 14.5% could accurately differentiate category D1 from the categories D2, D3, and D4 [area under curve (AUC) on ROC analysis - 94.09%, sensitivity - 85.71%, specificity - 84.21%]. QUANTRA density values of <19.5% accurately differentiated categories D1 and D2 from D3 and D4 (AUC - 94.4%, sensitivity - 87.50%, specificity - 84.60%); QUANTRA density values of <26.5% accurately differentiated categories D1, D2, and D3 from category D4 (AUC - 90.75%, sensitivity - 88.89%, specificity - 88.621%). Conclusions: Breast density values obtained by QUANTRA software can be used to obtain objective cutoff values for each ACR BI-RADS breast density category. Although the numerical density values obtained by QUANTRA are lower than visual estimates, they correlate well with the BI-RADS breast density categories assigned visually to the mammograms. PMID:26752820

  4. Evaluation of the efficiency and reliability of software generated by code generators

    NASA Technical Reports Server (NTRS)

    Schreur, Barbara

    1994-01-01

    There are numerous studies which show that CASE Tools greatly facilitate software development. As a result of these advantages, an increasing amount of software development is done with CASE Tools. As more software engineers become proficient with these tools, their experience and feedback lead to further development with the tools themselves. What has not been widely studied, however, is the reliability and efficiency of the actual code produced by the CASE Tools. This investigation considered these matters. Three segments of code generated by MATRIXx, one of many commercially available CASE Tools, were chosen for analysis: ETOFLIGHT, a portion of the Earth to Orbit Flight software, and ECLSS and PFMC, modules for Environmental Control and Life Support System and Pump Fan Motor Control, respectively.

  5. Proofreading Using an Assistive Software Homophone Tool: Compensatory and Remedial Effects on the Literacy Skills of Students with Reading Difficulties

    ERIC Educational Resources Information Center

    Lange, Alissa A.; Mulhern, Gerry; Wylie, Judith

    2009-01-01

    The present study investigated the effects of using an assistive software homophone tool on the assisted proofreading performance and unassisted basic skills of secondary-level students with reading difficulties. Students aged 13 to 15 years proofread passages for homophonic errors under three conditions: with the homophone tool, with homophones…

  6. Evaluation of the Trajectory Operations Applications Software Task (TOAST)

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Martin, Andrea; Bavinger, Bill

    1990-01-01

    The Trajectory Operations Applications Software Task (TOAST) is a software development project under the auspices of the Mission Operations Directorate. Its purpose is to provide trajectory operation pre-mission and real-time support for the Space Shuttle program. As an Application Manager, TOAST provides an isolation layer between the underlying Unix operating system and the series of user programs. It provides two main services: a common interface to operating system functions with semantics appropriate for C or FORTRAN, and a structured input and output package that can be utilized by user application programs. In order to evaluate TOAST as an Application Manager, the task was to assess current and planned capabilities, compare capabilities to functions available in commercially-available off the shelf (COTS) and Flight Analysis Design System (FADS) users for TOAST implementation. As a result of the investigation, it was found that the current version of TOAST is well implemented and meets the needs of the real-time users. The plans for migrating TOAST to the X Window System are essentially sound; the Executive will port with minor changes, while Menu Handler will require a total rewrite. A series of recommendations for future TOAST directions are included.

  7. Performance Evaluation of Communication Software Systems for Distributed Computing

    NASA Technical Reports Server (NTRS)

    Fatoohi, Rod

    1996-01-01

    In recent years there has been an increasing interest in object-oriented distributed computing since it is better quipped to deal with complex systems while providing extensibility, maintainability, and reusability. At the same time, several new high-speed network technologies have emerged for local and wide area networks. However, the performance of networking software is not improving as fast as the networking hardware and the workstation microprocessors. This paper gives an overview and evaluates the performance of the Common Object Request Broker Architecture (CORBA) standard in a distributed computing environment at NASA Ames Research Center. The environment consists of two testbeds of SGI workstations connected by four networks: Ethernet, FDDI, HiPPI, and ATM. The performance results for three communication software systems are presented, analyzed and compared. These systems are: BSD socket programming interface, IONA's Orbix, an implementation of the CORBA specification, and the PVM message passing library. The results show that high-level communication interfaces, such as CORBA and PVM, can achieve reasonable performance under certain conditions.

  8. Photovoltaic array performance and life-cycle cost simulation using new software tools

    NASA Technical Reports Server (NTRS)

    Daniel, R. E.; Burger, D. R.; Reiter, L. J.

    1985-01-01

    The three computer models, SAMICS, PVARRAY, and LCP can be used together as a single analytical tool to compare the lifetime economic value of a photovoltaic (PV) array. This evaluation can be used to compare various module and array configurations and the performance characteristics of different module manufacturing technologies.

  9. The Viability of a Software Tool to Assist Students in the Review of Literature

    ERIC Educational Resources Information Center

    Anderson, Timothy R.

    2013-01-01

    Most doctoral students are novice researchers and may not possess the skills to effectively conduct a comprehensive review of the literature and frame a problem designed to conduct original research. Students need proper training and tools necessary to critically evaluate, synthesize and organize literature. The purpose of this concurrent mixed…

  10. A Process for Evaluating Student Records Management Software. ERIC/AE Digest.

    ERIC Educational Resources Information Center

    Vecchioli, Lisa

    This digest provides practical advice on evaluating software for managing student records. An evaluation of record-keeping software should start with a process to identify all of the individual needs the software produce must meet in order to be considered for purchase. The first step toward establishing an administrative computing system is…

  11. Validation of a Tool Evaluating Educational Apps for Smart Education

    ERIC Educational Resources Information Center

    Lee, Jeong-Sook; Kim, Sung-Wan

    2015-01-01

    The purpose of this study is to develop and validate an evaluation tool of educational apps for smart education. Based on literature reviews, a potential model for evaluating educational apps was suggested. An evaluation tool consisting of 57 survey items was delivered to 156 students in middle and high schools. An exploratory factor analysis was…

  12. THE ATMOSPHERIC MODEL EVALUATION TOOL (AMET); AIR QUALITY MODULE

    EPA Science Inventory

    This presentation reviews the development of the Atmospheric Model Evaluation Tool (AMET) air quality module. The AMET tool is being developed to aid in the model evaluation. This presentation focuses on the air quality evaluation portion of AMET. Presented are examples of the...

  13. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model

    PubMed Central

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies’ business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and “what-if” scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  14. Ranking of Business Process Simulation Software Tools with DEX/QQ Hierarchical Decision Model.

    PubMed

    Damij, Nadja; Boškoski, Pavle; Bohanec, Marko; Mileva Boshkoska, Biljana

    2016-01-01

    The omnipresent need for optimisation requires constant improvements of companies' business processes (BPs). Minimising the risk of inappropriate BP being implemented is usually performed by simulating the newly developed BP under various initial conditions and "what-if" scenarios. An effectual business process simulations software (BPSS) is a prerequisite for accurate analysis of an BP. Characterisation of an BPSS tool is a challenging task due to the complex selection criteria that includes quality of visual aspects, simulation capabilities, statistical facilities, quality reporting etc. Under such circumstances, making an optimal decision is challenging. Therefore, various decision support models are employed aiding the BPSS tool selection. The currently established decision support models are either proprietary or comprise only a limited subset of criteria, which affects their accuracy. Addressing this issue, this paper proposes a new hierarchical decision support model for ranking of BPSS based on their technical characteristics by employing DEX and qualitative to quantitative (QQ) methodology. Consequently, the decision expert feeds the required information in a systematic and user friendly manner. There are three significant contributions of the proposed approach. Firstly, the proposed hierarchical model is easily extendible for adding new criteria in the hierarchical structure. Secondly, a fully operational decision support system (DSS) tool that implements the proposed hierarchical model is presented. Finally, the effectiveness of the proposed hierarchical model is assessed by comparing the resulting rankings of BPSS with respect to currently available results. PMID:26871694

  15. Establishing a Web-based DICOM teaching file authoring tool using open-source public software.

    PubMed

    Lee, Wen-Jeng; Yang, Chung-Yi; Liu, Kao-Lang; Liu, Hon-Man; Ching, Yu-Tai; Chen, Shyh-Jye

    2005-09-01

    Online teaching files are an important source of educational and referential materials in the radiology community. The commonly used Digital Imaging and Communications in Medicine (DICOM) file format of the radiology community is not natively supported by common Web browsers. The ability of the Web server to convert and parse DICOM is important when the DICOM-converting tools are not available. In this paper, we describe our approach to develop a Web-based teaching file authoring tool. Our server is built using Apache Web server running on FreeBSD operating system. The dynamic page content is produced by Hypertext Preprocessor (PHP). Digital Imaging and Communications in Medicine images are converted by ImageMagick into Joint Photographic Experts Group (JPEG) format. Digital Imaging and Communications in Medicine attributes are parsed by dicom3tools and stored in PostgreSQL database. Using free software available from the Internet, we build a Web service that allows radiologists to create their own online teaching file cases with a common Web browser. PMID:15924271

  16. An automated deformable image registration evaluation of confidence tool.

    PubMed

    Kirby, Neil; Chen, Josephine; Kim, Hojin; Morin, Olivier; Nie, Ke; Pouliot, Jean

    2016-04-21

    Deformable image registration (DIR) is a powerful tool for radiation oncology, but it can produce errors. Beyond this, DIR accuracy is not a fixed quantity and varies on a case-by-case basis. The purpose of this study is to explore the possibility of an automated program to create a patient- and voxel-specific evaluation of DIR accuracy. AUTODIRECT is a software tool that was developed to perform this evaluation for the application of a clinical DIR algorithm to a set of patient images. In brief, AUTODIRECT uses algorithms to generate deformations and applies them to these images (along with processing) to generate sets of test images, with known deformations that are similar to the actual ones and with realistic noise properties. The clinical DIR algorithm is applied to these test image sets (currently 4). From these tests, AUTODIRECT generates spatial and dose uncertainty estimates for each image voxel based on a Student's t distribution. In this study, four commercially available DIR algorithms were used to deform a dose distribution associated with a virtual pelvic phantom image set, and AUTODIRECT was used to generate dose uncertainty estimates for each deformation. The virtual phantom image set has a known ground-truth deformation, so the true dose-warping errors of the DIR algorithms were also known. AUTODIRECT predicted error patterns that closely matched the actual error spatial distribution. On average AUTODIRECT overestimated the magnitude of the dose errors, but tuning the AUTODIRECT algorithms should improve agreement. This proof-of-principle test demonstrates the potential for the AUTODIRECT algorithm as an empirical method to predict DIR errors. PMID:27025957

  17. An automated deformable image registration evaluation of confidence tool

    NASA Astrophysics Data System (ADS)

    Kirby, Neil; Chen, Josephine; Kim, Hojin; Morin, Olivier; Nie, Ke; Pouliot, Jean

    2016-04-01

    Deformable image registration (DIR) is a powerful tool for radiation oncology, but it can produce errors. Beyond this, DIR accuracy is not a fixed quantity and varies on a case-by-case basis. The purpose of this study is to explore the possibility of an automated program to create a patient- and voxel-specific evaluation of DIR accuracy. AUTODIRECT is a software tool that was developed to perform this evaluation for the application of a clinical DIR algorithm to a set of patient images. In brief, AUTODIRECT uses algorithms to generate deformations and applies them to these images (along with processing) to generate sets of test images, with known deformations that are similar to the actual ones and with realistic noise properties. The clinical DIR algorithm is applied to these test image sets (currently 4). From these tests, AUTODIRECT generates spatial and dose uncertainty estimates for each image voxel based on a Student’s t distribution. In this study, four commercially available DIR algorithms were used to deform a dose distribution associated with a virtual pelvic phantom image set, and AUTODIRECT was used to generate dose uncertainty estimates for each deformation. The virtual phantom image set has a known ground-truth deformation, so the true dose-warping errors of the DIR algorithms were also known. AUTODIRECT predicted error patterns that closely matched the actual error spatial distribution. On average AUTODIRECT overestimated the magnitude of the dose errors, but tuning the AUTODIRECT algorithms should improve agreement. This proof-of-principle test demonstrates the potential for the AUTODIRECT algorithm as an empirical method to predict DIR errors.

  18. New Decision Tool To Evaluate Award Selection Process.

    ERIC Educational Resources Information Center

    Thornley, Richard; Spence, Matthew W.; Taylor, Mark; Magnan, Jacques

    2002-01-01

    Describes an Alberta Heritage Foundation for Medical Research initiative to enhance the review process for its training awards using a new tool based on the ProGrid decision-assist software. Implementation resulted in several modifications to the review process in the areas of definition, rationality, fairness, timeliness, and responsiveness; the…

  19. Hybrid Modeling for Scenario-Based Evaluation of Failure Effects in Advanced Hardware-Software Designs

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Fleming, Land; Throop, David

    2001-01-01

    This paper describes an incremental scenario-based simulation approach to evaluation of intelligent software for control and management of hardware systems. A hybrid continuous/discrete event simulation of the hardware dynamically interacts with the intelligent software in operations scenarios. Embedded anomalous conditions and failures in simulated hardware can lead to emergent software behavior and identification of missing or faulty software or hardware requirements. An approach is described for extending simulation-based automated incremental failure modes and effects analysis, to support concurrent evaluation of intelligent software and the hardware controlled by the software

  20. Combining Software Games with Education: Evaluation of its Educational Effectiveness

    ERIC Educational Resources Information Center

    Virvou, Maria; Katsionis, George; Manos, Konstantinos

    2005-01-01

    Computer games are very popular among children and adolescents. In this respect, they could be exploited by educational software designers to render educational software more attractive and motivating. However, it remains to be explored what the educational scope of educational software games is. In this paper, we explore several issues concerning…

  1. [SIGAPS: a software package for the evaluation of medical publications].

    PubMed

    Derancourt, C; Devos, P; Moore, N; Rouvillain, J-L

    2014-01-01

    The "système d'interrogation, de gestion et d'analyse des publications scientifiques" (System for Identification, Management and Analysis of Scientific Publications), or SIGAPS, is an innovative tool of French design that enables the identification and analysis of bibliographic references produced by a given researcher or unit using the Medline database (PubMed). This evaluation takes into account the author's rank of signature and the impact factor of the journal of publication within the discipline in question. The limits are those of the impact factor. Analyses produced by SIGAPS enable financial assessment to be made by hospitals. PMID:25209819

  2. A browsing tool for the Internet Logical Library of the HPCC Software Exchange

    NASA Technical Reports Server (NTRS)

    Biro, Ross

    1993-01-01

    As the quantity of information available on the Internet grows, locating a particular piece of information becomes more difficult. One possible solution is for a database of pointers to all available information to be maintained at a central site. Subject classifications for all the information could also be maintained in order to make searching possible. This paper describes one possible method of searching such an index. In particular a prototype browsing tool has been created using TCL/TK to demonstrate several possible features: rapidly scanning at any rank of the index, narrowing the index to any scope, regular-expression searching, and creation of a list of pointers answering to any set of index terms. The prototype browser is an easy-to-use independent X application designed for use in the Catalog of Repositories of the HPCC (High Performance Computing and Communications) Software Exchange.

  3. BioBrick assembly standards and techniques and associated software tools.

    PubMed

    Røkke, Gunvor; Korvald, Eirin; Pahr, Jarle; Oyås, Ove; Lale, Rahmi

    2014-01-01

    The BioBrick idea was developed to introduce the engineering principles of abstraction and standardization into synthetic biology. BioBricks are DNA sequences that serve a defined biological function and can be readily assembled with any other BioBrick parts to create new BioBricks with novel properties. In order to achieve this, several assembly standards can be used. Which assembly standards a BioBrick is compatible with, depends on the prefix and suffix sequences surrounding the part. In this chapter, five of the most common assembly standards will be described, as well as some of the most used assembly techniques, cloning procedures, and a presentation of the available software tools that can be used for deciding on the best method for assembling of different BioBricks, and searching for BioBrick parts in the Registry of Standard Biological Parts database. PMID:24395353

  4. GeneMarker® Genotyping Software: Tools to Increase the Statistical Power of DNA Fragment Analysis

    PubMed Central

    Hulce, D.; Li, X.; Snyder-Leiby, T.; Johathan Liu, C.S.

    2011-01-01

    The discriminatory power of post-genotyping analyses, such as kinship or clustering analysis, is dependent on the amount of genetic information obtained from the DNA fragment/genotyping analysis. The number of microsatellite loci amplified in one multiplex is limited by the number of dyes and overlapping loci boundaries; requiring researchers to amplify replicate samples with 2 or more multiplexes in order to obtain a genotype for 12–15 loci. AFLP is another method that is limited by the number of dyes, often requiring multiple amplifications of replicate samples to obtain more complete results. Traditionally, researchers export the genotyping results into a spread sheet, manually combine the results for each individual and then import into a third software package for post-genotyping analysis. GeneMarker is highly accurate, user-friendly genotyping software that allows all of these steps to be done in one software package, avoiding potential errors from data transfer to different programs and decreasing the amount of time needed to process the results. The Merge Project tool automatically combines the results from replicate samples processed with different primer sets. Replicate animal (diploid) DNA samples were amplified with three different multiplexes, each multiplex provided information on 4–6 loci. The kinship analysis using the merged results provided a 1017 increase in statistical power with a range of 108 when 5 loci were used versus 1025 when 15 loci were used to determine potential relationship levels with identity by descent calculations. These same sample sets were used in clustering analysis to diagram dendrograms. The dendrogram based on a single multiplex resulted in three branches at a given Euclidian distance. In comparison, the dendrogram that was constructed using the merged results had eight branches at the same Euclidian distance.

  5. RadNotes: a novel software development tool for radiology education.

    PubMed

    Baxter, A B; Klein, J S; Oesterle, E V

    1997-01-01

    RadNotes is a novel software development tool that enables physicians to develop teaching materials incorporating text and images in an intelligent, highly usable format. Projects undertaken in the RadNotes environment require neither programming expertise nor the assistance of a software engineer. The first of these projects, Thoracic Imaging, integrates image teaching files, concise disease and topic summaries, references, and flash card quizzes into a single program designed to provide an overview of chest radiology. RadNotes is intended to support the academic goals of teaching radiologists by enabling authors to create, edit, and electronically distribute image-oriented presentations. RadNotes also supports the educational goals of physicians who wish to quickly review selected imaging topics, as well as to develop a visual vocabulary of corresponding radiologic anatomy and pathologic conditions. Although Thoracic Imaging was developed with the aim of introducing chest radiology to residents, RadNotes can be used to develop tutorials and image-based tests for all levels; create corresponding World Wide Web sites; and organize notes, images, and references for individual use. PMID:9153710

  6. OligoSpawn: a software tool for the design of overgo probes from large unigene datasets

    PubMed Central

    Zheng, Jie; Svensson, Jan T; Madishetty, Kavitha; Close, Timothy J; Jiang, Tao; Lonardi, Stefano

    2006-01-01

    Background Expressed sequence tag (EST) datasets represent perhaps the largest collection of genetic information. ESTs can be exploited in a variety of biological experiments and analysis. Here we are interested in the design of overlapping oligonucleotide (overgo) probes from large unigene (EST-contigs) datasets. Results OLIGOSPAWN is a suite of software tools that offers two complementary services, namely (1) the selection of "unique" oligos each of which appears in one unigene but does not occur (exactly or approximately) in any other and (2) the selection of "popular" oligos each of which occurs (exactly or approximately) in as many unigenes as possible. In this paper, we describe the functionalities of OLIGOSPAWN and the computational methods it employs, and we report on experimental results for the overgo probes designed with it. Conclusion The algorithms we designed are highly efficient and capable of processing unigene datasets of sizes on the order of several tens of Mb in a few hours on a regular PC. The software has been used to design overgo probes employed to screen a barley BAC library (Hordeum vulgare). OLIGOSPAWN is freely available at . PMID:16401345

  7. Open Source Software Openfoam as a New Aerodynamical Simulation Tool for Rocket-Borne Measurements

    NASA Astrophysics Data System (ADS)

    Staszak, T.; Brede, M.; Strelnikov, B.

    2015-09-01

    The only way to do in-situ measurements, which are very important experimental studies for atmospheric science, in the mesoshere/lower thermosphere (MLT) is to use sounding rockets. The drawback of using rockets is the shock wave appearing because of the very high speed of the rocket motion (typically about 1000 mIs). This shock wave disturbs the density, the temperature and the velocity fields in the vicinity of the rocket, compared to undisturbed values of the atmosphere. This effect, however, can be quantified and the measured data has to be corrected not just to make it more precise but simply usable. The commonly accepted and widely used tool for this calculations is the Direct Simulation Monte Carlo (DSMC) technique developed by GA. Bird which is available as stand-alone program limited to use a single processor. Apart from complications with simulations of flows around bodies related to different flow regimes in the altitude range of MLT, that rise due to exponential density change by several orders of magnitude, a particular hardware configuration introduces significant difficulty for aerodynamical calculations due to choice of the grid sizes mainly depending on the demands on adequate DSMCs and good resolution of geometries with scale differences of factor of iO~. This makes either the calculation time unreasonably long or even prevents the calculation algorithm from converging. In this paper we apply the free open source software OpenFOAM (licensed under GNU GPL) for a three-dimensional CFD-Simulation of a flow around a sounding rocket instrumentation. An advantage of this software package, among other things, is that it can run on high performance clusters, which are easily scalable. We present the first results and discuss the potential of the new tool in applications for sounding rockets.

  8. Evaluation of Distribution Analysis Software for DER Applications

    SciTech Connect

    Staunton, RH

    2003-01-23

    providers will be forced to both fully acknowledge the trend and plan for accommodating DER [3]. With bureaucratic barriers [4], lack of time/resources, tariffs, etc. still seen in certain regions of the country, changes still need to be made. Given continued technical advances in DER, the time is fast approaching when the industry, nation-wide, must not only accept DER freely but also provide or review in-depth technical assessments of how DER should be integrated into and managed throughout the distribution system. Characterization studies are needed to fully understand how both the utility system and DER devices themselves will respond to all reasonable events (e.g., grid disturbances, faults, rapid growth, diverse and multiple DER systems, large reactive loads). Some of this work has already begun as it relates to operation and control of DER [5] and microturbine performance characterization [6,7]. One of the most urgently needed tools that can provide these types of analyses is a distribution network analysis program in combination with models for various DER. Together, they can be used for (1) analyzing DER placement in distribution networks and (2) helping to ensure that adequate transmission reliability is maintained. Surveys of the market show products that represent a partial match to these needs; specifically, software that has been developed to plan electrical distribution systems and analyze reliability (in a near total absence of DER). The first part of this study (Sections 2 and 3 of the report) looks at a number of these software programs and provides both summary descriptions and comparisons. The second part of this study (Section 4 of the report) considers the suitability of these analysis tools for DER studies. It considers steady state modeling and assessment work performed by ORNL using one commercially available tool on feeder data provided by a southern utility. Appendix A provides a technical report on the results of this modeling effort.

  9. Performance Analysis of GYRO: A Tool Evaluation

    SciTech Connect

    Worley, P.; Roth, P.; Candy, J.; Shan, Hongzhang; Mahinthakumar,G.; Sreepathi, S.; Carrington, L.; Kaiser, T.; Snavely, A.; Reed, D.; Zhang, Y.; Huck, K.; Malony, A.; Shende, S.; Moore, S.; Wolf, F.

    2005-06-26

    The performance of the Eulerian gyrokinetic-Maxwell solver code GYRO is analyzed on five high performance computing systems. First, a manual approach is taken, using custom scripts to analyze the output of embedded wall clock timers, floating point operation counts collected using hardware performance counters, and traces of user and communication events collected using the profiling interface to Message Passing Interface (MPI) libraries. Parts of the analysis are then repeated or extended using a number of sophisticated performance analysis tools: IPM, KOJAK, SvPablo, TAU, and the PMaC modeling tool suite. The paper briefly discusses what has been discovered via this manual analysis process, what performance analyses are inconvenient or infeasible to attempt manually, and to what extent the tools show promise in accelerating or significantly extending the manual performance analyses.

  10. SHAPA: An interactive software tool for protocol analysis applied to aircrew communications and workload

    NASA Technical Reports Server (NTRS)

    James, Jeffrey M.; Sanderson, Penelope M.; Seidler, Karen S.

    1990-01-01

    As modern transport environments become increasingly complex, issues such as crew communication, interaction with automation, and workload management have become crucial. Much research is being focused on holistic aspects of social and cognitive behavior, such as the strategies used to handle workload, the flow of information, the scheduling of tasks, the verbal and non-verbal interactions between crew members. Traditional laboratory performance measures no longer sufficiently meet the needs of researchers addressing these issues. However observational techniques are better equipped to capture the type of data needed and to build models of the requisite level of sophistication. Presented here is SHAPA, an interactive software tool for performing both verbal and non-verbal protocol analysis. It has been developed with the idea of affording the researchers the closest possible degree of engagement with protocol data. The researcher can configure SHAPA to encode protocols using any theoretical framework or encoding vocabulary that is desired. SHAPA allows protocol analysis to be performed at any level of analysis, and it supplies a wide variety of tools for data aggregation, manipulation. The output generated by SHAPA can be used alone or in combination with other performance variables to get a rich picture of the influences on sequences of verbal or nonverbal behavior.

  11. Proceedings of the Ninth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.

  12. Tool for the evaluation of cigarette paper marking quality

    NASA Astrophysics Data System (ADS)

    Bloch, Jean-Francis; Falgueras, D. Bertran

    2000-12-01

    Papermaking process consists in a succession of unit operations that are successively the forming section, the press section and finally the drying section. Forming and pressing are on the scope of this paper as they may influence the aspect of the studied material: paper. The main objective is to characterize paper and more specifically its visual quality, mainly due to marking which consists in successive white and dark strips. A proposed method is described in order to analyze the quality of this visual aspect of paper, which is a very important factor for the consumer. This paper is therefore devoted to the presentation of an industrial tool to Digital Image Processing that allows the evaluation of cigarette paper marking quality. This problem is delicate as different technical and physical parameters have some influence on the paper appearance. For example the whiteness or the opacity of paper influences the evaluation of the quality of the marking. Furthermore, an expert of paper cigarette who observes the paper lying on a black support carries out the classical test of quality evaluation. Thus the reflection of light is mainly observed instead of the look- through aspect. Usually, this determination is made by the experienced eye of the expert who may distinguish between 5 to 6 classes of paper quality. Moreover, sensibility and subjectivity play an important role in this grading establishment. The aim of the presented tool is to obtain an object classification of the paper marking quality. Image analysis is used in order to mimic the expert experience. In a first step, the image acquisition is done using a standard scanner. Then developed software analyzes the obtained image numerically. The sensibility of the image analysis is high, and the results are repeatable. The classification of different cigarette papers using this method provided the same results as the human expert, pointing out the validity of the developed method. Some experimental results are

  13. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition. PMID:27350495

  14. Design and Evaluation of Interactive Proofreading Tools for Connectomics.

    PubMed

    Haehn, Daniel; Knowles-Barley, Seymour; Roberts, Mike; Beyer, Johanna; Kasthuri, Narayanan; Lichtman, Jeff W; Pfister, Hanspeter

    2014-12-01

    Proofreading refers to the manual correction of automatic segmentations of image data. In connectomics, electron microscopy data is acquired at nanometer-scale resolution and results in very large image volumes of brain tissue that require fully automatic segmentation algorithms to identify cell boundaries. However, these algorithms require hundreds of corrections per cubic micron of tissue. Even though this task is time consuming, it is fairly easy for humans to perform corrections through splitting, merging, and adjusting segments during proofreading. In this paper we present the design and implementation of Mojo, a fully-featured single-user desktop application for proofreading, and Dojo, a multi-user web-based application for collaborative proofreading. We evaluate the accuracy and speed of Mojo, Dojo, and Raveler, a proofreading tool from Janelia Farm, through a quantitative user study. We designed a between-subjects experiment and asked non-experts to proofread neurons in a publicly available connectomics dataset. Our results show a significant improvement of corrections using web-based Dojo, when given the same amount of time. In addition, all participants using Dojo reported better usability. We discuss our findings and provide an analysis of requirements for designing visual proofreading software. PMID:26356960

  15. TIDE TOOL: Open-Source Sea-Level Monitoring Software for Tsunami Warning Systems

    NASA Astrophysics Data System (ADS)

    Weinstein, S. A.; Kong, L. S.; Becker, N. C.; Wang, D.

    2012-12-01

    A tsunami warning center (TWC) typically decides to issue a tsunami warning bulletin when initial estimates of earthquake source parameters suggest it may be capable of generating a tsunami. A TWC, however, relies on sea-level data to provide prima facie evidence for the existence or non-existence of destructive tsunami waves and to constrain tsunami wave height forecast models. In the aftermath of the 2004 Sumatra disaster, the International Tsunami Information Center asked the Pacific Tsunami Warning Center (PTWC) to develop a platform-independent, easy-to-use software package to give nascent TWCs the ability to process WMO Global Telecommunications System (GTS) sea-level messages and to analyze the resulting sea-level curves (marigrams). In response PTWC developed TIDE TOOL that has since steadily grown in sophistication to become PTWC's operational sea-level processing system. TIDE TOOL has two main parts: a decoder that reads GTS sea-level message logs, and a graphical user interface (GUI) written in the open-source platform-independent graphical toolkit scripting language Tcl/Tk. This GUI consists of dynamic map-based clients that allow the user to select and analyze a single station or groups of stations by displaying their marigams in strip-chart or screen-tiled forms. TIDE TOOL also includes detail maps of each station to show each station's geographical context and reverse tsunami travel time contours to each station. TIDE TOOL can also be coupled to the GEOWARE™ TTT program to plot tsunami travel times and to indicate the expected tsunami arrival time on the marigrams. Because sea-level messages are structured in a rich variety of formats TIDE TOOL includes a metadata file, COMP_META, that contains all of the information needed by TIDE TOOL to decode sea-level data as well as basic information such as the geographical coordinates of each station. TIDE TOOL can therefore continuously decode theses sea-level messages in real-time and display the time

  16. Disposal Systems Evaluations and Tool Development - Engineered Barrier System Evaluation (Work Package LL1015080425)

    SciTech Connect

    Blink, J A; Buscheck, T A; Halsey, W G; Wolery, T

    2010-03-19

    in the field of knowledge engineering and knowledge-management systems (Umeki et al. 2009). At certain points in the logical process, the DSEF software will point the evaluate to other software tools to do analyses needed to move the process forward. In the development of the DSEF, they will be mindful to make it no more complex than necessary to evaluate the system being considered. The DSEF will organize and document the work such that multiple realizations for different combinations can be compared and contrasted.

  17. Performance evaluation tools for nuclear based interrogation techniques — an application of the PFNA technology

    NASA Astrophysics Data System (ADS)

    Feinstein, R. L.; Keeley, D. A.; Bendahan, J.

    1995-02-01

    To facilitate the design and tuning of the Pulsed Fast Neutron Analysis (PFNA) system, under development for non-intrusive inspection of large cargo containers, Science Applications International Corporation (SAIC) has developed and utilized a family of computational tools to encapsulate the essential physics and system characteristics and to serve as a framework for hardware and software trade-off studies. One such tool is the PFNASIM code, a physics based, end-to-end simulator of the entire PFNA technology that maps the atomic densities of any material container to the observed γ-ray counts in each detector. Another tool is the PFNA Performance Evaluation Tool (PFNAPET) that utilizes estimation theory and the output of PFNASIM to predict the minimum error in estimated atomic densities inside the container. These two codes are described and an example of performance evaluation on a cargo container is included.

  18. Should we have blind faith in bioinformatics software? Illustrations from the SNAP web-based tool.

    PubMed

    Robiou-du-Pont, Sébastien; Li, Aihua; Christie, Shanice; Sohani, Zahra N; Meyre, David

    2015-01-01

    Bioinformatics tools have gained popularity in biology but little is known about their validity. We aimed to assess the early contribution of 415 single nucleotide polymorphisms (SNPs) associated with eight cardio-metabolic traits at the genome-wide significance level in adults in the Family Atherosclerosis Monitoring In earLY Life (FAMILY) birth cohort. We used the popular web-based tool SNAP to assess the availability of the 415 SNPs in the Illumina Cardio-Metabochip genotyped in the FAMILY study participants. We then compared the SNAP output with the Cardio-Metabochip file provided by Illumina using chromosome and chromosomal positions of SNPs from NCBI Human Genome Browser (Genome Reference Consortium Human Build 37). With the HapMap 3 release 2 reference, 201 out of 415 SNPs were reported as missing in the Cardio-Metabochip by the SNAP output. However, the Cardio-Metabochip file revealed that 152 of these 201 SNPs were in fact present in the Cardio-Metabochip array (false negative rate of 36.6%). With the more recent 1000 Genomes Project release, we found a false-negative rate of 17.6% by comparing the outputs of SNAP and the Illumina product file. We did not find any 'false positive' SNPs (SNPs specified as available in the Cardio-Metabochip by SNAP, but not by the Cardio-Metabochip Illumina file). The Cohen's Kappa coefficient, which calculates the percentage of agreement between both methods, indicated that the validity of SNAP was fair to moderate depending on the reference used (the HapMap 3 or 1000 Genomes). In conclusion, we demonstrate that the SNAP outputs for the Cardio-Metabochip are invalid. This study illustrates the importance of systematically assessing the validity of bioinformatics tools in an independent manner. We propose a series of guidelines to improve practices in the fast-moving field of bioinformatics software implementation. PMID:25742008

  19. Should We Have Blind Faith in Bioinformatics Software? Illustrations from the SNAP Web-Based Tool

    PubMed Central

    Robiou-du-Pont, Sébastien; Li, Aihua; Christie, Shanice; Sohani, Zahra N.; Meyre, David

    2015-01-01

    Bioinformatics tools have gained popularity in biology but little is known about their validity. We aimed to assess the early contribution of 415 single nucleotide polymorphisms (SNPs) associated with eight cardio-metabolic traits at the genome-wide significance level in adults in the Family Atherosclerosis Monitoring In earLY Life (FAMILY) birth cohort. We used the popular web-based tool SNAP to assess the availability of the 415 SNPs in the Illumina Cardio-Metabochip genotyped in the FAMILY study participants. We then compared the SNAP output with the Cardio-Metabochip file provided by Illumina using chromosome and chromosomal positions of SNPs from NCBI Human Genome Browser (Genome Reference Consortium Human Build 37). With the HapMap 3 release 2 reference, 201 out of 415 SNPs were reported as missing in the Cardio-Metabochip by the SNAP output. However, the Cardio-Metabochip file revealed that 152 of these 201 SNPs were in fact present in the Cardio-Metabochip array (false negative rate of 36.6%). With the more recent 1000 Genomes Project release, we found a false-negative rate of 17.6% by comparing the outputs of SNAP and the Illumina product file. We did not find any ‘false positive’ SNPs (SNPs specified as available in the Cardio-Metabochip by SNAP, but not by the Cardio-Metabochip Illumina file). The Cohen’s Kappa coefficient, which calculates the percentage of agreement between both methods, indicated that the validity of SNAP was fair to moderate depending on the reference used (the HapMap 3 or 1000 Genomes). In conclusion, we demonstrate that the SNAP outputs for the Cardio-Metabochip are invalid. This study illustrates the importance of systematically assessing the validity of bioinformatics tools in an independent manner. We propose a series of guidelines to improve practices in the fast-moving field of bioinformatics software implementation. PMID:25742008

  20. Performance analysis and optimization of an advanced pharmaceutical wastewater treatment plant through a visual basic software tool (PWWT.VB).

    PubMed

    Pal, Parimal; Thakura, Ritwik; Chakrabortty, Sankha

    2016-05-01

    A user-friendly, menu-driven simulation software tool has been developed for the first time to optimize and analyze the system performance of an advanced continuous membrane-integrated pharmaceutical wastewater treatment plant. The software allows pre-analysis and manipulation of input data which helps in optimization and shows the software performance visually on a graphical platform. Moreover, the software helps the user to "visualize" the effects of the operating parameters through its model-predicted output profiles. The software is based on a dynamic mathematical model, developed for a systematically integrated forward osmosis-nanofiltration process for removal of toxic organic compounds from pharmaceutical wastewater. The model-predicted values have been observed to corroborate well with the extensive experimental investigations which were found to be consistent under varying operating conditions like operating pressure, operating flow rate, and draw solute concentration. Low values of the relative error (RE = 0.09) and high values of Willmott-d-index (d will = 0.981) reflected a high degree of accuracy and reliability of the software. This software is likely to be a very efficient tool for system design or simulation of an advanced membrane-integrated treatment plant for hazardous wastewater. PMID:26856870

  1. The Comprehensive Evaluation of Professional Development Software: A Review of the Literature.

    ERIC Educational Resources Information Center

    Liaupsin, Carl J.

    2003-01-01

    Discussion of the development of professional development software for special education teachers reviews the literature to (1) consolidate the database of literature regarding professional development software; (2) examine the degree to which the described software has been comprehensively evaluated; and (3) provide suggestions for future…

  2. A practical overview and comparison of certain commercial forensic software tools for processing large-scale digital investigations

    NASA Astrophysics Data System (ADS)

    Kröger, Knut; Creutzburg, Reiner

    2013-05-01

    The aim of this paper is to show the usefulness of modern forensic software tools for processing large-scale digital investigations. In particular, we focus on the new version of Nuix 4.2 and compare it with AccessData FTK 4.2, X-Ways Forensics 16.9 and Guidance Encase Forensic 7 regarding its performance, functionality, usability and capability. We will show how these software tools work with large forensic images and how capable they are in examining complex and big data scenarios.

  3. Differences in the Educational Software Evaluation Process for Experts and Novice Students

    ERIC Educational Resources Information Center

    Tokmak, Hatice Sancar; Incikabi, Lutfi; Yelken, Tugba Yanpar

    2012-01-01

    This comparative case study investigated the educational software evaluation processes of both experts and novices in conjunction with a software evaluation checklist. Twenty novice elementary education students, divided into groups of five, and three experts participated. Each novice group and the three experts evaluated educational software…

  4. Understanding Expertise-Based Training Effects on the Software Evaluation Process of Mathematics Education Teachers

    ERIC Educational Resources Information Center

    Incikabi, Lutfi; Sancar Tokmak, Hatice

    2012-01-01

    This case study examined the educational software evaluation processes of pre-service teachers who attended either expertise-based training (XBT) or traditional training in conjunction with a Software-Evaluation checklist. Forty-three mathematics teacher candidates and three experts participated in the study. All participants evaluated educational…

  5. Diva software, a tool for European regional seas and Ocean climatologies production

    NASA Astrophysics Data System (ADS)

    Ouberdous, M.; Troupin, C.; Barth, A.; Alvera-Azcàrate, A.; Beckers, J.-M.

    2012-04-01

    Diva (Data-Interpolating Variational Analysis) is a software based on a method designed to perform data-gridding (or analysis) tasks, with the assets of taking into account the intrinsic nature of oceanographic data, i.e., the uncertainty on the in situ measurements and the anisotropy due to advection and irregular coastlines and topography. The Variational Inverse Method (VIM, Brasseur et al., 1996) implemented in Diva consists in minimizing a variational principle which accounts for the differences between the observations and the reconstructed field, the influence of the gradients and variability of the reconstructed field. The resolution of the numerical problem is based on finite-element method, which allows a great numerical efficiency and the consideration of complicated contours. Along with the analysis, Diva provides also error fields (Brankart and Brasseur, 1998; Rixen et al., 2000) based on the data coverage and noise. Diva is used for the production of climatologies in the pan-European network SeaDataNet. SeaDataNet is connecting the existing marine data centres of more than 30 countries and set up a data management infrastructure consisting of a standardized distributed system. The consortium has elaborated integrated products, using common procedures and methods. Among these, it uses the Diva software as reference tool for climatologies computation for various European regional seas, the Atlantic and the global ocean. During the first phase of the SeaDataNet project, a number of additional tools were developed to make easier the climatologies production for the users. Among these tools: the advection constraint during the field reconstruction through the specification of a velocity field on a regular grid, forcing the analysis to align with the velocity vectors; the Generalized Cross Validation for the determination of analysis parameters (signal-to-noise ratio); the creation of contours at selected depths; the detection of possible outliers; the

  6. Using Numerical Models in the Development of Software Tools for Risk Management of Accidents with Oil and Inert Spills

    NASA Astrophysics Data System (ADS)

    Fernandes, R.; Leitão, P. C.; Braunschweig, F.; Lourenço, F.; Galvão, P.; Neves, R.

    2012-04-01

    useful in emergency situations. The backtracking modelling feature and the possibility of importing spill locations from remote servers with observed data (per example, from flight surveillance or remote sensing) allow the potential application to the evaluation of possible contamination sources. The third tool developed is an innovative system to dynamically produce quantified risk levels in real time, integrating best available information from numerical forecasts and the existing monitoring tools. This system provides coastal pollution risk levels associated to potential (or real) oil spill incidents, taking into account regional statistic information on vessel accidents and coastal sensitivity indexes (determined in EROCIPS project), real time vessel information (positioning, cargo type, speed and vessel type) obtained from AIS, best-available metocean numerical forecasts (hydrodynamics, meteorology - including visibility, wave conditions) and simulated scenarios by the oil spill fate and behaviour component of MOHID Water Modelling System (www.mohid.com). Different spill fate and behaviour simulations are continuously generated and processed in background (assuming hypothetical spills from vessels), based on variable vessel information, and metocean conditions, and results from these simulations are used in the quantification the consequences of potential spills. Dynamic Risk Tool was not designed to replace conventional mapping tools, but to complement that type of information with an innovative approach to risk mapping. Taking advantage of interoperability between forecasting models, oil spill simulations, AIS monitoring systems, statistical data and coastal vulnerability, this software can provide to end-users realtime risk levels, allowing an innovative approach to risk mapping, providing decision-makers with an improved decision support model and also an intelligent risk-based traffic monitoring. For instance, this tool allows the prioritisation of individual

  7. Evaluation of Two Types of Online Help for Application Software.

    ERIC Educational Resources Information Center

    Dutke, Stephan; Reimer, T.

    2000-01-01

    Discusses online help systems in application software design and describes two experiments in which adult computer novices learned to use experimental graphics software by task-based exploration. Topics include operative help; function-oriented help; effects on learning performance; schemata; mental models; and implications for the design of…

  8. Evaluation: New Tools for New Tasks.

    ERIC Educational Resources Information Center

    Bunce-Crim, Marna

    1992-01-01

    Presents workable strategies for observing, tracking, and reporting student progress in the elementary writing classroom. The article offers tips and techniques for ongoing evaluation (observing writers as they write, conferencing, and student self-evaluation). It also includes a teacher-guided tour of a real Vermont student's writing portfolio.…

  9. A simulator tool set for evaluating HEVC/SHVC streaming

    NASA Astrophysics Data System (ADS)

    Al Hadhrami, Tawfik; Nightingale, James; Wang, Qi; Grecos, Christos; Kehtarnavaz, Nasser

    2015-02-01

    Video streaming and other multimedia applications account for an ever increasing proportion of all network traffic. The recent adoption of High Efficiency Video Coding (HEVC) as the H.265 standard provides many opportunities for new and improved services multimedia services and applications in the consumer domain. Since the delivery of version one of H.265, the Joint Collaborative Team on Video Coding have been working towards standardisation of a scalable extension (SHVC) to the H.265 standard and a series of range extensions and new profiles. As these enhancements are added to the standard the range of potential applications and research opportunities will expend. For example the use of video is also growing rapidly in other sectors such as safety, security, defence and health with real-time high quality video transmission playing an important role in areas like critical infrastructure monitoring and disaster management. Each of which may benefit from the application of enhanced HEVC/H.265 and SHVC capabilities. The majority of existing research into HEVC/H.265 transmission has focussed on the consumer domain addressing issues such as broadcast transmission and delivery to mobile devices with the lack of freely available tools widely cited as an obstacle to conducting this type of research. In this paper we present a toolset which facilitates the transmission and evaluation of HEVC/H.265 and SHVC encoded video on the popular open source NCTUns simulator. Our toolset provides researchers with a modular, easy to use platform for evaluating video transmission and adaptation proposals on large scale wired, wireless and hybrid architectures. The toolset consists of pre-processing, transmission, SHVC adaptation and post-processing tools to gather and analyse statistics. It has been implemented using HM15 and SHM5, the latest versions of the HEVC and SHVC reference software implementations to ensure that currently adopted proposals for scalable and range extensions to

  10. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  11. Software comparison for evaluating genomic copy number variation for Affymetrix 6.0 SNP array platform

    PubMed Central

    2011-01-01

    Background Copy number data are routinely being extracted from genome-wide association study chips using a variety of software. We empirically evaluated and compared four freely-available software packages designed for Affymetrix SNP chips to estimate copy number: Affymetrix Power Tools (APT), Aroma.Affymetrix, PennCNV and CRLMM. Our evaluation used 1,418 GENOA samples that were genotyped on the Affymetrix Genome-Wide Human SNP Array 6.0. We compared bias and variance in the locus-level copy number data, the concordance amongst regions of copy number gains/deletions and the false-positive rate amongst deleted segments. Results APT had median locus-level copy numbers closest to a value of two, whereas PennCNV and Aroma.Affymetrix had the smallest variability associated with the median copy number. Of those evaluated, only PennCNV provides copy number specific quality-control metrics and identified 136 poor CNV samples. Regions of copy number variation (CNV) were detected using the hidden Markov models provided within PennCNV and CRLMM/VanillaIce. PennCNV detected more CNVs than CRLMM/VanillaIce; the median number of CNVs detected per sample was 39 and 30, respectively. PennCNV detected most of the regions that CRLMM/VanillaIce did as well as additional CNV regions. The median concordance between PennCNV and CRLMM/VanillaIce was 47.9% for duplications and 51.5% for deletions. The estimated false-positive rate associated with deletions was similar for PennCNV and CRLMM/VanillaIce. Conclusions If the objective is to perform statistical tests on the locus-level copy number data, our empirical results suggest that PennCNV or Aroma.Affymetrix is optimal. If the objective is to perform statistical tests on the summarized segmented data then PennCNV would be preferred over CRLMM/VanillaIce. Specifically, PennCNV allows the analyst to estimate locus-level copy number, perform segmentation and evaluate CNV-specific quality-control metrics within a single software package

  12. Evaluation and selection of open-source EMR software packages based on integrated AHP and TOPSIS.

    PubMed

    Zaidan, A A; Zaidan, B B; Al-Haiqi, Ahmed; Kiah, M L M; Hussain, Muzammil; Abdulnabi, Mohamed

    2015-02-01

    Evaluating and selecting software packages that meet the requirements of an organization are difficult aspects of software engineering process. Selecting the wrong open-source EMR software package can be costly and may adversely affect business processes and functioning of the organization. This study aims to evaluate and select open-source EMR software packages based on multi-criteria decision-making. A hands-on study was performed and a set of open-source EMR software packages were implemented locally on separate virtual machines to examine the systems more closely. Several measures as evaluation basis were specified, and the systems were selected based a set of metric outcomes using Integrated Analytic Hierarchy Process (AHP) and TOPSIS. The experimental results showed that GNUmed and OpenEMR software can provide better basis on ranking score records than other open-source EMR software packages. PMID:25483886

  13. Developing tools for digital radar image data evaluation

    NASA Technical Reports Server (NTRS)

    Domik, G.; Leberl, F.; Raggam, J.

    1986-01-01

    The refinement of radar image analysis methods has led to a need for a systems approach to radar image processing software. Developments stimulated through satellite radar are combined with standard image processing techniques to create a user environment to manipulate and analyze airborne and satellite radar images. One aim is to create radar products for the user from the original data to enhance the ease of understanding the contents. The results are called secondary image products and derive from the original digital images. Another aim is to support interactive SAR image analysis. Software methods permit use of a digital height model to create ortho images, synthetic images, stereo-ortho images, radar maps or color combinations of different component products. Efforts are ongoing to integrate individual tools into a combined hardware/software environment for interactive radar image analysis.

  14. Machine tool evaluation and machining operation development

    SciTech Connect

    Morris, T.O.; Kegg, R.

    1997-03-15

    The purpose of this CRADA was to support Cincinnati Milacron`s needs in fabricating precision components, from difficult to machine materials, while maintaining and enhancing the precision manufacturing skills of the Oak Ridge Complex. Oak Ridge and Cincinnati Milacron personnel worked in a team relationship wherein each contributed equally to the success of the program. Process characterization, control technologies, machine tool capabilities, and environmental issues were the primary focus areas. In general, Oak Ridge contributed a wider range of expertise in machine tool testing and monitoring, and environmental testing on machining fluids to the defined tasks while Cincinnati Milacron personnel provided equipment, operations-specific knowledge and shop-floor services to each task. Cincinnati Milacron was very pleased with the results of all of the CRADA tasks. However, some of the environmental tasks were not carried through to a desired completion due to an expanding realization of need as the work progressed. This expansion of the desired goals then exceeded the time length of the CRADA. Discussions are underway on continuing these tasks under either a Work for Others agreement or some alternate funding.

  15. Performance evaluation of automated segmentation software on optical coherence tomography volume data.

    PubMed

    Tian, Jing; Varga, Boglarka; Tatrai, Erika; Fanni, Palya; Somfai, Gabor Mark; Smiddy, William E; Debuc, Delia Cabrera

    2016-05-01

    Over the past two decades a significant number of OCT segmentation approaches have been proposed in the literature. Each methodology has been conceived for and/or evaluated using specific datasets that do not reflect the complexities of the majority of widely available retinal features observed in clinical settings. In addition, there does not exist an appropriate OCT dataset with ground truth that reflects the realities of everyday retinal features observed in clinical settings. While the need for unbiased performance evaluation of automated segmentation algorithms is obvious, the validation process of segmentation algorithms have been usually performed by comparing with manual labelings from each study and there has been a lack of common ground truth. Therefore, a performance comparison of different algorithms using the same ground truth has never been performed. This paper reviews research-oriented tools for automated segmentation of the retinal tissue on OCT images. It also evaluates and compares the performance of these software tools with a common ground truth. PMID:27159849

  16. ASSESS (Analytic System and Software for Evaluating Safeguards and Security) update: Current status and future developments

    SciTech Connect

    Al-Ayat, R.A. ); Cousins, T.D. ); Hoover, E.R. )

    1990-07-15

    The Analytic System and Software for Evaluating Safeguards and Security (ASSESS) has been released for use by DOE field offices and their contractors. In October, 1989, we offered a prototype workshop to selected representatives of the DOE community. Based on the prototype results, we held the first training workshop at the Central Training Academy in January, 1990. Four additional workshops are scheduled for FY 1990. ASSESS is a state-of-the-art analytical tool for management to conduct integrated evaluation of safeguards systems at facilities handling facilities. Currently, ASSESS focuses on the threat of theft/diversion of special nuclear material by insiders, outsiders, and a special form of insider/outsider collusion. ASSESS also includes a neutralization module. Development of the tool is continuing. Plans are underway to expand the capabilities of ASSESS to evaluate against violent insiders, to validate the databases, to expand the neutralization module, and to assist in demonstrating compliance with DOE Material Control and Accountability (MC A) Order 5633.3. These new capabilities include the ability to: compute a weighted average for performance capability against a spectrum of insider adversaries; conduct defense-in-depth analyses; and analyze against protracted theft scenarios. As they become available, these capabilities will be incorporated in our training program. ASSESS is being developed jointly by Lawrence Livermore and Sandia National Laboratories under the sponsorship of the Department of Energy (DOE) Office of Safeguards and Security.

  17. Selecting Software.

    ERIC Educational Resources Information Center

    Pereus, Steven C.

    2002-01-01

    Describes a comprehensive computer software selection and evaluation process, including documenting district needs, evaluating software packages, weighing the alternatives, and making the purchase. (PKP)

  18. Data-Driven Decision Making as a Tool to Improve Software Development Productivity

    ERIC Educational Resources Information Center

    Brown, Mary Erin

    2013-01-01

    The worldwide software project failure rate, based on a survey of information technology software manager's view of user satisfaction, product quality, and staff productivity, is estimated to be between 24% and 36% and software project success has not kept pace with the advances in hardware. The problem addressed by this study was the limited…

  19. Students' Learning Experiences When Using a Dynamic Geometry Software Tool in a Geometry Lesson at Secondary School in Ethiopia

    ERIC Educational Resources Information Center

    Denbel, Dejene Girma

    2015-01-01

    Students learning experiences were investigated in geometry lesson when using Dynamic Geometry Software (DGS) tool in geometry learning in 25 Ethiopian secondary students. The research data were drawn from the used worksheets, classroom observations, results of pre- and post-test, a questionnaire and interview responses. I used GeoGebra as a DGS…

  20. EPA's science blog: "It All Starts with Science"; Article title: "EPA's Solvent Substitution Software Tool, PARIS III"

    EPA Science Inventory

    EPA's solvent substitution software tool, PARIS III is provided by the EPA for free, and can be effective and efficiently used to help environmentally-conscious individuals find better and greener solvent mixtures for many different common industrial processes. People can downlo...