Sample records for applications software quality

  1. Software Acquisition: Evolution, Total Quality Management, and Applications to the Army Tactical Missile System

    DTIC Science & Technology

    1992-06-01

    presents the concept of software Total Quality Management (TQM) which focuses on the entire process of software acquisition, as a partial solution to...software TQM can be applied to software acquisition. Software Development, Software Acquisition, Total Quality management (TQM), Army Tactical Missile

  2. Quality measures and assurance for AI (Artificial Intelligence) software

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1988-01-01

    This report is concerned with the application of software quality and evaluation measures to AI software and, more broadly, with the question of quality assurance for AI software. Considered are not only the metrics that attempt to measure some aspect of software quality, but also the methodologies and techniques (such as systematic testing) that attempt to improve some dimension of quality, without necessarily quantifying the extent of the improvement. The report is divided into three parts Part 1 reviews existing software quality measures, i.e., those that have been developed for, and applied to, conventional software. Part 2 considers the characteristics of AI software, the applicability and potential utility of measures and techniques identified in the first part, and reviews those few methods developed specifically for AI software. Part 3 presents an assessment and recommendations for the further exploration of this important area.

  3. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition.

  4. The new meaning of quality in the information age.

    PubMed

    Prahalad, C K; Krishnan, M S

    1999-01-01

    Software applications are now a mission-critical source of competitive advantage for most companies. They are also a source of great risk, as the Y2K bug has made clear. Yet many line managers still haven't confronted software issues--partly because they aren't sure how best to define the quality of the applications in their IT infrastructures. Some companies such as Wal-Mart and the Gap have successfully integrated the software in their networks, but most have accumulated an unwidely number of incompatible applications--all designed to perform the same tasks. The authors provide a framework for measuring the performance of software in a company's IT portfolio. Quality traditionally has been measured according to a product's ability to meet certain specifications; other views of quality have emerged that measure a product's adaptability to customers' needs and a product's ability to encourage innovation. To judge software quality properly, argue the authors, managers must measure applications against all three approaches. Understanding the domain of a software application is an important part of that process. The domain is the body of knowledge about a user's needs and expectations for a product. Software domains change frequently based on how a consumer chooses to use, for example, Microsoft Word or a spreadsheet application. The domain can also be influenced by general changes in technology, such as the development of a new software platform. Thus, applications can't be judged only according to whether they conform to specifications. The authors discuss how to identify domain characteristics and software risks and suggest ways to reduce the variability of software domains.

  5. Building quality into medical product software design.

    PubMed

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  6. APPLICATION OF SOFTWARE QUALITY ASSURANCE CONCEPTS AND PROCEDURES TO ENVIORNMENTAL RESEARCH INVOLVING SOFTWARE DEVELOPMENT

    EPA Science Inventory

    As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...

  7. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  8. Operational excellence (six sigma) philosophy: Application to software quality assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  9. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  10. Development and application of new quality model for software projects.

    PubMed

    Karnavel, K; Dillibabu, R

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects.

  11. A Strategy for Improved System Assurance

    DTIC Science & Technology

    2007-06-20

    Quality (Measurements Life Cycle Safety, Security & Others) ISO /IEC 12207 * Software Life Cycle Processes ISO 9001 Quality Management System...14598 Software Product Evaluation Related ISO /IEC 90003 Guidelines for the Application of ISO 9001:2000 to Computer Software IEEE 12207 Industry...Implementation of International Standard ISO /IEC 12207 IEEE 1220 Standard for Application and Management of the System Engineering Process Use in

  12. [Quality assurance of the renal applications software].

    PubMed

    del Real Núñez, R; Contreras Puertas, P I; Moreno Ortega, E; Mena Bares, L M; Maza Muret, F R; Latre Romero, J M

    2007-01-01

    The need for quality assurance of all technical aspects of nuclear medicine studies is widely recognised. However, little attention has been paid to the quality assurance of the applications software. Our work reported here aims at verifying the analysis software for processing of renal nuclear medicine studies (renograms). The software tools were used to build a synthetic dynamic model of renal system. The model consists of two phases: perfusion and function. The organs of interest (kidneys, bladder and aortic artery) were simple geometric forms. The uptake of the renal structures was described by mathematic functions. Curves corresponding to normal or pathological conditions were simulated for kidneys, bladder and aortic artery by appropriate selection of parameters. There was no difference between the parameters of the mathematic curves and the quantitative data produced by the renal analysis program. Our test procedure is simple to apply, reliable, reproducible and rapid to verify the renal applications software.

  13. QuickEval: a web application for psychometric scaling experiments

    NASA Astrophysics Data System (ADS)

    Van Ngo, Khai; Storvik, Jehans J.; Dokkeberg, Christopher A.; Farup, Ivar; Pedersen, Marius

    2015-01-01

    QuickEval is a web application for carrying out psychometric scaling experiments. It offers the possibility of running controlled experiments in a laboratory, or large scale experiment over the web for people all over the world. It is a unique one of a kind web application, and it is a software needed in the image quality field. It is also, to the best of knowledge, the first software that supports the three most common scaling methods; paired comparison, rank order, and category judgement. It is also the first software to support rank order. Hopefully, a side effect of this newly created software is that it will lower the threshold to perform psychometric experiments, improve the quality of the experiments being carried out, make it easier to reproduce experiments, and increase research on image quality both in academia and industry. The web application is available at www.colourlab.no/quickeval.

  14. xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina

    Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less

  15. xSDK Foundations: Toward an Extreme-scale Scientific Software Development Kit

    DOE PAGES

    Heroux, Michael A.; Bartlett, Roscoe; Demeshko, Irina; ...

    2017-03-01

    Here, extreme-scale computational science increasingly demands multiscale and multiphysics formulations. Combining software developed by independent groups is imperative: no single team has resources for all predictive science and decision support capabilities. Scientific libraries provide high-quality, reusable software components for constructing applications with improved robustness and portability. However, without coordination, many libraries cannot be easily composed. Namespace collisions, inconsistent arguments, lack of third-party software versioning, and additional difficulties make composition costly. The Extreme-scale Scientific Software Development Kit (xSDK) defines community policies to improve code quality and compatibility across independently developed packages (hypre, PETSc, SuperLU, Trilinos, and Alquimia) and provides a foundationmore » for addressing broader issues in software interoperability, performance portability, and sustainability. The xSDK provides turnkey installation of member software and seamless combination of aggregate capabilities, and it marks first steps toward extreme-scale scientific software ecosystems from which future applications can be composed rapidly with assured quality and scalability.« less

  16. Design and implementation of software for automated quality control and data analysis for a complex LC/MS/MS assay for urine opiates and metabolites.

    PubMed

    Dickerson, Jane A; Schmeling, Michael; Hoofnagle, Andrew N; Hoffman, Noah G

    2013-01-16

    Mass spectrometry provides a powerful platform for performing quantitative, multiplexed assays in the clinical laboratory, but at the cost of increased complexity of analysis and quality assurance calculations compared to other methodologies. Here we describe the design and implementation of a software application that performs quality control calculations for a complex, multiplexed, mass spectrometric analysis of opioids and opioid metabolites. The development and implementation of this application improved our data analysis and quality assurance processes in several ways. First, use of the software significantly improved the procedural consistency for performing quality control calculations. Second, it reduced the amount of time technologists spent preparing and reviewing the data, saving on average over four hours per run, and in some cases improving turnaround time by a day. Third, it provides a mechanism for coupling procedural and software changes with the results of each analysis. We describe several key details of the implementation including the use of version control software and automated unit tests. These generally useful software engineering principles should be considered for any software development project in the clinical lab. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. A software quality model and metrics for risk assessment

    NASA Technical Reports Server (NTRS)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  18. Development of innovative computer software to facilitate the setup and computation of water quality index.

    PubMed

    Nabizadeh, Ramin; Valadi Amin, Maryam; Alimohammadi, Mahmood; Naddafi, Kazem; Mahvi, Amir Hossein; Yousefzadeh, Samira

    2013-04-26

    Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases.

  19. Open Source Molecular Modeling

    PubMed Central

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-01-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. PMID:27631126

  20. Quality in End User Documentation.

    ERIC Educational Resources Information Center

    Morrison, Ronald

    1994-01-01

    Discusses quality in end-user documentation for computer applications and explains four approaches to improving quality in end-user documents. Highlights include online help, usability testing, technical writing elements, statistical approaches, and concepts relating to software quality that are also applicable to user manuals. (LRW)

  1. Integrating automated support for a software management cycle into the TAME system

    NASA Technical Reports Server (NTRS)

    Sunazuka, Toshihiko; Basili, Victor R.

    1989-01-01

    Software managers are interested in the quantitative management of software quality, cost and progress. An integrated software management methodology, which can be applied throughout the software life cycle for any number purposes, is required. The TAME (Tailoring A Measurement Environment) methodology is based on the improvement paradigm and the goal/question/metric (GQM) paradigm. This methodology helps generate a software engineering process and measurement environment based on the project characteristics. The SQMAR (software quality measurement and assurance technology) is a software quality metric system and methodology applied to the development processes. It is based on the feed forward control principle. Quality target setting is carried out before the plan-do-check-action activities are performed. These methodologies are integrated to realize goal oriented measurement, process control and visual management. A metric setting procedure based on the GQM paradigm, a management system called the software management cycle (SMC), and its application to a case study based on NASA/SEL data are discussed. The expected effects of SMC are quality improvement, managerial cost reduction, accumulation and reuse of experience, and a highly visual management reporting system.

  2. Development of innovative computer software to facilitate the setup and computation of water quality index

    PubMed Central

    2013-01-01

    Background Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. Findings In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. Conclusion A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases. PMID:24499556

  3. Evaluating Predictive Models of Software Quality

    NASA Astrophysics Data System (ADS)

    Ciaschini, V.; Canaparo, M.; Ronchieri, E.; Salomoni, D.

    2014-06-01

    Applications from High Energy Physics scientific community are constantly growing and implemented by a large number of developers. This implies a strong churn on the code and an associated risk of faults, which is unavoidable as long as the software undergoes active evolution. However, the necessities of production systems run counter to this. Stability and predictability are of paramount importance; in addition, a short turn-around time for the defect discovery-correction-deployment cycle is required. A way to reconcile these opposite foci is to use a software quality model to obtain an approximation of the risk before releasing a program to only deliver software with a risk lower than an agreed threshold. In this article we evaluated two quality predictive models to identify the operational risk and the quality of some software products. We applied these models to the development history of several EMI packages with intent to discover the risk factor of each product and compare it with its real history. We attempted to determine if the models reasonably maps reality for the applications under evaluation, and finally we concluded suggesting directions for further studies.

  4. Proceedings of the Seventeenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Proceedings of the Seventeenth Annual Software Engineering Workshop are presented. The software Engineering Laboratory (SEL) is an organization sponsored by NASA/Goddard Space Flight Center and created to investigate the effectiveness of software engineering technologies when applied to the development of applications software. Topics covered include: the Software Engineering Laboratory; process measurement; software reuse; software quality; lessons learned; and is Ada dying.

  5. Investigating the application of AOP methodology in development of Financial Accounting Software using Eclipse-AJDT Environment

    NASA Astrophysics Data System (ADS)

    Sharma, Amita; Sarangdevot, S. S.

    2010-11-01

    Aspect-Oriented Programming (AOP) methodology has been investigated in development of real world business application software—Financial Accounting Software. Eclipse-AJDT environment has been used as open source enhanced IDE support for programming in AOP language—Aspect J. Crosscutting concerns have been identified and modularized as aspects. This reduces the complexity of the design considerably due to elimination of code scattering and tangling. Improvement in modularity, quality and performance is achieved. The study concludes that AOP methodology in Eclipse-AJDT environment offers powerful support for modular design and implementation of real world quality business software.

  6. SQA of finite element method (FEM) codes used for analyses of pit storage/transport packages

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Russel, E.

    1997-11-01

    This report contains viewgraphs on the software quality assurance of finite element method codes used for analyses of pit storage and transport projects. This methodology utilizes the ISO 9000-3: Guideline for application of 9001 to the development, supply, and maintenance of software, for establishing well-defined software engineering processes to consistently maintain high quality management approaches.

  7. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  8. Open source molecular modeling.

    PubMed

    Pirhadi, Somayeh; Sunseri, Jocelyn; Koes, David Ryan

    2016-09-01

    The success of molecular modeling and computational chemistry efforts are, by definition, dependent on quality software applications. Open source software development provides many advantages to users of modeling applications, not the least of which is that the software is free and completely extendable. In this review we categorize, enumerate, and describe available open source software packages for molecular modeling and computational chemistry. An updated online version of this catalog can be found at https://opensourcemolecularmodeling.github.io. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  9. Preface to QoIS 2009

    NASA Astrophysics Data System (ADS)

    Comyn-Wattiau, Isabelle; Thalheim, Bernhard

    Quality assurance is a growing research domain within the Information Systems (IS) and Conceptual Modeling (CM) disciplines. Ongoing research on quality in IS and CM is highly diverse and encompasses theoretical aspects including quality definition and quality models, and practical/empirical aspects such as the development of methods, approaches and tools for quality measurement and improvement. Current research on quality also includes quality characteristics definitions, validation instruments, methodological and development approaches to quality assurance during software and information systems development, quality monitors, quality assurance during information systems development processes and practices, quality assurance both for data and (meta)schemata, quality support for information systems data import and export, quality of query answering, and cost/benefit analysis of quality assurance processes. Quality assurance is also depending on the application area and the specific requirements in applications such as health sector, logistics, public sector, financial sector, manufacturing, services, e-commerce, software, etc. Furthermore, quality assurance must also be supported for data aggregation, ETL processes, web content management and other multi-layered applications. Quality assurance is typically requiring resources and has therefore beside its benefits a computational and economical trade-off. It is therefore also based on compromising between the value of quality data and the cost for quality assurance.

  10. General guidelines for biomedical software development

    PubMed Central

    Silva, Luis Bastiao; Jimenez, Rafael C.; Blomberg, Niklas; Luis Oliveira, José

    2017-01-01

    Most bioinformatics tools available today were not written by professional software developers, but by people that wanted to solve their own problems, using computational solutions and spending the minimum time and effort possible, since these were just the means to an end. Consequently, a vast number of software applications are currently available, hindering the task of identifying the utility and quality of each. At the same time, this situation has hindered regular adoption of these tools in clinical practice. Typically, they are not sufficiently developed to be used by most clinical researchers and practitioners. To address these issues, it is necessary to re-think how biomedical applications are built and adopt new strategies that ensure quality, efficiency, robustness, correctness and reusability of software components. We also need to engage end-users during the development process to ensure that applications fit their needs. In this review, we present a set of guidelines to support biomedical software development, with an explanation of how they can be implemented and what kind of open-source tools can be used for each specific topic. PMID:28443186

  11. beta-Aminoalcohols as Potential Reactivators of Aged Sarin-/Soman-Inhibited Acetylcholinesterase

    DTIC Science & Technology

    2017-02-08

    This approach includes high - quality quantum mechanical/molecular mechanical calcula- tions, providing reliable reactivation steps and energetics...I. V. Khavrutskii Department of Defense Biotechnology High Performance Computing Software Applications Institute Telemedicine and Advanced...b] Dr. A. Wallqvist Department of Defense Biotechnology High Performance Computing Software Applications Institute Telemedicine and Advanced

  12. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  13. Measuring health care process quality with software quality measures.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2012-01-01

    Existing quality models focus on some specific diseases, clinics or clinical areas. Although they contain structure, process, or output type measures, there is no model which measures quality of health care processes comprehensively. In addition, due to the not measured overall process quality, hospitals cannot compare quality of processes internally and externally. To bring a solution to above problems, a new model is developed from software quality measures. We have adopted the ISO/IEC 9126 software quality standard for health care processes. Then, JCIAS (Joint Commission International Accreditation Standards for Hospitals) measurable elements were added to model scope for unifying functional requirements. Assessment (diagnosing) process measurement results are provided in this paper. After the application, it was concluded that the model determines weak and strong aspects of the processes, gives a more detailed picture for the process quality, and provides quantifiable information to hospitals to compare their processes with multiple organizations.

  14. Factors in Software Quality. Volume I. Concepts and Definitions of Software Quality

    DTIC Science & Technology

    1977-11-01

    FLEXIBILITY COMPLEXITY EXPANDABILITY PRECISION DOCUMENTATION TOLERANCE REPAIRABILITY COMPATABIL ITY SERVICEABILITY 2-4 AiI1I~3~I!-T A1 11 NI󈧥 AIiB 9l 0...applications. Several standard documents are required by DOD/AF’ regulations . The following references were used to compile the rFpnge of documents...documents are specified by the AF regulations or SPO-local regulations listed above. Each ot the document types for a long life/high cost software

  15. [Impact of a software application to improve medication reconciliation at hospital discharge].

    PubMed

    Corral Baena, S; Garabito Sánchez, M J; Ruíz Rómero, M V; Vergara Díaz, M A; Martín Chacón, E R; Fernández Moyano, A

    2014-01-01

    To assess the impact of a software application to improve the quality of information concerning current patient medications and changes on the discharge report after hospitalization. To analyze the incidence of errors and to classify them. Quasi-experimental pre / post study with non-equivalent control group study. Medical patients at hospital discharge. implementation of a software application. Percentage of reconciled patient medication on discharge, and percentage of patients with more than one unjustified discrepancy. A total of 349 patients were assessed; 199 (pre-intervention phase) and 150 (post-intervention phase). Before the implementation of the application in 157 patients (78.8%) medication reconciliation had been completed; finding reconciliation errors in 99 (63.0%). The most frequent type of error, 339 (78.5%), was a missing dose or administration frequency information. After implementation, all the patient prescriptions were reconciled when the software was used. The percentage of patients with unjustified discrepancies decreased from 63.0% to 11.8% with the use of the application (p<.001). The main type of discrepancy found on using the application was confusing prescription, due to the fact that the professionals were not used to using the new tool. The use of a software application has been shown to improve the quality of the information on patient treatment on the hospital discharge report, but it is still necessary to continue development as a strategy for improving medication reconciliation. Copyright © 2014 SECA. Published by Elsevier Espana. All rights reserved.

  16. Simulation of textile manufacturing processes for planning, scheduling, and quality control purposes

    NASA Astrophysics Data System (ADS)

    Cropper, A. E.; Wang, Z.

    1995-08-01

    Simulation, as a management information tool, has been applied to engineering manufacture and assembly operations. The application of the principles to textile manufacturing (fiber to fabric) is discussed. The particular problems and solutions in applying the simulation software package to the yarn production processes are discussed with an indication of how the software achieves the production schedule. The system appears to have application in planning, scheduling, and quality assurance. The latter being a result of the traceability possibilities through a process involving mixing and splitting of material.

  17. Transforming High School Classrooms with Free/Open Source Software: "It's Time for an Open Source Software Revolution"

    ERIC Educational Resources Information Center

    Pfaffman, Jay

    2008-01-01

    Free/Open Source Software (FOSS) applications meet many of the software needs of high school science classrooms. In spite of the availability and quality of FOSS tools, they remain unknown to many teachers and utilized by fewer still. In a world where most software has restrictions on copying and use, FOSS is an anomaly, free to use and to…

  18. Automated Translation of Safety Critical Application Software Specifications into PLC Ladder Logic

    NASA Technical Reports Server (NTRS)

    Leucht, Kurt W.; Semmel, Glenn S.

    2008-01-01

    The numerous benefits of automatic application code generation are widely accepted within the software engineering community. A few of these benefits include raising the abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at the NASA Kennedy Space Center (KSC) recognized the need for PLC code generation while developing their new ground checkout and launch processing system. They developed a process and a prototype software tool that automatically translates a high-level representation or specification of safety critical application software into ladder logic that executes on a PLC. This process and tool are expected to increase the reliability of the PLC code over that which is written manually, and may even lower life-cycle costs and shorten the development schedule of the new control system at KSC. This paper examines the problem domain and discusses the process and software tool that were prototyped by the KSC software engineers.

  19. Application Reuse Library for Software, Requirements, and Guidelines

    NASA Technical Reports Server (NTRS)

    Malin, Jane T.; Thronesbery, Carroll

    1994-01-01

    Better designs are needed for expert systems and other operations automation software, for more reliable, usable and effective human support. A prototype computer-aided Application Reuse Library shows feasibility of supporting concurrent development and improvement of advanced software by users, analysts, software developers, and human-computer interaction experts. Such a library expedites development of quality software, by providing working, documented examples, which support understanding, modification and reuse of requirements as well as code. It explicitly documents and implicitly embodies design guidelines, standards and conventions. The Application Reuse Library provides application modules with Demo-and-Tester elements. Developers and users can evaluate applicability of a library module and test modifications, by running it interactively. Sub-modules provide application code and displays and controls. The library supports software modification and reuse, by providing alternative versions of application and display functionality. Information about human support and display requirements is provided, so that modifications will conform to guidelines. The library supports entry of new application modules from developers throughout an organization. Example library modules include a timer, some buttons and special fonts, and a real-time data interface program. The library prototype is implemented in the object-oriented G2 environment for developing real-time expert systems.

  20. Software Formal Inspections Standard

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This Software Formal Inspections Standard (hereinafter referred to as Standard) is applicable to NASA software. This Standard defines the requirements that shall be fulfilled by the software formal inspections process whenever this process is specified for NASA software. The objective of this Standard is to define the requirements for a process that inspects software products to detect and eliminate defects as early as possible in the software life cycle. The process also provides for the collection and analysis of inspection data to improve the inspection process as well as the quality of the software.

  1. Software engineering with application-specific languages

    NASA Technical Reports Server (NTRS)

    Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.

    1993-01-01

    Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.

  2. Design and implementation of a cloud based lithography illumination pupil processing application

    NASA Astrophysics Data System (ADS)

    Zhang, Youbao; Ma, Xinghua; Zhu, Jing; Zhang, Fang; Huang, Huijie

    2017-02-01

    Pupil parameters are important parameters to evaluate the quality of lithography illumination system. In this paper, a cloud based full-featured pupil processing application is implemented. A web browser is used for the UI (User Interface), the websocket protocol and JSON format are used for the communication between the client and the server, and the computing part is implemented in the server side, where the application integrated a variety of high quality professional libraries, such as image processing libraries libvips and ImageMagic, automatic reporting system latex, etc., to support the program. The cloud based framework takes advantage of server's superior computing power and rich software collections, and the program could run anywhere there is a modern browser due to its web UI design. Compared to the traditional way of software operation model: purchased, licensed, shipped, downloaded, installed, maintained, and upgraded, the new cloud based approach, which is no installation, easy to use and maintenance, opens up a new way. Cloud based application probably is the future of the software development.

  3. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  4. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  5. High-throughput image analysis of tumor spheroids: a user-friendly software application to measure the size of spheroids automatically and accurately.

    PubMed

    Chen, Wenjin; Wong, Chung; Vosburgh, Evan; Levine, Arnold J; Foran, David J; Xu, Eugenia Y

    2014-07-08

    The increasing number of applications of three-dimensional (3D) tumor spheroids as an in vitro model for drug discovery requires their adaptation to large-scale screening formats in every step of a drug screen, including large-scale image analysis. Currently there is no ready-to-use and free image analysis software to meet this large-scale format. Most existing methods involve manually drawing the length and width of the imaged 3D spheroids, which is a tedious and time-consuming process. This study presents a high-throughput image analysis software application - SpheroidSizer, which measures the major and minor axial length of the imaged 3D tumor spheroids automatically and accurately; calculates the volume of each individual 3D tumor spheroid; then outputs the results in two different forms in spreadsheets for easy manipulations in the subsequent data analysis. The main advantage of this software is its powerful image analysis application that is adapted for large numbers of images. It provides high-throughput computation and quality-control workflow. The estimated time to process 1,000 images is about 15 min on a minimally configured laptop, or around 1 min on a multi-core performance workstation. The graphical user interface (GUI) is also designed for easy quality control, and users can manually override the computer results. The key method used in this software is adapted from the active contour algorithm, also known as Snakes, which is especially suitable for images with uneven illumination and noisy background that often plagues automated imaging processing in high-throughput screens. The complimentary "Manual Initialize" and "Hand Draw" tools provide the flexibility to SpheroidSizer in dealing with various types of spheroids and diverse quality images. This high-throughput image analysis software remarkably reduces labor and speeds up the analysis process. Implementing this software is beneficial for 3D tumor spheroids to become a routine in vitro model for drug screens in industry and academia.

  6. DEVELOPMENT AND APPLICATIONS OF CFD SIMULATIONS SUPPORTING URBAN AIR QUALITY AND HOMELAND SECURITY

    EPA Science Inventory

    Prior to September 11, 2001 developments of Computational Fluid Dynamics (CFD) were begun to support air quality applications. CFD models are emerging as a promising technology for such assessments, in part due to the advancing power of computational hardware and software. CFD si...

  7. Using Academia-Industry Partnerships to Enhance Software Verification & Validation Education via Active Learning Tools

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter

    2017-01-01

    Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…

  8. A software framework for developing measurement applications under variable requirements.

    PubMed

    Arpaia, Pasquale; Buzio, Marco; Fiscarelli, Lucio; Inglese, Vitaliano

    2012-11-01

    A framework for easily developing software for measurement and test applications under highly and fast-varying requirements is proposed. The framework allows the software quality, in terms of flexibility, usability, and maintainability, to be maximized. Furthermore, the development effort is reduced and finalized, by relieving the test engineer of development details. The framework can be configured for satisfying a large set of measurement applications in a generic field for an industrial test division, a test laboratory, or a research center. As an experimental case study, the design, the implementation, and the assessment inside the application to a measurement scenario of magnet testing at the European Organization for Nuclear Research is reported.

  9. Application of QC_DR software for acceptance testing and routine quality control of direct digital radiography systems: initial experiences using the Italian Association of Physicist in Medicine quality control protocol.

    PubMed

    Nitrosi, Andrea; Bertolini, Marco; Borasi, Giovanni; Botti, Andrea; Barani, Adriana; Rivetti, Stefano; Pierotti, Luisa

    2009-12-01

    Ideally, medical x-ray imaging systems should be designed to deliver maximum image quality at an acceptable radiation risk to the patient. Quality assurance procedures are employed to ensure that these standards are maintained. A quality control protocol for direct digital radiography (DDR) systems is described and discussed. Software to automatically process and analyze the required images was developed. In this paper, the initial results obtained on equipment of different DDR manufacturers were reported. The protocol was developed to highlight even small discrepancies in standard operating performance.

  10. A review of the FDA draft guidance document for software validation: guidance for industry.

    PubMed

    Keatley, K L

    1999-01-01

    A Draft Guidance Document (Version 1.1) was issued by the United States Food and Drug Administration (FDA) to address the software validation requirement of the Quality System Regulation, 21 CFR Part 820, effective June 1, 1997. The guidance document outlines validation considerations that the FDA regards as applicable to both medical device software and software used to "design, develop or manufacture" medical devices. The Draft Guidance is available at the FDA web site http:@www.fda.gov/cdrh/comps/swareval++ +.html. Presented here is a review of the main features of the FDA document for Quality System Regulation (QSR), and some guidance for its implementation in industry.

  11. The research and practice of spacecraft software engineering

    NASA Astrophysics Data System (ADS)

    Chen, Chengxin; Wang, Jinghua; Xu, Xiaoguang

    2017-06-01

    In order to ensure the safety and reliability of spacecraft software products, it is necessary to execute engineering management. Firstly, the paper introduces the problems of unsystematic planning, uncertain classified management and uncontinuous improved mechanism in domestic and foreign spacecraft software engineering management. Then, it proposes a solution for software engineering management based on system-integrated ideology in the perspective of spacecraft system. Finally, a application result of spacecraft is given as an example. The research can provides a reference for executing spacecraft software engineering management and improving software product quality.

  12. Analysis of quality raw data of second generation sequencers with Quality Assessment Software.

    PubMed

    Ramos, Rommel Tj; Carneiro, Adriana R; Baumbach, Jan; Azevedo, Vasco; Schneider, Maria Pc; Silva, Artur

    2011-04-18

    Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction.

  13. A quality-refinement process for medical imaging applications.

    PubMed

    Neuhaus, J; Maleike, D; Nolden, M; Kenngott, H-G; Meinzer, H-P; Wolf, I

    2009-01-01

    To introduce and evaluate a process for refinement of software quality that is suitable to research groups. In order to avoid constraining researchers too much, the quality improvement process has to be designed carefully. The scope of this paper is to present and evaluate a process to advance quality aspects of existing research prototypes in order to make them ready for initial clinical studies. The proposed process is tailored for research environments and therefore more lightweight than traditional quality management processes. Focus on quality criteria that are important at the given stage of the software life cycle. Usage of tools that automate aspects of the process is emphasized. To evaluate the additional effort that comes along with the process, it was exemplarily applied for eight prototypical software modules for medical image processing. The introduced process has been applied to improve the quality of all prototypes so that they could be successfully used in clinical studies. The quality refinement yielded an average of 13 person days of additional effort per project. Overall, 107 bugs were found and resolved by applying the process. Careful selection of quality criteria and the usage of automated process tools lead to a lightweight quality refinement process suitable for scientific research groups that can be applied to ensure a successful transfer of technical software prototypes into clinical research workflows.

  14. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-30

    information that are returned from the tools to the human user, and the forms in which these outputs are presented. Page 2 of 4 STAGE OF DEVELOPMENT: What... AUTOMIATED SOFTWARE TOOL MONITORING SYSTEM APPENDIX 2 2-1 INTRODUCTION This document and Automated Software Tool Monitoring Program (Appendix 1) are...t Output Output features provide links from the tool to both the human user and the target machine (where applicable). They describe the types

  15. Technology Infusion of CodeSonar into the Space Network Ground Segment (RII07)

    NASA Technical Reports Server (NTRS)

    Benson, Markland

    2008-01-01

    The NASA Software Assurance Research Program (in part) performs studies as to the feasibility of technologies for improving the safety, quality, reliability, cost, and performance of NASA software. This study considers the application of commercial automated source code analysis tools to mission critical ground software that is in the operations and sustainment portion of the product lifecycle.

  16. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    NASA Technical Reports Server (NTRS)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  17. State of the art metrics for aspect oriented programming

    NASA Astrophysics Data System (ADS)

    Ghareb, Mazen Ismaeel; Allen, Gary

    2018-04-01

    The quality evaluation of software, e.g., defect measurement, gains significance with higher use of software applications. Metric measurements are considered as the primary indicator of imperfection prediction and software maintenance in various empirical studies of software products. However, there is no agreement on which metrics are compelling quality indicators for novel development approaches such as Aspect Oriented Programming (AOP). AOP intends to enhance programming quality, by providing new and novel constructs for the development of systems, for example, point cuts, advice and inter-type relationships. Hence, it is not evident if quality pointers for AOP can be derived from direct expansions of traditional OO measurements. Then again, investigations of AOP do regularly depend on established coupling measurements. Notwithstanding the late reception of AOP in empirical studies, coupling measurements have been adopted as useful markers of flaw inclination in this context. In this paper we will investigate the state of the art metrics for measurement of Aspect Oriented systems development.

  18. Web Implementation of Quality Assurance (QA) for X-ray Units in Balkanic Medical Institutions.

    PubMed

    Urošević, Vlade; Ristić, Olga; Milošević, Danijela; Košutić, Duško

    2015-08-01

    Diagnostic radiology is the major contributor to the total dose of the population from all artificial sources. In order to reduce radiation exposure and optimize diagnostic x-ray image quality, it is necessary to increase the quality and efficiency of quality assurance (QA) and audit programs. This work presents a web application providing completely new QA solutions for x-ray modalities and facilities. The software gives complete online information (using European standards) with which the corresponding institutions and individuals can evaluate and control a facility's Radiation Safety and QA program. The software enables storage of all data in one place and sharing the same information (data), regardless of whether the measured data is used by an individual user or by an authorized institution. The software overcomes the distance and time separation of institutions and individuals who take part in QA. Upgrading the software will enable assessment of the medical exposure level to ionizing radiation.

  19. Software metrics: The key to quality software on the NCC project

    NASA Technical Reports Server (NTRS)

    Burns, Patricia J.

    1993-01-01

    Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.

  20. The Use of UML for Software Requirements Expression and Management

    NASA Technical Reports Server (NTRS)

    Murray, Alex; Clark, Ken

    2015-01-01

    It is common practice to write English-language "shall" statements to embody detailed software requirements in aerospace software applications. This paper explores the use of the UML language as a replacement for the English language for this purpose. Among the advantages offered by the Unified Modeling Language (UML) is a high degree of clarity and precision in the expression of domain concepts as well as architecture and design. Can this quality of UML be exploited for the definition of software requirements? While expressing logical behavior, interface characteristics, timeliness constraints, and other constraints on software using UML is commonly done and relatively straight-forward, achieving the additional aspects of the expression and management of software requirements that stakeholders expect, especially traceability, is far less so. These other characteristics, concerned with auditing and quality control, include the ability to trace a requirement to a parent requirement (which may well be an English "shall" statement), to trace a requirement to verification activities or scenarios which verify that requirement, and to trace a requirement to elements of the software design which implement that requirement. UML Use Cases, designed for capturing requirements, have not always been satisfactory. Some applications of them simply use the Use Case model element as a repository for English requirement statements. Other applications of Use Cases, in which Use Cases are incorporated into behavioral diagrams that successfully communicate the behaviors and constraints required of the software, do indeed take advantage of UML's clarity, but not in ways that support the traceability features mentioned above. Our approach uses the Stereotype construct of UML to precisely identify elements of UML constructs, especially behaviors such as State Machines and Activities, as requirements, and also to achieve the necessary mapping capabilities. We describe this approach in the context of a space-based software application currently under development at the Jet Propulsion Laboratory.

  1. A systematic literature review of open source software quality assessment models.

    PubMed

    Adewumi, Adewole; Misra, Sanjay; Omoregbe, Nicholas; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    Many open source software (OSS) quality assessment models are proposed and available in the literature. However, there is little or no adoption of these models in practice. In order to guide the formulation of newer models so they can be acceptable by practitioners, there is need for clear discrimination of the existing models based on their specific properties. Based on this, the aim of this study is to perform a systematic literature review to investigate the properties of the existing OSS quality assessment models by classifying them with respect to their quality characteristics, the methodology they use for assessment, and their domain of application so as to guide the formulation and development of newer models. Searches in IEEE Xplore, ACM, Science Direct, Springer and Google Search is performed so as to retrieve all relevant primary studies in this regard. Journal and conference papers between the year 2003 and 2015 were considered since the first known OSS quality model emerged in 2003. A total of 19 OSS quality assessment model papers were selected. To select these models we have developed assessment criteria to evaluate the quality of the existing studies. Quality assessment models are classified into five categories based on the quality characteristics they possess namely: single-attribute, rounded category, community-only attribute, non-community attribute as well as the non-quality in use models. Our study reflects that software selection based on hierarchical structures is found to be the most popular selection method in the existing OSS quality assessment models. Furthermore, we found that majority (47%) of the existing models do not specify any domain of application. In conclusion, our study will be a valuable contribution to the community and helps the quality assessment model developers in formulating newer models and also to the practitioners (software evaluators) in selecting suitable OSS in the midst of alternatives.

  2. Research on Visualization Design Method in the Field of New Media Software Engineering

    NASA Astrophysics Data System (ADS)

    Deqiang, Hu

    2018-03-01

    In the new period of increasingly developed science and technology, with the increasingly fierce competition in the market and the increasing demand of the masses, new design and application methods have emerged in the field of new media software engineering, that is, the visualization design method. Applying the visualization design method to the field of new media software engineering can not only improve the actual operation efficiency of new media software engineering but more importantly the quality of software development can be enhanced by means of certain media of communication and transformation; on this basis, the progress and development of new media software engineering in China are also continuously promoted. Therefore, the application of visualization design method in the field of new media software engineering is analysed concretely in this article from the perspective of the overview of visualization design methods and on the basis of systematic analysis of the basic technology.

  3. [Research progress of probe design software of oligonucleotide microarrays].

    PubMed

    Chen, Xi; Wu, Zaoquan; Liu, Zhengchun

    2014-02-01

    DNA microarray has become an essential medical genetic diagnostic tool for its high-throughput, miniaturization and automation. The design and selection of oligonucleotide probes are critical for preparing gene chips with high quality. Several sets of probe design software have been developed and are available to perform this work now. Every set of the software aims to different target sequences and shows different advantages and limitations. In this article, the research and development of these sets of software are reviewed in line with three main criteria, including specificity, sensitivity and melting temperature (Tm). In addition, based on the experimental results from literatures, these sets of software are classified according to their applications. This review will be helpful for users to choose an appropriate probe-design software. It will also reduce the costs of microarrays, improve the application efficiency of microarrays, and promote both the research and development (R&D) and commercialization of high-performance probe design software.

  4. Functional description of the ISIS system

    NASA Technical Reports Server (NTRS)

    Berman, W. J.

    1979-01-01

    Development of software for avionic and aerospace applications (flight software) is influenced by a unique combination of factors which includes: (1) length of the life cycle of each project; (2) necessity for cooperation between the aerospace industry and NASA; (3) the need for flight software that is highly reliable; (4) the increasing complexity and size of flight software; and (5) the high quality of the programmers and the tightening of project budgets. The interactive software invocation system (ISIS) which is described is designed to overcome the problems created by this combination of factors.

  5. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  6. [Development of analysis software package for the two kinds of Japanese fluoro-d-glucose-positron emission tomography guideline].

    PubMed

    Matsumoto, Keiichi; Endo, Keigo

    2013-06-01

    Two kinds of Japanese guidelines for the data acquisition protocol of oncology fluoro-D-glucose-positron emission tomography (FDG-PET)/computed tomography (CT) scans were created by the joint task force of the Japanese Society of Nuclear Medicine Technology (JSNMT) and the Japanese Society of Nuclear Medicine (JSNM), and published in Kakuigaku-Gijutsu 27(5): 425-456, 2007 and 29(2): 195-235, 2009. These guidelines aim to standardize PET image quality among facilities and different PET/CT scanner models. The objective of this study was to develop a personal computer-based performance measurement and image quality processor for the two kinds of Japanese guidelines for oncology (18)F-FDG PET/CT scans. We call this software package the "PET quality control tool" (PETquact). Microsoft Corporation's Windows(™) is used as the operating system for PETquact, which requires 1070×720 image resolution and includes 12 different applications. The accuracy was examined for numerous applications of PETquact. For example, in the sensitivity application, the system sensitivity measurement results were equivalent when comparing two PET sinograms obtained from the PETquact and the report. PETquact is suited for analysis of the two kinds of Japanese guideline, and it shows excellent spec to performance measurements and image quality analysis. PETquact can be used at any facility if the software package is installed on a laptop computer.

  7. Developing a smartphone software package for predicting atmospheric pollutant concentrations at mobile locations.

    PubMed

    Larkin, Andrew; Williams, David E; Kile, Molly L; Baird, William M

    2015-06-01

    There is considerable evidence that exposure to air pollution is harmful to health. In the U.S., ambient air quality is monitored by Federal and State agencies for regulatory purposes. There are limited options, however, for people to access this data in real-time which hinders an individual's ability to manage their own risks. This paper describes a new software package that models environmental concentrations of fine particulate matter (PM 2.5 ), coarse particulate matter (PM 10 ), and ozone concentrations for the state of Oregon and calculates personal health risks at the smartphone's current location. Predicted air pollution risk levels can be displayed on mobile devices as interactive maps and graphs color-coded to coincide with EPA air quality index (AQI) categories. Users have the option of setting air quality warning levels via color-coded bars and were notified whenever warning levels were exceeded by predicted levels within 10 km. We validated the software using data from participants as well as from simulations which showed that the application was capable of identifying spatial and temporal air quality trends. This unique application provides a potential low-cost technology for reducing personal exposure to air pollution which can improve quality of life particularly for people with health conditions, such as asthma, that make them more susceptible to these hazards.

  8. Increasing quality and managing complexity in neuroinformatics software development with continuous integration.

    PubMed

    Zaytsev, Yury V; Morrison, Abigail

    2012-01-01

    High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique.

  9. Increasing quality and managing complexity in neuroinformatics software development with continuous integration

    PubMed Central

    Zaytsev, Yury V.; Morrison, Abigail

    2013-01-01

    High quality neuroscience research requires accurate, reliable and well maintained neuroinformatics applications. As software projects become larger, offering more functionality and developing a denser web of interdependence between their component parts, we need more sophisticated methods to manage their complexity. If complexity is allowed to get out of hand, either the quality of the software or the speed of development suffer, and in many cases both. To address this issue, here we develop a scalable, low-cost and open source solution for continuous integration (CI), a technique which ensures the quality of changes to the code base during the development procedure, rather than relying on a pre-release integration phase. We demonstrate that a CI-based workflow, due to rapid feedback about code integration problems and tracking of code health measures, enabled substantial increases in productivity for a major neuroinformatics project and additional benefits for three further projects. Beyond the scope of the current study, we identify multiple areas in which CI can be employed to further increase the quality of neuroinformatics projects by improving development practices and incorporating appropriate development tools. Finally, we discuss what measures can be taken to lower the barrier for developers of neuroinformatics applications to adopt this useful technique. PMID:23316158

  10. Quality Assurance of Software Used In Aircraft Or Related Products

    DOT National Transportation Integrated Search

    1993-02-01

    This advisory circular (AC) provides an acceptable means, but not the only means, to show compliance with the quality assurance requirements of Federal Aviation Regulations (FAR) Part 21, Certification Procedures for Products and Parts, as applicable...

  11. Genoviz Software Development Kit: Java tool kit for building genomics visualization applications.

    PubMed

    Helt, Gregg A; Nicol, John W; Erwin, Ed; Blossom, Eric; Blanchard, Steven G; Chervitz, Stephen A; Harmon, Cyrus; Loraine, Ann E

    2009-08-25

    Visualization software can expose previously undiscovered patterns in genomic data and advance biological science. The Genoviz Software Development Kit (SDK) is an open source, Java-based framework designed for rapid assembly of visualization software applications for genomics. The Genoviz SDK framework provides a mechanism for incorporating adaptive, dynamic zooming into applications, a desirable feature of genome viewers. Visualization capabilities of the Genoviz SDK include automated layout of features along genetic or genomic axes; support for user interactions with graphical elements (Glyphs) in a map; a variety of Glyph sub-classes that promote experimentation with new ways of representing data in graphical formats; and support for adaptive, semantic zooming, whereby objects change their appearance depending on zoom level and zooming rate adapts to the current scale. Freely available demonstration and production quality applications, including the Integrated Genome Browser, illustrate Genoviz SDK capabilities. Separation between graphics components and genomic data models makes it easy for developers to add visualization capability to pre-existing applications or build new applications using third-party data models. Source code, documentation, sample applications, and tutorials are available at http://genoviz.sourceforge.net/.

  12. Quality Improvement With Discrete Event Simulation: A Primer for Radiologists.

    PubMed

    Booker, Michael T; O'Connell, Ryan J; Desai, Bhushan; Duddalwar, Vinay A

    2016-04-01

    The application of simulation software in health care has transformed quality and process improvement. Specifically, software based on discrete-event simulation (DES) has shown the ability to improve radiology workflows and systems. Nevertheless, despite the successful application of DES in the medical literature, the power and value of simulation remains underutilized. For this reason, the basics of DES modeling are introduced, with specific attention to medical imaging. In an effort to provide readers with the tools necessary to begin their own DES analyses, the practical steps of choosing a software package and building a basic radiology model are discussed. In addition, three radiology system examples are presented, with accompanying DES models that assist in analysis and decision making. Through these simulations, we provide readers with an understanding of the theory, requirements, and benefits of implementing DES in their own radiology practices. Copyright © 2016 American College of Radiology. All rights reserved.

  13. A proposed defect tracking model for classifying the inserted defect reports to enhance software quality control.

    PubMed

    Sultan, Torky; Khedr, Ayman E; Sayed, Mostafa

    2013-01-01

    NONE DECLARED Defect tracking systems play an important role in the software development organizations as they can store historical information about defects. There are many research in defect tracking models and systems to enhance their capabilities to be more specifically tracking, and were adopted with new technology. Furthermore, there are different studies in classifying bugs in a step by step method to have clear perception and applicable method in detecting such bugs. This paper shows a new proposed defect tracking model for the purpose of classifying the inserted defects reports in a step by step method for more enhancement of the software quality.

  14. Software Tech News. The DoD Source for Software Technology Information. Software Testing Series: Part 2, Volume 3, Number 3, 1999

    DTIC Science & Technology

    1999-01-01

    published in December of 1998. In addition, Mr. Drake is the author of a theme article entitled: "Measuring Software Quality: A Case Study...and services may run on different platforms in differing combinations , • Partial application failure (e.g., a client running, service down) is...result in a combined utility function that is some aggregation of the underlying utility functions. The benefit a client receives from a service

  15. Application of industry-standard guidelines for the validation of avionics software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shagnea, Anita M.

    1990-01-01

    The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.

  16. Software Quality Measurement for Distributed Systems. Volume 3. Distributed Computing Systems: Impact on Software Quality.

    DTIC Science & Technology

    1983-07-01

    Distributed Computing Systems impact DrnwrR - aehR on Sotwar Quaity. PERFORMING 010. REPORT NUMBER 7. AUTNOW) S. CONTRACT OR GRANT "UMBER(*)IS ThomasY...C31 Application", "Space Systems Network", "Need for Distributed Database Management", and "Adaptive Routing". This is discussed in the last para ...data reduction, buffering, encryption, and error detection and correction functions. Examples of such data streams include imagery data, video

  17. The Medical Imaging Interaction Toolkit: challenges and advances : 10 years of open-source development.

    PubMed

    Nolden, Marco; Zelzer, Sascha; Seitel, Alexander; Wald, Diana; Müller, Michael; Franz, Alfred M; Maleike, Daniel; Fangerau, Markus; Baumhauer, Matthias; Maier-Hein, Lena; Maier-Hein, Klaus H; Meinzer, Hans-Peter; Wolf, Ivo

    2013-07-01

    The Medical Imaging Interaction Toolkit (MITK) has been available as open-source software for almost 10 years now. In this period the requirements of software systems in the medical image processing domain have become increasingly complex. The aim of this paper is to show how MITK evolved into a software system that is able to cover all steps of a clinical workflow including data retrieval, image analysis, diagnosis, treatment planning, intervention support, and treatment control. MITK provides modularization and extensibility on different levels. In addition to the original toolkit, a module system, micro services for small, system-wide features, a service-oriented architecture based on the Open Services Gateway initiative (OSGi) standard, and an extensible and configurable application framework allow MITK to be used, extended and deployed as needed. A refined software process was implemented to deliver high-quality software, ease the fulfillment of regulatory requirements, and enable teamwork in mixed-competence teams. MITK has been applied by a worldwide community and integrated into a variety of solutions, either at the toolkit level or as an application framework with custom extensions. The MITK Workbench has been released as a highly extensible and customizable end-user application. Optional support for tool tracking, image-guided therapy, diffusion imaging as well as various external packages (e.g. CTK, DCMTK, OpenCV, SOFA, Python) is available. MITK has also been used in several FDA/CE-certified applications, which demonstrates the high-quality software and rigorous development process. MITK provides a versatile platform with a high degree of modularization and interoperability and is well suited to meet the challenging tasks of today's and tomorrow's clinically motivated research.

  18. Automated Reuse of Scientific Subroutine Libraries through Deductive Synthesis

    NASA Technical Reports Server (NTRS)

    Lowry, Michael R.; Pressburger, Thomas; VanBaalen, Jeffrey; Roach, Steven

    1997-01-01

    Systematic software construction offers the potential of elevating software engineering from an art-form to an engineering discipline. The desired result is more predictable software development leading to better quality and more maintainable software. However, the overhead costs associated with the formalisms, mathematics, and methods of systematic software construction have largely precluded their adoption in real-world software development. In fact, many mainstream software development organizations, such as Microsoft, still maintain a predominantly oral culture for software development projects; which is far removed from a formalism-based culture for software development. An exception is the limited domain of safety-critical software, where the high-assuiance inherent in systematic software construction justifies the additional cost. We believe that systematic software construction will only be adopted by mainstream software development organization when the overhead costs have been greatly reduced. Two approaches to cost mitigation are reuse (amortizing costs over many applications) and automation. For the last four years, NASA Ames has funded the Amphion project, whose objective is to automate software reuse through techniques from systematic software construction. In particular, deductive program synthesis (i.e., program extraction from proofs) is used to derive a composition of software components (e.g., subroutines) that correctly implements a specification. The construction of reuse libraries of software components is the standard software engineering solution for improving software development productivity and quality.

  19. Adopting software quality measures for healthcare processes.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process.

  20. Comparative Analyses of MIRT Models and Software (BMIRT and flexMIRT)

    ERIC Educational Resources Information Center

    Yavuz, Guler; Hambleton, Ronald K.

    2017-01-01

    Application of MIRT modeling procedures is dependent on the quality of parameter estimates provided by the estimation software and techniques used. This study investigated model parameter recovery of two popular MIRT packages, BMIRT and flexMIRT, under some common measurement conditions. These packages were specifically selected to investigate the…

  1. The software application and classification algorithms for welds radiograms analysis

    NASA Astrophysics Data System (ADS)

    Sikora, R.; Chady, T.; Baniukiewicz, P.; Grzywacz, B.; Lopato, P.; Misztal, L.; Napierała, L.; Piekarczyk, B.; Pietrusewicz, T.; Psuj, G.

    2013-01-01

    The paper presents a software implementation of an Intelligent System for Radiogram Analysis (ISAR). The system has to support radiologists in welds quality inspection. The image processing part of software with a graphical user interface and a welds classification part are described with selected classification results. Classification was based on a few algorithms: an artificial neural network, a k-means clustering, a simplified k-means and a rough sets theory.

  2. Student computer attitudes, experience and perceptions about the use of two software applications in Building Engineering

    NASA Astrophysics Data System (ADS)

    Chiner, Esther; Garcia-Vera, Victoria E.

    2017-11-01

    The purpose of this study was to examine students' computer attitudes and experience, as well as students' perceptions about the use of two specific software applications (Google Drive Spreadsheets and Arquimedes) in the Building Engineering context. The relationships among these variables were also examined. Ninety-two students took part in this study. Results suggest that students hold favourable computer attitudes. Moreover, it was found a significant positive relationship among students' attitudes and their computer experience. Findings also show that students find Arquimedes software more useful and with higher output quality than Google Drive Spreadsheets, while the latter is perceived to be easier to use. Regarding the relationship among students' attitudes towards the use of computers and their perceptions about the use of both software applications, only a significant positive relationship in the case of Arquimedes was found. Findings are discussed in terms of its implications for practice and further research.

  3. User-friendly technology to help family carers cope.

    PubMed

    Chambers, Mary; Connor, Samantha L

    2002-12-01

    Increases in the older adult population are occurring simultaneously with a growth in new technology. Modern technology presents an opportunity to enhance the quality of life and independence of older people and their family carers through communication and access to health care information. To evaluate the usability of a multimedia software application designed to provide family carers of the elderly or disabled with information, advice and psychological support to increase their coping capacity. The interactive application consisted of an information-based package that provided carers with advice on the promotion of psychological health, including relaxation and other coping strategies. The software application also included a carer self-assessment instrument, designed to provide both family and professional carers with information to assess how family carers were coping with their care-giving role. Usability evaluation was carried out in two stages. In the first stage (verification), user trials and an evaluation questionnaire were used to refine and develop the content and usability of the multimedia software application. In the second (demonstration), stage evaluation questionnaires were used to appraise the usability of the modified software application. The findings evidenced that the majority of users found the software to be usable and informative. Some areas were highlighted for improvement in the navigation of the software. The authors conclude that with further refinement, the software application has the potential to offer information and support to those who are caring for the elderly and disabled at home.

  4. Development and validation of a novel large field of view phantom and a software module for the quality assurance of geometric distortion in magnetic resonance imaging.

    PubMed

    Torfeh, Tarraf; Hammoud, Rabih; McGarry, Maeve; Al-Hammadi, Noora; Perkins, Gregory

    2015-09-01

    To develop and validate a large field of view phantom and quality assurance software tool for the assessment and characterization of geometric distortion in MRI scanners commissioned for radiation therapy planning. A purpose built phantom was developed consisting of 357 rods (6mm in diameter) of polymethyl-methacrylat separated by 20mm intervals, providing a three dimensional array of control points at known spatial locations covering a large field of view up to a diameter of 420mm. An in-house software module was developed to allow automatic geometric distortion assessment. This software module was validated against a virtual dataset of the phantom that reproduced the exact geometry of the physical phantom, but with known translational and rotational displacements and warping. For validation experiments, clinical MRI sequences were acquired with and without the application of a commercial 3D distortion correction algorithm (Gradwarp™). The software module was used to characterize and assess system-related geometric distortion in the sequences relative to a benchmark CT dataset, and the efficacy of the vendor geometric distortion correction algorithms (GDC) was also assessed. Results issued from the validation of the software against virtual images demonstrate the algorithm's ability to accurately calculate geometric distortion with sub-pixel precision by the extraction of rods and quantization of displacements. Geometric distortion was assessed for the typical sequences used in radiotherapy applications and over a clinically relevant 420mm field of view (FOV). As expected and towards the edges of the field of view (FOV), distortion increased with increasing FOV. For all assessed sequences, the vendor GDC was able to reduce the mean distortion to below 1mm over a field of view of 5, 10, 15 and 20cm radius respectively. Results issued from the application of the developed phantoms and algorithms demonstrate a high level of precision. The results indicate that this platform represents an important, robust and objective tool to perform routine quality assurance of MR-guided therapeutic applications, where spatial accuracy is paramount. Copyright © 2015 Elsevier Inc. All rights reserved.

  5. Application of newly developed Fluoro-QC software for image quality evaluation in cardiac X-ray systems.

    PubMed

    Oliveira, M; Lopez, G; Geambastiani, P; Ubeda, C

    2018-05-01

    A quality assurance (QA) program is a valuable tool for the continuous production of optimal quality images. The aim of this paper is to assess a newly developed automatic computer software for image quality (IR) evaluation in fluoroscopy X-ray systems. Test object images were acquired using one fluoroscopy system, Siemens Axiom Artis model (Siemens AG, Medical Solutions Erlangen, Germany). The software was developed as an ImageJ plugin. Two image quality parameters were assessed: high-contrast spatial resolution (HCSR) and signal-to-noise ratio (SNR). The time between manual and automatic image quality assessment procedures were compared. The paired t-test was used to assess the data. p Values of less than 0.05 were considered significant. The Fluoro-QC software generated faster IQ evaluation results (mean = 0.31 ± 0.08 min) than manual procedure (mean = 4.68 ± 0.09 min). The mean difference between techniques was 4.36 min. Discrepancies were identified in the region of interest (ROI) areas drawn manually with evidence of user dependence. The new software presented the results of two tests (HCSR = 3.06, SNR = 5.17) and also collected information from the DICOM header. Significant differences were not identified between manual and automatic measures of SNR (p value = 0.22) and HCRS (p value = 0.46). The Fluoro-QC software is a feasible, fast and free to use method for evaluating imaging quality parameters on fluoroscopy systems. Copyright © 2017 The College of Radiographers. Published by Elsevier Ltd. All rights reserved.

  6. A Proposed Defect Tracking Model for Classifying the Inserted Defect Reports to Enhance Software Quality Control

    PubMed Central

    Khedr, Ayman E.; Sayed, Mostafa

    2013-01-01

    CONFLICT OF INTEREST: NONE DECLARED Defect tracking systems play an important role in the software development organizations as they can store historical information about defects. There are many research in defect tracking models and systems to enhance their capabilities to be more specifically tracking, and were adopted with new technology. Furthermore, there are different studies in classifying bugs in a step by step method to have clear perception and applicable method in detecting such bugs. This paper shows a new proposed defect tracking model for the purpose of classifying the inserted defects reports in a step by step method for more enhancement of the software quality. PMID:24039334

  7. Applying CASE Tools for On-Board Software Development

    NASA Astrophysics Data System (ADS)

    Brammer, U.; Hönle, A.

    For many space projects the software development is facing great pressure with respect to quality, costs and schedule. One way to cope with these challenges is the application of CASE tools for automatic generation of code and documentation. This paper describes two CASE tools: Rhapsody (I-Logix) featuring UML and ISG (BSSE) that provides modeling of finite state machines. Both tools have been used at Kayser-Threde in different space projects for the development of on-board software. The tools are discussed with regard to the full software development cycle.

  8. Advanced fingerprint verification software

    NASA Astrophysics Data System (ADS)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  9. NASA guidelines for assuring the adequacy and appropriateness of security safeguards in sensitive applications

    NASA Technical Reports Server (NTRS)

    Tompkins, F. G.

    1984-01-01

    The Office of Management and Budget (OMB) Circular A-71, transmittal Memorandum No. 1, requires that each agency establish a management control process to assure that appropriate administrative, physical and technical safeguards are incorporated into all new computer applications. In addition to security specifications, the management control process should assure that the safeguards are adequate for the application. The security activities that should be integral to the system development process are examined. The software quality assurance process to assure that adequate and appropriate controls are incorporated into sensitive applications is also examined. Security for software packages is also discussed.

  10. Developing a smartphone software package for predicting atmospheric pollutant concentrations at mobile locations

    PubMed Central

    Larkin, Andrew; Williams, David E.; Kile, Molly L.; Baird, William M.

    2014-01-01

    Background There is considerable evidence that exposure to air pollution is harmful to health. In the U.S., ambient air quality is monitored by Federal and State agencies for regulatory purposes. There are limited options, however, for people to access this data in real-time which hinders an individual's ability to manage their own risks. This paper describes a new software package that models environmental concentrations of fine particulate matter (PM2.5), coarse particulate matter (PM10), and ozone concentrations for the state of Oregon and calculates personal health risks at the smartphone's current location. Predicted air pollution risk levels can be displayed on mobile devices as interactive maps and graphs color-coded to coincide with EPA air quality index (AQI) categories. Users have the option of setting air quality warning levels via color-coded bars and were notified whenever warning levels were exceeded by predicted levels within 10 km. We validated the software using data from participants as well as from simulations which showed that the application was capable of identifying spatial and temporal air quality trends. This unique application provides a potential low-cost technology for reducing personal exposure to air pollution which can improve quality of life particularly for people with health conditions, such as asthma, that make them more susceptible to these hazards. PMID:26146409

  11. Design, Development and Delivery of Active Learning Tools in Software Verification & Validation Education

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary

    2018-01-01

    Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…

  12. A NASA family of minicomputer systems, Appendix A

    NASA Technical Reports Server (NTRS)

    Deregt, M. P.; Dulfer, J. E.

    1972-01-01

    This investigation was undertaken to establish sufficient specifications, or standards, for minicomputer hardware and software to provide NASA with realizable economics in quantity purchases, interchangeability of minicomputers, software, storage and peripherals, and a uniformly high quality. The standards will define minicomputer system component types, each specialized to its intended NASA application, in as many levels of capacity as required.

  13. EduSpeak[R]: A Speech Recognition and Pronunciation Scoring Toolkit for Computer-Aided Language Learning Applications

    ERIC Educational Resources Information Center

    Franco, Horacio; Bratt, Harry; Rossier, Romain; Rao Gadde, Venkata; Shriberg, Elizabeth; Abrash, Victor; Precoda, Kristin

    2010-01-01

    SRI International's EduSpeak[R] system is a software development toolkit that enables developers of interactive language education software to use state-of-the-art speech recognition and pronunciation scoring technology. Automatic pronunciation scoring allows the computer to provide feedback on the overall quality of pronunciation and to point to…

  14. Flexible and Low-Cost Measurements for Space Software Development- The Measurements Exploration Framework

    NASA Astrophysics Data System (ADS)

    Marculescu, Bogdan; Feldt, Robert; Torkar, Richard; Green, Lars-Goran; Liljegren, Thomas; Hult, Erika

    2011-08-01

    Verification and validation is an important part of software development and accounts for significant amounts of the costs associated with such a project. For developers of life or mission critical systems, such as software being developed for space applications, a balance must be reached between ensuring the quality of the system by extensive and rigorous testing and reducing costs and allowing the company to compete.Ensuring the quality of any system starts with a quality development process. To evaluate both the software development process and the product itself, measurements are needed. A balance must be then struck between ensuring the best possible quality of both process and product on the one hand, and reducing the cost of performing requirements on the other.A number of measurements have already been defined and are being used. For some of these, data collection can be automated as well, further lowering costs associated with implementing them. In practice, however, there may be situations where existing measurements are unsuitable for a variety of reasons.This paper describes a framework for creating low cost, flexible measurements in areas where initial information is scarce. The framework, called The Measurements Exploration Framework, is aimed in particular at the Space Software development industry and was developed is such an environment.

  15. An experience of qualified preventive screening: shiraz smart screening software.

    PubMed

    Islami Parkoohi, Parisa; Zare, Hashem; Abdollahifard, Gholamreza

    2015-01-01

    Computerized preventive screening software is a cost effective intervention tool to address non-communicable chronic diseases. Shiraz Smart Screening Software (SSSS) was developed as an innovative tool for qualified screening. It allows simultaneous smart screening of several high-burden chronic diseases and supports reminder notification functionality. The extent in which SSSS affects screening quality is also described. Following software development, preventive screening and annual health examinations of 261 school staff (Medical School of Shiraz, Iran) was carried out in a software-assisted manner. To evaluate the quality of the software-assisted screening, we used quasi-experimental study design and determined coverage, irregular attendance and inappropriateness proportions in relation with the manual and software-assisted screening as well as the corresponding number of requested tests. In manual screening method, 27% of employees were covered (with 94% irregular attendance) while by software-assisted screening, the coverage proportion was 79% (attendance status will clear after the specified time). The frequency of inappropriate screening test requests, before the software implementation, was 41.37% for fasting plasma glucose, 41.37% for lipid profile, 0.84% for occult blood, 0.19% for flexible sigmoidoscopy/colonoscopy, 35.29% for Pap smear, 19.20% for mammography and 11.2% for prostate specific antigen. All of the above were corrected by the software application. In total, 366 manual screening and 334 software-assisted screening tests were requested. SSSS is an innovative tool to improve the quality of preventive screening plans in terms of increased screening coverage, reduction in inappropriateness and the total number of requested tests.

  16. First CLIPS Conference Proceedings, volume 1

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The first Conference of C Language Production Systems (CLIPS) hosted by the NASA-Lyndon B. Johnson Space Center in August 1990 is presented. Articles included engineering applications, intelligent tutors and training, intelligent software engineering, automated knowledge acquisition, network applications, verification and validation, enhancements to CLIPS, space shuttle quality control/diagnosis applications, space shuttle and real-time applications, and medical, biological, and agricultural applications.

  17. Requirement Metrics for Risk Identification

    NASA Technical Reports Server (NTRS)

    Hammer, Theodore; Huffman, Lenore; Wilson, William; Rosenberg, Linda; Hyatt, Lawrence

    1996-01-01

    The Software Assurance Technology Center (SATC) is part of the Office of Mission Assurance of the Goddard Space Flight Center (GSFC). The SATC's mission is to assist National Aeronautics and Space Administration (NASA) projects to improve the quality of software which they acquire or develop. The SATC's efforts are currently focused on the development and use of metric methodologies and tools that identify and assess risks associated with software performance and scheduled delivery. This starts at the requirements phase, where the SATC, in conjunction with software projects at GSFC and other NASA centers is working to identify tools and metric methodologies to assist project managers in identifying and mitigating risks. This paper discusses requirement metrics currently being used at NASA in a collaborative effort between the SATC and the Quality Assurance Office at GSFC to utilize the information available through the application of requirements management tools.

  18. Database Performance Monitoring for the Photovoltaic Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Klise, Katherine A.

    The Database Performance Monitoring (DPM) software (copyright in processes) is being developed at Sandia National Laboratories to perform quality control analysis on time series data. The software loads time indexed databases (currently csv format), performs a series of quality control tests defined by the user, and creates reports which include summary statistics, tables, and graphics. DPM can be setup to run on an automated schedule defined by the user. For example, the software can be run once per day to analyze data collected on the previous day. HTML formatted reports can be sent via email or hosted on a website.more » To compare performance of several databases, summary statistics and graphics can be gathered in a dashboard view which links to detailed reporting information for each database. The software can be customized for specific applications.« less

  19. Software Design Improvements. Part 1; Software Benefits and Limitations

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?

  20. Measuring the complexity of design in real-time imaging software

    NASA Astrophysics Data System (ADS)

    Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.

    2007-02-01

    Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.

  1. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  2. Generating Safety-Critical PLC Code From a High-Level Application Software Specification

    NASA Technical Reports Server (NTRS)

    2008-01-01

    The benefits of automatic-application code generation are widely accepted within the software engineering community. These benefits include raised abstraction level of application programming, shorter product development time, lower maintenance costs, and increased code quality and consistency. Surprisingly, code generation concepts have not yet found wide acceptance and use in the field of programmable logic controller (PLC) software development. Software engineers at Kennedy Space Center recognized the need for PLC code generation while developing the new ground checkout and launch processing system, called the Launch Control System (LCS). Engineers developed a process and a prototype software tool that automatically translates a high-level representation or specification of application software into ladder logic that executes on a PLC. All the computer hardware in the LCS is planned to be commercial off the shelf (COTS), including industrial controllers or PLCs that are connected to the sensors and end items out in the field. Most of the software in LCS is also planned to be COTS, with only small adapter software modules that must be developed in order to interface between the various COTS software products. A domain-specific language (DSL) is a programming language designed to perform tasks and to solve problems in a particular domain, such as ground processing of launch vehicles. The LCS engineers created a DSL for developing test sequences of ground checkout and launch operations of future launch vehicle and spacecraft elements, and they are developing a tabular specification format that uses the DSL keywords and functions familiar to the ground and flight system users. The tabular specification format, or tabular spec, allows most ground and flight system users to document how the application software is intended to function and requires little or no software programming knowledge or experience. A small sample from a prototype tabular spec application is shown.

  3. Managing quality and compliance.

    PubMed

    McNeil, Alice; Koppel, Carl

    2015-01-01

    Critical care nurses assume vital roles in maintaining patient care quality. There are distinct facets to the process including standard setting, regulatory compliance, and completion of reports associated with these endeavors. Typically, multiple niche software applications are required and user interfaces are varied and complex. Although there are distinct quality indicators that must be tracked as well as a list of serious or sentinel events that must be documented and reported, nurses may not know the precise steps to ensure that information is properly documented and actually reaches the proper authorities for further investigation and follow-up actions. Technology advances have permitted the evolution of a singular software platform, capable of monitoring quality indicators and managing all facets of reporting associated with regulatory compliance.

  4. An experiment in software reliability: Additional analyses using data from automated replications

    NASA Technical Reports Server (NTRS)

    Dunham, Janet R.; Lauterbach, Linda A.

    1988-01-01

    A study undertaken to collect software error data of laboratory quality for use in the development of credible methods for predicting the reliability of software used in life-critical applications is summarized. The software error data reported were acquired through automated repetitive run testing of three independent implementations of a launch interceptor condition module of a radar tracking problem. The results are based on 100 test applications to accumulate a sufficient sample size for error rate estimation. The data collected is used to confirm the results of two Boeing studies reported in NASA-CR-165836 Software Reliability: Repetitive Run Experimentation and Modeling, and NASA-CR-172378 Software Reliability: Additional Investigations into Modeling With Replicated Experiments, respectively. That is, the results confirm the log-linear pattern of software error rates and reject the hypothesis of equal error rates per individual fault. This rejection casts doubt on the assumption that the program's failure rate is a constant multiple of the number of residual bugs; an assumption which underlies some of the current models of software reliability. data raises new questions concerning the phenomenon of interacting faults.

  5. A survey of quality assurance practices in biomedical open source software projects.

    PubMed

    Koru, Günes; El Emam, Khaled; Neisa, Angelica; Umarji, Medha

    2007-05-07

    Open source (OS) software is continuously gaining recognition and use in the biomedical domain, for example, in health informatics and bioinformatics. Given the mission critical nature of applications in this domain and their potential impact on patient safety, it is important to understand to what degree and how effectively biomedical OS developers perform standard quality assurance (QA) activities such as peer reviews and testing. This would allow the users of biomedical OS software to better understand the quality risks, if any, and the developers to identify process improvement opportunities to produce higher quality software. A survey of developers working on biomedical OS projects was conducted to examine the QA activities that are performed. We took a descriptive approach to summarize the implementation of QA activities and then examined some of the factors that may be related to the implementation of such practices. Our descriptive results show that 63% (95% CI, 54-72) of projects did not include peer reviews in their development process, while 82% (95% CI, 75-89) did include testing. Approximately 74% (95% CI, 67-81) of developers did not have a background in computing, 80% (95% CI, 74-87) were paid for their contributions to the project, and 52% (95% CI, 43-60) had PhDs. A multivariate logistic regression model to predict the implementation of peer reviews was not significant (likelihood ratio test = 16.86, 9 df, P = .051) and neither was a model to predict the implementation of testing (likelihood ratio test = 3.34, 9 df, P = .95). Less attention is paid to peer review than testing. However, the former is a complementary, and necessary, QA practice rather than an alternative. Therefore, one can argue that there are quality risks, at least at this point in time, in transitioning biomedical OS software into any critical settings that may have operational, financial, or safety implications. Developers of biomedical OS applications should invest more effort in implementing systemic peer review practices throughout the development and maintenance processes.

  6. Software Fault Tolerance: A Tutorial

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2000-01-01

    Because of our present inability to produce error-free software, software fault tolerance is and will continue to be an important consideration in software systems. The root cause of software design errors is the complexity of the systems. Compounding the problems in building correct software is the difficulty in assessing the correctness of software for highly complex systems. After a brief overview of the software development processes, we note how hard-to-detect design faults are likely to be introduced during development and how software faults tend to be state-dependent and activated by particular input sequences. Although component reliability is an important quality measure for system level analysis, software reliability is hard to characterize and the use of post-verification reliability estimates remains a controversial issue. For some applications software safety is more important than reliability, and fault tolerance techniques used in those applications are aimed at preventing catastrophes. Single version software fault tolerance techniques discussed include system structuring and closure, atomic actions, inline fault detection, exception handling, and others. Multiversion techniques are based on the assumption that software built differently should fail differently and thus, if one of the redundant versions fails, it is expected that at least one of the other versions will provide an acceptable output. Recovery blocks, N-version programming, and other multiversion techniques are reviewed.

  7. Model-Driven Development of Interactive Multimedia Applications with MML

    NASA Astrophysics Data System (ADS)

    Pleuss, Andreas; Hussmann, Heinrich

    There is an increasing demand for high-quality interactive applications which combine complex application logic with a sophisticated user interface, making use of individual media objects like graphics, animations, 3D graphics, audio or video. Their development is still challenging as it requires the integration of software design, user interface design, and media design.

  8. Web Application Software for Ground Operations Planning Database (GOPDb) Management

    NASA Technical Reports Server (NTRS)

    Lanham, Clifton; Kallner, Shawn; Gernand, Jeffrey

    2013-01-01

    A Web application facilitates collaborative development of the ground operations planning document. This will reduce costs and development time for new programs by incorporating the data governance, access control, and revision tracking of the ground operations planning data. Ground Operations Planning requires the creation and maintenance of detailed timelines and documentation. The GOPDb Web application was created using state-of-the-art Web 2.0 technologies, and was deployed as SaaS (Software as a Service), with an emphasis on data governance and security needs. Application access is managed using two-factor authentication, with data write permissions tied to user roles and responsibilities. Multiple instances of the application can be deployed on a Web server to meet the robust needs for multiple, future programs with minimal additional cost. This innovation features high availability and scalability, with no additional software that needs to be bought or installed. For data governance and security (data quality, management, business process management, and risk management for data handling), the software uses NAMS. No local copy/cloning of data is permitted. Data change log/tracking is addressed, as well as collaboration, work flow, and process standardization. The software provides on-line documentation and detailed Web-based help. There are multiple ways that this software can be deployed on a Web server to meet ground operations planning needs for future programs. The software could be used to support commercial crew ground operations planning, as well as commercial payload/satellite ground operations planning. The application source code and database schema are owned by NASA.

  9. Software quality assurance | News

    Science.gov Websites

    Measure was removed: "Sufficient level of detail in the requirements to develop test cases." ; This control measure was removed since the sufficient level of detail needed to develop test cases is recorded for all test cases. (Note: This is mandatory for applications graded with a High Quality Assurance

  10. Estimation of water quality parameters of inland and coastal waters with the use of a toolkit for processing of remote sensing data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dekker, A.G.; Hoogenboom, H.J.; Rijkeboer, M.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air/water interface correction, and application of water quality algorithms. A prototype software environment has recently been developed that enables the user to perform and control these processing steps. Main parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code for removing atmospheric and air-water interface influences, (ii) a tool for analyzing of algorithms for estimating water quality and (iii) a spectral database, containing apparent and inherent optical properties and associated water quality parameters.more » The use of the software is illustrated by applying implemented algorithms for estimating chlorophyll to data from a spectral library of Dutch inland waters with CHL ranging from 1 to 500 pg 1{sup -1}. The algorithms currently implemented in the Toolkit software are recommended for optically simple waters, but for optically complex waters development of more advanced retrieval methods is required.« less

  11. Video Image Stabilization and Registration (VISAR) Software

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Two scientists at NASA's Marshall Space Flight Center,atmospheric scientist Paul Meyer and solar physicist Dr. David Hathaway, developed promising new software, called Video Image Stabilization and Registration (VISAR). VISAR may help law enforcement agencies catch criminals by improving the quality of video recorded at crime scenes. In this photograph, the single frame at left, taken at night, was brightened in order to enhance details and reduce noise or snow. To further overcome the video defects in one frame, Law enforcement officials can use VISAR software to add information from multiple frames to reveal a person. Images from less than a second of videotape were added together to create the clarified image at right. VISAR stabilizes camera motion in the horizontal and vertical as well as rotation and zoom effects producing clearer images of moving objects, smoothes jagged edges, enhances still images, and reduces video noise or snow. VISAR could also have applications in medical and meteorological imaging. It could steady images of ultrasounds, which are infamous for their grainy, blurred quality. The software can be used for defense application by improving recornaissance video imagery made by military vehicles, aircraft, and ships traveling in harsh, rugged environments.

  12. Domain specific software architectures: Command and control

    NASA Technical Reports Server (NTRS)

    Braun, Christine; Hatch, William; Ruegsegger, Theodore; Balzer, Bob; Feather, Martin; Goldman, Neil; Wile, Dave

    1992-01-01

    GTE is the Command and Control contractor for the Domain Specific Software Architectures program. The objective of this program is to develop and demonstrate an architecture-driven, component-based capability for the automated generation of command and control (C2) applications. Such a capability will significantly reduce the cost of C2 applications development and will lead to improved system quality and reliability through the use of proven architectures and components. A major focus of GTE's approach is the automated generation of application components in particular subdomains. Our initial work in this area has concentrated in the message handling subdomain; we have defined and prototyped an approach that can automate one of the most software-intensive parts of C2 systems development. This paper provides an overview of the GTE team's DSSA approach and then presents our work on automated support for message processing.

  13. Monitoring Cyanobacteria with Satellites Webinar

    EPA Pesticide Factsheets

    real-world satellite applications can quantify cyanobacterial harmful algal blooms and related water quality parameters. Provisional satellite derived cyanobacteria data and different software tools are available to state environmental and health agencies.

  14. CCSDS Time-Critical Onboard Networking Service

    NASA Technical Reports Server (NTRS)

    Parkes, Steve; Schnurr, Rick; Marquart, Jane; Menke, Greg; Ciccone, Massimiliano

    2006-01-01

    The Consultative Committee for Space Data Systems (CCSDS) is developing recommendations for communication services onboard spacecraft. Today many different communication buses are used on spacecraft requiring software with the same basic functionality to be rewritten for each type of bus. This impacts on the application software resulting in custom software for almost every new mission. The Spacecraft Onboard Interface Services (SOIS) working group aims to provide a consistent interface to various onboard buses and sub-networks, enabling a common interface to the application software. The eventual goal is reusable software that can be easily ported to new missions and run on a range of onboard buses without substantial modification. The system engineer will then be able to select a bus based on its performance, power, etc and be confident that a particular choice of bus will not place excessive demands on software development. This paper describes the SOIS Intra-Networking Service which is designed to enable data transfer and multiplexing of a variety of internetworking protocols with a range of quality of service support, over underlying heterogeneous data links. The Intra-network service interface provides users with a common Quality of Service interface when transporting data across a variety of underlying data links. Supported Quality of Service (QoS) elements include: Priority, Resource Reservation and Retry/Redundancy. These three QoS elements combine and map into four TCONS services for onboard data communications: Best Effort, Assured, Reserved, and Guaranteed. Data to be transported is passed to the Intra-network service with a requested QoS. The requested QoS includes the type of service, priority and where appropriate, a channel identifier. The data is de-multiplexed, prioritized, and the required resources for transport are allocated. The data is then passed to the appropriate data link for transfer across the bus. The SOIS supported data links may inherently provide the quality of service support requested by the intra-network layer. In the case where the data link does not have the required level of support, the missing functionality is added by SOIS. As a result of this architecture, re-usable software applications can be designed and used across missions thereby promoting common mission operations. In addition, the protocol multiplexing function enables the blending of multiple onboard networks. This paper starts by giving an overview of the SOIS architecture in section 11, illustrating where the TCONS services fit into the overall architecture. It then describes the quality of service approach adopted, in section III. The prototyping efforts that have been going on are introduced in section JY. Finally, in section V the current status of the CCSDS recommendations is summarized.

  15. A multiparametric automatic method to monitor long-term reproducibility in digital mammography: results from a regional screening programme.

    PubMed

    Gennaro, G; Ballaminut, A; Contento, G

    2017-09-01

    This study aims to illustrate a multiparametric automatic method for monitoring long-term reproducibility of digital mammography systems, and its application on a large scale. Twenty-five digital mammography systems employed within a regional screening programme were controlled weekly using the same type of phantom, whose images were analysed by an automatic software tool. To assess system reproducibility levels, 15 image quality indices (IQIs) were extracted and compared with the corresponding indices previously determined by a baseline procedure. The coefficients of variation (COVs) of the IQIs were used to assess the overall variability. A total of 2553 phantom images were collected from the 25 digital mammography systems from March 2013 to December 2014. Most of the systems showed excellent image quality reproducibility over the surveillance interval, with mean variability below 5%. Variability of each IQI was 5%, with the exception of one index associated with the smallest phantom objects (0.25 mm), which was below 10%. The method applied for reproducibility tests-multi-detail phantoms, cloud automatic software tool to measure multiple image quality indices and statistical process control-was proven to be effective and applicable on a large scale and to any type of digital mammography system. • Reproducibility of mammography image quality should be monitored by appropriate quality controls. • Use of automatic software tools allows image quality evaluation by multiple indices. • System reproducibility can be assessed comparing current index value with baseline data. • Overall system reproducibility of modern digital mammography systems is excellent. • The method proposed and applied is cost-effective and easily scalable.

  16. Data quality can make or break a research infrastructure

    NASA Astrophysics Data System (ADS)

    Pastorello, G.; Gunter, D.; Chu, H.; Christianson, D. S.; Trotta, C.; Canfora, E.; Faybishenko, B.; Cheah, Y. W.; Beekwilder, N.; Chan, S.; Dengel, S.; Keenan, T. F.; O'Brien, F.; Elbashandy, A.; Poindexter, C.; Humphrey, M.; Papale, D.; Agarwal, D.

    2017-12-01

    Research infrastructures (RIs) commonly support observational data provided by multiple, independent sources. Uniformity in the data distributed by such RIs is important in most applications, e.g., in comparative studies using data from two or more sources. Achieving uniformity in terms of data quality is challenging, especially considering that many data issues are unpredictable and cannot be detected until a first occurrence of the issue. With that, many data quality control activities within RIs require a manual, human-in-the-loop element, making it an expensive activity. Our motivating example is the FLUXNET2015 dataset - a collection of ecosystem-level carbon, water, and energy fluxes between land and atmosphere from over 200 sites around the world, some sites with over 20 years of data. About 90% of the human effort to create the dataset was spent in data quality related activities. Based on this experience, we have been working on solutions to increase the automation of data quality control procedures. Since it is nearly impossible to fully automate all quality related checks, we have been drawing from the experience with techniques used in software development, which shares a few common constraints. In both managing scientific data and writing software, human time is a precious resource; code bases, as Science datasets, can be large, complex, and full of errors; both scientific and software endeavors can be pursued by individuals, but collaborative teams can accomplish a lot more. The lucrative and fast-paced nature of the software industry fueled the creation of methods and tools to increase automation and productivity within these constraints. Issue tracking systems, methods for translating problems into automated tests, powerful version control tools are a few examples. Terrestrial and aquatic ecosystems research relies heavily on many types of observational data. As volumes of data collection increases, ensuring data quality is becoming an unwieldy challenge for RIs. Business as usual approaches to data quality do not work with larger data volumes. We believe RIs can benefit greatly from adapting and imitating this body of theory and practice from software quality into data quality, enabling systematic and reproducible safeguards against errors and mistakes in datasets as much as in software.

  17. 76 FR 15004 - Proposed Collection; Comment Request

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-18

    ... computer software-based models or applications, termed under the rule as ``interactive websites.'' These... information; (c) Ways to enhance the quality, utility, and clarity of the information collected; and (d) Ways...

  18. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management andmore » software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.« less

  19. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and themore » software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.« less

  20. Record of Assessment Moderation Practice (RAMP): Survey Software as a Mechanism of Continuous Quality Improvement

    ERIC Educational Resources Information Center

    Johnson, Genevieve Marie

    2015-01-01

    In higher education, assessment integrity is pivotal to student learning and satisfaction, and, therefore, a particularly important target of continuous quality improvement. This paper reports on the preliminary development and application of a process of recording and analysing current assessment moderation practices, with the aim of identifying…

  1. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  2. Towards a mature measurement environment: Creating a software engineering research environment

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1990-01-01

    Software engineering researchers are building tools, defining methods, and models; however, there are problems with the nature and style of the research. The research is typically bottom-up, done in isolation so the pieces cannot be easily logically or physically integrated. A great deal of the research is essentially the packaging of a particular piece of technology with little indication of how the work would be integrated with other prices of research. The research is not aimed at solving the real problems of software engineering, i.e., the development and maintenance of quality systems in a productive manner. The research results are not evaluated or analyzed via experimentation or refined and tailored to the application environment. Thus, it cannot be easily transferred into practice. Because of these limitations we have not been able to understand the components of the discipline as a coherent whole and the relationships between various models of the process and product. What is needed is a top down experimental, evolutionary framework in which research can be focused, logically and physically integrated to produce quality software productively, and evaluated and tailored to the application environment. This implies the need for experimentation, which in turn implies the need for a laboratory that is associated with the artifact we are studying. This laboratory can only exist in an environment where software is being built, i.e., as part of a real software development and maintenance organization. Thus, we propose that Software Engineering Laboratory (SEL) type activities exist in all organizations to support software engineering research. We describe the SEL from a researcher's point of view, and discuss the corporate and government benefits of the SEL. The discussion focuses on the benefits to the research community.

  3. Poster - Thur Eve - 76: A quality control to achieve planning consistency in arc radiotherapy of the prostate.

    PubMed

    Zeng, G; Murphy, J; Annis, S-L; Wu, X; Wang, Y; McGowan, T; Macpherson, M

    2012-07-01

    To report a quality control program in prostate radiation therapy at our center that includes semi-automated planning process to generate high quality plans and in-house software to track plan quality in the subsequent clinical application. Arc planning in Eclipse v10.0 was preformed for both intact prostate and post-prostatectomy treatments. The planning focuses on DVH requirements and dose distributions being able to tolerate daily setup variations. A modified structure set is used to standardize the optimization, including short rectum and bladder in the fields to effectively tighten dose to target and a rectum expansion with 1cm cropped from PTV to block dose and shape posterior isodose lines. Structure, plan and optimization templates are used to streamline plan generation. DVH files are exported from Eclipse to a quality tracking software with GUI written in Matlab that can report the dose-volume data either for an individual patient or over a patient population. For 100 intact prostate patients treated with 78Gy, rectal D50, D25, D15 and D5 are 30.1±6.2Gy, 50.6±7.9Gy, 65.9±6.0Gy and 76.6±1.4Gy respectively, well below the limits 50Gy, 65Gy, 75Gy and 78Gy respectively. For prostate bed with prescription of 66Gy, rectal D50 is 35.9±6.9Gy. In both sites, PTV is covered by 95% prescription and the hotspots are less than 5%. The semi-automated planning method can efficiently create high quality plans while the tracking software can monitor the feedback from clinical application. It is a comprehensive and robust quality control program in radiation therapy. © 2012 American Association of Physicists in Medicine.

  4. Video Image Stabilization and Registration (VISAR) Software

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Two scientists at NASA's Marshall Space Flight Center, atmospheric scientist Paul Meyer and solar physicist Dr. David Hathaway, developed promising new software, called Video Image Stabilization and Registration (VISAR), which is illustrated in this Quick Time movie. VISAR is a computer algorithm that stabilizes camera motion in the horizontal and vertical as well as rotation and zoom effects producing clearer images of moving objects, smoothes jagged edges, enhances still images, and reduces video noise or snow. It could steady images of ultrasounds, which are infamous for their grainy, blurred quality. VISAR could also have applications in law enforcement, medical, and meteorological imaging. The software can be used for defense application by improving reconnaissance video imagery made by military vehicles, aircraft, and ships traveling in harsh, rugged environments.

  5. Video Image Stabilization and Registration (VISAR) Software

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Two scientists at NASA's Marshall Space Flight Center,atmospheric scientist Paul Meyer and solar physicist Dr. David Hathaway, developed promising new software, called Video Image stabilization and Registration (VISAR), which is illustrated in this Quick Time movie. VISAR is a computer algorithm that stabilizes camera motion in the horizontal and vertical as well as rotation and zoom effects producing clearer images of moving objects, smoothes jagged edges, enhances still images, and reduces video noise or snow. It could steady images of ultrasounds, which are infamous for their grainy, blurred quality. VISAR could also have applications in law enforcement, medical, and meteorological imaging. The software can be used for defense application by improving reconnaissance video imagery made by military vehicles, aircraft, and ships traveling in harsh, rugged environments.

  6. Geodetic glacier mass balances at the push of a button: application of Structure from Motion technology on aerial images in mountain regions

    NASA Astrophysics Data System (ADS)

    Bolch, T.; Mölg, N.

    2017-12-01

    The application of Structure-from-Motion (SfM) to generate digital terrain models (DTMs) derived out of images from various kinds of sources has strongly increased in recent years. The major reason for this is its easy-to-use handling in comparison to conventional photogrammetry. In glaciology, DTMs are intensely used, among others, to calculate the geodetic mass balances. Few studies investigated the application of SfM to aerial images in mountainous terrain and results look promising. We tested this technique in a demanding environment in the Swiss Alps including very steep slopes, snow and ice covered terrain. SfM (using the commercial software packages of Agisoft Photoscan and Pix4DMapper) and conventional photogrammetry (ERDAS Photogrammetry) were applied on archival aerial images for nine dates between 1946 and 2005 the results were compared regarding bundle adjustment and final DTM quality. The overall precision of the DTMs could be defined with the use of a modern, high-quality reference DTM by Swisstopo. Results suggest a high performance of SfM to produce DTMs of similar quality as conventional photogrammetry. A ground resolution of high quality (little noise and artefacts) can be up to 50% higher, with 3-6 times less user effort. However, the controls on the commercial SfM software packages are limited in comparison to ERDAS Photogrammetry. SfM performs less reliably when few images with little overlap are processed. Overall, the uncertainty of DTMs from the different software are comparable and mostly within the uncertainty range of the reference DTM, making them highly valuable for glaciological purposes. Even though SfM facilitates the largely automated production of high quality DTMs, the user is not exempt from a thorough quality check, at best with reference data where available.

  7. Computer applications in scientific balloon quality control

    NASA Astrophysics Data System (ADS)

    Seely, Loren G.; Smith, Michael S.

    Seal defects and seal tensile strength are primary determinants of product quality in scientific balloon manufacturing; they therefore require a unit of quality measure. The availability of inexpensive and powerful data-processing tools can serve as the basis of a quality-trends-discerning analysis of products. The results of one such analysis are presently given in graphic form for use on the production floor. Software descriptions and their sample outputs are presented, together with a summary of overall and long-term effects of these methods on product quality.

  8. International MODIS and AIRS Processing Package (IMAPP) Implementation of Infusion of Satellite Data into Environmental Applications-International (IDEA-I) for Air Quality Forecasts using Suomi-NPP, Terra and Aqua Aerosol Retrievals

    NASA Astrophysics Data System (ADS)

    Davies, J. E.; Strabala, K.; Pierce, R. B.; Huang, A.

    2016-12-01

    Fine mode aerosols play a significant role in public health through their impact on respiratory and cardiovascular disease. IDEA-I (Infusion of Satellite Data into Environmental Applications-International) is a real-time system for trajectory-based forecasts of aerosol dispersion that can assist in the prediction of poor air quality events. We released a direct broadcast version of IDEA-I for aerosol trajectory forecasts in June 2012 under the International MODIS and AIRS Processing Package (IMAPP). In January 2014 we updated this application with website software to display multi-satellite products. Now we have added VIIRS aerosols from Suomi National Polar-orbiting Partnership (S-NPP). IMAPP is a NASA-funded and freely-distributed software package developed at Space Science and Engineering Center of University of Wisconsin-Madison that has over 2,300 registered users worldwide. With IMAPP, any ground station capable of receiving direct broadcast from Terra or Aqua can produce calibrated and geolocated radiances and a suite of environmental products. These products include MODIS AOD required for IDEA-I. VIIRS AOD for IDEA-I can be generated by Community Satellite Processing Package (CSPP) VIIRS EDR Version 2.0 Software for Suomi NPP. CSPP is also developed and distributed by Space Science & Engineering Center. This presentation describes our updated IMAPP implementation of IDEA-I through an example of its operation in a region known for episodic poor air quality events.

  9. Web usability evaluation on BloobIS website by using hallway usability testing method and ISO 9241:11

    NASA Astrophysics Data System (ADS)

    Dwi Susanto, Tony; Ingesti Prasetyo, Anisa; Astuti, Hanim Maria

    2018-03-01

    At this moment, the need for web as an information media is highly important. Not only confined in the infotainment area, government, and education, but health as well uses the web media as a medium for providing information effectively. BloobIS is a web based application which integrates blood supply and distribution information at the Blood Transfusion Unit. Knowing how easy information is on BloobIS is marked by how convenient the website is used. Up until now, the BloobIS website is nearing completion but testing has not yet been performed to users and is on the testing and development phase in the Development Life Cycle software. Thus, an evaluation namely the quality control software which focuses on the perspective of BloobIs web usability is required. Hallway Usability Testing and ISO 9241:11 are the methods chosen to measure the BloobIS application usability. The expected outputs of the quality control software focusing on the usability evaluation are being able to rectify the usability deficiencies on the BloobIs web and provide recommendations to develop the web as a basic BloobIS web quality upgraed which sets a goal to amplify the satisfaction of web users based on usability factors in ISO 9241:11.

  10. Applying Reflective Middleware Techniques to Optimize a QoS-enabled CORBA Component Model Implementation

    NASA Technical Reports Server (NTRS)

    Wang, Nanbor; Parameswaran, Kirthika; Kircher, Michael; Schmidt, Douglas

    2003-01-01

    Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and open sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration framework for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of-service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines rejective middleware techniques designed to adaptively (1) select optimal communication mechanisms, (2) manage QoS properties of CORBA components in their contain- ers, and (3) (re)con$gure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of rejective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.

  11. Applying Reflective Middleware Techniques to Optimize a QoS-enabled CORBA Component Model Implementation

    NASA Technical Reports Server (NTRS)

    Wang, Nanbor; Kircher, Michael; Schmidt, Douglas C.

    2000-01-01

    Although existing CORBA specifications, such as Real-time CORBA and CORBA Messaging, address many end-to-end quality-of-service (QoS) properties, they do not define strategies for configuring these properties into applications flexibly, transparently, and adaptively. Therefore, application developers must make these configuration decisions manually and explicitly, which is tedious, error-prone, and often sub-optimal. Although the recently adopted CORBA Component Model (CCM) does define a standard configuration frame-work for packaging and deploying software components, conventional CCM implementations focus on functionality rather than adaptive quality-of service, which makes them unsuitable for next-generation applications with demanding QoS requirements. This paper presents three contributions to the study of middleware for QoS-enabled component-based applications. It outlines reflective middleware techniques designed to adaptively: (1) select optimal communication mechanisms, (2) man- age QoS properties of CORBA components in their containers, and (3) (re)configure selected component executors dynamically. Based on our ongoing research on CORBA and the CCM, we believe the application of reflective techniques to component middleware will provide a dynamically adaptive and (re)configurable framework for COTS software that is well-suited for the QoS demands of next-generation applications.

  12. Application of advanced data collection and quality assurance methods in open prospective study - a case study of PONS project.

    PubMed

    Wawrzyniak, Zbigniew M; Paczesny, Daniel; Mańczuk, Marta; Zatoński, Witold A

    2011-01-01

    Large-scale epidemiologic studies can assess health indicators differentiating social groups and important health outcomes of the incidence and mortality of cancer, cardiovascular disease, and others, to establish a solid knowledgebase for the prevention management of premature morbidity and mortality causes. This study presents new advanced methods of data collection and data management systems with current data quality control and security to ensure high quality data assessment of health indicators in the large epidemiologic PONS study (The Polish-Norwegian Study). The material for experiment is the data management design of the large-scale population study in Poland (PONS) and the managed processes are applied into establishing a high quality and solid knowledge. The functional requirements of the PONS study data collection, supported by the advanced IT web-based methods, resulted in medical data of a high quality, data security, with quality data assessment, control process and evolution monitoring are fulfilled and shared by the IT system. Data from disparate and deployed sources of information are integrated into databases via software interfaces, and archived by a multi task secure server. The practical and implemented solution of modern advanced database technologies and remote software/hardware structure successfully supports the research of the big PONS study project. Development and implementation of follow-up control of the consistency and quality of data analysis and the processes of the PONS sub-databases have excellent measurement properties of data consistency of more than 99%. The project itself, by tailored hardware/software application, shows the positive impact of Quality Assurance (QA) on the quality of outcomes analysis results, effective data management within a shorter time. This efficiency ensures the quality of the epidemiological data and indicators of health by the elimination of common errors of research questionnaires and medical measurements.

  13. Application of a newly developed software program for image quality assessment in cone-beam computed tomography.

    PubMed

    de Oliveira, Marcus Vinicius Linhares; Santos, António Carvalho; Paulo, Graciano; Campos, Paulo Sergio Flores; Santos, Joana

    2017-06-01

    The purpose of this study was to apply a newly developed free software program, at low cost and with minimal time, to evaluate the quality of dental and maxillofacial cone-beam computed tomography (CBCT) images. A polymethyl methacrylate (PMMA) phantom, CQP-IFBA, was scanned in 3 CBCT units with 7 protocols. A macro program was developed, using the free software ImageJ, to automatically evaluate the image quality parameters. The image quality evaluation was based on 8 parameters: uniformity, the signal-to-noise ratio (SNR), noise, the contrast-to-noise ratio (CNR), spatial resolution, the artifact index, geometric accuracy, and low-contrast resolution. The image uniformity and noise depended on the protocol that was applied. Regarding the CNR, high-density structures were more sensitive to the effect of scanning parameters. There were no significant differences between SNR and CNR in centered and peripheral objects. The geometric accuracy assessment showed that all the distance measurements were lower than the real values. Low-contrast resolution was influenced by the scanning parameters, and the 1-mm rod present in the phantom was not depicted in any of the 3 CBCT units. Smaller voxel sizes presented higher spatial resolution. There were no significant differences among the protocols regarding artifact presence. This software package provided a fast, low-cost, and feasible method for the evaluation of image quality parameters in CBCT.

  14. IEEE/AIAA/NASA Digital Avionics Systems Conference, 9th, Virginia Beach, VA, Oct. 15-18, 1990, Proceedings

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The present conference on digital avionics discusses vehicle-management systems, spacecraft avionics, special vehicle avionics, communication/navigation/identification systems, software qualification and quality assurance, launch-vehicle avionics, Ada applications, sensor and signal processing, general aviation avionics, automated software development, design-for-testability techniques, and avionics-software engineering. Also discussed are optical technology and systems, modular avionics, fault-tolerant avionics, commercial avionics, space systems, data buses, crew-station technology, embedded processors and operating systems, AI and expert systems, data links, and pilot/vehicle interfaces.

  15. The Role and Quality of Software Safety in the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.

    2010-01-01

    In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.

  16. Generic control software connecting astronomical instruments to the reflective memory data recording system of VLTI - bossvlti

    NASA Astrophysics Data System (ADS)

    Pozna, E.; Ramirez, A.; Mérand, A.; Mueller, A.; Abuter, R.; Frahm, R.; Morel, S.; Schmid, C.; Duc, T. Phan; Delplancke-Ströbele, F.

    2014-07-01

    The quality of data obtained by VLTI instruments may be refined by analyzing the continuous data supplied by the Reflective Memory Network (RMN). Based on 5 years experience providing VLTI instruments (PACMAN, AMBER, MIDI) with RMN data, the procedure has been generalized to make the synchronization with observation trouble-free. The present software interface saves not only months of efforts for each instrument but also provides the benefits of software frameworks. Recent applications (GRAVITY, MATISSE) supply feedback for the software to evolve. The paper highlights the way common features been identified to be able to offer reusable code in due course.

  17. A Robust and Scalable Software Library for Parallel Adaptive Refinement on Unstructured Meshes

    NASA Technical Reports Server (NTRS)

    Lou, John Z.; Norton, Charles D.; Cwik, Thomas A.

    1999-01-01

    The design and implementation of Pyramid, a software library for performing parallel adaptive mesh refinement (PAMR) on unstructured meshes, is described. This software library can be easily used in a variety of unstructured parallel computational applications, including parallel finite element, parallel finite volume, and parallel visualization applications using triangular or tetrahedral meshes. The library contains a suite of well-designed and efficiently implemented modules that perform operations in a typical PAMR process. Among these are mesh quality control during successive parallel adaptive refinement (typically guided by a local-error estimator), parallel load-balancing, and parallel mesh partitioning using the ParMeTiS partitioner. The Pyramid library is implemented in Fortran 90 with an interface to the Message-Passing Interface (MPI) library, supporting code efficiency, modularity, and portability. An EM waveguide filter application, adaptively refined using the Pyramid library, is illustrated.

  18. Modernization of software quality assurance

    NASA Technical Reports Server (NTRS)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  19. Software archeology: a case study in software quality assurance and design

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case studymore » for the Robotic Integrated Packaging System (RIPS) software is included.« less

  20. [Quality assurance of a virtual simulation software: application to IMAgo and SIMAgo (ISOgray)].

    PubMed

    Isambert, A; Beaudré, A; Ferreira, I; Lefkopoulos, D

    2007-06-01

    Virtual simulation process is often used to prepare three dimensional conformal radiation therapy treatments. As the quality of the treatment is widely dependent on this step, it is mandatory to perform extensive controls on this software before clinical use. The tests presented in this work have been carried out on the treatment planning system ISOgray (DOSIsoft), including the delineation module IMAgo and the virtual simulation module SIMAgo. According to our experience, the most relevant controls of international protocols have been selected. These tests mainly focused on measuring and delineation tools, virtual simulation functionalities, and have been performed with three phantoms: the Quasar Multi-Purpose Body Phantom, the Quasar MLC Beam Geometry Phantom (Modus Medical Devices Inc.) and a phantom developed at Hospital Tenon. No major issues have been identified while performing the tests. These controls have emphasized the necessity for the user to consider with a critical eye the results displayed by a virtual simulation software. The contrast of visualisation, the slice thickness, the calculation and display mode of 3D structures used by the software are many factors of uncertainties. A virtual simulation software quality assurance procedure has been written and applied on a set of CT images. Similar tests have to be performed periodically and at minimum at each change of major version.

  1. Semantic Entity-Component State Management Techniques to Enhance Software Quality for Multimodal VR-Systems.

    PubMed

    Fischbach, Martin; Wiebusch, Dennis; Latoschik, Marc Erich

    2017-04-01

    Modularity, modifiability, reusability, and API usability are important software qualities that determine the maintainability of software architectures. Virtual, Augmented, and Mixed Reality (VR, AR, MR) systems, modern computer games, as well as interactive human-robot systems often include various dedicated input-, output-, and processing subsystems. These subsystems collectively maintain a real-time simulation of a coherent application state. The resulting interdependencies between individual state representations, mutual state access, overall synchronization, and flow of control implies a conceptual close coupling whereas software quality asks for a decoupling to develop maintainable solutions. This article presents five semantics-based software techniques that address this contradiction: Semantic grounding, code from semantics, grounded actions, semantic queries, and decoupling by semantics. These techniques are applied to extend the well-established entity-component-system (ECS) pattern to overcome some of this pattern's deficits with respect to the implied state access. A walk-through of central implementation aspects of a multimodal (speech and gesture) VR-interface is used to highlight the techniques' benefits. This use-case is chosen as a prototypical example of complex architectures with multiple interacting subsystems found in many VR, AR and MR architectures. Finally, implementation hints are given, lessons learned regarding maintainability pointed-out, and performance implications discussed.

  2. Test Driven Development of Scientific Models

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.

    2012-01-01

    Test-Driven Development (TDD) is a software development process that promises many advantages for developer productivity and has become widely accepted among professional software engineers. As the name suggests, TDD practitioners alternate between writing short automated tests and producing code that passes those tests. Although this overly simplified description will undoubtedly sound prohibitively burdensome to many uninitiated developers, the advent of powerful unit-testing frameworks greatly reduces the effort required to produce and routinely execute suites of tests. By testimony, many developers find TDD to be addicting after only a few days of exposure, and find it unthinkable to return to previous practices. Of course, scientific/technical software differs from other software categories in a number of important respects, but I nonetheless believe that TDD is quite applicable to the development of such software and has the potential to significantly improve programmer productivity and code quality within the scientific community. After a detailed introduction to TDD, I will present the experience within the Software Systems Support Office (SSSO) in applying the technique to various scientific applications. This discussion will emphasize the various direct and indirect benefits as well as some of the difficulties and limitations of the methodology. I will conclude with a brief description of pFUnit, a unit testing framework I co-developed to support test-driven development of parallel Fortran applications.

  3. Scientific Use Cases for the Virtual Atomic and Molecular Data Center

    NASA Astrophysics Data System (ADS)

    Dubernet, M. L.; Aboudarham, J.; Ba, Y. A.; Boiziot, M.; Bottinelli, S.; Caux, E.; Endres, C.; Glorian, J. M.; Henry, F.; Lamy, L.; Le Sidaner, P.; Møller, T.; Moreau, N.; Rénié, C.; Roueff, E.; Schilke, P.; Vastel, C.; Zwoelf, C. M.

    2014-12-01

    VAMDC Consortium is a worldwide consortium which federates interoperable Atomic and Molecular databases through an e-science infrastructure. The contained data are of the highest scientific quality and are crucial for many applications: astrophysics, atmospheric physics, fusion, plasma and lighting technologies, health, etc. In this paper we present astrophysical scientific use cases in relation to the use of the VAMDC e-infrastructure. Those will cover very different applications such as: (i) modeling the spectra of interstellar objects using the myXCLASS software tool implemented in the Common Astronomy Software Applications package (CASA) or using the CASSIS software tool, in its stand-alone version or implemented in the Herschel Interactive Processing Environment (HIPE); (ii) the use of Virtual Observatory tools accessing VAMDC databases; (iii) the access of VAMDC from the Paris solar BASS2000 portal; (iv) the combination of tools and database from the APIS service (Auroral Planetary Imaging and Spectroscopy); (v) combination of heterogeneous data for the application to the interstellar medium from the SPECTCOL tool.

  4. Remote hardware-reconfigurable robotic camera

    NASA Astrophysics Data System (ADS)

    Arias-Estrada, Miguel; Torres-Huitzil, Cesar; Maya-Rueda, Selene E.

    2001-10-01

    In this work, a camera with integrated image processing capabilities is discussed. The camera is based on an imager coupled to an FPGA device (Field Programmable Gate Array) which contains an architecture for real-time computer vision low-level processing. The architecture can be reprogrammed remotely for application specific purposes. The system is intended for rapid modification and adaptation for inspection and recognition applications, with the flexibility of hardware and software reprogrammability. FPGA reconfiguration allows the same ease of upgrade in hardware as a software upgrade process. The camera is composed of a digital imager coupled to an FPGA device, two memory banks, and a microcontroller. The microcontroller is used for communication tasks and FPGA programming. The system implements a software architecture to handle multiple FPGA architectures in the device, and the possibility to download a software/hardware object from the host computer into its internal context memory. System advantages are: small size, low power consumption, and a library of hardware/software functionalities that can be exchanged during run time. The system has been validated with an edge detection and a motion processing architecture, which will be presented in the paper. Applications targeted are in robotics, mobile robotics, and vision based quality control.

  5. Mining Program Source Code for Improving Software Quality

    DTIC Science & Technology

    2013-01-01

    conduct static verification on the software application under analysis to detect defects around APIs. (a) Papers published in peer-reviewed journals...N/A for none) Enter List of papers submitted or published that acknowledge ARO support from the start of the project to the date of this printing...List the papers , including journal references, in the following categories: Received Paper 05/06/2013 21.00 Tao Xie, Suresh Thummalapenta, David Lo

  6. Cellular Radio Telecommunication for Health Care: Benefits and Risks

    PubMed Central

    Sneiderman, Charles A.; Ackerman, Michael J.

    2004-01-01

    Cellular radio telecommunication has increased exponentially with many applications to health care reported. The authors attempt to summarize published applications with demonstrated effect on health care, review briefly the rapid evolution of hardware and software standards, explain current limitations and future potential of data quality and security, and discuss issues of safety. PMID:15298996

  7. Friction and lubrication modelling in sheet metal forming: Influence of lubrication amount, tool roughness and sheet coating on product quality

    NASA Astrophysics Data System (ADS)

    Hol, J.; Wiebenga, J. H.; Carleer, B.

    2017-09-01

    In the stamping of automotive parts, friction and lubrication play a key role in achieving high quality products. In the development process of new automotive parts, it is therefore crucial to accurately account for these effects in sheet metal forming simulations. This paper presents a selection of results considering friction and lubrication modelling in sheet metal forming simulations of a front fender product. For varying lubrication conditions, the front fender can either show wrinkling or fractures. The front fender is modelled using different lubrication amounts, tool roughness’s and sheet coatings to show the strong influence of friction on both part quality and the overall production stability. For this purpose, the TriboForm software is used in combination with the AutoForm software. The results demonstrate that the TriboForm software enables the simulation of friction behaviour for varying lubrication conditions, i.e. resulting in a generally applicable approach for friction characterization under industrial sheet metal forming process conditions.

  8. Predicting Software Suitability Using a Bayesian Belief Network

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.; Berrios, Joseph S.

    2005-01-01

    The ability to reliably predict the end quality of software under development presents a significant advantage for a development team. It provides an opportunity to address high risk components earlier in the development life cycle, when their impact is minimized. This research proposes a model that captures the evolution of the quality of a software product, and provides reliable forecasts of the end quality of the software being developed in terms of product suitability. Development team skill, software process maturity, and software problem complexity are hypothesized as driving factors of software product quality. The cause-effect relationships between these factors and the elements of software suitability are modeled using Bayesian Belief Networks, a machine learning method. This research presents a Bayesian Network for software quality, and the techniques used to quantify the factors that influence and represent software quality. The developed model is found to be effective in predicting the end product quality of small-scale software development efforts.

  9. Flexible control techniques for a lunar base

    NASA Technical Reports Server (NTRS)

    Kraus, Thomas W.

    1992-01-01

    The fundamental elements found in every terrestrial control system can be employed in all lunar applications. These elements include sensors which measure physical properties, controllers which acquire sensor data and calculate a control response, and actuators which apply the control output to the process. The unique characteristics of the lunar environment will certainly require the development of new control system technology. However, weightlessness, harsh atmospheric conditions, temperature extremes, and radiation hazards will most significantly impact the design of sensors and actuators. The controller and associated control algorithms, which are the most complex element of any control system, can be derived in their entirety from existing technology. Lunar process control applications -- ranging from small-scale research projects to full-scale processing plants -- will benefit greatly from the controller advances being developed today. In particular, new software technology aimed at commercial process monitoring and control applications will almost completely eliminate the need for custom programs and the lengthy development and testing cycle they require. The applicability of existing industrial software to lunar applications has other significant advantages in addition to cost and quality. This software is designed to run on standard hardware platforms and takes advantage of existing LAN and telecommunications technology. Further, in order to exploit the existing commercial market, the software is being designed to be implemented by users of all skill levels -- typically users who are familiar with their process, but not necessarily with software or control theory. This means that specialized technical support personnel will not need to be on-hand, and the associated costs are eliminated. Finally, the latest industrial software designed for the commercial market is extremely flexible, in order to fit the requirements of many types of processing applications with little or no customization. This means that lunar process control projects will not be delayed by unforeseen problems or last minute process modifications. The software will include all of the tools needed to adapt to virtually any changes. In contrast to other space programs which required the development of tremendous amounts of custom software, lunar-based processing facilities will benefit from the use of existing software technology which is being proven in commercial applications on Earth.

  10. Evaluation of a mandatory quality assurance data capture in anesthesia: a secure electronic system to capture quality assurance information linked to an automated anesthesia record.

    PubMed

    Peterfreund, Robert A; Driscoll, William D; Walsh, John L; Subramanian, Aparna; Anupama, Shaji; Weaver, Melissa; Morris, Theresa; Arnholz, Sarah; Zheng, Hui; Pierce, Eric T; Spring, Stephen F

    2011-05-01

    Efforts to assure high-quality, safe, clinical care depend upon capturing information about near-miss and adverse outcome events. Inconsistent or unreliable information capture, especially for infrequent events, compromises attempts to analyze events in quantitative terms, understand their implications, and assess corrective efforts. To enhance reporting, we developed a secure, electronic, mandatory system for reporting quality assurance data linked to our electronic anesthesia record. We used the capabilities of our anesthesia information management system (AIMS) in conjunction with internally developed, secure, intranet-based, Web application software. The application is implemented with a backend allowing robust data storage, retrieval, data analysis, and reporting capabilities. We customized a feature within the AIMS software to create a hard stop in the documentation workflow before the end of anesthesia care time stamp for every case. The software forces the anesthesia provider to access the separate quality assurance data collection program, which provides a checklist for targeted clinical events and a free text option. After completing the event collection program, the software automatically returns the clinician to the AIMS to finalize the anesthesia record. The number of events captured by the departmental quality assurance office increased by 92% (95% confidence interval [CI] 60.4%-130%) after system implementation. The major contributor to this increase was the new electronic system. This increase has been sustained over the initial 12 full months after implementation. Under our reporting criteria, the overall rate of clinical events reported by any method was 471 events out of 55,382 cases or 0.85% (95% CI 0.78% to 0.93%). The new system collected 67% of these events (95% confidence interval 63%-71%). We demonstrate the implementation in an academic anesthesia department of a secure clinical event reporting system linked to an AIMS. The system enforces entry of quality assurance information (either no clinical event or notification of a clinical event). System implementation resulted in capturing nearly twice the number of events at a relatively steady case load. © 2011 International Anesthesia Research Society

  11. Specification-based software sizing: An empirical investigation of function metrics

    NASA Technical Reports Server (NTRS)

    Jeffery, Ross; Stathis, John

    1993-01-01

    For some time the software industry has espoused the need for improved specification-based software size metrics. This paper reports on a study of nineteen recently developed systems in a variety of application domains. The systems were developed by a single software services corporation using a variety of languages. The study investigated several metric characteristics. It shows that: earlier research into inter-item correlation within the overall function count is partially supported; a priori function counts, in themself, do not explain the majority of the effort variation in software development in the organization studied; documentation quality is critical to accurate function identification; and rater error is substantial in manual function counting. The implication of these findings for organizations using function based metrics are explored.

  12. MoniQA: a general approach to monitor quality assurance

    NASA Astrophysics Data System (ADS)

    Jacobs, J.; Deprez, T.; Marchal, G.; Bosmans, H.

    2006-03-01

    MoniQA ("Monitor Quality Assurance") is a new, non-commercial, independent quality assurance software application developed in our medical physics team. It is a complete Java TM - based modular environment for the evaluation of radiological viewing devices and it thus fits in the global quality assurance network of our (film less) radiology department. The purpose of the software tool is to guide the medical physicist through an acceptance protocol and the radiologist through a constancy check protocol by presentation of the necessary test patterns and by automated data collection. Data are then sent to a central management system for further analysis. At the moment more than 55 patterns have been implemented, which can be grouped in schemes to implement protocols (i.e. AAPMtg18, DIN and EUREF). Some test patterns are dynamically created and 'drawn' on the viewing device with random parameters as is the case in a recently proposed new pattern for constancy testing. The software is installed on 35 diagnostic stations (70 monitors) in a film less radiology department. Learning time was very limited. A constancy check -with the new pattern that assesses luminance decrease, resolution problems and geometric distortion- takes only 2 minutes and 28 seconds per monitor. The modular approach of the software allows the evaluation of new or emerging test patterns. We will report on the software and its usability: practicality of the constancy check tests in our hospital and on the results from acceptance tests of viewing stations for digital mammography.

  13. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    ERIC Educational Resources Information Center

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  14. Fast and predictable video compression in software design and implementation of an H.261 codec

    NASA Astrophysics Data System (ADS)

    Geske, Dagmar; Hess, Robert

    1998-09-01

    The use of software codecs for video compression becomes commonplace in several videoconferencing applications. In order to reduce conflicts with other applications used at the same time, mechanisms for resource reservation on endsystems need to determine an upper bound for computing time used by the codec. This leads to the demand for predictable execution times of compression/decompression. Since compression schemes as H.261 inherently depend on the motion contained in the video, an adaptive admission control is required. This paper presents a data driven approach based on dynamical reduction of the number of processed macroblocks in peak situations. Besides the absolute speed is a point of interest. The question, whether and how software compression of high quality video is feasible on today's desktop computers, is examined.

  15. [Development of integrated support software for clinical nutrition].

    PubMed

    Siquier Homar, Pedro; Pinteño Blanco, Manel; Calleja Hernández, Miguel Ángel; Fernández Cortés, Francisco; Martínez Sotelo, Jesús

    2015-09-01

    to develop an integrated computer software application for specialized nutritional support, integrated in the electronic clinical record, which detects automatically and early those undernourished patients or at risk of developing undernourishment, determining points of opportunity for improvement and evaluation of the results. the quality standards published by the Nutrition Work Group of the Spanish Society of Hospital Pharmacy (SEFH) and the recommendations by the Pharmacy Group of the Spanish Society of Parenteral and Enteral Nutrition (SENPE) have been taken into account. According to these quality standards, the nutritional support has to include the following healthcare stages or sub-processes: nutritional screening, nutritional assessment, plan for nutritional care, prescription, preparation and administration. this software allows to conduct, in an automated way, a specific nutritional assessment for those patients with nutritional risk, implementing, if necessary, a nutritional treatment plan, conducting follow-up and traceability of outcomes derived from the implementation of improvement actions, and quantifying to what extent our practice is close to the established standard. this software allows to standardize the specialized nutritional support from a multidisciplinary point of view, introducing the concept of quality control per processes, and including patient as the main customer. Copyright AULA MEDICA EDICIONES 2014. Published by AULA MEDICA. All rights reserved.

  16. Software cost/resource modeling: Software quality tradeoff measurement

    NASA Technical Reports Server (NTRS)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  17. Computers--Teaching, Technology, and Applications.

    ERIC Educational Resources Information Center

    Cocco, Anthony M.; And Others

    1995-01-01

    Includes "Managing Personality Types in the Computer Classroom" (Cocco); "External I/O Input/Output with a PC" (Fryda); "The Future of CAD/CAM Computer-Assisted Design/Computer-Assisted Manufacturing Software" (Fulton); and "Teaching Quality Assurance--A Laboratory Approach" (Wojslaw). (SK)

  18. Enhancing Environmental HPC Applications: The EnCompAS approach

    NASA Astrophysics Data System (ADS)

    Frank, Anton; Donners, John; Pursula, Antti; Seinstra, Frank; Kranzlmüller, Dieter

    2015-04-01

    Many HPC applications in geoscience are of very high scientific quality and highly optimized for supercomputers. However, some of these codes lack the uptake by other adjacent scientific communities or industry due to deficiencies in usability, quality, and availability. Since enhancing software by, e.g., adding a graphical user interface, respecting data standards, setting up a support structure, or writing an extensive documentation is not of direct and immediate scientific relevance, most developers are not willing to invest any additional effort in these issues. Furthermore, if scientists, who are not directly involved in the development of some scientific software, could make benefit from additional features or interfaces, respective requests are often turned down due to the lack of time and resources. On the other hand, such enhancements are crucial for the sustainability of the scientific assets as well as the widespread or even worldwide distribution of European environmental software. Closely collaborating with environmental scientists the national supercomputing and eScience centres in Helsinki, Amsterdam, and Munich have identified that an enhancement of HPC and data analysis software must be provided as a service to the scientists developing such software. Therefore, first steps have been taken to establish respective services at these centres. In this talk we will present the already existing and envisioned service portfolio, some first success stories, and the approach to harmonize the current status aiming to turn this local effort into a pan-European service offering for environmental science.

  19. Building an experience factory for maintenance

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.; Condon, Steven E.; Briand, Lionel; Kim, Yong-Mi; Basili, Victor R.

    1994-01-01

    This paper reports the preliminary results of a study of the software maintenance process in the Flight Dynamics Division (FDD) of the National Aeronautics and Space Administration/Goddard Space Flight Center (NASA/GSFC). This study is being conducted by the Software Engineering Laboratory (SEL), a research organization sponsored by the Software Engineering Branch of the FDD, which investigates the effectiveness of software engineering technologies when applied to the development of applications software. This software maintenance study began in October 1993 and is being conducted using the Quality Improvement Paradigm (QIP), a process improvement strategy based on three iterative steps: understanding, assessing, and packaging. The preliminary results represent the outcome of the understanding phase, during which SEL researchers characterized the maintenance environment, product, and process. Findings indicate that a combination of quantitative and qualitative analysis is effective for studying the software maintenance process, that additional measures should be collected for maintenance (as opposed to new development), and that characteristics such as effort, error rate, and productivity are best considered on a 'release' basis rather than on a project basis. The research thus far has documented some basic differences between new development and software maintenance. It lays the foundation for further application of the QIP to investigate means of improving the maintenance process and product in the FDD.

  20. User’s manual for the Automated Data Assurance and Management application developed for quality control of Everglades Depth Estimation Network water-level data

    USGS Publications Warehouse

    Petkewich, Matthew D.; Daamen, Ruby C.; Roehl, Edwin A.; Conrads, Paul

    2016-09-29

    The generation of Everglades Depth Estimation Network (EDEN) daily water-level and water-depth maps is dependent on high quality real-time data from over 240 water-level stations. To increase the accuracy of the daily water-surface maps, the Automated Data Assurance and Management (ADAM) tool was created by the U.S. Geological Survey as part of Greater Everglades Priority Ecosystems Science. The ADAM tool is used to provide accurate quality-assurance review of the real-time data from the EDEN network and allows estimation or replacement of missing or erroneous data. This user’s manual describes how to install and operate the ADAM software. File structure and operation of the ADAM software is explained using examples.

  1. NOSC Program Managers Handbook. Revision 1

    DTIC Science & Technology

    1988-02-01

    cost. The effects of application of life-cycle cost analysis through the planning and RIDT&E phases of a program, and the " design to cost" concept on...is the plan for assuring the quality of the design , design documentation, and fabricated/assembled hardware and associated computer software. 13.5.3.2...listings and printouts, which document the n. requirements, design , or details of compute : software; explain the capabilities and limitations of the

  2. PQLX: A seismic data quality control system description, applications, and users manual

    USGS Publications Warehouse

    McNamara, Daniel E.; Boaz, Richard I.

    2011-01-01

    We present a detailed description and users manual for a new tool to evaluate seismic station performance and characteristics by providing quick and easy transitions between visualizations of the frequency and time domains. The software is based on the probability density functions (PDF) of power spectral densities (PSD) (McNamara and Buland, 2004) and builds on the original development of the PDF stand-alone software system (McNamara and Boaz, 2005) and the seismological data viewer application PQL (IRIS-PASSCAL Quick Look) and PQLII (available through the IRIS PASSCAL program: http://www.passcal.nmt.edu/content/pql-ii-program-viewing-data). With PQLX (PQL eXtended), computed PSDs are stored in a MySQL database, allowing a user to access specific time periods of PSDs (PDF subsets) and time series segments through a GUI-driven interface. The power of the method and software lies in the fact that there is no need to screen the data for system transients, earthquakes, or general data artifacts, because they map into a background probability level. In fact, examination of artifacts related to station operation and episodic cultural noise allow us to estimate both the overall station quality and a baseline level of Earth noise at each site. The output of this analysis tool is useful for both operational and scientific applications. Operationally, it is useful for characterizing the current and past performance of existing broadband stations, for conducting tests on potential new seismic station locations, for evaluating station baseline noise levels (McNamara and others, 2009), for detecting problems with the recording system or sensors, and for evaluating the overall quality of data and metadata. Scientifically, the tool allows for mining of PSDs for investigations on the evolution of seismic noise (for example, Aster and others, 2008; and Aster and others, 2010) and other phenomena. Currently, PQLX is operational at several organizations including the USGS National Earthquake Information Center (NEIC), the USGS Albuquerque Seismological Laboratory (ASL), and the Incorporated Research Institutions in Seismology (IRIS) Data Management Center (DMC) for station monitoring and instrument response quality control. The PQLX system is available to the community at large through the U.S. Geological Survey (USGS) (http://ehpm-earthquake.wr.usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/pqlx). Also provided is a fully searchable website for bug reporting and enhancement requests (http://wush.net/bugzilla/PQLX). The first part of this document aims to describe and illustrate some of the features and capabilities of the software. The second part of this document is a detailed users manual that covers installation procedures, system requirements, operations, bug reporting, and software components (Appendix).

  3. The Effects of Development Team Skill on Software Product Quality

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  4. The design of real time infrared image generation software based on Creator and Vega

    NASA Astrophysics Data System (ADS)

    Wang, Rui-feng; Wu, Wei-dong; Huo, Jun-xiu

    2013-09-01

    Considering the requirement of high reality and real-time quality dynamic infrared image of an infrared image simulation, a method to design real-time infrared image simulation application on the platform of VC++ is proposed. This is based on visual simulation software Creator and Vega. The functions of Creator are introduced simply, and the main features of Vega developing environment are analyzed. The methods of infrared modeling and background are offered, the designing flow chart of the developing process of IR image real-time generation software and the functions of TMM Tool and MAT Tool and sensor module are explained, at the same time, the real-time of software is designed.

  5. Automated software development workstation

    NASA Technical Reports Server (NTRS)

    Prouty, Dale A.; Klahr, Philip

    1988-01-01

    A workstation is being developed that provides a computational environment for all NASA engineers across application boundaries, which automates reuse of existing NASA software and designs, and efficiently and effectively allows new programs and/or designs to be developed, catalogued, and reused. The generic workstation is made domain specific by specialization of the user interface, capturing engineering design expertise for the domain, and by constructing/using a library of pertinent information. The incorporation of software reusability principles and expert system technology into this workstation provide the obvious benefits of increased productivity, improved software use and design reliability, and enhanced engineering quality by bringing engineering to higher levels of abstraction based on a well tested and classified library.

  6. Using CAD/CAM to improve productivity - The IPAD approach

    NASA Technical Reports Server (NTRS)

    Fulton, R. E.

    1981-01-01

    Progress in designing and implementing CAD/CAM systems as a result of the NASA Integrated Programs for Aerospace-Vehicle Design is discussed. Essential software packages have been identified as executive, data management, general user, and geometry and graphics software. Data communication, as a means to integrate data over a network of computers of different vendors, provides data management with the capability of meeting design and manufacturing requirements of the vendors. Geometry software is dependent on developmental success with solid geometry software, which is necessary for continual measurements of, for example, a block of metal while it is being machined. Applications in the aerospace industry, such as for design, analysis, tooling, testing, quality control, etc., are outlined.

  7. A Core Plug and Play Architecture for Reusable Flight Software Systems

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan

    2006-01-01

    The Flight Software Branch, at Goddard Space Flight Center (GSFC), has been working on a run-time approach to facilitate a formal software reuse process. The reuse process is designed to enable rapid development and integration of high-quality software systems and to more accurately predict development costs and schedule. Previous reuse practices have been somewhat successful when the same teams are moved from project to project. But this typically requires taking the software system in an all-or-nothing approach where useful components cannot be easily extracted from the whole. As a result, the system is less flexible and scalable with limited applicability to new projects. This paper will focus on the rationale behind, and implementation of the run-time executive. This executive is the core for the component-based flight software commonality and reuse process adopted at Goddard.

  8. Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station

    NASA Technical Reports Server (NTRS)

    Kirby, Randy L.; Mann, David; Prenger, Stephen G.; Craig, Wayne; Greenwood, Andrew; Morsics, Jonathan; Fricker, Charles H.; Quach, Son; Lechese, Paul

    2003-01-01

    United Space Alliance (USA) developed and used a new software development method to meet technical, schedule, and budget challenges faced during the development and delivery of the new Shuttle Telemetry Ground Station at Kennedy Space Center. This method, called Collaborative Software Development, enabled KSC to effectively leverage industrial software and build additional capabilities to meet shuttle system and operational requirements. Application of this method resulted in reduced time to market, reduced development cost, improved product quality, and improved programmer competence while developing technologies of benefit to a small company in California (AP Labs Inc.). Many modifications were made to the baseline software product (VMEwindow), which improved its quality and functionality. In addition, six new software capabilities were developed, which are the subject of this article and add useful functionality to the VMEwindow environment. These new software programs are written in C or VXWorks and are used in conjunction with other ground station software packages, such as VMEwindow, Matlab, Dataviews, and PVWave. The Space Shuttle Telemetry Ground Station receives frequency-modulation (FM) and pulse-code-modulated (PCM) signals from the shuttle and support equipment. The hardware architecture (see figure) includes Sun workstations connected to multiple PCM- and FM-processing VersaModule Eurocard (VME) chassis. A reflective memory network transports raw data from PCM Processors (PCMPs) to the programmable digital-to-analog (D/A) converters, strip chart recorders, and analysis and controller workstations.

  9. A Framework of the Use of Information in Software Testing

    ERIC Educational Resources Information Center

    Kaveh, Payman

    2010-01-01

    With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…

  10. A proven approach for more effective software development and maintenance

    NASA Technical Reports Server (NTRS)

    Pajerski, Rose; Hall, Dana; Sinclair, Craig

    1994-01-01

    Modern space flight mission operations and associated ground data systems are increasingly dependent upon reliable, quality software. Critical functions such as command load preparation, health and status monitoring, communications link scheduling and conflict resolution, and transparent gateway protocol conversion are routinely performed by software. Given budget constraints and the ever increasing capabilities of processor technology, the next generation of control centers and data systems will be even more dependent upon software across all aspects of performance. A key challenge now is to implement improved engineering, management, and assurance processes for the development and maintenance of that software; processes that cost less, yield higher quality products, and that self-correct for continual improvement evolution. The NASA Goddard Space Flight Center has a unique experience base that can be readily tapped to help solve the software challenge. Over the past eighteen years, the Software Engineering Laboratory within the code 500 Flight Dynamics Division has evolved a software development and maintenance methodology that accommodates the unique characteristics of an organization while optimizing and continually improving the organization's software capabilities. This methodology relies upon measurement, analysis, and feedback much analogous to that of control loop systems. It is an approach with a time-tested track record proven through repeated applications across a broad range of operational software development and maintenance projects. This paper describes the software improvement methodology employed by the Software Engineering Laboratory, and how it has been exploited within the Flight Dynamics Division with GSFC Code 500. Examples of specific improvement in the software itself and its processes are presented to illustrate the effectiveness of the methodology. Finally, the initial findings are given when this methodology was applied across the mission operations and ground data systems software domains throughout Code 500.

  11. Software Development Standard Processes (SDSP)

    NASA Technical Reports Server (NTRS)

    Lavin, Milton L.; Wang, James J.; Morillo, Ronald; Mayer, John T.; Jamshidian, Barzia; Shimizu, Kenneth J.; Wilkinson, Belinda M.; Hihn, Jairus M.; Borgen, Rosana B.; Meyer, Kenneth N.; hide

    2011-01-01

    A JPL-created set of standard processes is to be used throughout the lifecycle of software development. These SDSPs cover a range of activities, from management and engineering activities, to assurance and support activities. These processes must be applied to software tasks per a prescribed set of procedures. JPL s Software Quality Improvement Project is currently working at the behest of the JPL Software Process Owner to ensure that all applicable software tasks follow these procedures. The SDSPs are captured as a set of 22 standards in JPL s software process domain. They were developed in-house at JPL by a number of Subject Matter Experts (SMEs) residing primarily within the Engineering and Science Directorate, but also from the Business Operations Directorate and Safety and Mission Success Directorate. These practices include not only currently performed best practices, but also JPL-desired future practices in key thrust areas like software architecting and software reuse analysis. Additionally, these SDSPs conform to many standards and requirements to which JPL projects are beholden.

  12. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  13. Software life cycle methodologies and environments

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest

    1991-01-01

    Products of this project will significantly improve the quality and productivity of Space Station Freedom Program software processes by: improving software reliability and safety; and broadening the range of problems that can be solved with computational solutions. Projects brings in Computer Aided Software Engineering (CASE) technology for: Environments such as Engineering Script Language/Parts Composition System (ESL/PCS) application generator, Intelligent User Interface for cost avoidance in setting up operational computer runs, Framework programmable platform for defining process and software development work flow control, Process for bringing CASE technology into an organization's culture, and CLIPS/CLIPS Ada language for developing expert systems; and methodologies such as Method for developing fault tolerant, distributed systems and a method for developing systems for common sense reasoning and for solving expert systems problems when only approximate truths are known.

  14. A code inspection process for security reviews

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Garzoglio, Gabriele; /Fermilab

    2009-05-01

    In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application andmore » their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.« less

  15. The development procedures and tools applied for the attitude control software of the Italian satellite SAX

    NASA Astrophysics Data System (ADS)

    Hameetman, G. J.; Dekker, G. J.

    1993-11-01

    The Italian satellite (with a large Dutch contribution) SAX is a scientific satellite which has the mission to study roentgen sources. One main requirement for the Attitude and Orbit Control Subsystem (AOCS) is to achieve and maintain a stable pointing accuracy with a limit cycle of less than 90 arcsec during pointings of maximal 28 hours. The main SAX instrument, the Narrow Field Instrument, is highly sensitive to (indirect) radiation coming from the Sun. This sensitivity leads to another main requirement that under no circumstances the safe attitude domain may be left. The paper describes the application software in relation with the overall SAX AOCS subsystem, the CASE tools that have been used during the development, some advantages and disadvantages of the use of these tools, the measures taken to meet the more or less conflicting requirements of reliability and flexibility, and the lessons learned during development. The quality of the approach to the development has proven the (separately executed) hardware/software integration tests. During these tests, a neglectible number of software errors has been detected in the application software.

  16. A code inspection process for security reviews

    NASA Astrophysics Data System (ADS)

    Garzoglio, Gabriele

    2010-04-01

    In recent years, it has become more and more evident that software threat communities are taking an increasing interest in Grid infrastructures. To mitigate the security risk associated with the increased numbers of attacks, the Grid software development community needs to scale up effort to reduce software vulnerabilities. This can be achieved by introducing security review processes as a standard project management practice. The Grid Facilities Department of the Fermilab Computing Division has developed a code inspection process, tailored to reviewing security properties of software. The goal of the process is to identify technical risks associated with an application and their impact. This is achieved by focusing on the business needs of the application (what it does and protects), on understanding threats and exploit communities (what an exploiter gains), and on uncovering potential vulnerabilities (what defects can be exploited). The desired outcome of the process is an improvement of the quality of the software artifact and an enhanced understanding of possible mitigation strategies for residual risks. This paper describes the inspection process and lessons learned on applying it to Grid middleware.

  17. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  18. SWiFT Software Quality Assurance Plan.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berg, Jonathan Charles

    This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan

  19. Pragmatic quality metrics for evolutionary software development models

    NASA Technical Reports Server (NTRS)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  20. Increasing productivity through Total Reuse Management (TRM)

    NASA Technical Reports Server (NTRS)

    Schuler, M. P.

    1991-01-01

    Total Reuse Management (TRM) is a new concept currently being promoted by the NASA Langley Software Engineering and Ada Lab (SEAL). It uses concepts similar to those promoted in Total Quality Management (TQM). Both technical and management personnel are continually encouraged to think in terms of reuse. Reuse is not something that is aimed for after a product is completed, but rather it is built into the product from inception through development. Lowering software development costs, reducing risk, and increasing code reliability are the more prominent goals of TRM. Procedures and methods used to adopt and apply TRM are described. Reuse is frequently thought of as only being applicable to code. However, reuse can apply to all products and all phases of the software life cycle. These products include management and quality assurance plans, designs, and testing procedures. Specific examples of successfully reused products are given and future goals are discussed.

  1. Development of a software for quantitative evaluation radiotherapy target and organ-at-risk segmentation comparison.

    PubMed

    Kalpathy-Cramer, Jayashree; Awan, Musaddiq; Bedrick, Steven; Rasch, Coen R N; Rosenthal, David I; Fuller, Clifton D

    2014-02-01

    Modern radiotherapy requires accurate region of interest (ROI) inputs for plan optimization and delivery. Target delineation, however, remains operator-dependent and potentially serves as a major source of treatment delivery error. In order to optimize this critical, yet observer-driven process, a flexible web-based platform for individual and cooperative target delineation analysis and instruction was developed in order to meet the following unmet needs: (1) an open-source/open-access platform for automated/semiautomated quantitative interobserver and intraobserver ROI analysis and comparison, (2) a real-time interface for radiation oncology trainee online self-education in ROI definition, and (3) a source for pilot data to develop and validate quality metrics for institutional and cooperative group quality assurance efforts. The resultant software, Target Contour Testing/Instructional Computer Software (TaCTICS), developed using Ruby on Rails, has since been implemented and proven flexible, feasible, and useful in several distinct analytical and research applications.

  2. Benefit from NASA

    NASA Image and Video Library

    1999-06-01

    Two scientists at NASA's Marshall Space Flight Center,atmospheric scientist Paul Meyer and solar physicist Dr. David Hathaway, developed promising new software, called Video Image Stabilization and Registration (VISAR). VISAR may help law enforcement agencies catch criminals by improving the quality of video recorded at crime scenes. In this photograph, the single frame at left, taken at night, was brightened in order to enhance details and reduce noise or snow. To further overcome the video defects in one frame, Law enforcement officials can use VISAR software to add information from multiple frames to reveal a person. Images from less than a second of videotape were added together to create the clarified image at right. VISAR stabilizes camera motion in the horizontal and vertical as well as rotation and zoom effects producing clearer images of moving objects, smoothes jagged edges, enhances still images, and reduces video noise or snow. VISAR could also have applications in medical and meteorological imaging. It could steady images of ultrasounds, which are infamous for their grainy, blurred quality. The software can be used for defense application by improving recornaissance video imagery made by military vehicles, aircraft, and ships traveling in harsh, rugged environments.

  3. An Internet Protocol-Based Software System for Real-Time, Closed-Loop, Multi-Spacecraft Mission Simulation Applications

    NASA Technical Reports Server (NTRS)

    Burns, Richard D.; Davis, George; Cary, Everett; Higinbotham, John; Hogie, Keith

    2003-01-01

    A mission simulation prototype for Distributed Space Systems has been constructed using existing developmental hardware and software testbeds at NASA s Goddard Space Flight Center. A locally distributed ensemble of testbeds, connected through the local area network, operates in real time and demonstrates the potential to assess the impact of subsystem level modifications on system level performance and, ultimately, on the quality and quantity of the end product science data.

  4. Models and Frameworks: A Synergistic Association for Developing Component-Based Applications

    PubMed Central

    Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A.; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development. PMID:25147858

  5. Models and frameworks: a synergistic association for developing component-based applications.

    PubMed

    Alonso, Diego; Sánchez-Ledesma, Francisco; Sánchez, Pedro; Pastor, Juan A; Álvarez, Bárbara

    2014-01-01

    The use of frameworks and components has been shown to be effective in improving software productivity and quality. However, the results in terms of reuse and standardization show a dearth of portability either of designs or of component-based implementations. This paper, which is based on the model driven software development paradigm, presents an approach that separates the description of component-based applications from their possible implementations for different platforms. This separation is supported by automatic integration of the code obtained from the input models into frameworks implemented using object-oriented technology. Thus, the approach combines the benefits of modeling applications from a higher level of abstraction than objects, with the higher levels of code reuse provided by frameworks. In order to illustrate the benefits of the proposed approach, two representative case studies that use both an existing framework and an ad hoc framework, are described. Finally, our approach is compared with other alternatives in terms of the cost of software development.

  6. A measurement system for large, complex software programs

    NASA Technical Reports Server (NTRS)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  7. Software quality in 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, C.

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized thatmore » success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.« less

  8. Models for Deploying Open Source and Commercial Software to Support Earth Science Data Processing and Distribution

    NASA Astrophysics Data System (ADS)

    Yetman, G.; Downs, R. R.

    2011-12-01

    Software deployment is needed to process and distribute scientific data throughout the data lifecycle. Developing software in-house can take software development teams away from other software development projects and can require efforts to maintain the software over time. Adopting and reusing software and system modules that have been previously developed by others can reduce in-house software development and maintenance costs and can contribute to the quality of the system being developed. A variety of models are available for reusing and deploying software and systems that have been developed by others. These deployment models include open source software, vendor-supported open source software, commercial software, and combinations of these approaches. Deployment in Earth science data processing and distribution has demonstrated the advantages and drawbacks of each model. Deploying open source software offers advantages for developing and maintaining scientific data processing systems and applications. By joining an open source community that is developing a particular system module or application, a scientific data processing team can contribute to aspects of the software development without having to commit to developing the software alone. Communities of interested developers can share the work while focusing on activities that utilize in-house expertise and addresses internal requirements. Maintenance is also shared by members of the community. Deploying vendor-supported open source software offers similar advantages to open source software. However, by procuring the services of a vendor, the in-house team can rely on the vendor to provide, install, and maintain the software over time. Vendor-supported open source software may be ideal for teams that recognize the value of an open source software component or application and would like to contribute to the effort, but do not have the time or expertise to contribute extensively. Vendor-supported software may also have the additional benefits of guaranteed up-time, bug fixes, and vendor-added enhancements. Deploying commercial software can be advantageous for obtaining system or software components offered by a vendor that meet in-house requirements. The vendor can be contracted to provide installation, support and maintenance services as needed. Combining these options offers a menu of choices, enabling selection of system components or software modules that meet the evolving requirements encountered throughout the scientific data lifecycle.

  9. Labview Interface Concepts Used in NASA Scientific Investigations and Virtual Instruments

    NASA Technical Reports Server (NTRS)

    Roth, Don J.; Parker, Bradford H.; Rapchun, David A.; Jones, Hollis H.; Cao, Wei

    2001-01-01

    This article provides an overview of several software control applications developed for NASA using LabVIEW. The applications covered here include (1) an Ultrasonic Measurement System for nondestructive evaluation of advanced structural materials, an Xray Spectral Mapping System for characterizing the quality and uniformity of developing photon detector materials, (2) a Life Testing System for these same materials, (3) and the instrument panel for an aircraft mounted Cloud Absorption Radiometer that measures the light scattered by clouds in multiple spectral bands. Many of the software interface concepts employed are explained. Panel layout and block diagram (code) strategies for each application are described. In particular, some of the more unique features of the applications' interfaces and source code are highlighted. This article assumes that the reader has a beginner-to-intermediate understanding of LabVIEW methods.

  10. Small Business Innovations

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Under an Army Small Business Innovation Research (SBIR) grant, Symbiotics, Inc. developed a software system that permits users to upgrade products from standalone applications so they can communicate in a distributed computing environment. Under a subsequent NASA SBIR grant, Symbiotics added additional tools to the SOCIAL product to enable NASA to coordinate conventional systems for planning Shuttle launch support operations. Using SOCIAL, data may be shared among applications in a computer network even when the applications are written in different programming languages. The product was introduced to the commercial market in 1993 and is used to monitor and control equipment for operation support and to integrate financial networks. The SBIR program was established to increase small business participation in federal R&D activities and to transfer government research to industry. InQuisiX is a reuse library providing high performance classification, cataloging, searching, browsing, retrieval and synthesis capabilities. These form the foundation for software reuse, producing higher quality software at lower cost and in less time. Software Productivity Solutions, Inc. developed the technology under Small Business Innovation Research (SBIR) projects funded by NASA and the Army and is marketing InQuisiX in conjunction with Science Applications International Corporation (SAIC). The SBIR program was established to increase small business participation in federal R&D activities and to transfer government research to industry.

  11. The NASA Software Research Infusion Initiative: Successful Technology Transfer for Software Assurance

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Pressburger, Thomas; Markosian, Lawrence; Feather, Martin S.

    2006-01-01

    New processes, methods and tools are constantly appearing in the field of software engineering. Many of these augur great potential in improving software development processes, resulting in higher quality software with greater levels of assurance. However, there are a number of obstacles that impede their infusion into software development practices. These are the recurring obstacles common to many forms of research. Practitioners cannot readily identify the emerging techniques that may most benefit them, and cannot afford to risk time and effort in evaluating and experimenting with them while there is still uncertainty about whether they will have payoff in this particular context. Similarly, researchers cannot readily identify those practitioners whose problems would be amenable to their techniques and lack the feedback from practical applications necessary to help them to evolve their techniques to make them more likely to be successful. This paper describes an ongoing effort conducted by a software engineering research infusion team, and the NASA Research Infusion Initiative, established by NASA s Software Engineering Initiative, to overcome these obstacles.

  12. Orbiter Flying Qualities (OFQ) Workstation user's guide

    NASA Technical Reports Server (NTRS)

    Myers, Thomas T.; Parseghian, Zareh; Hogue, Jeffrey R.

    1988-01-01

    This project was devoted to the development of a software package, called the Orbiter Flying Qualities (OFQ) Workstation, for working with the OFQ Archives which are specially selected sets of space shuttle entry flight data relevant to flight control and flying qualities. The basic approach to creation of the workstation software was to federate and extend commercial software products to create a low cost package that operates on personal computers. Provision was made to link the workstation to large computers, but the OFQ Archive files were also converted to personal computer diskettes and can be stored on workstation hard disk drives. The primary element of the workstation developed in the project is the Interactive Data Handler (IDH) which allows the user to select data subsets from the archives and pass them to specialized analysis programs. The IDH was developed as an application in a relational database management system product. The specialized analysis programs linked to the workstation include a spreadsheet program, FREDA for spectral analysis, MFP for frequency domain system identification, and NIPIP for pilot-vehicle system parameter identification. The workstation also includes capability for ensemble analysis over groups of missions.

  13. A Software for soil quality conservation at organic waste disposal areas: The case of olive mill and pistachio wastes.

    NASA Astrophysics Data System (ADS)

    Doula, Maria; Sarris, Apostolos; Papadopoulos, Nikos; Hliaoutakis, Aggelos; Kydonakis, Aris; Argyriou, Lemonia; Theocharopoulos, Sid; Kolovos, Chronis

    2016-04-01

    For the sustainable reuse of organic wastes at agricultural areas, apart from extensive evaluation of waste properties and characteristics, it is of significant importance, in order to protect soil quality, to evaluate land suitability and estimate the correct application doses prior waste landspreading. In the light of this precondition, a software was developed that integrates GIS maps of land suitability for waste reuse (wastewater and solid waste) and an algorithm for waste doses estimation in relation to soil analysis, and in case of reuse for fertilization with soil analysis, irrigation water quality and plant needs. EU and legislation frameworks of European Member States are also considered for the assessment of waste suitability for landspreading and for the estimation of the correct doses that will not cause adverse effects on soil and also to underground water (e.g. Nitrate Directive). Two examples of software functionality are presented in this study using data collected during two LIFE projects, i.e. Prosodol for landspreading of olive mill wastes and AgroStrat for pistachio wastes.

  14. LEGOS: Object-based software components for mission-critical systems. Final report, June 1, 1995--December 31, 1997

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1998-08-01

    An estimated 85% of the installed base of software is a custom application with a production quantity of one. In practice, almost 100% of military software systems are custom software. Paradoxically, the marginal costs of producing additional units are near zero. So why hasn`t the software market, a market with high design costs and low productions costs evolved like other similar custom widget industries, such as automobiles and hardware chips? The military software industry seems immune to market pressures that have motivated a multilevel supply chain structure in other widget industries: design cost recovery, improve quality through specialization, and enablemore » rapid assembly from purchased components. The primary goal of the ComponentWare Consortium (CWC) technology plan was to overcome barriers to building and deploying mission-critical information systems by using verified, reusable software components (Component Ware). The adoption of the ComponentWare infrastructure is predicated upon a critical mass of the leading platform vendors` inevitable adoption of adopting emerging, object-based, distributed computing frameworks--initially CORBA and COM/OLE. The long-range goal of this work is to build and deploy military systems from verified reusable architectures. The promise of component-based applications is to enable developers to snap together new applications by mixing and matching prefabricated software components. A key result of this effort is the concept of reusable software architectures. A second important contribution is the notion that a software architecture is something that can be captured in a formal language and reused across multiple applications. The formalization and reuse of software architectures provide major cost and schedule improvements. The Unified Modeling Language (UML) is fast becoming the industry standard for object-oriented analysis and design notation for object-based systems. However, the lack of a standard real-time distributed object operating system, lack of a standard Computer-Aided Software Environment (CASE) tool notation and lack of a standard CASE tool repository has limited the realization of component software. The approach to fulfilling this need is the software component factory innovation. The factory approach takes advantage of emerging standards such as UML, CORBA, Java and the Internet. The key technical innovation of the software component factory is the ability to assemble and test new system configurations as well as assemble new tools on demand from existing tools and architecture design repositories.« less

  15. Software engineering for ESO's VLT project

    NASA Astrophysics Data System (ADS)

    Filippi, G.

    1994-12-01

    This paper reports on the experience at the European Southern Observatory on the application of software engineering techniques to a 200 man-year control software project for the Very Large Telescope (VLT). This shall provide astronomers, before the end of the century, with one of the most powerful telescopes in the world. From the definition of the general model, described in the software management plan, specific activities have been and will be defined: standards for documents and for code development, design approach using a CASE tool, the process of reviewing both documentation and code, quality assurance, test strategy, etc. The initial choices, the current implementation and the future planned activities are presented and, where feedback is already available, pros and cons are discussed.

  16. Identification and evaluation of software measures

    NASA Technical Reports Server (NTRS)

    Card, D. N.

    1981-01-01

    A large scale, systematic procedure for identifying and evaluating measures that meaningfully characterize one or more elements of software development is described. The background of this research, the nature of the data involved, and the steps of the analytic procedure are discussed. An example of the application of this procedure to data from real software development projects is presented. As the term is used here, a measure is a count or numerical rating of the occurrence of some property. Examples of measures include lines of code, number of computer runs, person hours expended, and degree of use of top down design methodology. Measures appeal to the researcher and the manager as a potential means of defining, explaining, and predicting software development qualities, especially productivity and reliability.

  17. SIMPATIQCO: a server-based software suite which facilitates monitoring the time course of LC-MS performance metrics on Orbitrap instruments.

    PubMed

    Pichler, Peter; Mazanek, Michael; Dusberger, Frederico; Weilnböck, Lisa; Huber, Christian G; Stingl, Christoph; Luider, Theo M; Straube, Werner L; Köcher, Thomas; Mechtler, Karl

    2012-11-02

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC-MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge.

  18. SIMPATIQCO: A Server-Based Software Suite Which Facilitates Monitoring the Time Course of LC–MS Performance Metrics on Orbitrap Instruments

    PubMed Central

    2012-01-01

    While the performance of liquid chromatography (LC) and mass spectrometry (MS) instrumentation continues to increase, applications such as analyses of complete or near-complete proteomes and quantitative studies require constant and optimal system performance. For this reason, research laboratories and core facilities alike are recommended to implement quality control (QC) measures as part of their routine workflows. Many laboratories perform sporadic quality control checks. However, successive and systematic longitudinal monitoring of system performance would be facilitated by dedicated automatic or semiautomatic software solutions that aid an effortless analysis and display of QC metrics over time. We present the software package SIMPATIQCO (SIMPle AuTomatIc Quality COntrol) designed for evaluation of data from LTQ Orbitrap, Q-Exactive, LTQ FT, and LTQ instruments. A centralized SIMPATIQCO server can process QC data from multiple instruments. The software calculates QC metrics supervising every step of data acquisition from LC and electrospray to MS. For each QC metric the software learns the range indicating adequate system performance from the uploaded data using robust statistics. Results are stored in a database and can be displayed in a comfortable manner from any computer in the laboratory via a web browser. QC data can be monitored for individual LC runs as well as plotted over time. SIMPATIQCO thus assists the longitudinal monitoring of important QC metrics such as peptide elution times, peak widths, intensities, total ion current (TIC) as well as sensitivity, and overall LC–MS system performance; in this way the software also helps identify potential problems. The SIMPATIQCO software package is available free of charge. PMID:23088386

  19. Culture shock: Improving software quality

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    de Jong, K.; Trauth, S.L.

    1988-01-01

    The concept of software quality can represent a significant shock to an individual who has been developing software for many years and who believes he or she has been doing a high quality job. The very idea that software includes lines of code and associated documentation is foreign and difficult to grasp, at best. Implementation of a software quality program hinges on the concept that software is a product whose quality needs improving. When this idea is introduced into a technical community that is largely ''self-taught'' and has been producing ''good'' software for some time, a fundamental understanding of themore » concepts associated with software is often weak. Software developers can react as if to say, ''What are you talking about. What do you mean I'm not doing a good job. I haven't gotten any complaints about my code yetexclamation'' Coupling such surprise and resentment with the shock that software really is a product and software quality concepts do exist, can fuel the volatility of these emotions. In this paper, we demonstrate that the concept of software quality can indeed pose a culture shock to developers. We also show that a ''typical'' quality assurance approach, that of imposing a standard and providing inspectors and auditors to assure its adherence, contributes to this shock and detracts from the very goal the approach should achieve. We offer an alternative, adopted through experience, to implement a software quality program: cooperative assistance. We show how cooperation, education, consultation and friendly assistance can overcome this culture shock. 3 refs.« less

  20. QCScreen: a software tool for data quality control in LC-HRMS based metabolomics.

    PubMed

    Simader, Alexandra Maria; Kluger, Bernhard; Neumann, Nora Katharina Nicole; Bueschl, Christoph; Lemmens, Marc; Lirk, Gerald; Krska, Rudolf; Schuhmacher, Rainer

    2015-10-24

    Metabolomics experiments often comprise large numbers of biological samples resulting in huge amounts of data. This data needs to be inspected for plausibility before data evaluation to detect putative sources of error e.g. retention time or mass accuracy shifts. Especially in liquid chromatography-high resolution mass spectrometry (LC-HRMS) based metabolomics research, proper quality control checks (e.g. for precision, signal drifts or offsets) are crucial prerequisites to achieve reliable and comparable results within and across experimental measurement sequences. Software tools can support this process. The software tool QCScreen was developed to offer a quick and easy data quality check of LC-HRMS derived data. It allows a flexible investigation and comparison of basic quality-related parameters within user-defined target features and the possibility to automatically evaluate multiple sample types within or across different measurement sequences in a short time. It offers a user-friendly interface that allows an easy selection of processing steps and parameter settings. The generated results include a coloured overview plot of data quality across all analysed samples and targets and, in addition, detailed illustrations of the stability and precision of the chromatographic separation, the mass accuracy and the detector sensitivity. The use of QCScreen is demonstrated with experimental data from metabolomics experiments using selected standard compounds in pure solvent. The application of the software identified problematic features, samples and analytical parameters and suggested which data files or compounds required closer manual inspection. QCScreen is an open source software tool which provides a useful basis for assessing the suitability of LC-HRMS data prior to time consuming, detailed data processing and subsequent statistical analysis. It accepts the generic mzXML format and thus can be used with many different LC-HRMS platforms to process both multiple quality control sample types as well as experimental samples in one or more measurement sequences.

  1. Software quality for 1997 - what works and what doesn`t?

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jones, C.

    1997-11-01

    This presentation provides a view of software quality for 1997 - what works and what doesn`t. For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Development (WFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels.

  2. Application of reiteration of Hankel singular value decomposition in quality control

    NASA Astrophysics Data System (ADS)

    Staniszewski, Michał; Skorupa, Agnieszka; Boguszewicz, Łukasz; Michalczuk, Agnieszka; Wereszczyński, Kamil; Wicher, Magdalena; Konopka, Marek; Sokół, Maria; Polański, Andrzej

    2017-07-01

    Medical centres are obliged to store past medical records, including the results of quality assurance (QA) tests of the medical equipment, which is especially useful in checking reproducibility of medical devices and procedures. Analysis of multivariate time series is an important part of quality control of NMR data. In this work we proposean anomaly detection tool based on Reiteration of Hankel Singular Value Decomposition method. The presented method was compared with external software and authors obtained comparable results.

  3. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  4. The Impact of Software Culture on the Management of Community Data

    NASA Astrophysics Data System (ADS)

    Collins, J. A.; Pulsifer, P. L.; Sheffield, E.; Lewis, S.; Oldenburg, J.

    2013-12-01

    The Exchange for Local Observations and Knowledge of the Arctic (ELOKA), a program hosted at the National Snow and Ice Data Center (NSIDC), supports the collection, curation, and distribution of Local and Traditional Knowledge (LTK) data, as well as some quantitative data products. Investigations involving LTK data often involve community participation, and therefore require flexible and robust user interfaces to support a reliable process of data collection and management. Often, investigators focused on LTK and community-based monitoring choose to use ELOKA's data services based on our ability to provide rapid proof-of-concepts and economical delivery of a usable product. To satisfy these two overarching criteria, ELOKA is experimenting with modifications to its software development culture both in terms of how the software applications are developed as well as the kind of software applications (or components) being developed. Over the past several years, NSIDC has shifted its software development culture from one of assigning individual scientific programmers to support particular principal investigators or projects, to an Agile Software Methodology implementation using Scrum practices. ELOKA has participated in this process by working with other product owners to schedule and prioritize development work which is then implemented by a team of application developers. Scrum, along with practices such as Test Driven Development (TDD) and paired programming, improves the quality of the software product delivered to the user community. To meet the need for rapid prototyping and to maximize product development and support with limited developer input, our software development efforts are now focused on creating a platform of application modules that can be quickly customized to suit the needs of a variety of LTK projects. This approach is in contrast to the strategy of delivering custom applications for individual projects. To date, we have integrated components of the Nunaliit Atlas framework (a Java/JavaScript client-server web-based application) with an existing Ruby on Rails application. This approach requires transitioning individual applications to expose a service layer, thus allowing interapplication communication via RESTful services. In this presentation we will report on our experiences using Agile Scrum practices, our efforts to move from custom solutions to a platform of customizable modules, and the impact of each on our ability to support researchers and Arctic residents in the domain of community-based observations and knowledge.

  5. An empirical evaluation of software quality assurance practices and challenges in a developing country: a comparison of Nigeria and Turkey.

    PubMed

    Sowunmi, Olaperi Yeside; Misra, Sanjay; Fernandez-Sanz, Luis; Crawford, Broderick; Soto, Ricardo

    2016-01-01

    The importance of quality assurance in the software development process cannot be overemphasized because its adoption results in high reliability and easy maintenance of the software system and other software products. Software quality assurance includes different activities such as quality control, quality management, quality standards, quality planning, process standardization and improvement amongst others. The aim of this work is to further investigate the software quality assurance practices of practitioners in Nigeria. While our previous work covered areas on quality planning, adherence to standardized processes and the inherent challenges, this work has been extended to include quality control, software process improvement and international quality standard organization membership. It also makes comparison based on a similar study carried out in Turkey. The goal is to generate more robust findings that can properly support decision making by the software community. The qualitative research approach, specifically, the use of questionnaire research instruments was applied to acquire data from software practitioners. In addition to the previous results, it was observed that quality assurance practices are quite neglected and this can be the cause of low patronage. Moreover, software practitioners are neither aware of international standards organizations or the required process improvement techniques; as such their claimed standards are not aligned to those of accredited bodies, and are only limited to their local experience and knowledge, which makes it questionable. The comparison with Turkey also yielded similar findings, making the results typical of developing countries. The research instrument used was tested for internal consistency using the Cronbach's alpha, and it was proved reliable. For the software industry in developing countries to grow strong and be a viable source of external revenue, software assurance practices have to be taken seriously because its effect is evident in the final product. Moreover, quality frameworks and tools which require minimum time and cost are highly needed in these countries.

  6. Computerized stratified random site-selection approaches for design of a ground-water-quality sampling network

    USGS Publications Warehouse

    Scott, J.C.

    1990-01-01

    Computer software was written to randomly select sites for a ground-water-quality sampling network. The software uses digital cartographic techniques and subroutines from a proprietary geographic information system. The report presents the approaches, computer software, and sample applications. It is often desirable to collect ground-water-quality samples from various areas in a study region that have different values of a spatial characteristic, such as land-use or hydrogeologic setting. A stratified network can be used for testing hypotheses about relations between spatial characteristics and water quality, or for calculating statistical descriptions of water-quality data that account for variations that correspond to the spatial characteristic. In the software described, a study region is subdivided into areal subsets that have a common spatial characteristic to stratify the population into several categories from which sampling sites are selected. Different numbers of sites may be selected from each category of areal subsets. A population of potential sampling sites may be defined by either specifying a fixed population of existing sites, or by preparing an equally spaced population of potential sites. In either case, each site is identified with a single category, depending on the value of the spatial characteristic of the areal subset in which the site is located. Sites are selected from one category at a time. One of two approaches may be used to select sites. Sites may be selected randomly, or the areal subsets in the category can be grouped into cells and sites selected randomly from each cell.

  7. The Influence of Software Complexity on the Maintenance Effort: Case Study on Software Developed within Educational Process

    ERIC Educational Resources Information Center

    Radulescu, Iulian Ionut

    2006-01-01

    Software complexity is the most important software quality attribute and a very useful instrument in the study of software quality. Is one of the factors that affect most of the software quality characteristics, including maintainability. It is very important to quantity this influence and identify the means to keep it under control; by using…

  8. Integrated software system for improving medical equipment management.

    PubMed

    Bliznakov, Z; Pappous, G; Bliznakova, K; Pallikarakis, N

    2003-01-01

    The evolution of biomedical technology has led to an extraordinary use of medical devices in health care delivery. During the last decade, clinical engineering departments (CEDs) turned toward computerization and application of specific software systems for medical equipment management in order to improve their services and monitor outcomes. Recently, much emphasis has been given to patient safety. Through its Medical Device Directives, the European Union has required all member nations to use a vigilance system to prevent the reoccurrence of adverse events that could lead to injuries or death of patients or personnel as a result of equipment malfunction or improper use. The World Health Organization also has made this issue a high priority and has prepared a number of actions and recommendations. In the present workplace, a new integrated, Windows-oriented system is proposed, addressing all tasks of CEDs but also offering a global approach to their management needs, including vigilance. The system architecture is based on a star model, consisting of a central core module and peripheral units. Its development has been based on the integration of 3 software modules, each one addressing specific predefined tasks. The main features of this system include equipment acquisition and replacement management, inventory archiving and monitoring, follow up on scheduled maintenance, corrective maintenance, user training, data analysis, and reports. It also incorporates vigilance monitoring and information exchange for adverse events, together with a specific application for quality-control procedures. The system offers clinical engineers the ability to monitor and evaluate the quality and cost-effectiveness of the service provided by means of quality and cost indicators. Particular emphasis has been placed on the use of harmonized standards with regard to medical device nomenclature and classification. The system's practical applications have been demonstrated through a pilot evaluation trial.

  9. GLOBECOM '84 - Global Telecommunications Conference, Atlanta, GA, November 26-29, 1984, Conference Record. Volume 1

    NASA Astrophysics Data System (ADS)

    The subjects discussed are related to LSI/VLSI based subscriber transmission and customer access for the Integrated Services Digital Network (ISDN), special applications of fiber optics, ISDN and competitive telecommunication services, technical preparations for the Geostationary-Satellite Orbit Conference, high-capacity statistical switching fabrics, networking and distributed systems software, adaptive arrays and cancelers, synchronization and tracking, speech processing, advances in communication terminals, full-color videotex, and a performance analysis of protocols. Advances in data communications are considered along with transmission network plans and progress, direct broadcast satellite systems, packet radio system aspects, radio-new and developing technologies and applications, the management of software quality, and Open Systems Interconnection (OSI) aspects of telematic services. Attention is given to personal computers and OSI, the role of software reliability measurement in information systems, and an active array antenna for the next-generation direct broadcast satellite.

  10. Quality Market: Design and Field Study of Prediction Market for Software Quality Control

    ERIC Educational Resources Information Center

    Krishnamurthy, Janaki

    2010-01-01

    Given the increasing competition in the software industry and the critical consequences of software errors, it has become important for companies to achieve high levels of software quality. While cost reduction and timeliness of projects continue to be important measures, software companies are placing increasing attention on identifying the user…

  11. The development and technology transfer of software engineering technology at NASA. Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.

    1992-01-01

    The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.

  12. Federal Communications Commission (FCC) Transponder Loading Data Conversion Software. User's guide and software maintenance manual, version 1.2

    NASA Technical Reports Server (NTRS)

    Mallasch, Paul G.

    1993-01-01

    This volume contains the complete software system documentation for the Federal Communications Commission (FCC) Transponder Loading Data Conversion Software (FIX-FCC). This software was written to facilitate the formatting and conversion of FCC Transponder Occupancy (Loading) Data before it is loaded into the NASA Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS). The information that FCC supplies NASA is in report form and must be converted into a form readable by the database management software used in the GSOSTATS application. Both the User's Guide and Software Maintenance Manual are contained in this document. This volume of documentation passed an independent quality assurance review and certification by the Product Assurance and Security Office of the Planning Research Corporation (PRC). The manuals were reviewed for format, content, and readability. The Software Management and Assurance Program (SMAP) life cycle and documentation standards were used in the development of this document. Accordingly, these standards were used in the review. Refer to the System/Software Test/Product Assurance Report for the Geosynchronous Satellite Orbital Statistics Database System (GSOSTATS) for additional information.

  13. Four simple recommendations to encourage best practices in research software

    PubMed Central

    Jiménez, Rafael C.; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll.; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C.; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S.; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J.; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V.; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S.; Crouch, Steve

    2017-01-01

    Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations. PMID:28751965

  14. Four simple recommendations to encourage best practices in research software.

    PubMed

    Jiménez, Rafael C; Kuzak, Mateusz; Alhamdoosh, Monther; Barker, Michelle; Batut, Bérénice; Borg, Mikael; Capella-Gutierrez, Salvador; Chue Hong, Neil; Cook, Martin; Corpas, Manuel; Flannery, Madison; Garcia, Leyla; Gelpí, Josep Ll; Gladman, Simon; Goble, Carole; González Ferreiro, Montserrat; Gonzalez-Beltran, Alejandra; Griffin, Philippa C; Grüning, Björn; Hagberg, Jonas; Holub, Petr; Hooft, Rob; Ison, Jon; Katz, Daniel S; Leskošek, Brane; López Gómez, Federico; Oliveira, Luis J; Mellor, David; Mosbergen, Rowland; Mulder, Nicola; Perez-Riverol, Yasset; Pergl, Robert; Pichler, Horst; Pope, Bernard; Sanz, Ferran; Schneider, Maria V; Stodden, Victoria; Suchecki, Radosław; Svobodová Vařeková, Radka; Talvik, Harry-Anton; Todorov, Ilian; Treloar, Andrew; Tyagi, Sonika; van Gompel, Maarten; Vaughan, Daniel; Via, Allegra; Wang, Xiaochuan; Watson-Haigh, Nathan S; Crouch, Steve

    2017-01-01

    Scientific research relies on computer software, yet software is not always developed following practices that ensure its quality and sustainability. This manuscript does not aim to propose new software development best practices, but rather to provide simple recommendations that encourage the adoption of existing best practices. Software development best practices promote better quality software, and better quality software improves the reproducibility and reusability of research. These recommendations are designed around Open Source values, and provide practical suggestions that contribute to making research software and its source code more discoverable, reusable and transparent. This manuscript is aimed at developers, but also at organisations, projects, journals and funders that can increase the quality and sustainability of research software by encouraging the adoption of these recommendations.

  15. LINKING THE CMAQ AND HYSPLIT MODELING SYSTEM INTERFACE PROGRAM AND EXAMPLE APPLICATION

    EPA Science Inventory

    A new software tool has been developed to link the Eulerian-based Community Multiscale Air Quality (CMAQ) modeling system with the Lagrangian-based HYSPLIT (HYbrid Single-Particle Lagrangian Integrated Trajectory) model. Both models require many of the same hourly meteorological...

  16. Using Archives for Education.

    ERIC Educational Resources Information Center

    MacKenzie, Douglas

    1996-01-01

    Discusses the use of computer systems for archival applications based on experiences at the Demarco European Arts Foundation (Scotland) and the TAMH Project, an attempt to build a virtual museum of Tay Valley maritime history. Highlights include hardware; development software; data representation, including storage space versus quality;…

  17. CrossTalk: The Journal of Defense Software Engineering. Volume 24, Number 6. November/December 2011

    DTIC Science & Technology

    2011-11-01

    Software Development.” Software Quality Professional Journal, American Society for Quality (ASQ), (March 2010) 4-14. 3. Nair, Gopalakrishnan T.R...Inspection Performance Metric”. Software Quality Professional Journal, American Society for Quality (ASQ), Volume 13, Issue 2, (March 2011) 14-26...the discovery process and are marketed by compa- nies such as Black Duck Software, OpenLogic, Palamida, and Protecode, among others.7 A number of open

  18. Preparation of Power Distribution System for High Penetration of Renewable Energy Part I. Dynamic Voltage Restorer for Voltage Regulation Pat II. Distribution Circuit Modeling and Validation

    NASA Astrophysics Data System (ADS)

    Khoshkbar Sadigh, Arash

    Part I: Dynamic Voltage Restorer In the present power grids, voltage sags are recognized as a serious threat and a frequently occurring power-quality problem and have costly consequence such as sensitive loads tripping and production loss. Consequently, the demand for high power quality and voltage stability becomes a pressing issue. Dynamic voltage restorer (DVR), as a custom power device, is more effective and direct solutions for "restoring" the quality of voltage at its load-side terminals when the quality of voltage at its source-side terminals is disturbed. In the first part of this thesis, a DVR configuration with no need of bulky dc link capacitor or energy storage is proposed. This fact causes to reduce the size of the DVR and increase the reliability of the circuit. In addition, the proposed DVR topology is based on high-frequency isolation transformer resulting in the size reduction of transformer. The proposed DVR circuit, which is suitable for both low- and medium-voltage applications, is based on dc-ac converters connected in series to split the main dc link between the inputs of dc-ac converters. This feature makes it possible to use modular dc-ac converters and utilize low-voltage components in these converters whenever it is required to use DVR in medium-voltage application. The proposed configuration is tested under different conditions of load power factor and grid voltage harmonic. It has been shown that proposed DVR can compensate the voltage sag effectively and protect the sensitive loads. Following the proposition of the DVR topology, a fundamental voltage amplitude detection method which is applicable in both single/three-phase systems for DVR applications is proposed. The advantages of proposed method include application in distorted power grid with no need of any low-pass filter, precise and reliable detection, simple computation and implementation without using a phased locked loop and lookup table. The proposed method has been verified by simulation and experimental tests under various conditions considering all possible cases such as different amounts of voltage sag depth (VSD), different amounts of point-on-wave (POW) at which voltage sag occurs, harmonic distortion, line frequency variation, and phase jump (PJ). Furthermore, the ripple amount of fundamental voltage amplitude calculated by the proposed method and its error is analyzed considering the line frequency variation together with harmonic distortion. The best and worst detection time of proposed method were measured 1ms and 8.8ms, respectively. Finally, the proposed method has been compared with other voltage sag detection methods available in literature. Part 2: Power System Modeling for Renewable Energy Integration: As power distribution systems are evolving into more complex networks, electrical engineers have to rely on software tools to perform circuit analysis. There are dozens of powerful software tools available in the market to perform the power system studies. Although their main functions are similar, there are differences in features and formatting structures to suit specific applications. This creates challenges for transferring power system circuit models data (PSCMD) between different software and rebuilding the same circuit in the second software environment. The objective of this part of thesis is to develop a Unified Platform (UP) to facilitate transferring PSCMD among different software packages and relieve the challenges of the circuit model conversion process. UP uses a commonly available spreadsheet file with a defined format, for any home software to write data to and for any destination software to read data from, via a script-based application called PSCMD transfer application. The main considerations in developing the UP are to minimize manual intervention and import a one-line diagram into the destination software or export it from the source software, with all details to allow load flow, short circuit and other analyses. In this study, ETAP, OpenDSS, and GridLab-D are considered, and PSCMD transfer applications written in MATLAB have been developed for each of these to read the circuit model data provided in the UP spreadsheet. In order to test the developed PSCMD transfer applications, circuit model data of a test circuit and a power distribution circuit from Southern California Edison (SCE) - a utility company - both built in CYME, were exported into the spreadsheet file according to the UP format. Thereafter, circuit model data were imported successfully from the spreadsheet files into above mentioned software using the PSCMD transfer applications developed for each software. After the SCE studied circuit is transferred into OpenDSS software using the proposed UP scheme and developed application, it has been studied to investigate the impacts of large-scale solar energy penetration. The main challenge of solar energy integration into power grid is its intermittency (i.e., discontinuity of output power) nature due to cloud shading of photovoltaic panels which depends on weather conditions. In order to conduct this study, OpenDSS time-series simulation feature, which is required due to intermittency of solar energy, is utilized. In this study, the impacts of intermittency of solar energy penetration, especially high-variability points, on voltage fluctuation and operation of capacitor bank and voltage regulator is provided. In addition, the necessity to interpolate and resample unequally spaced time-series measurement data and convert them to equally spaced time-series data as well as the effect of resampling time-interval on the amount of error is discussed. Two applications are developed in Matlab to do interpolation and resampling as well as to calculate the amount of error for different resampling time-intervals to figure out the suitable resampling time-interval. Furthermore, an approach based on cumulative distribution, regarding the length for lines/cables types and the power rating for loads, is presented to prioritize which loads, lines and cables the meters should be installed at to have the most effect on model validation.

  19. Leveraging OpenStudio's Application Programming Interfaces: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, N.; Ball, B.; Goldwasser, D.

    2013-11-01

    OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less

  20. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  1. A Systematic Process for Developing High Quality SaaS Cloud Services

    NASA Astrophysics Data System (ADS)

    La, Hyun Jung; Kim, Soo Dong

    Software-as-a-Service (SaaS) is a type of cloud service which provides software functionality through Internet. Its benefits are well received in academia and industry. To fully utilize the benefits, there should be effective methodologies to support the development of SaaS services which provide high reusability and applicability. Conventional approaches such as object-oriented methods do not effectively support SaaS-specific engineering activities such as modeling common features, variability, and designing quality services. In this paper, we present a systematic process for developing high quality SaaS and highlight the essentiality of commonality and variability (C&V) modeling to maximize the reusability. We first define criteria for designing the process model and provide a theoretical foundation for SaaS; its meta-model and C&V model. We clarify the notion of commonality and variability in SaaS, and propose a SaaS development process which is accompanied with engineering instructions. Using the proposed process, SaaS services with high quality can be effectively developed.

  2. Development strategies for the satellite flight software on-board Meteosat Third Generation

    NASA Astrophysics Data System (ADS)

    Tipaldi, Massimo; Legendre, Cedric; Koopmann, Olliver; Ferraguto, Massimo; Wenker, Ralf; D'Angelo, Gianni

    2018-04-01

    Nowadays, satellites are becoming increasingly software dependent. Satellite Flight Software (FSW), that is to say, the application software running on the satellite main On-Board Computer (OBC), plays a relevant role in implementing complex space mission requirements. In this paper, we examine relevant technical approaches and programmatic strategies adopted for the development of the Meteosat Third Generation Satellite (MTG) FSW. To begin with, we present its layered model-based architecture, and the means for ensuring a robust and reliable interaction among the FSW components. Then, we focus on the selection of an effective software development life cycle model. In particular, by combining plan-driven and agile approaches, we can fulfill the need of having preliminary SW versions. They can be used for the elicitation of complex system-level requirements as well as for the initial satellite integration and testing activities. Another important aspect can be identified in the testing activities. Indeed, very demanding quality requirements have to be fulfilled in satellite SW applications. This manuscript proposes a test automation framework, which uses an XML-based test procedure language independent of the underlying test environment. Finally, a short overview of the MTG FSW sizing and timing budgets concludes the paper.

  3. 78 FR 16474 - Extension of the Period for Comments on the Enhancement of Quality of Software-Related Patents

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-03-15

    ...] Extension of the Period for Comments on the Enhancement of Quality of Software-Related Patents AGENCY... announcing the formation of a partnership with the software community to enhance the quality of software-related patents (Software Partnership), and a request for comments on the preparation of patent...

  4. Generation of Long-time Complex Signals for Testing the Instruments for Detection of Voltage Quality Disturbances

    NASA Astrophysics Data System (ADS)

    Živanović, Dragan; Simić, Milan; Kokolanski, Zivko; Denić, Dragan; Dimcev, Vladimir

    2018-04-01

    Software supported procedure for generation of long-time complex test sentences, suitable for testing the instruments for detection of standard voltage quality (VQ) disturbances is presented in this paper. This solution for test signal generation includes significant improvements of computer-based signal generator presented and described in the previously published paper [1]. The generator is based on virtual instrumentation software for defining the basic signal parameters, data acquisition card NI 6343, and power amplifier for amplification of output voltage level to the nominal RMS voltage value of 230 V. Definition of basic signal parameters in LabVIEW application software is supported using Script files, which allows simple repetition of specific test signals and combination of more different test sequences in the complex composite test waveform. The basic advantage of this generator compared to the similar solutions for signal generation is the possibility for long-time test sequence generation according to predefined complex test scenarios, including various combinations of VQ disturbances defined in accordance with the European standard EN50160. Experimental verification of the presented signal generator capability is performed by testing the commercial power quality analyzer Fluke 435 Series II. In this paper are shown some characteristic complex test signals with various disturbances and logged data obtained from the tested power quality analyzer.

  5. The State of Software for Evolutionary Biology.

    PubMed

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-05-01

    With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development.

  6. Consolidated View on Space Software Engineering Problems - An Empirical Study

    NASA Astrophysics Data System (ADS)

    Silva, N.; Vieira, M.; Ricci, D.; Cotroneo, D.

    2015-09-01

    Independent software verification and validation (ISVV) has been a key process for engineering quality assessment for decades, and is considered in several international standards. The “European Space Agency (ESA) ISVV Guide” is used for the European Space market to drive the ISVV tasks and plans, and to select applicable tasks and techniques. Software artefacts have room for improvement due to the amount if issues found during ISVV tasks. This article presents the analysis of the results of a large set of ISVV issues originated from three different ESA missions-amounting to more than 1000 issues. The study presents the main types, triggers and impacts related to the ISVV issues found and sets the path for a global software engineering improvement based on the most common deficiencies identified for space projects.

  7. Clinical evaluation of reducing acquisition time on single-photon emission computed tomography image quality using proprietary resolution recovery software.

    PubMed

    Aldridge, Matthew D; Waddington, Wendy W; Dickson, John C; Prakash, Vineet; Ell, Peter J; Bomanji, Jamshed B

    2013-11-01

    A three-dimensional model-based resolution recovery (RR) reconstruction algorithm that compensates for collimator-detector response, resulting in an improvement in reconstructed spatial resolution and signal-to-noise ratio of single-photon emission computed tomography (SPECT) images, was tested. The software is said to retain image quality even with reduced acquisition time. Clinically, any improvement in patient throughput without loss of quality is to be welcomed. Furthermore, future restrictions in radiotracer supplies may add value to this type of data analysis. The aims of this study were to assess improvement in image quality using the software and to evaluate the potential of performing reduced time acquisitions for bone and parathyroid SPECT applications. Data acquisition was performed using the local standard SPECT/CT protocols for 99mTc-hydroxymethylene diphosphonate bone and 99mTc-methoxyisobutylisonitrile parathyroid SPECT imaging. The principal modification applied was the acquisition of an eight-frame gated data set acquired using an ECG simulator with a fixed signal as the trigger. This had the effect of partitioning the data such that the effect of reduced time acquisitions could be assessed without conferring additional scanning time on the patient. The set of summed data sets was then independently reconstructed using the RR software to permit a blinded assessment of the effect of acquired counts upon reconstructed image quality as adjudged by three experienced observers. Data sets reconstructed with the RR software were compared with the local standard processing protocols; filtered back-projection and ordered-subset expectation-maximization. Thirty SPECT studies were assessed (20 bone and 10 parathyroid). The images reconstructed with the RR algorithm showed improved image quality for both full-time and half-time acquisitions over local current processing protocols (P<0.05). The RR algorithm improved image quality compared with local processing protocols and has been introduced into routine clinical use. SPECT acquisitions are now acquired at half of the time previously required. The method of binning the data can be applied to any other camera system to evaluate the reduction in acquisition time for similar processes. The potential for dose reduction is also inherent with this approach.

  8. [Physical Activity in the Context of Workplace Health Promotion: A Systematic Review on the Effectiveness of Software-Based in Contrast to Personal-Based Interventions].

    PubMed

    Rudolph, Sabrina; Göring, Arne; Padrok, Dennis

    2018-01-03

    Sports and physical activity interventions are attracting considerable attention in the context of workplace health promotion. Due to increasing digitalization, especially software-based interventions that promote physical activity are gaining acceptance in practice. Empirical evidence concerning the efficiency of software-based interventions in the context of workplace health promotion is rather low so far. This paper examines the question in what way software-based interventions are more efficient than personal-based interventions in terms of increasing the level of physical activity. A systematic review according to the specifications of the Cochrane Collaboration was conducted. Inclusion criteria and should-have criteria were defined and by means of the should-have criteria the quality score of the studies was calculated. The software-based and personal-based interventions are presented in 2 tables with the categories author, year, country, sample group, aim of the intervention, methods, outcome and study quality. A total of 25 studies are included in the evaluation (12 personal- and 13 software-based interventions). The quality scores of the studies are heterogeneous and range from 3 to 9 points. 5 personal- and 5 software-based studies achieved an increase of physical activity. Other positive effects on health could be presented in the studies, for example, a reduction in blood pressure or body-mass index. A few studies did not show any improvement in health-related parameters. This paper demonstrates that positive effects can be achieved with both intervention types. Software-based interventions show advantages due to the use of new technologies. Use of desktop or mobile applications facilitate organization, communication and data acquisition with fewer resources needed. A schooled trainer, on the other hand, is able to react to specific and varying needs of the employees. This aspect should be considered as very significant. © Georg Thieme Verlag KG Stuttgart · New York.

  9. Real-Time Event Detection for Monitoring Natural and Source ...

    EPA Pesticide Factsheets

    The use of event detection systems in finished drinking water systems is increasing in order to monitor water quality in both operational and security contexts. Recent incidents involving harmful algal blooms and chemical spills into watersheds have increased interest in monitoring source water quality prior to treatment. This work highlights the use of the CANARY event detection software in detecting suspected illicit events in an actively monitored watershed in South Carolina. CANARY is an open source event detection software that was developed by USEPA and Sandia National Laboratories. The software works with any type of sensor, utilizes multiple detection algorithms and approaches, and can incorporate operational information as needed. Monitoring has been underway for several years to detect events related to intentional or unintentional dumping of materials into the monitored watershed. This work evaluates the feasibility of using CANARY to enhance the detection of events in this watershed. This presentation will describe the real-time monitoring approach used in this watershed, the selection of CANARY configuration parameters that optimize detection for this watershed and monitoring application, and the performance of CANARY during the time frame analyzed. Further, this work will highlight how rainfall events impacted analysis, and the innovative application of CANARY taken in order to effectively detect the suspected illicit events. This presentation d

  10. Quality Attributes for Mission Flight Software: A Reference for Architects

    NASA Technical Reports Server (NTRS)

    Wilmot, Jonathan; Fesq, Lorraine; Dvorak, Dan

    2016-01-01

    In the international standards for architecture descriptions in systems and software engineering (ISO/IEC/IEEE 42010), "concern" is a primary concept that often manifests itself in relation to the quality attributes or "ilities" that a system is expected to exhibit - qualities such as reliability, security and modifiability. One of the main uses of an architecture description is to serve as a basis for analyzing how well the architecture achieves its quality attributes, and that requires architects to be as precise as possible about what they mean in claiming, for example, that an architecture supports "modifiability." This paper describes a table, generated by NASA's Software Architecture Review Board, which lists fourteen key quality attributes, identifies different important aspects of each quality attribute and considers each aspect in terms of requirements, rationale, evidence, and tactics to achieve the aspect. This quality attribute table is intended to serve as a guide to software architects, software developers, and software architecture reviewers in the domain of mission-critical real-time embedded systems, such as space mission flight software.

  11. A research review of quality assessment for software

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Measures were recommended to assess the quality of software submitted to the AdaNet program. The quality factors that are important to software reuse are explored and methods of evaluating those factors are discussed. Quality factors important to software reuse are: correctness, reliability, verifiability, understandability, modifiability, and certifiability. Certifiability is included because the documentation of many factors about a software component such as its efficiency, portability, and development history, constitute a class for factors important to some users, not important at all to other, and impossible for AdaNet to distinguish between a priori. The quality factors may be assessed in different ways. There are a few quantitative measures which have been shown to indicate software quality. However, it is believed that there exists many factors that indicate quality and have not been empirically validated due to their subjective nature. These subjective factors are characterized by the way in which they support the software engineering principles of abstraction, information hiding, modularity, localization, confirmability, uniformity, and completeness.

  12. Obtaining Valid Safety Data for Software Safety Measurement and Process Improvement

    NASA Technical Reports Server (NTRS)

    Basili, Victor r.; Zelkowitz, Marvin V.; Layman, Lucas; Dangle, Kathleen; Diep, Madeline

    2010-01-01

    We report on a preliminary case study to examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Our goal is to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. Our purpose was two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to identify potential risks due to incorrect application of the safety process, deficiencies in the safety process, or the lack of a defined process. One early outcome of this work was to show that there are structural deficiencies in collecting valid safety data that make software safety different from hardware safety. In our conclusions we present some of these deficiencies.

  13. HeatmapGenerator: high performance RNAseq and microarray visualization software suite to examine differential gene expression levels using an R and C++ hybrid computational pipeline.

    PubMed

    Khomtchouk, Bohdan B; Van Booven, Derek J; Wahlestedt, Claes

    2014-01-01

    The graphical visualization of gene expression data using heatmaps has become an integral component of modern-day medical research. Heatmaps are used extensively to plot quantitative differences in gene expression levels, such as those measured with RNAseq and microarray experiments, to provide qualitative large-scale views of the transcriptonomic landscape. Creating high-quality heatmaps is a computationally intensive task, often requiring considerable programming experience, particularly for customizing features to a specific dataset at hand. Software to create publication-quality heatmaps is developed with the R programming language, C++ programming language, and OpenGL application programming interface (API) to create industry-grade high performance graphics. We create a graphical user interface (GUI) software package called HeatmapGenerator for Windows OS and Mac OS X as an intuitive, user-friendly alternative to researchers with minimal prior coding experience to allow them to create publication-quality heatmaps using R graphics without sacrificing their desired level of customization. The simplicity of HeatmapGenerator is that it only requires the user to upload a preformatted input file and download the publicly available R software language, among a few other operating system-specific requirements. Advanced features such as color, text labels, scaling, legend construction, and even database storage can be easily customized with no prior programming knowledge. We provide an intuitive and user-friendly software package, HeatmapGenerator, to create high-quality, customizable heatmaps generated using the high-resolution color graphics capabilities of R. The software is available for Microsoft Windows and Apple Mac OS X. HeatmapGenerator is released under the GNU General Public License and publicly available at: http://sourceforge.net/projects/heatmapgenerator/. The Mac OS X direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_MAC_OSX.tar.gz/download. The Windows OS direct download is available at: http://sourceforge.net/projects/heatmapgenerator/files/HeatmapGenerator_WINDOWS.zip/download.

  14. RNA-SeQC: RNA-seq metrics for quality control and process optimization.

    PubMed

    DeLuca, David S; Levin, Joshua Z; Sivachenko, Andrey; Fennell, Timothy; Nazaire, Marc-Danie; Williams, Chris; Reich, Michael; Winckler, Wendy; Getz, Gad

    2012-06-01

    RNA-seq, the application of next-generation sequencing to RNA, provides transcriptome-wide characterization of cellular activity. Assessment of sequencing performance and library quality is critical to the interpretation of RNA-seq data, yet few tools exist to address this issue. We introduce RNA-SeQC, a program which provides key measures of data quality. These metrics include yield, alignment and duplication rates; GC bias, rRNA content, regions of alignment (exon, intron and intragenic), continuity of coverage, 3'/5' bias and count of detectable transcripts, among others. The software provides multi-sample evaluation of library construction protocols, input materials and other experimental parameters. The modularity of the software enables pipeline integration and the routine monitoring of key measures of data quality such as the number of alignable reads, duplication rates and rRNA contamination. RNA-SeQC allows investigators to make informed decisions about sample inclusion in downstream analysis. In summary, RNA-SeQC provides quality control measures critical to experiment design, process optimization and downstream computational analysis. See www.genepattern.org to run online, or www.broadinstitute.org/rna-seqc/ for a command line tool.

  15. Recent experience with the CQE{trademark}

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harrison, C.D.; Kehoe, D.B.; O`Connor, D.C.

    1997-12-31

    CQE (the Coal Quality Expert) is a software tool that brings a new level of sophistication to fuel decisions by seamlessly integrating the system-wide effects of fuel purchase decisions on power plant performance, emissions, and power generation costs. The CQE technology, which addresses fuel quality from the coal mine to the busbar and the stack, is an integration and improvement of predecessor software tools including: EPRI`s Coal Quality Information System, EPRI`s Coal Cleaning Cost Model, EPRI`s Coal Quality Impact Model, and EPRI and DOE models to predict slagging and fouling. CQE can be used as a stand-alone workstation or asmore » a network application for utilities, coal producers, and equipment manufacturers to perform detailed analyses of the impacts of coal quality, capital improvements, operational changes, and/or environmental compliance alternatives on power plant emissions, performance and production costs. It can be used as a comprehensive, precise and organized methodology for systematically evaluating all such impacts or it may be used in pieces with some default data to perform more strategic or comparative studies.« less

  16. Designing Tracking Software for Image-Guided Surgery Applications: IGSTK Experience

    PubMed Central

    Enquobahrie, Andinet; Gobbi, David; Turek, Matt; Cheng, Patrick; Yaniv, Ziv; Lindseth, Frank; Cleary, Kevin

    2009-01-01

    Objective Many image-guided surgery applications require tracking devices as part of their core functionality. The Image-Guided Surgery Toolkit (IGSTK) was designed and developed to interface tracking devices with software applications incorporating medical images. Methods IGSTK was designed as an open source C++ library that provides the basic components needed for fast prototyping and development of image-guided surgery applications. This library follows a component-based architecture with several components designed for specific sets of image-guided surgery functions. At the core of the toolkit is the tracker component that handles communication between a control computer and navigation device to gather pose measurements of surgical instruments present in the surgical scene. The representations of the tracked instruments are superimposed on anatomical images to provide visual feedback to the clinician during surgical procedures. Results The initial version of the IGSTK toolkit has been released in the public domain and several trackers are supported. The toolkit and related information are available at www.igstk.org. Conclusion With the increased popularity of minimally invasive procedures in health care, several tracking devices have been developed for medical applications. Designing and implementing high-quality and safe software to handle these different types of trackers in a common framework is a challenging task. It requires establishing key software design principles that emphasize abstraction, extensibility, reusability, fault-tolerance, and portability. IGSTK is an open source library that satisfies these needs for the image-guided surgery community. PMID:20037671

  17. A Structure for Creating Quality Software.

    ERIC Educational Resources Information Center

    Christensen, Larry C.; Bodey, Michael R.

    1990-01-01

    Addresses the issue of assuring quality software for use in computer-aided instruction and presents a structure by which developers can create quality courseware. Differences between courseware and computer-aided instruction software are discussed, methods for testing software are described, and human factors issues as well as instructional design…

  18. Software Formal Inspections Guidebook

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Software Formal Inspections Guidebook is designed to support the inspection process of software developed by and for NASA. This document provides information on how to implement a recommended and proven method for conducting formal inspections of NASA software. This Guidebook is a companion document to NASA Standard 2202-93, Software Formal Inspections Standard, approved April 1993, which provides the rules, procedures, and specific requirements for conducting software formal inspections. Application of the Formal Inspections Standard is optional to NASA program or project management. In cases where program or project management decide to use the formal inspections method, this Guidebook provides additional information on how to establish and implement the process. The goal of the formal inspections process as documented in the above-mentioned Standard and this Guidebook is to provide a framework and model for an inspection process that will enable the detection and elimination of defects as early as possible in the software life cycle. An ancillary aspect of the formal inspection process incorporates the collection and analysis of inspection data to effect continual improvement in the inspection process and the quality of the software subjected to the process.

  19. Boutiques: a flexible framework to integrate command-line applications in computing platforms.

    PubMed

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-05-01

    We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science.

  20. DPOI: Distributed software system development platform for ocean information service

    NASA Astrophysics Data System (ADS)

    Guo, Zhongwen; Hu, Keyong; Jiang, Yongguo; Sun, Zhaosui

    2015-02-01

    Ocean information management is of great importance as it has been employed in many areas of ocean science and technology. However, the developments of Ocean Information Systems (OISs) often suffer from low efficiency because of repetitive work and continuous modifications caused by dynamic requirements. In this paper, the basic requirements of OISs are analyzed first, and then a novel platform DPOI is proposed to improve development efficiency and enhance software quality of OISs by providing off-the-shelf resources. In the platform, the OIS is decomposed hierarchically into a set of modules, which can be reused in different system developments. These modules include the acquisition middleware and data loader that collect data from instruments and files respectively, the database that stores data consistently, the components that support fast application generation, the web services that make the data from distributed sources syntactical by use of predefined schemas and the configuration toolkit that enables software customization. With the assistance of the development platform, the software development needs no programming and the development procedure is thus accelerated greatly. We have applied the development platform in practical developments and evaluated its efficiency in several development practices and different development approaches. The results show that DPOI significantly improves development efficiency and software quality.

  1. HDBStat!: a platform-independent software suite for statistical analysis of high dimensional biology data.

    PubMed

    Trivedi, Prinal; Edwards, Jode W; Wang, Jelai; Gadbury, Gary L; Srinivasasainagendra, Vinodh; Zakharkin, Stanislav O; Kim, Kyoungmi; Mehta, Tapan; Brand, Jacob P L; Patki, Amit; Page, Grier P; Allison, David B

    2005-04-06

    Many efforts in microarray data analysis are focused on providing tools and methods for the qualitative analysis of microarray data. HDBStat! (High-Dimensional Biology-Statistics) is a software package designed for analysis of high dimensional biology data such as microarray data. It was initially developed for the analysis of microarray gene expression data, but it can also be used for some applications in proteomics and other aspects of genomics. HDBStat! provides statisticians and biologists a flexible and easy-to-use interface to analyze complex microarray data using a variety of methods for data preprocessing, quality control analysis and hypothesis testing. Results generated from data preprocessing methods, quality control analysis and hypothesis testing methods are output in the form of Excel CSV tables, graphs and an Html report summarizing data analysis. HDBStat! is a platform-independent software that is freely available to academic institutions and non-profit organizations. It can be downloaded from our website http://www.soph.uab.edu/ssg_content.asp?id=1164.

  2. Micro sensor node for air pollutant monitoring: hardware and software issues.

    PubMed

    Choi, Sukwon; Kim, Nakyoung; Cha, Hojung; Ha, Rhan

    2009-01-01

    Wireless sensor networks equipped with various gas sensors have been actively used for air quality monitoring. Previous studies have typically explored system issues that include middleware or networking performance, but most research has barely considered the details of the hardware and software of the sensor node itself. In this paper, we focus on the design and implementation of a sensor board for air pollutant monitoring applications. Several hardware and software issues are discussed to explore the possibilities of a practical WSN-based air pollution monitoring system. Through extensive experiments and evaluation, we have determined the various characteristics of the gas sensors and their practical implications for air pollutant monitoring systems.

  3. A method to establish seismic noise baselines for automated station assessment

    USGS Publications Warehouse

    McNamara, D.E.; Hutt, C.R.; Gee, L.S.; Benz, H.M.; Buland, R.P.

    2009-01-01

    We present a method for quantifying station noise baselines and characterizing the spectral shape of out-of-nominal noise sources. Our intent is to automate this method in order to ensure that only the highest-quality data are used in rapid earthquake products at NEIC. In addition, the station noise baselines provide a valuable tool to support the quality control of GSN and ANSS backbone data and metadata. The procedures addressed here are currently in development at the NEIC, and work is underway to understand how quickly changes from nominal can be observed and used within the NEIC processing framework. The spectral methods and software used to compute station baselines and described herein (PQLX) can be useful to both permanent and portable seismic stations operators. Applications include: general seismic station and data quality control (QC), evaluation of instrument responses, assessment of near real-time communication system performance, characterization of site cultural noise conditions, and evaluation of sensor vault design, as well as assessment of gross network capabilities (McNamara et al. 2005). Future PQLX development plans include incorporating station baselines for automated QC methods and automating station status report generation and notification based on user-defined QC parameters. The PQLX software is available through the USGS (http://earthquake. usgs.gov/research/software/pqlx.php) and IRIS (http://www.iris.edu/software/ pqlx/).

  4. The development and application of composite complexity models and a relative complexity metric in a software maintenance environment

    NASA Technical Reports Server (NTRS)

    Hops, J. M.; Sherif, J. S.

    1994-01-01

    A great deal of effort is now being devoted to the study, analysis, prediction, and minimization of software maintenance expected cost, long before software is delivered to users or customers. It has been estimated that, on the average, the effort spent on software maintenance is as costly as the effort spent on all other software costs. Software design methods should be the starting point to aid in alleviating the problems of software maintenance complexity and high costs. Two aspects of maintenance deserve attention: (1) protocols for locating and rectifying defects, and for ensuring that noe new defects are introduced in the development phase of the software process; and (2) protocols for modification, enhancement, and upgrading. This article focuses primarily on the second aspect, the development of protocols to help increase the quality and reduce the costs associated with modifications, enhancements, and upgrades of existing software. This study developed parsimonious models and a relative complexity metric for complexity measurement of software that were used to rank the modules in the system relative to one another. Some success was achieved in using the models and the relative metric to identify maintenance-prone modules.

  5. Testing, Requirements, and Metrics

    NASA Technical Reports Server (NTRS)

    Rosenberg, Linda; Hyatt, Larry; Hammer, Theodore F.; Huffman, Lenore; Wilson, William

    1998-01-01

    The criticality of correct, complete, testable requirements is a fundamental tenet of software engineering. Also critical is complete requirements based testing of the final product. Modern tools for managing requirements allow new metrics to be used in support of both of these critical processes. Using these tools, potential problems with the quality of the requirements and the test plan can be identified early in the life cycle. Some of these quality factors include: ambiguous or incomplete requirements, poorly designed requirements databases, excessive or insufficient test cases, and incomplete linkage of tests to requirements. This paper discusses how metrics can be used to evaluate the quality of the requirements and test to avoid problems later. Requirements management and requirements based testing have always been critical in the implementation of high quality software systems. Recently, automated tools have become available to support requirements management. At NASA's Goddard Space Flight Center (GSFC), automated requirements management tools are being used on several large projects. The use of these tools opens the door to innovative uses of metrics in characterizing test plan quality and assessing overall testing risks. In support of these projects, the Software Assurance Technology Center (SATC) is working to develop and apply a metrics program that utilizes the information now available through the application of requirements management tools. Metrics based on this information provides real-time insight into the testing of requirements and these metrics assist the Project Quality Office in its testing oversight role. This paper discusses three facets of the SATC's efforts to evaluate the quality of the requirements and test plan early in the life cycle, thus preventing costly errors and time delays later.

  6. The State of Software for Evolutionary Biology

    PubMed Central

    Darriba, Diego; Flouri, Tomáš; Stamatakis, Alexandros

    2018-01-01

    Abstract With Next Generation Sequencing data being routinely used, evolutionary biology is transforming into a computational science. Thus, researchers have to rely on a growing number of increasingly complex software. All widely used core tools in the field have grown considerably, in terms of the number of features as well as lines of code and consequently, also with respect to software complexity. A topic that has received little attention is the software engineering quality of widely used core analysis tools. Software developers appear to rarely assess the quality of their code, and this can have potential negative consequences for end-users. To this end, we assessed the code quality of 16 highly cited and compute-intensive tools mainly written in C/C++ (e.g., MrBayes, MAFFT, SweepFinder, etc.) and JAVA (BEAST) from the broader area of evolutionary biology that are being routinely used in current data analysis pipelines. Because, the software engineering quality of the tools we analyzed is rather unsatisfying, we provide a list of best practices for improving the quality of existing tools and list techniques that can be deployed for developing reliable, high quality scientific software from scratch. Finally, we also discuss journal as well as science policy and, more importantly, funding issues that need to be addressed for improving software engineering quality as well as ensuring support for developing new and maintaining existing software. Our intention is to raise the awareness of the community regarding software engineering quality issues and to emphasize the substantial lack of funding for scientific software development. PMID:29385525

  7. Usability evaluation of mobile applications using ISO 9241 and ISO 25062 standards.

    PubMed

    Moumane, Karima; Idri, Ali; Abran, Alain

    2016-01-01

    This paper presents an empirical study based on a set of measures to evaluate the usability of mobile applications running on different mobile operating systems, including Android, iOS and Symbian. The aim is to evaluate empirically a framework that we have developed on the use of the Software Quality Standard ISO 9126 in mobile environments, especially the usability characteristic. To do that, 32 users had participated in the experiment and we have used ISO 25062 and ISO 9241 standards for objective measures by working with two widely used mobile applications: Google Apps and Google Maps. The QUIS 7.0 questionnaire have been used to collect measures assessing the users' level of satisfaction when using these two mobile applications. By analyzing the results we highlighted a set of mobile usability issues that are related to the hardware as well as to the software and that need to be taken into account by designers and developers in order to improve the usability of mobile applications.

  8. [Assessment of a software application tool for managing nursing care processes in the period 2005-2010].

    PubMed

    Medina-Valverde, M José; Rodríguez-Borrego, M Aurora; Luque-Alcaraz, Olga; de la Torre-Barbero, M José; Parra-Perea, Julia; Moros-Molina, M del Pilar

    2012-01-01

    To identify problems and critical points in the software application. Assessment of the implementation of the software tool "Azahar" used to manage nursing care processes. The monitored population consisted of nurses who were users of the tool, at the Hospital and those who benefited from it in Primary Care. Each group was selected randomly and the number was determined by data saturation. A qualitative approach was employed using in-depth interviews and group discussion as data collection techniques. The nurses considered that the most beneficial and useful application of the tool was the initial assessment and the continuity of care release forms, as well as the recording of all data on the nursing process to ensure quality. The disadvantages and weaknesses identified were associated with the continuous variability in their daily care. The nurses related an increase in workload with the impossibility of entering the records into the computer, making paper records, thus duplicating the recording process. Likewise, they consider that the operating system of the software should be improved in terms of simplicity and functionality. The simplicity of the tool and the adjustment of workloads would favour its use and as a result, continuity of care. Copyright © 2010 Elsevier España, S.L. All rights reserved.

  9. Telemedicine using free voice over internet protocol (VoIP) technology.

    PubMed

    Miller, David J; Miljkovic, Nikola; Chiesa, Chad; Callahan, John B; Webb, Brad; Boedeker, Ben H

    2011-01-01

    Though dedicated videoteleconference (VTC) systems deliver high quality, low-latency audio and video for telemedical applications, they require expensive hardware and extensive infrastructure. The purpose of this study was to investigate free commercially available Voice over Internet Protocol (VoIP) software as a low cost alternative for telemedicine.

  10. USER MANUAL FOR THE EPA THIRD-GENERATION AIR QUALITY MODELING SYSTEM (MODELS-3 VERSION 3.0)

    EPA Science Inventory

    Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheri...

  11. SYSTEM INSTALLATION AND OPERATION MANUAL FOR THE EPA THIRD-GENERATION AIR QUALITY MODELING SYSTEM (MODELS-3) VERSION 3.0

    EPA Science Inventory

    Models-3 is a flexible software system designed to simplify the development and use of environmental assessment and other decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of atmospheri...

  12. Automating Nuclear-Safety-Related SQA Procedures with Custom Applications

    DOE PAGES

    Freels, James D.

    2016-01-01

    Nuclear safety-related procedures are rigorous for good reason. Small design mistakes can quickly turn into unwanted failures. Researchers at Oak Ridge National Laboratory worked with COMSOL to define a simulation app that automates the software quality assurance (SQA) verification process and provides results in less than 24 hours.

  13. Parallel Computing and Model Evaluation for Environmental Systems: An Overview of the Supermuse and Frames Software Technologies

    EPA Science Inventory

    ERD’s Supercomputer for Model Uncertainty and Sensitivity Evaluation (SuperMUSE) is a key to enhancing quality assurance in environmental models and applications. Uncertainty analysis and sensitivity analysis remain critical, though often overlooked steps in the development and e...

  14. MODELS-3 INSTALLATION PROCEDURES FOR A PC WITH AN NT OPERATING SYSTEM (MODELS-3 VERSION 4.0)

    EPA Science Inventory

    Models-3 is a flexible software system designed to simplify the development and use of air quality models and other environmental decision support tools. It is designed for applications ranging from regulatory and policy analysis to understanding the complex interactions of at...

  15. Predictive Modeling of Microbial Indicators for Timely Beach Notifications and Advisories at Marine Beaches

    EPA Science Inventory

    Marine beaches are occasionally contaminated by unacceptably high levels of fecal indicator bacteria (FIB) that exceed EPA water quality criteria. Here we describe application of a recent version of the software package Virtual Beach tool (VB 3.0.6) to build and evaluate multiple...

  16. The Simulation of a Major Automated Information System (AIS) on a Microcomputer

    DTIC Science & Technology

    1984-03-01

    The AIS would be selected from a public zec cor application. Rationale for this choice are: a. Many public sector organizations have a far richer...19. Boehm, Barry W. and others, Characteristics of Software Quality, North-Holland, 1978. 20. Freeman, Peter and Wasserman, Anthony I., Tutorial on

  17. Engineering Quality Software: 10 Recommendations for Improved Software Quality Management

    DTIC Science & Technology

    2010-04-27

    lack of user involvement • Inadequate Software Process Management & Control By Contractors • No “Team” of Vendors and users; little SME participation...1990 Quality Perspectives • Process Quality ( CMMI ) • Product Quality (ISO/IEC 2500x) – Internal Quality Attributes – External Quality Attributes... CMMI /ISO 9000 Assessments – Capture organizational knowledge • Identify best practices, lessons learned Know where you are, and where you need to be

  18. Software Dependability and Safety Evaluations ESA's Initiative

    NASA Astrophysics Data System (ADS)

    Hernek, M.

    ESA has allocated funds for an initiative to evaluate Dependability and Safety methods of Software. The objectives of this initiative are; · More extensive validation of Safety and Dependability techniques for Software · Provide valuable results to improve the quality of the Software thus promoting the application of Dependability and Safety methods and techniques. ESA space systems are being developed according to defined PA requirement specifications. These requirements may be implemented through various design concepts, e.g. redundancy, diversity etc. varying from project to project. Analysis methods (FMECA. FTA, HA, etc) are frequently used during requirements analysis and design activities to assure the correct implementation of system PA requirements. The criticality level of failures, functions and systems is determined and by doing that the critical sub-systems are identified, on which dependability and safety techniques are to be applied during development. Proper performance of the software development requires the development of a technical specification for the products at the beginning of the life cycle. Such technical specification comprises both functional and non-functional requirements. These non-functional requirements address characteristics of the product such as quality, dependability, safety and maintainability. Software in space systems is more and more used in critical functions. Also the trend towards more frequent use of COTS and reusable components pose new difficulties in terms of assuring reliable and safe systems. Because of this, its dependability and safety must be carefully analysed. ESA identified and documented techniques, methods and procedures to ensure that software dependability and safety requirements are specified and taken into account during the design and development of a software system and to verify/validate that the implemented software systems comply with these requirements [R1].

  19. Getting started on metrics - Jet Propulsion Laboratory productivity and quality

    NASA Technical Reports Server (NTRS)

    Bush, M. W.

    1990-01-01

    A review is presented to describe the effort and difficulties of reconstructing fifteen years of JPL software history. In 1987 the collection and analysis of project data were started with the objective of creating laboratory-wide measures of quality and productivity for software development. As a result of this two-year Software Product Assurance metrics study, a rough measurement foundation for software productivity and software quality, and an order-of-magnitude quantitative baseline for software systems and subsystems are now available.

  20. Software designs of image processing tasks with incremental refinement of computation.

    PubMed

    Anastasia, Davide; Andreopoulos, Yiannis

    2010-08-01

    Software realizations of computationally-demanding image processing tasks (e.g., image transforms and convolution) do not currently provide graceful degradation when their clock-cycles budgets are reduced, e.g., when delay deadlines are imposed in a multitasking environment to meet throughput requirements. This is an important obstacle in the quest for full utilization of modern programmable platforms' capabilities since worst-case considerations must be in place for reasonable quality of results. In this paper, we propose (and make available online) platform-independent software designs performing bitplane-based computation combined with an incremental packing framework in order to realize block transforms, 2-D convolution and frame-by-frame block matching. The proposed framework realizes incremental computation: progressive processing of input-source increments improves the output quality monotonically. Comparisons with the equivalent nonincremental software realization of each algorithm reveal that, for the same precision of the result, the proposed approach can lead to comparable or faster execution, while it can be arbitrarily terminated and provide the result up to the computed precision. Application examples with region-of-interest based incremental computation, task scheduling per frame, and energy-distortion scalability verify that our proposal provides significant performance scalability with graceful degradation.

  1. Water quality modeling in the systems impact assessment model for the Klamath River basin - Keno, Oregon to Seiad Valley, California

    USGS Publications Warehouse

    Hanna, R. Blair; Campbell, Sharon G.

    2000-01-01

    This report describes the water quality model developed for the Klamath River System Impact Assessment Model (SIAM). The Klamath River SIAM is a decision support system developed by the authors and other US Geological Survey (USGS), Midcontinent Ecological Science Center staff to study the effects of basin-wide water management decisions on anadromous fish in the Klamath River. The Army Corps of Engineersa?? HEC5Q water quality modeling software was used to simulate water temperature, dissolved oxygen and conductivity in 100 miles of the Klamath River Basin in Oregon and California. The water quality model simulated three reservoirs and the mainstem Klamath River influenced by the Shasta and Scott River tributaries. Model development, calibration and two validation exercises are described as well as the integration of the water quality model into the SIAM decision support system software. Within SIAM, data are exchanged between the water quantity model (MODSIM), the water quality model (HEC5Q), the salmon population model (SALMOD) and methods for evaluating ecosystem health. The overall predictive ability of the water quality model is described in the context of calibration and validation error statistics. Applications of SIAM and the water quality model are described.

  2. Software Reviews Since Acquisition Reform - The Artifact Perspective

    DTIC Science & Technology

    2004-01-01

    Risk Management OLD NEW Slide 13Acquisition of Software Intensive Systems 2004 – Peter Hantos Single, basic software paradigm Single processor Low...software risk mitigation related trade-offs must be done together Integral Software Engineering Activities Process Maturity and Quality Frameworks Quality

  3. Software Development Projects: Estimation of Cost and Effort (A Manager’s Digest).

    DTIC Science & Technology

    1982-12-01

    it later when changes need to be made to it. Various levels of difficulty are experiencel .4 due to the skill level of the programmer, poor Orcaram...impurities that if eliminatel 4 reduce the level of complexity of the program. They are as follows: 16 1. Complementary Operations: unreduced expressions 2...greater quality than that to support standard business applications. Remus defines quality as "...the number of program defects normalized by size over

  4. Simulation Modelling in Healthcare: An Umbrella Review of Systematic Literature Reviews.

    PubMed

    Salleh, Syed; Thokala, Praveen; Brennan, Alan; Hughes, Ruby; Booth, Andrew

    2017-09-01

    Numerous studies examine simulation modelling in healthcare. These studies present a bewildering array of simulation techniques and applications, making it challenging to characterise the literature. The aim of this paper is to provide an overview of the level of activity of simulation modelling in healthcare and the key themes. We performed an umbrella review of systematic literature reviews of simulation modelling in healthcare. Searches were conducted of academic databases (JSTOR, Scopus, PubMed, IEEE, SAGE, ACM, Wiley Online Library, ScienceDirect) and grey literature sources, enhanced by citation searches. The articles were included if they performed a systematic review of simulation modelling techniques in healthcare. After quality assessment of all included articles, data were extracted on numbers of studies included in each review, types of applications, techniques used for simulation modelling, data sources and simulation software. The search strategy yielded a total of 117 potential articles. Following sifting, 37 heterogeneous reviews were included. Most reviews achieved moderate quality rating on a modified AMSTAR (A Measurement Tool used to Assess systematic Reviews) checklist. All the review articles described the types of applications used for simulation modelling; 15 reviews described techniques used for simulation modelling; three reviews described data sources used for simulation modelling; and six reviews described software used for simulation modelling. The remaining reviews either did not report or did not provide enough detail for the data to be extracted. Simulation modelling techniques have been used for a wide range of applications in healthcare, with a variety of software tools and data sources. The number of reviews published in recent years suggest an increased interest in simulation modelling in healthcare.

  5. The experience factory: Can it make you a 5? or what is its relationship to other quality and improvement concepts?

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1992-01-01

    The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.

  6. ETICS: the international software engineering service for the grid

    NASA Astrophysics Data System (ADS)

    Meglio, A. D.; Bégin, M.-E.; Couvares, P.; Ronchieri, E.; Takacs, E.

    2008-07-01

    The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself.

  7. Establishing Qualitative Software Metrics in Department of the Navy Programs

    DTIC Science & Technology

    2015-10-29

    dedicated to provide the highest quality software to its users. In doing, there is a need for a formalized set of Software Quality Metrics . The goal...of this paper is to establish the validity of those necessary Quality metrics . In our approach we collected the data of over a dozen programs...provide the necessary variable data for our formulas and tested the formulas for validity. Keywords: metrics ; software; quality I. PURPOSE Space

  8. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework.

    PubMed

    Zheng, Qi; Grice, Elizabeth A

    2016-10-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost's algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost.

  9. Software-codec-based full motion video conferencing on the PC using visual pattern image sequence coding

    NASA Astrophysics Data System (ADS)

    Barnett, Barry S.; Bovik, Alan C.

    1995-04-01

    This paper presents a real time full motion video conferencing system based on the Visual Pattern Image Sequence Coding (VPISC) software codec. The prototype system hardware is comprised of two personal computers, two camcorders, two frame grabbers, and an ethernet connection. The prototype system software has a simple structure. It runs under the Disk Operating System, and includes a user interface, a video I/O interface, an event driven network interface, and a free running or frame synchronous video codec that also acts as the controller for the video and network interfaces. Two video coders have been tested in this system. Simple implementations of Visual Pattern Image Coding and VPISC have both proven to support full motion video conferencing with good visual quality. Future work will concentrate on expanding this prototype to support the motion compensated version of VPISC, as well as encompassing point-to-point modem I/O and multiple network protocols. The application will be ported to multiple hardware platforms and operating systems. The motivation for developing this prototype system is to demonstrate the practicality of software based real time video codecs. Furthermore, software video codecs are not only cheaper, but are more flexible system solutions because they enable different computer platforms to exchange encoded video information without requiring on-board protocol compatible video codex hardware. Software based solutions enable true low cost video conferencing that fits the `open systems' model of interoperability that is so important for building portable hardware and software applications.

  10. Master Pump Shutdown MPS Software Quality Assurance Plan (SQAP)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    BEVINS, R.R.

    2000-09-20

    The MPSS Software Quality Assurance (SQAP) describes the tools and strategy used in the development of the MPSS software. The document also describes the methodology for controlling and managing changes to the software.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Errion, S.M.; Thommes, M.M. Caruthers, C.M.

    Using the Apple LaserWriter at ANL (ANL/TM 452) explains how Argonne computer users (with CMS, MVS, or VAX/VMS accounts) can print quality text and graphics on the Apple LaserWriter. Currently, applications at Argonne that are compatible with the Apple LaserWriter include Waterloo Script, CA/ISSCO graphics software (i.e., Cuechart, Tellagraf, and Disspla), SAS/Graph, ANSYS (version 4.2), and some personal computer test and graphics software. This manual does not attempt to cover use of the Apple LaserWriter with other applications, though some information on the handling of PostScript-compatible files may be valid for other applications. Refer to the documentation of those applicationsmore » to learn how they work with the Apple LaserWriter. Most of the information in this manual applies to the Allied Linotype L300P typesetter in Building 222. However, the typesetter is not a high volume output device and should be used primarily for high quality (1250 and 2500 dots per inch) final copy output for Laboratory publications prior to making printing plates. You should print all drafts and proof pages on LaserWriers or other printers compatible with the PostScript page description language. Consult with Graphic Arts (at extension 2-5603) to determine the availability of the typesetter for printing the final copy of your document or graphics application. Since the Apple LaserWriter itself produces good quality output (300 dots per inch), we expect that most internal documents consisting of test or graphics will continue to be printed at LaserWriters distributed throughout the Laboratory. 5 figs., 2 tabs.« less

  12. Software Quality and Copyright: Issues in Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Helm, Virginia

    The two interconnected problems of educational quality and piracy are described and analyzed in this book, which begins with an investigation of the accusations regarding the alleged dismal quality of educational software. The reality behind accusations of rampant piracy and the effect of piracy on the quality of educational software is examined…

  13. Software Quality Assurance Audits Guidebooks

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  14. Boutiques: a flexible framework to integrate command-line applications in computing platforms

    PubMed Central

    Glatard, Tristan; Kiar, Gregory; Aumentado-Armstrong, Tristan; Beck, Natacha; Bellec, Pierre; Bernard, Rémi; Bonnet, Axel; Brown, Shawn T; Camarasu-Pop, Sorina; Cervenansky, Frédéric; Das, Samir; Ferreira da Silva, Rafael; Flandin, Guillaume; Girard, Pascal; Gorgolewski, Krzysztof J; Guttmann, Charles R G; Hayot-Sasson, Valérie; Quirion, Pierre-Olivier; Rioux, Pierre; Rousseau, Marc-Étienne; Evans, Alan C

    2018-01-01

    Abstract We present Boutiques, a system to automatically publish, integrate, and execute command-line applications across computational platforms. Boutiques applications are installed through software containers described in a rich and flexible JSON language. A set of core tools facilitates the construction, validation, import, execution, and publishing of applications. Boutiques is currently supported by several distinct virtual research platforms, and it has been used to describe dozens of applications in the neuroinformatics domain. We expect Boutiques to improve the quality of application integration in computational platforms, to reduce redundancy of effort, to contribute to computational reproducibility, and to foster Open Science. PMID:29718199

  15. Section 3. The SPARROW Surface Water-Quality Model: Theory, Application and User Documentation

    USGS Publications Warehouse

    Schwarz, G.E.; Hoos, A.B.; Alexander, R.B.; Smith, R.A.

    2006-01-01

    SPARROW (SPAtially Referenced Regressions On Watershed attributes) is a watershed modeling technique for relating water-quality measurements made at a network of monitoring stations to attributes of the watersheds containing the stations. The core of the model consists of a nonlinear regression equation describing the non-conservative transport of contaminants from point and diffuse sources on land to rivers and through the stream and river network. The model predicts contaminant flux, concentration, and yield in streams and has been used to evaluate alternative hypotheses about the important contaminant sources and watershed properties that control transport over large spatial scales. This report provides documentation for the SPARROW modeling technique and computer software to guide users in constructing and applying basic SPARROW models. The documentation gives details of the SPARROW software, including the input data and installation requirements, and guidance in the specification, calibration, and application of basic SPARROW models, as well as descriptions of the model output and its interpretation. The documentation is intended for both researchers and water-resource managers with interest in using the results of existing models and developing and applying new SPARROW models. The documentation of the model is presented in two parts. Part 1 provides a theoretical and practical introduction to SPARROW modeling techniques, which includes a discussion of the objectives, conceptual attributes, and model infrastructure of SPARROW. Part 1 also includes background on the commonly used model specifications and the methods for estimating and evaluating parameters, evaluating model fit, and generating water-quality predictions and measures of uncertainty. Part 2 provides a user's guide to SPARROW, which includes a discussion of the software architecture and details of the model input requirements and output files, graphs, and maps. The text documentation and computer software are available on the Web at http://usgs.er.gov/sparrow/sparrow-mod/.

  16. Quality assurance and control issues for HF radar wave and current measurements

    NASA Astrophysics Data System (ADS)

    Wyatt, Lucy

    2015-04-01

    HF radars are now widely used to provide surface current measurements over wide areas of the coastal ocean for scientific and operational applications. In general data quality is acceptable for these applications but there remain issues that impact on the quantity and quality of the data. These include problems with calibration and interference which impact on both phased array (e.g. WERA, Pisces) and direction-finding (e.g. SeaSonde) radars. These same issues and others (e.g. signal-to-noise, in-cell current variability, antenna sidelobes) also impact on the quality and quantity of wave data that can be obtained. These issues will be discussed in this paper, illustrated with examples from deployments of WERA, Pisces and SeaSonde radars in the UK, Europe, USA and Australia. These issues involve both quality assurance (making sure the radars perform to spec and the software is fully operational) and in quality control (identifying problems with the data due to radar hardware or software performance issues and flagging these in the provided data streams). Recommendations for the former, and current practice (of the author and within the Australian Coastal Ocean Radar Network, ACORN*) for the latter, will be discussed. The quality control processes for wave measurement are not yet as well developed as those for currents and data from some deployments can be rather noisy. Some new methods, currently under development by SeaView Sensing Ltd and being tested with ACORN data, will be described and results presented. *ACORN is a facility of the Australian Integrated Marine Observing System, IMOS. IMOS is a national collaborative research infrastructure, supported by Australian Government. It is led by University of Tasmania in partnership with the Australian marine and climate science community.

  17. What Is An Expert System? ERIC Digest.

    ERIC Educational Resources Information Center

    Boss, Richard W.

    This digest describes and defines the various components of an expert system, e.g., a computerized tool designed to enhance the quality and availability of knowledge required by decision makers. It is noted that expert systems differ from conventional applications software in the following areas: (1) the existence of the expert systems shell, or…

  18. An integrated toolbox for processing and analysis of remote sensing data of inland and coastal waters - atmospheric correction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Haan, J.F. de; Kokke, J.M.M.; Hoogenboom, H.J.

    1997-06-01

    Deriving thematic maps of water quality parameters from a remote sensing image requires a number of processing steps, such as calibration, atmospheric correction, air-water interface correction, and application of water quality algorithms. A prototype version of an integrated software environment has recently been developed that enables the user to perform and control these processing steps. Major parts of this environment are: (i) access to the MODTRAN 3 radiative transfer code, (ii) a database of water quality algorithms, and (iii) a spectral library of Dutch coastal and inland waters, containing subsurface irradiance reflectance spectra and associated water quality parameters. The atmosphericmore » correction part of this environment is discussed here. It is shown that this part can be used to accurately retrieve spectral signatures of inland water for wavelengths between 450 and 750 nm, provided in situ measurements are used to determine atmospheric model parameters. Assessment of the usefulness of the completely integrated software system in an operational environment requires a revised version that is presently being developed.« less

  19. The NCC project: A quality management perspective

    NASA Technical Reports Server (NTRS)

    Lee, Raymond H.

    1993-01-01

    The Network Control Center (NCC) Project introduced the concept of total quality management (TQM) in mid-1990. The CSC project team established a program which focused on continuous process improvement in software development methodology and consistent deliveries of high quality software products for the NCC. The vision of the TQM program was to produce error free software. Specific goals were established to allow continuing assessment of the progress toward meeting the overall quality objectives. The total quality environment, now a part of the NCC Project culture, has become the foundation for continuous process improvement and has resulted in the consistent delivery of quality software products over the last three years.

  20. OpenChrom: a cross-platform open source software for the mass spectrometric analysis of chromatographic data.

    PubMed

    Wenig, Philip; Odermatt, Juergen

    2010-07-30

    Today, data evaluation has become a bottleneck in chromatographic science. Analytical instruments equipped with automated samplers yield large amounts of measurement data, which needs to be verified and analyzed. Since nearly every GC/MS instrument vendor offers its own data format and software tools, the consequences are problems with data exchange and a lack of comparability between the analytical results. To challenge this situation a number of either commercial or non-profit software applications have been developed. These applications provide functionalities to import and analyze several data formats but have shortcomings in terms of the transparency of the implemented analytical algorithms and/or are restricted to a specific computer platform. This work describes a native approach to handle chromatographic data files. The approach can be extended in its functionality such as facilities to detect baselines, to detect, integrate and identify peaks and to compare mass spectra, as well as the ability to internationalize the application. Additionally, filters can be applied on the chromatographic data to enhance its quality, for example to remove background and noise. Extended operations like do, undo and redo are supported. OpenChrom is a software application to edit and analyze mass spectrometric chromatographic data. It is extensible in many different ways, depending on the demands of the users or the analytical procedures and algorithms. It offers a customizable graphical user interface. The software is independent of the operating system, due to the fact that the Rich Client Platform is written in Java. OpenChrom is released under the Eclipse Public License 1.0 (EPL). There are no license constraints regarding extensions. They can be published using open source as well as proprietary licenses. OpenChrom is available free of charge at http://www.openchrom.net.

  1. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities

    NASA Technical Reports Server (NTRS)

    Hebert, Phillip W., Sr.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Hughes, Mark S.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition systems (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis development and deployment.

  2. Monte Carlo calculations for reporting patient organ doses from interventional radiology

    NASA Astrophysics Data System (ADS)

    Huo, Wanli; Feng, Mang; Pi, Yifei; Chen, Zhi; Gao, Yiming; Xu, X. George

    2017-09-01

    This paper describes a project to generate organ dose data for the purposes of extending VirtualDose software from CT imaging to interventional radiology (IR) applications. A library of 23 mesh-based anthropometric patient phantoms were involved in Monte Carlo simulations for database calculations. Organ doses and effective doses of IR procedures with specific beam projection, filed of view (FOV) and beam quality for all parts of body were obtained. Comparing organ doses for different beam qualities, beam projections, patients' ages and patient's body mass indexes (BMIs) which generated by VirtualDose-IR, significant discrepancies were observed. For relatively long time exposure, IR doses depend on beam quality, beam direction and patient size. Therefore, VirtualDose-IR, which is based on the latest anatomically realistic patient phantoms, can generate accurate doses for IR treatment. It is suitable to apply this software in clinical IR dose management as an effective tool to estimate patient doses and optimize IR treatment plans.

  3. A neurite quality index and machine vision software for improved quantification of neurodegeneration.

    PubMed

    Romero, Peggy; Miller, Ted; Garakani, Arman

    2009-12-01

    Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.

  4. The impact of software quality characteristics on healthcare outcome: a literature review.

    PubMed

    Aghazadeh, Sakineh; Pirnejad, Habibollah; Moradkhani, Alireza; Aliev, Alvosat

    2014-01-01

    The aim of this study was to discover the effect of software quality characteristics on healthcare quality and efficiency indicators. Through a systematic literature review, we selected and analyzed 37 original research papers to investigate the impact of the software indicators (coming from the standard ISO 9126 quality characteristics and sub-characteristics) on some of healthcare important outcome indicators and finally ranked these software indicators. The results showed that the software characteristics usability, reliability and efficiency were mostly favored in the studies, indicating their importance. On the other hand, user satisfaction, quality of patient care, clinical workflow efficiency, providers' communication and information exchange, patient satisfaction and care costs were among the healthcare outcome indicators frequently evaluated in relation to the mentioned software characteristics. Regression Logistic Method was the most common assessment methodology, and Confirmatory Factor Analysis and Structural Equation Modeling were performed to test the structural model's fit. The software characteristics were considered to impact the healthcare outcome indicators through other intermediate factors (variables).

  5. Development (design and systematization) of HMS Group pump ranges

    NASA Astrophysics Data System (ADS)

    Tverdokhleb, I.; Yamburenko, V.

    2017-08-01

    The article reveals the need for pump range charts development for different applications and describes main principles used by HMS Group. Some modern approaches to pump selection are reviewed and highlighted the need for pump compliance with international standards and modern customer requirements. Even though pump design types are similar for different applications they need adjustment to specific requirements, which gets manufacturers develop their particular design for each pump range. Having wide pump ranges for different applications enables to create pump selection software, facilitating manufacturers to prepare high quality quotations in shortest time.

  6. ASAP (Automatic Software for ASL Processing): A toolbox for processing Arterial Spin Labeling images.

    PubMed

    Mato Abad, Virginia; García-Polo, Pablo; O'Daly, Owen; Hernández-Tamames, Juan Antonio; Zelaya, Fernando

    2016-04-01

    The method of Arterial Spin Labeling (ASL) has experienced a significant rise in its application to functional imaging, since it is the only technique capable of measuring blood perfusion in a truly non-invasive manner. Currently, there are no commercial packages for processing ASL data and there is no recognized standard for normalizing ASL data to a common frame of reference. This work describes a new Automated Software for ASL Processing (ASAP) that can automatically process several ASL datasets. ASAP includes functions for all stages of image pre-processing: quantification, skull-stripping, co-registration, partial volume correction and normalization. To assess the applicability and validity of the toolbox, this work shows its application in the study of hypoperfusion in a sample of healthy subjects at risk of progressing to Alzheimer's disease. ASAP requires limited user intervention, minimizing the possibility of random and systematic errors, and produces cerebral blood flow maps that are ready for statistical group analysis. The software is easy to operate and results in excellent quality of spatial normalization. The results found in this evaluation study are consistent with previous studies that find decreased perfusion in Alzheimer's patients in similar regions and demonstrate the applicability of ASAP. Copyright © 2015 Elsevier Inc. All rights reserved.

  7. Data from selected U.S. Geological Survey National Stream Water Quality Monitoring Networks

    USGS Publications Warehouse

    Alexander, Richard B.; Slack, James R.; Ludtke, Amy S.; Fitzgerald, Kathleen K.; Schertz, Terry L.

    1998-01-01

    A nationally consistent and well-documented collection of water quality and quantity data compiled during the past 30 years for streams and rivers in the United States is now available on CD-ROM and accessible over the World Wide Web. The data include measurements from two U.S. Geological Survey (USGS) national networks for 122 physical, chemical, and biological properties of water collected at 680 monitoring stations from 1962 to 1995, quality assurance information that describes the sample collection agencies, laboratories, analytical methods, and estimates of laboratory measurement error (bias and variance), and information on selected cultural and natural characteristics of the station watersheds. The data are easily accessed via user-supplied software including Web browser, spreadsheet, and word processor, or may be queried and printed according to user-specified criteria using the supplied retrieval software on CD-ROM. The water quality data serve a variety of scientific uses including research and educational applications related to trend detection, flux estimation, investigations of the effects of the natural environment and cultural sources on water quality, and the development of statistical methods for designing efficient monitoring networks and interpreting water resources data.

  8. Software-safety and software quality assurance in real-time applications Part 2: Real-time structures and languages

    NASA Astrophysics Data System (ADS)

    Schoitsch, Erwin

    1988-07-01

    Our society is depending more and more on the reliability of embedded (real-time) computer systems even in every-day life. Considering the complexity of the real world, this might become a severe threat. Real-time programming is a discipline important not only in process control and data acquisition systems, but also in fields like communication, office automation, interactive databases, interactive graphics and operating systems development. General concepts of concurrent programming and constructs for process-synchronization are discussed in detail. Tasking and synchronization concepts, methods of process communication, interrupt- and timeout handling in systems based on semaphores, signals, conditional critical regions or on real-time languages like Concurrent PASCAL, MODULA, CHILL and ADA are explained and compared with each other and with respect to their potential to quality and safety.

  9. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    ERIC Educational Resources Information Center

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  10. Software Process Improvement through the Removal of Project-Level Knowledge Flow Obstacles: The Perceptions of Software Engineers

    ERIC Educational Resources Information Center

    Mitchell, Susan Marie

    2012-01-01

    Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…

  11. Experience report: Using formal methods for requirements analysis of critical spacecraft software

    NASA Technical Reports Server (NTRS)

    Lutz, Robyn R.; Ampo, Yoko

    1994-01-01

    Formal specification and analysis of requirements continues to gain support as a method for producing more reliable software. However, the introduction of formal methods to a large software project is difficult, due in part to the unfamiliarity of the specification languages and the lack of graphics. This paper reports results of an investigation into the effectiveness of formal methods as an aid to the requirements analysis of critical, system-level fault-protection software on a spacecraft currently under development. Our experience indicates that formal specification and analysis can enhance the accuracy of the requirements and add assurance prior to design development in this domain. The work described here is part of a larger, NASA-funded research project whose purpose is to use formal-methods techniques to improve the quality of software in space applications. The demonstration project described here is part of the effort to evaluate experimentally the effectiveness of supplementing traditional engineering approaches to requirements specification with the more rigorous specification and analysis available with formal methods.

  12. [EpiData: the natural heir to EpiInfo 6?].

    PubMed

    Bohigas, Pedro Arias; Lauritsen, Jens L

    2007-01-01

    EpiData is an epidemiological software developed by the EpiData Association (www.epidata.dk). Following the EpiInfo 6 philosophy, Epidata, offers all the advantages of EpiInfo 6: simplicity, applicability, few operation and communication system requirements, widening them with a clear focus on data quality and documentation plus the advantages that, for many users, has the Windows operating system. The aim of this Note is to introduce to potential users the strengths and limitations of EpiData, a software that can become in a short time the equivalent to what EpiInfo 6 was a few years ago.

  13. Astronomy Data Visualization with Blender

    NASA Astrophysics Data System (ADS)

    Kent, Brian R.

    2015-08-01

    We present innovative methods and techniques for using Blender, a 3D software package, in the visualization of astronomical data. N-body simulations, data cubes, galaxy and stellar catalogs, and planetary surface maps can be rendered in high quality videos for exploratory data analysis. Blender's API is Python based, making it advantageous for use in astronomy with flexible libraries like astroPy. Examples will be exhibited that showcase the features of the software in astronomical visualization paradigms. 2D and 3D voxel texture applications, animations, camera movement, and composite renders are introduced to the astronomer's toolkit and how they mesh with different forms of data.

  14. The application of virtual reality systems as a support of digital manufacturing and logistics

    NASA Astrophysics Data System (ADS)

    Golda, G.; Kampa, A.; Paprocka, I.

    2016-08-01

    Modern trends in development of computer aided techniques are heading toward the integration of design competitive products and so-called "digital manufacturing and logistics", supported by computer simulation software. All phases of product lifecycle: starting from design of a new product, through planning and control of manufacturing, assembly, internal logistics and repairs, quality control, distribution to customers and after-sale service, up to its recycling or utilization should be aided and managed by advanced packages of product lifecycle management software. Important problems for providing the efficient flow of materials in supply chain management of whole product lifecycle, using computer simulation will be described on that paper. Authors will pay attention to the processes of acquiring relevant information and correct data, necessary for virtual modeling and computer simulation of integrated manufacturing and logistics systems. The article describes possibilities of use an applications of virtual reality software for modeling and simulation the production and logistics processes in enterprise in different aspects of product lifecycle management. The authors demonstrate effective method of creating computer simulations for digital manufacturing and logistics and show modeled and programmed examples and solutions. They pay attention to development trends and show options of the applications that go beyond enterprise.

  15. A mHealth Application for Chronic Wound Care: Findings of a User Trial

    PubMed Central

    Friesen, Marcia R.; Hamel, Carole; McLeod, Robert D.

    2013-01-01

    This paper reports on the findings of a user trial of a mHealth application for pressure ulcer (bedsore) documentation. Pressure ulcers are a leading iatrogenic cause of death in developed countries and significantly impact quality of life for those affected. Pressure ulcers will be an increasing public health concern as the population ages. Electronic information systems are being explored to improve consistency and accuracy of documentation, improve patient and caregiver experience and ultimately improve patient outcomes. A software application was developed for Android Smartphones and tablets and was trialed in a personal care home in Western Canada. The software application provides an electronic medical record for chronic wounds, replacing nurses’ paper-based charting and is positioned for integration with facility’s larger eHealth framework. The mHealth application offers three intended benefits over paper-based charting of chronic wounds, including: (1) the capacity for remote consultation (telehealth between facilities, practitioners, and/or remote communities), (2) data organization and analysis, including built-in alerts, automatically-generated text-based and graph-based wound histories including wound images, and (3) tutorial support for non-specialized caregivers. The user trial yielded insights regarding the software application’s design and functionality in the clinical setting, and highlighted the key role of wound photographs in enhancing patient and caregiver experiences, enhancing communication between multiple healthcare professionals, and leveraging the software’s telehealth capacities. PMID:24256739

  16. AnClim and ProClimDB software for data quality control and homogenization of time series

    NASA Astrophysics Data System (ADS)

    Stepanek, Petr

    2015-04-01

    During the last decade, a software package consisting of AnClim, ProClimDB and LoadData for processing (mainly climatological) data has been created. This software offers a complex solution for processing of climatological time series, starting from loading the data from a central database (e.g. Oracle, software LoadData), through data duality control and homogenization to time series analysis, extreme value evaluations and RCM outputs verification and correction (ProClimDB and AnClim software). The detection of inhomogeneities is carried out on a monthly scale through the application of AnClim, or newly by R functions called from ProClimDB, while quality control, the preparation of reference series and the correction of found breaks is carried out by the ProClimDB software. The software combines many statistical tests, types of reference series and time scales (monthly, seasonal and annual, daily and sub-daily ones). These can be used to create an "ensemble" of solutions, which may be more reliable than any single method. AnClim software is suitable for educational purposes: e.g. for students getting acquainted with methods used in climatology. Built-in graphical tools and comparison of various statistical tests help in better understanding of a given method. ProClimDB is, on the contrary, tool aimed for processing of large climatological datasets. Recently, functions from R may be used within the software making it more efficient in data processing and capable of easy inclusion of new methods (when available under R). An example of usage is easy comparison of methods for correction of inhomogeneities in daily data (HOM of Paul Della-Marta, SPLIDHOM method of Olivier Mestre, DAP - own method, QM of Xiaolan Wang and others). The software is available together with further information on www.climahom.eu . Acknowledgement: this work was partially funded by the project "Building up a multidisciplinary scientific team focused on drought" No. CZ.1.07/2.3.00/20.0248.

  17. Statistical analysis of water-quality data containing multiple detection limits: S-language software for regression on order statistics

    USGS Publications Warehouse

    Lee, L.; Helsel, D.

    2005-01-01

    Trace contaminants in water, including metals and organics, often are measured at sufficiently low concentrations to be reported only as values below the instrument detection limit. Interpretation of these "less thans" is complicated when multiple detection limits occur. Statistical methods for multiply censored, or multiple-detection limit, datasets have been developed for medical and industrial statistics, and can be employed to estimate summary statistics or model the distributions of trace-level environmental data. We describe S-language-based software tools that perform robust linear regression on order statistics (ROS). The ROS method has been evaluated as one of the most reliable procedures for developing summary statistics of multiply censored data. It is applicable to any dataset that has 0 to 80% of its values censored. These tools are a part of a software library, or add-on package, for the R environment for statistical computing. This library can be used to generate ROS models and associated summary statistics, plot modeled distributions, and predict exceedance probabilities of water-quality standards. ?? 2005 Elsevier Ltd. All rights reserved.

  18. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  19. A software communication tool for the tele-ICU.

    PubMed

    Pimintel, Denise M; Wei, Shang Heng; Odor, Alberto

    2013-01-01

    The Tele Intensive Care Unit (tele-ICU) supports a high volume, high acuity population of patients. There is a high-volume of incoming and outgoing calls, especially during the evening and night hours, through the tele-ICU hubs. The tele-ICU clinicians must be able to communicate effectively to team members in order to support the care of complex and critically ill patients while supporting and maintaining a standard to improve time to intervention. This study describes a software communication tool that will improve the time to intervention, over the paper-driven communication format presently used in the tele-ICU. The software provides a multi-relational database of message instances to mine information for evaluation and quality improvement for all entities that touch the tele-ICU. The software design incorporates years of critical care and software design experience combined with new skills acquired in an applied Health Informatics program. This software tool will function in the tele-ICU environment and perform as a front-end application that gathers, routes, and displays internal communication messages for intervention by priority and provider.

  20. A Framework for Evaluating the Software Product Quality of Pregnancy Monitoring Mobile Personal Health Records.

    PubMed

    Idri, Ali; Bachiri, Mariam; Fernández-Alemán, José Luis

    2016-03-01

    Stakeholders' needs and expectations are identified by means of software quality requirements, which have an impact on software product quality. In this paper, we present a set of requirements for mobile personal health records (mPHRs) for pregnancy monitoring, which have been extracted from literature and existing mobile apps on the market. We also use the ISO/IEC 25030 standard to suggest the requirements that should be considered during the quality evaluation of these mPHRs. We then go on to design a checklist in which we contrast the mPHRs for pregnancy monitoring requirements with software product quality characteristics and sub-characteristics in order to calculate the impact of these requirements on software product quality, using the ISO/IEC 25010 software product quality standard. The results obtained show that the requirements related to the user's actions and the app's features have the most impact on the external sub-characteristics of the software product quality model. The only sub-characteristic affected by all the requirements is Appropriateness of Functional suitability. The characteristic Operability is affected by 95% of the requirements while the lowest degrees of impact were identified for Compatibility (15%) and Transferability (6%). Lastly, the degrees of the impact of the mPHRs for pregnancy monitoring requirements are discussed in order to provide appropriate recommendations for the developers and stakeholders of mPHRs for pregnancy monitoring.

  1. Software quality: Process or people

    NASA Technical Reports Server (NTRS)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  2. A Smartphone Application for Customized Frequency Table Selection in Cochlear Implants.

    PubMed

    Jethanamest, Daniel; Azadpour, Mahan; Zeman, Annette M; Sagi, Elad; Svirsky, Mario A

    2017-09-01

    A novel smartphone-based software application can facilitate self-selection of frequency allocation tables (FAT) in postlingually deaf cochlear implant (CI) users. CIs use FATs to represent the tonotopic organization of a normal cochlea. Current CI fitting methods typically use a standard FAT for all patients regardless of individual differences in cochlear size and electrode location. In postlingually deaf patients, different amounts of mismatch can result between the frequency-place function they experienced when they had normal hearing and the frequency-place function that results from the standard FAT. For some CI users, an alternative FAT may enhance sound quality or speech perception. Currently, no widely available tools exist to aid real-time selection of different FATs. This study aims to develop a new smartphone tool for this purpose and to evaluate speech perception and sound quality measures in a pilot study of CI subjects using this application. A smartphone application for a widely available mobile platform (iOS) was developed to serve as a preprocessor of auditory input to a clinical CI speech processor and enable interactive real-time selection of FATs. The application's output was validated by measuring electrodograms for various inputs. A pilot study was conducted in six CI subjects. Speech perception was evaluated using word recognition tests. All subjects successfully used the portable application with their clinical speech processors to experience different FATs while listening to running speech. The users were all able to select one table that they judged provided the best sound quality. All subjects chose a FAT different from the standard FAT in their everyday clinical processor. Using the smartphone application, the mean consonant-nucleus-consonant score with the default FAT selection was 28.5% (SD 16.8) and 29.5% (SD 16.4) when using a self-selected FAT. A portable smartphone application enables CI users to self-select frequency allocation tables in real time. Even though the self-selected FATs that were deemed to have better sound quality were only tested acutely (i.e., without long-term experience with them), speech perception scores were not inferior to those obtained with the clinical FATs. This software application may be a valuable tool for improving future methods of CI fitting.

  3. Cost Estimation of Software Development and the Implications for the Program Manager

    DTIC Science & Technology

    1992-06-01

    Software Lifecycle Model (SLIM), the Jensen System-4 model, the Software Productivity, Quality, and Reliability Estimator ( SPQR \\20), the Constructive...function models in current use are the Software Productivity, Quality, and Reliability Estimator ( SPQR /20) and the Software Architecture Sizing and...Estimator ( SPQR /20) was developed by T. Capers Jones of Software Productivity Research, Inc., in 1985. The model is intended to estimate the outcome

  4. MetaDB a Data Processing Workflow in Untargeted MS-Based Metabolomics Experiments.

    PubMed

    Franceschi, Pietro; Mylonas, Roman; Shahaf, Nir; Scholz, Matthias; Arapitsas, Panagiotis; Masuero, Domenico; Weingart, Georg; Carlin, Silvia; Vrhovsek, Urska; Mattivi, Fulvio; Wehrens, Ron

    2014-01-01

    Due to their sensitivity and speed, mass-spectrometry based analytical technologies are widely used to in metabolomics to characterize biological phenomena. To address issues like metadata organization, quality assessment, data processing, data storage, and, finally, submission to public repositories, bioinformatic pipelines of a non-interactive nature are often employed, complementing the interactive software used for initial inspection and visualization of the data. These pipelines often are created as open-source software allowing the complete and exhaustive documentation of each step, ensuring the reproducibility of the analysis of extensive and often expensive experiments. In this paper, we will review the major steps which constitute such a data processing pipeline, discussing them in the context of an open-source software for untargeted MS-based metabolomics experiments recently developed at our institute. The software has been developed by integrating our metaMS R package with a user-friendly web-based application written in Grails. MetaMS takes care of data pre-processing and annotation, while the interface deals with the creation of the sample lists, the organization of the data storage, and the generation of survey plots for quality assessment. Experimental and biological metadata are stored in the ISA-Tab format making the proposed pipeline fully integrated with the Metabolights framework.

  5. AlignerBoost: A Generalized Software Toolkit for Boosting Next-Gen Sequencing Mapping Accuracy Using a Bayesian-Based Mapping Quality Framework

    PubMed Central

    Zheng, Qi; Grice, Elizabeth A.

    2016-01-01

    Accurate mapping of next-generation sequencing (NGS) reads to reference genomes is crucial for almost all NGS applications and downstream analyses. Various repetitive elements in human and other higher eukaryotic genomes contribute in large part to ambiguously (non-uniquely) mapped reads. Most available NGS aligners attempt to address this by either removing all non-uniquely mapping reads, or reporting one random or "best" hit based on simple heuristics. Accurate estimation of the mapping quality of NGS reads is therefore critical albeit completely lacking at present. Here we developed a generalized software toolkit "AlignerBoost", which utilizes a Bayesian-based framework to accurately estimate mapping quality of ambiguously mapped NGS reads. We tested AlignerBoost with both simulated and real DNA-seq and RNA-seq datasets at various thresholds. In most cases, but especially for reads falling within repetitive regions, AlignerBoost dramatically increases the mapping precision of modern NGS aligners without significantly compromising the sensitivity even without mapping quality filters. When using higher mapping quality cutoffs, AlignerBoost achieves a much lower false mapping rate while exhibiting comparable or higher sensitivity compared to the aligner default modes, therefore significantly boosting the detection power of NGS aligners even using extreme thresholds. AlignerBoost is also SNP-aware, and higher quality alignments can be achieved if provided with known SNPs. AlignerBoost’s algorithm is computationally efficient, and can process one million alignments within 30 seconds on a typical desktop computer. AlignerBoost is implemented as a uniform Java application and is freely available at https://github.com/Grice-Lab/AlignerBoost. PMID:27706155

  6. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit

    PubMed Central

    2014-01-01

    Background According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). Methods The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. Results The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. Conclusions The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a standards-compliant development of small and medium-sized medical software can be carried out by a small team with limited resources in a clinical setting. This is of particular relevance as the upcoming revision of the Medical Device Directive is expected to harmonize and tighten the current legal requirements for all European in-house manufacturers. PMID:24655818

  7. First experiences with the implementation of the European standard EN 62304 on medical device software for the quality assurance of a radiotherapy unit.

    PubMed

    Höss, Angelika; Lampe, Christian; Panse, Ralf; Ackermann, Benjamin; Naumann, Jakob; Jäkel, Oliver

    2014-03-21

    According to the latest amendment of the Medical Device Directive standalone software qualifies as a medical device when intended by the manufacturer to be used for medical purposes. In this context, the EN 62304 standard is applicable which defines the life-cycle requirements for the development and maintenance of medical device software. A pilot project was launched to acquire skills in implementing this standard in a hospital-based environment (in-house manufacture). The EN 62304 standard outlines minimum requirements for each stage of the software life-cycle, defines the activities and tasks to be performed and scales documentation and testing according to its criticality. The required processes were established for the pre-existent decision-support software FlashDumpComparator (FDC) used during the quality assurance of treatment-relevant beam parameters. As the EN 62304 standard implicates compliance with the EN ISO 14971 standard on the application of risk management to medical devices, a risk analysis was carried out to identify potential hazards and reduce the associated risks to acceptable levels. The EN 62304 standard is difficult to implement without proper tools, thus open-source software was selected and integrated into a dedicated development platform. The control measures yielded by the risk analysis were independently implemented and verified, and a script-based test automation was retrofitted to reduce the associated test effort. After all documents facilitating the traceability of the specified requirements to the corresponding tests and of the control measures to the proof of execution were generated, the FDC was released as an accessory to the HIT facility. The implementation of the EN 62304 standard was time-consuming, and a learning curve had to be overcome during the first iterations of the associated processes, but many process descriptions and all software tools can be re-utilized in follow-up projects. It has been demonstrated that a standards-compliant development of small and medium-sized medical software can be carried out by a small team with limited resources in a clinical setting. This is of particular relevance as the upcoming revision of the Medical Device Directive is expected to harmonize and tighten the current legal requirements for all European in-house manufacturers.

  8. An application framework of three-dimensional reconstruction and measurement for endodontic research.

    PubMed

    Gao, Yuan; Peters, Ove A; Wu, Hongkun; Zhou, Xuedong

    2009-02-01

    The purpose of this study was to customize an application framework by using the MeVisLab image processing and visualization platform for three-dimensional reconstruction and assessment of tooth and root canal morphology. One maxillary first molar was scanned before and after preparation with ProTaper by using micro-computed tomography. With a customized application framework based on MeVisLab, internal and external anatomy was reconstructed. Furthermore, the dimensions of root canal and radicular dentin were quantified, and effects of canal preparation were assessed. Finally, a virtual preparation with risk analysis was performed to simulate the removal of a broken instrument. This application framework provided an economical platform and met current requirements of endodontic research. The broad-based use of high-quality free software and the resulting exchange of experience might help to improve the quality of endodontic research with micro-computed tomography.

  9. Evaluating software development characteristics: Assessment of software measures in the Software Engineering Laboratory. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1981-01-01

    Work on metrics is discussed. Factors that affect software quality are reviewed. Metrics is discussed in terms of criteria achievements, reliability, and fault tolerance. Subjective and objective metrics are distinguished. Product/process and cost/quality metrics are characterized and discussed.

  10. Automated daily quality control analysis for mammography in a multi-unit imaging center.

    PubMed

    Sundell, Veli-Matti; Mäkelä, Teemu; Meaney, Alexander; Kaasalainen, Touko; Savolainen, Sauli

    2018-01-01

    Background The high requirements for mammography image quality necessitate a systematic quality assurance process. Digital imaging allows automation of the image quality analysis, which can potentially improve repeatability and objectivity compared to a visual evaluation made by the users. Purpose To develop an automatic image quality analysis software for daily mammography quality control in a multi-unit imaging center. Material and Methods An automated image quality analysis software using the discrete wavelet transform and multiresolution analysis was developed for the American College of Radiology accreditation phantom. The software was validated by analyzing 60 randomly selected phantom images from six mammography systems and 20 phantom images with different dose levels from one mammography system. The results were compared to a visual analysis made by four reviewers. Additionally, long-term image quality trends of a full-field digital mammography system and a computed radiography mammography system were investigated. Results The automated software produced feature detection levels comparable to visual analysis. The agreement was good in the case of fibers, while the software detected somewhat more microcalcifications and characteristic masses. Long-term follow-up via a quality assurance web portal demonstrated the feasibility of using the software for monitoring the performance of mammography systems in a multi-unit imaging center. Conclusion Automated image quality analysis enables monitoring the performance of digital mammography systems in an efficient, centralized manner.

  11. Improving the quality of EHR recording in primary care: a data quality feedback tool.

    PubMed

    van der Bij, Sjoukje; Khan, Nasra; Ten Veen, Petra; de Bakker, Dinny H; Verheij, Robert A

    2017-01-01

    Electronic health record (EHR) data are used to exchange information among health care providers. For this purpose, the quality of the data is essential. We developed a data quality feedback tool that evaluates differences in EHR data quality among practices and software packages as part of a larger intervention. The tool was applied in 92 practices in the Netherlands using different software packages. Practices received data quality feedback in 2010 and 2012. We observed large differences in the quality of recording. For example, the percentage of episodes of care that had a meaningful diagnostic code ranged from 30% to 100%. Differences were highly related to the software package. A year after the first measurement, the quality of recording had improved significantly and differences decreased, with 67% of the physicians indicating that they had actively changed their recording habits based on the results of the first measurement. About 80% found the feedback helpful in pinpointing recording problems. One of the software vendors made changes in functionality as a result of the feedback. Our EHR data quality feedback tool is capable of highlighting differences among practices and software packages. As such, it also stimulates improvements. As substantial variability in recording is related to the software package, our study strengthens the evidence that data quality can be improved substantially by standardizing the functionalities of EHR software packages. © The Author 2016. Published by Oxford University Press on behalf of the American Medical Informatics Association. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Recent developments in software tools for high-throughput in vitro ADME support with high-resolution MS.

    PubMed

    Paiva, Anthony; Shou, Wilson Z

    2016-08-01

    The last several years have seen the rapid adoption of the high-resolution MS (HRMS) for bioanalytical support of high throughput in vitro ADME profiling. Many capable software tools have been developed and refined to process quantitative HRMS bioanalysis data for ADME samples with excellent performance. Additionally, new software applications specifically designed for quan/qual soft spot identification workflows using HRMS have greatly enhanced the quality and efficiency of the structure elucidation process for high throughput metabolite ID in early in vitro ADME profiling. Finally, novel approaches in data acquisition and compression, as well as tools for transferring, archiving and retrieving HRMS data, are being continuously refined to tackle the issue of large data file size typical for HRMS analyses.

  13. Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories

    NASA Astrophysics Data System (ADS)

    Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly

    The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.

  14. Evaluating Software Assurance Knowledge and Competency of Acquisition Professionals

    DTIC Science & Technology

    2014-10-01

    of ISO 12207 -2008, both internationally and in the United States [7]. That standard documents a comprehensive set of activities and supporting...grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance Competency...cyberattacks grows, organizations must ensure that their procurement agents acquire high quality, secure software. ISO 12207 and the Software Assurance

  15. Value Engineering: An Application to Computer Software

    DTIC Science & Technology

    1995-06-01

    Ref. 4: P.2081 [Parentheses added] Figure S shows the cost function C(x) graphed with the Total Value function TV(x). It can be seen that for any...to be meaningful and ae-c-nrarp far tIha use V4a 4& trvite #nr- all nrnswn4_v%# s * 4 nna 33 since cost structures for each sotware de....o..n. project...maintainability quality characteristics due to longterm considerations affecting life-cycle costs .(0) S . VE applications provide alternative

  16. SOA: A Quality Attribute Perspective

    DTIC Science & Technology

    2011-06-23

    in software engineering from CMU. 6June 2011 Twitter #seiwebinar © 2011 Carnegie Mellon University Agenda Service -Oriented Architecture and... Software Architecture: Review Service -Orientation and Quality Attributes Summary and Future Challenges 7June 2011 Twitter #seiwebinar © 2011...Architecture and Software Architecture: Review Service -Orientation and Quality Attributes Summary and Future Challenges Review 10June 2011 Twitter

  17. Parallel design patterns for a low-power, software-defined compressed video encoder

    NASA Astrophysics Data System (ADS)

    Bruns, Michael W.; Hunt, Martin A.; Prasad, Durga; Gunupudi, Nageswara R.; Sonachalam, Sekar

    2011-06-01

    Video compression algorithms such as H.264 offer much potential for parallel processing that is not always exploited by the technology of a particular implementation. Consumer mobile encoding devices often achieve real-time performance and low power consumption through parallel processing in Application Specific Integrated Circuit (ASIC) technology, but many other applications require a software-defined encoder. High quality compression features needed for some applications such as 10-bit sample depth or 4:2:2 chroma format often go beyond the capability of a typical consumer electronics device. An application may also need to efficiently combine compression with other functions such as noise reduction, image stabilization, real time clocks, GPS data, mission/ESD/user data or software-defined radio in a low power, field upgradable implementation. Low power, software-defined encoders may be implemented using a massively parallel memory-network processor array with 100 or more cores and distributed memory. The large number of processor elements allow the silicon device to operate more efficiently than conventional DSP or CPU technology. A dataflow programming methodology may be used to express all of the encoding processes including motion compensation, transform and quantization, and entropy coding. This is a declarative programming model in which the parallelism of the compression algorithm is expressed as a hierarchical graph of tasks with message communication. Data parallel and task parallel design patterns are supported without the need for explicit global synchronization control. An example is described of an H.264 encoder developed for a commercially available, massively parallel memorynetwork processor device.

  18. Improving Software Quality and Management Through Use of Service Level Agreements

    DTIC Science & Technology

    2005-03-01

    many who believe that the quality of the development process is the best predictor of software product quality. ( Fenton ) Repeatable software processes...reduced errors per KLOC for small projects ( Fenton ), and the quality management metric (QMM) (Machniak, Osmundson). There are also numerous IEEE 14...attention to cosmetic user interface issues and any problems that may arise with the prototype. (Sawyer) The validation process is also another check

  19. Testing Software Development Project Productivity Model

    NASA Astrophysics Data System (ADS)

    Lipkin, Ilya

    Software development is an increasingly influential factor in today's business environment, and a major issue affecting software development is how an organization estimates projects. If the organization underestimates cost, schedule, and quality requirements, the end results will not meet customer needs. On the other hand, if the organization overestimates these criteria, resources that could have been used more profitably will be wasted. There is no accurate model or measure available that can guide an organization in a quest for software development, with existing estimation models often underestimating software development efforts as much as 500 to 600 percent. To address this issue, existing models usually are calibrated using local data with a small sample size, with resulting estimates not offering improved cost analysis. This study presents a conceptual model for accurately estimating software development, based on an extensive literature review and theoretical analysis based on Sociotechnical Systems (STS) theory. The conceptual model serves as a solution to bridge organizational and technological factors and is validated using an empirical dataset provided by the DoD. Practical implications of this study allow for practitioners to concentrate on specific constructs of interest that provide the best value for the least amount of time. This study outlines key contributing constructs that are unique for Software Size E-SLOC, Man-hours Spent, and Quality of the Product, those constructs having the largest contribution to project productivity. This study discusses customer characteristics and provides a framework for a simplified project analysis for source selection evaluation and audit task reviews for the customers and suppliers. Theoretical contributions of this study provide an initial theory-based hypothesized project productivity model that can be used as a generic overall model across several application domains such as IT, Command and Control, Simulation and etc... This research validates findings from previous work concerning software project productivity and leverages said results in this study. The hypothesized project productivity model provides statistical support and validation of expert opinions used by practitioners in the field of software project estimation.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Peck, T; Sparkman, D; Storch, N

    ''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less

  1. Computer graphics and the graphic artist

    NASA Technical Reports Server (NTRS)

    Taylor, N. L.; Fedors, E. G.; Pinelli, T. E.

    1985-01-01

    A centralized computer graphics system is being developed at the NASA Langley Research Center. This system was required to satisfy multiuser needs, ranging from presentation quality graphics prepared by a graphic artist to 16-mm movie simulations generated by engineers and scientists. While the major thrust of the central graphics system was directed toward engineering and scientific applications, hardware and software capabilities to support the graphic artists were integrated into the design. This paper briefly discusses the importance of computer graphics in research; the central graphics system in terms of systems, software, and hardware requirements; the application of computer graphics to graphic arts, discussed in terms of the requirements for a graphic arts workstation; and the problems encountered in applying computer graphics to the graphic arts. The paper concludes by presenting the status of the central graphics system.

  2. A data management approach to quality assurance using colorectal carcinoma reports from two institutions as a model.

    PubMed

    Sorge, John P; Harmon, C Reid; Sherman, Susan M; Baillie, E Eugene

    2005-07-01

    We used data management software to compare pathology report data concerning regional lymph node sampling for colorectal carcinoma from 2 institutions using different dissection methods. Data were retrieved from 2 disparate anatomic pathology information systems for all cases of colorectal carcinoma in 2003 involving the ascending and descending colon. Initial sorting of the data included overall lymph node recovery to assess differences between the dissection methods at the 2 institutions. Additional segregation of the data was used to challenge the application's capability of accurately addressing the complexity of the process. This software approach can be used to evaluate data from disparate computer systems, and we demonstrate how an automated function can enable institutions to compare internal pathologic assessment processes and the results of those comparisons. The use of this process has future implications for pathology quality assurance in other areas.

  3. Video Image Stabilization and Registration (VISAR) Software

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Two scientists at NASA Marshall Space Flight Center, atmospheric scientist Paul Meyer (left) and solar physicist Dr. David Hathaway, have developed promising new software, called Video Image Stabilization and Registration (VISAR), that may help law enforcement agencies to catch criminals by improving the quality of video recorded at crime scenes, VISAR stabilizes camera motion in the horizontal and vertical as well as rotation and zoom effects; produces clearer images of moving objects; smoothes jagged edges; enhances still images; and reduces video noise of snow. VISAR could also have applications in medical and meteorological imaging. It could steady images of Ultrasounds which are infamous for their grainy, blurred quality. It would be especially useful for tornadoes, tracking whirling objects and helping to determine the tornado's wind speed. This image shows two scientists reviewing an enhanced video image of a license plate taken from a moving automobile.

  4. Guidelines for software inspections

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Quality control inspections are software problem finding procedures which provide defect removal as well as improvements in software functionality, maintenance, quality, and development and testing methodology is discussed. The many side benefits include education, documentation, training, and scheduling.

  5. ECLIPSE, an Emerging Standardized Modular, Secure and Affordable Software Toolset in Support of Product Assurance, Quality Assurance and Project Management for the Entire European Space Industry (from Innovative SMEs to Primes and Institutions)

    NASA Astrophysics Data System (ADS)

    Bennetti, Andrea; Ansari, Salim; Dewhirst, Tori; Catanese, Giuseppe

    2010-08-01

    The development of satellites and ground systems (and the technologies that support them) is complex and demands a great deal of rigor in the management of both the information it relies upon and the information it generates via the performance of well established processes. To this extent for the past fifteen years Sapienza Consulting has been supporting the European Space Agency (ESA) in the management of this information and provided ESA with ECSS (European Cooperation for Space Standardization) Standards based Project Management (PM), Product Assurance (PA) and Quality Assurance (QA) software applications. In 2009 Sapienza recognised the need to modernize, standardizing and integrate its core ECSS-based software tools into a single yet modularised suite of applications named ECLIPSE aimed at: • Fulfilling a wider range of historical and emerging requirements, • Providing a better experience for users, • Increasing the value of the information it collects and manages • Lowering the cost of ownership and operation • Increasing collaboration within and between space sector organizations • Aiding in the performance of several PM, PA, QA, and configuration management tasks in adherence to ECSS standards. In this paper, Sapienza will first present the toolset, and a rationale for its development, describing and justifying its architecture, and basic modules composition. Having defined the toolset architecture, this paper will address the current status of the individual applications. A compliance assessment will be presented for each module in the toolset with respect to the ECSS standard it addresses. Lastly experience from early industry and Institutional users will be presented.

  6. Industrial applications of automated X-ray inspection

    NASA Astrophysics Data System (ADS)

    Shashishekhar, N.

    2015-03-01

    Many industries require that 100% of manufactured parts be X-ray inspected. Factors such as high production rates, focus on inspection quality, operator fatigue and inspection cost reduction translate to an increasing need for automating the inspection process. Automated X-ray inspection involves the use of image processing algorithms and computer software for analysis and interpretation of X-ray images. This paper presents industrial applications and illustrative case studies of automated X-ray inspection in areas such as automotive castings, fuel plates, air-bag inflators and tires. It is usually necessary to employ application-specific automated inspection strategies and techniques, since each application has unique characteristics and interpretation requirements.

  7. Rule-based statistical data mining agents for an e-commerce application

    NASA Astrophysics Data System (ADS)

    Qin, Yi; Zhang, Yan-Qing; King, K. N.; Sunderraman, Rajshekhar

    2003-03-01

    Intelligent data mining techniques have useful e-Business applications. Because an e-Commerce application is related to multiple domains such as statistical analysis, market competition, price comparison, profit improvement and personal preferences, this paper presents a hybrid knowledge-based e-Commerce system fusing intelligent techniques, statistical data mining, and personal information to enhance QoS (Quality of Service) of e-Commerce. A Web-based e-Commerce application software system, eDVD Web Shopping Center, is successfully implemented uisng Java servlets and an Oracle81 database server. Simulation results have shown that the hybrid intelligent e-Commerce system is able to make smart decisions for different customers.

  8. Assessment of the integration capability of system architectures from a complex and distributed software systems perspective

    NASA Astrophysics Data System (ADS)

    Leuchter, S.; Reinert, F.; Müller, W.

    2014-06-01

    Procurement and design of system architectures capable of network centric operations demand for an assessment scheme in order to compare different alternative realizations. In this contribution an assessment method for system architectures targeted at the C4ISR domain is presented. The method addresses the integration capability of software systems from a complex and distributed software system perspective focusing communication, interfaces and software. The aim is to evaluate the capability to integrate a system or its functions within a system-of-systems network. This method uses approaches from software architecture quality assessment and applies them on the system architecture level. It features a specific goal tree of several dimensions that are relevant for enterprise integration. These dimensions have to be weighed against each other and totalized using methods from the normative decision theory in order to reflect the intention of the particular enterprise integration effort. The indicators and measurements for many of the considered quality features rely on a model based view on systems, networks, and the enterprise. That means it is applicable to System-of-System specifications based on enterprise architectural frameworks relying on defined meta-models or domain ontologies for defining views and viewpoints. In the defense context we use the NATO Architecture Framework (NAF) to ground respective system models. The proposed assessment method allows evaluating and comparing competing system designs regarding their future integration potential. It is a contribution to the system-of-systems engineering methodology.

  9. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities - A General Overview

    NASA Technical Reports Server (NTRS)

    Hebert, Phillip W., Sr.; Hughes, Mark S.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Marshall, PeggL.; Duncan, Michael E.; Morris, Jon A.; Franzl, Richard W.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition system (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis' development and deployment.

  10. [Development and practice evaluation of blood acid-base imbalance analysis software].

    PubMed

    Chen, Bo; Huang, Haiying; Zhou, Qiang; Peng, Shan; Jia, Hongyu; Ji, Tianxing

    2014-11-01

    To develop a blood gas, acid-base imbalance analysis computer software to diagnose systematically, rapidly, accurately and automatically determine acid-base imbalance type, and evaluate the clinical application. Using VBA programming language, a computer aided diagnostic software for the judgment of acid-base balance was developed. The clinical data of 220 patients admitted to the Second Affiliated Hospital of Guangzhou Medical University were retrospectively analyzed. The arterial blood gas [pH value, HCO(3)(-), arterial partial pressure of carbon dioxide (PaCO₂)] and electrolytes included data (Na⁺ and Cl⁻) were collected. Data were entered into the software for acid-base imbalances judgment. At the same time the data generation was calculated manually by H-H compensation formula for determining the type of acid-base imbalance. The consistency of judgment results from software and manual calculation was evaluated, and the judgment time of two methods was compared. The clinical diagnosis of the types of acid-base imbalance for the 220 patients: 65 cases were normal, 90 cases with simple type, mixed type in 41 cases, and triplex type in 24 cases. The accuracy of the judgment results of the normal and triplex types from computer software compared with which were calculated manually was 100%, the accuracy of the simple type judgment was 98.9% and 78.0% for the mixed type, and the total accuracy was 95.5%. The Kappa value of judgment result from software and manual judgment was 0.935, P=0.000. It was demonstrated that the consistency was very good. The time for software to determine acid-base imbalances was significantly shorter than the manual judgment (seconds:18.14 ± 3.80 vs. 43.79 ± 23.86, t=7.466, P=0.000), so the method of software was much faster than the manual method. Software judgment can replace manual judgment with the characteristics of rapid, accurate and convenient, can improve work efficiency and quality of clinical doctors and has great clinical application promotion value.

  11. Copyright and the Assurance of Quality Courseware.

    ERIC Educational Resources Information Center

    Helm, Virginia M.

    Issues related to the illegal copying or piracy of educational software in the schools and its potential effect on quality software availability are discussed. Copyright violation is examined as a reason some software producers may be abandoning the school software market. An explanation of what the copyright allows and prohibits in terms of…

  12. Knowledge-Based Aircraft Automation: Managers Guide on the use of Artificial Intelligence for Aircraft Automation and Verification and Validation Approach for a Neural-Based Flight Controller

    NASA Technical Reports Server (NTRS)

    Broderick, Ron

    1997-01-01

    The ultimate goal of this report was to integrate the powerful tools of artificial intelligence into the traditional process of software development. To maintain the US aerospace competitive advantage, traditional aerospace and software engineers need to more easily incorporate the technology of artificial intelligence into the advanced aerospace systems being designed today. The future goal was to transition artificial intelligence from an emerging technology to a standard technology that is considered early in the life cycle process to develop state-of-the-art aircraft automation systems. This report addressed the future goal in two ways. First, it provided a matrix that identified typical aircraft automation applications conducive to various artificial intelligence methods. The purpose of this matrix was to provide top-level guidance to managers contemplating the possible use of artificial intelligence in the development of aircraft automation. Second, the report provided a methodology to formally evaluate neural networks as part of the traditional process of software development. The matrix was developed by organizing the discipline of artificial intelligence into the following six methods: logical, object representation-based, distributed, uncertainty management, temporal and neurocomputing. Next, a study of existing aircraft automation applications that have been conducive to artificial intelligence implementation resulted in the following five categories: pilot-vehicle interface, system status and diagnosis, situation assessment, automatic flight planning, and aircraft flight control. The resulting matrix provided management guidance to understand artificial intelligence as it applied to aircraft automation. The approach taken to develop a methodology to formally evaluate neural networks as part of the software engineering life cycle was to start with the existing software quality assurance standards and to change these standards to include neural network development. The changes were to include evaluation tools that can be applied to neural networks at each phase of the software engineering life cycle. The result was a formal evaluation approach to increase the product quality of systems that use neural networks for their implementation.

  13. Description and pilot evaluation of the Metabolic Irregularities Narrowing down Device software: a case analysis of physician programming.

    PubMed

    Kashiouris, Markos G; Miljković, Miloš; Herasevich, Vitaly; Goldberg, Andrew D; Albrecht, Charles

    2015-01-01

    There is a gap between the abilities and the everyday applications of Computerized Decision Support Systems (CDSSs). This gap is further exacerbated by the different 'worlds' between the software designers and the clinician end-users. Software programmers often lack clinical experience whereas practicing physicians lack skills in design and engineering. Our primary objective was to evaluate the performance of Metabolic Irregularities Narrowing down Device (MIND) intelligent medical calculator and differential diagnosis software through end-user surveys and discuss the roles of CDSS in the inpatient setting. A tertiary care, teaching community hospital. Thirty-one responders answered the survey. Responders consisted of medical students, 24%; attending physicians, 16%, and residents, 60%. About 62.5% of the responders reported that MIND has the ability to potentially improve the quality of care, 20.8% were sure that MIND improves the quality of care, and only 4.2% of the responders felt that it does not improve the quality of care. Ninety-six percent of the responders felt that MIND definitely serves or has the potential to serve as a useful tool for medical students, and only 4% of the responders felt otherwise. Thirty-five percent of the responders rated the differential diagnosis list as excellent, 56% as good, 4% as fair, and 4% as poor. MIND is a suggesting, interpreting, alerting, and diagnosing CDSS with good performance and end-user satisfaction. In the era of the electronic medical record, the ongoing development of efficient CDSS platforms should be carefully considered by practicing physicians and institutions.

  14. A survey of Canadian medical physicists: software quality assurance of in-house software.

    PubMed

    Salomons, Greg J; Kelly, Diane

    2015-01-05

    This paper reports on a survey of medical physicists who write and use in-house written software as part of their professional work. The goal of the survey was to assess the extent of in-house software usage and the desire or need for related software quality guidelines. The survey contained eight multiple-choice questions, a ranking question, and seven free text questions. The survey was sent to medical physicists associated with cancer centers across Canada. The respondents to the survey expressed interest in having guidelines to help them in their software-related work, but also demonstrated extensive skills in the area of testing, safety, and communication. These existing skills form a basis for medical physicists to establish a set of software quality guidelines.

  15. Impact of Requirements Quality on Project Success or Failure

    NASA Astrophysics Data System (ADS)

    Tamai, Tetsuo; Kamata, Mayumi Itakura

    We are interested in the relationship between the quality of the requirements specifications for software projects and the subsequent outcome of the projects. To examine this relationship, we investigated 32 projects started and completed between 2003 and 2005 by the software development division of a large company in Tokyo. The company has collected reliable data on requirements specification quality, as evaluated by software quality assurance teams, and overall project performance data relating to cost and time overruns. The data for requirements specification quality were first converted into a multiple-dimensional space, with each dimension corresponding to an item of the recommended structure for software requirements specifications (SRS) defined in IEEE Std. 830-1998. We applied various statistical analysis methods to the SRS quality data and project outcomes.

  16. Knowledge base methodology: Methodology for first Engineering Script Language (ESL) knowledge base

    NASA Technical Reports Server (NTRS)

    Peeris, Kumar; Izygon, Michel E.

    1992-01-01

    The primary goal of reusing software components is that software can be developed faster, cheaper and with higher quality. Though, reuse is not automatic and can not just happen. It has to be carefully engineered. For example a component needs to be easily understandable in order to be reused, and it has also to be malleable enough to fit into different applications. In fact the software development process is deeply affected when reuse is being applied. During component development, a serious effort has to be directed toward making these components as reusable. This implies defining reuse coding style guidelines and applying then to any new component to create as well as to any old component to modify. These guidelines should point out the favorable reuse features and may apply to naming conventions, module size and cohesion, internal documentation, etc. During application development, effort is shifted from writing new code toward finding and eventually modifying existing pieces of code, then assembling them together. We see here that reuse is not free, and therefore has to be carefully managed.

  17. Validation of a Quality Management Metric

    DTIC Science & Technology

    2000-09-01

    quality management metric (QMM) was used to measure the performance of ten software managers on Department of Defense (DoD) software development programs. Informal verification and validation of the metric compared the QMM score to an overall program success score for the entire program and yielded positive correlation. The results of applying the QMM can be used to characterize the quality of software management and can serve as a template to improve software management performance. Future work includes further refining the QMM, applying the QMM scores to provide feedback

  18. Standard classification of software documentation

    NASA Technical Reports Server (NTRS)

    Tausworthe, R. C.

    1976-01-01

    General conceptual requirements for standard levels of documentation and for application of these requirements to intended usages. These standards encourage the policy to produce only those forms of documentation that are needed and adequate for the purpose. Documentation standards are defined with respect to detail and format quality. Classes A through D range, in order, from the most definitive down to the least definitive, and categories 1 through 4 range, in order, from high-quality typeset down to handwritten material. Criteria for each of the classes and categories, as well as suggested selection guidelines for each are given.

  19. An improved real time superresolution FPGA system

    NASA Astrophysics Data System (ADS)

    Lakshmi Narasimha, Pramod; Mudigoudar, Basavaraj; Yue, Zhanfeng; Topiwala, Pankaj

    2009-05-01

    In numerous computer vision applications, enhancing the quality and resolution of captured video can be critical. Acquired video is often grainy and low quality due to motion, transmission bottlenecks, etc. Postprocessing can enhance it. Superresolution greatly decreases camera jitter to deliver a smooth, stabilized, high quality video. In this paper, we extend previous work on a real-time superresolution application implemented in ASIC/FPGA hardware. A gradient based technique is used to register the frames at the sub-pixel level. Once we get the high resolution grid, we use an improved regularization technique in which the image is iteratively modified by applying back-projection to get a sharp and undistorted image. The algorithm was first tested in software and migrated to hardware, to achieve 320x240 -> 1280x960, about 30 fps, a stunning superresolution by 16X in total pixels. Various input parameters, such as size of input image, enlarging factor and the number of nearest neighbors, can be tuned conveniently by the user. We use a maximum word size of 32 bits to implement the algorithm in Matlab Simulink as well as in FPGA hardware, which gives us a fine balance between the number of bits and performance. The proposed system is robust and highly efficient. We have shown the performance improvement of the hardware superresolution over the software version (C code).

  20. Developing information technology at the Medical Research Unit of the Albert Schweitzer Hospital in Lambaréné, Gabon.

    PubMed

    Dibacka, Paterne Lessihuin; Bounda, Yann; Nguema, Davy Ondo; Lell, Bertrand

    2010-03-01

    Information technology has become a key resource for research institutions, providing services such as hardware, software and network maintenance, as well as data management services. The IT department of the Medical Research Unit (MRU) of the Albert Schweitzer Hospital in Lambaréné, Gabon is a good example of how IT has developed at African Research Centres in recent years and demonstrates the scope of work that a modern research centre needs to offer. It illustrates the development in the past 15 years--from single computers maintained by investigators to the present situation of a group of well-trained local IT personal who are in charge of a variety of hardware and software and who also develop applications for use in a research environment. Open source applications are particularly suited for these needs and various applications are used in data management, data analysis, accounting, administration and quality management.

  1. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  2. Formulation of consumables management models. Development approach for the mission planning processor working model

    NASA Technical Reports Server (NTRS)

    Connelly, L. C.

    1977-01-01

    The mission planning processor is a user oriented tool for consumables management and is part of the total consumables subsystem management concept. The approach to be used in developing a working model of the mission planning processor is documented. The approach includes top-down design, structured programming techniques, and application of NASA approved software development standards. This development approach: (1) promotes cost effective software development, (2) enhances the quality and reliability of the working model, (3) encourages the sharing of the working model through a standard approach, and (4) promotes portability of the working model to other computer systems.

  3. Digital enhancement of sub-quality bitemark photographs.

    PubMed

    Karazalus, C P; Palmbach, T T; Lee, H C

    2001-07-01

    Digital enhancement software was used to enhance bitemark photographs. This enhancement technique improved the resolution of the bitemark images. Lucis was the software program utilized in this study and case applications. First, this technique was applied on known bitemark images to evaluate the potential effectiveness of this digital enhancement method. Subsequently, Lucis was utilized on two separate unsolved cases involving enhancement of bitemark evidence. One case involved a severely beaten infant with a bitemark on the upper thigh. The second case involves a bitemark observed on the breast of a female sexual assault strangulation victim. In both cases, bitemark images were significantly improved after digital enhancement.

  4. A hybrid integrated services digital network-internet protocol solution for resident education.

    PubMed

    Erickson, Delnora; Greer, Lester; Belard, Arnaud; Tinnel, Brent; O'Connell, John

    2010-05-01

    The purpose of this study was to explore the effectiveness of incorporating Web-based application sharing of virtual medical simulation software within a multipoint video teleconference (VTC) as a training tool in graduate medical education. National Capital Consortium Radiation Oncology Residency Program resident and attending physicians participated in dosimetry teaching sessions held via VTC using Acrobat Connect application sharing. Residents at remote locations could take turns designing radiation treatments using standard three-dimensional planning software, whereas instructors gave immediate feedback and demonstrated proper techniques. Immediately after each dosimetry lesson, residents were asked to complete a survey that evaluated the effectiveness of the session. At the end of a 3-month trial of using Adobe Connect, residents completed a final survey that compared this teaching technology to the prior VTC-alone method. The mean difference from equality across all quality measures from the weekly survey was 0.8, where 0 indicated neither enhanced nor detracted from the learning experience and 1 indicated a minor enhancement in the learning experience. The mean difference from equality across all measures from the final survey comparing use of application sharing with VTC to VTC alone was 1.5, where 1 indicated slightly better and 2 indicated a somewhat better experience. The teaching efficacy of multipoint VTC is perceived by medical residents to be more effective when complemented by application-sharing software such as Adobe Acrobat Connect.

  5. An MS-DOS-based program for analyzing plutonium gamma-ray spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ruhter, W.D.; Buckley, W.M.

    1989-09-07

    A plutonium gamma-ray analysis system that operates on MS-DOS-based computers has been developed for the International Atomic Energy Agency (IAEA) to perform in-field analysis of plutonium gamma-ray spectra for plutonium isotopics. The program titled IAEAPU consists of three separate applications: a data-transfer application for transferring spectral data from a CICERO multichannel analyzer to a binary data file, a data-analysis application to analyze plutonium gamma-ray spectra, for plutonium isotopic ratios and weight percents of total plutonium, and a data-quality assurance application to check spectral data for proper data-acquisition setup and performance. Volume 3 contains the software listings for these applications.

  6. Commissioning and validation of COMPASS system for VMAT patient specific quality assurance

    NASA Astrophysics Data System (ADS)

    Pimthong, J.; Kakanaporn, C.; Tuntipumiamorn, L.; Laojunun, P.; Iampongpaiboon, P.

    2016-03-01

    Pre-treatment patient specific quality assurance (QA) of advanced treatment techniques such as volumetric modulated arc therapy (VMAT) is one of important QA in radiotherapy. The fast and reliable dosimetric device is required. The objective of this study is to commission and validate the performance of COMPASS system for dose verification of VMAT technique. The COMPASS system is composed of an array of ionization detectors (MatriXX) mounted to the gantry using a custom holder and software for the analysis and visualization of QA results. We validated the COMPASS software for basic and advanced clinical application. For the basic clinical study, the simple open field in various field sizes were validated in homogeneous phantom. And the advanced clinical application, the fifteen prostate and fifteen nasopharyngeal cancers VMAT plans were chosen to study. The treatment plans were measured by the MatriXX. The doses and dose-volume histograms (DVHs) reconstructed from the fluence measurements were compared to the TPS calculated plans. And also, the doses and DVHs computed using collapsed cone convolution (CCC) Algorithm were compared with Eclipse TPS calculated plans using Analytical Anisotropic Algorithm (AAA) that according to dose specified in ICRU 83 for PTV.

  7. Identifying strengths and weaknesses of Quality Management Unit University of Sumatera Utara software using SCAMPI C

    NASA Astrophysics Data System (ADS)

    Gunawan, D.; Amalia, A.; Rahmat, R. F.; Muchtar, M. A.; Siregar, I.

    2018-02-01

    Identification of software maturity level is a technique to determine the quality of the software. By identifying the software maturity level, the weaknesses of the software can be observed. As a result, the recommendations might be a reference for future software maintenance and development. This paper discusses the software Capability Level (CL) with case studies on Quality Management Unit (Unit Manajemen Mutu) University of Sumatera Utara (UMM-USU). This research utilized Standard CMMI Appraisal Method for Process Improvement class C (SCAMPI C) model with continuous representation. This model focuses on activities for developing quality products and services. The observation is done in three process areas, such as Project Planning (PP), Project Monitoring and Control (PMC), and Requirements Management (REQM). According to the measurement of software capability level for UMM-USU software, turns out that the capability level for the observed process area is in the range of CL1 and CL2. Planning Project (PP) is the only process area which reaches capability level 2, meanwhile, PMC and REQM are still in CL 1 or in performed level. This research reveals several weaknesses of existing UMM-USU software. Therefore, this study proposes several recommendations for UMM-USU to improve capability level for observed process areas.

  8. Framework for the quality assurance of 'omics technologies considering GLP requirements.

    PubMed

    Kauffmann, Hans-Martin; Kamp, Hennicke; Fuchs, Regine; Chorley, Brian N; Deferme, Lize; Ebbels, Timothy; Hackermüller, Jörg; Perdichizzi, Stefania; Poole, Alan; Sauer, Ursula G; Tollefsen, Knut E; Tralau, Tewes; Yauk, Carole; van Ravenzwaay, Ben

    2017-12-01

    'Omics technologies are gaining importance to support regulatory toxicity studies. Prerequisites for performing 'omics studies considering GLP principles were discussed at the European Centre for Ecotoxicology and Toxicology of Chemicals (ECETOC) Workshop Applying 'omics technologies in Chemical Risk Assessment. A GLP environment comprises a standard operating procedure system, proper pre-planning and documentation, and inspections of independent quality assurance staff. To prevent uncontrolled data changes, the raw data obtained in the respective 'omics data recording systems have to be specifically defined. Further requirements include transparent and reproducible data processing steps, and safe data storage and archiving procedures. The software for data recording and processing should be validated, and data changes should be traceable or disabled. GLP-compliant quality assurance of 'omics technologies appears feasible for many GLP requirements. However, challenges include (i) defining, storing, and archiving the raw data; (ii) transparent descriptions of data processing steps; (iii) software validation; and (iv) ensuring complete reproducibility of final results with respect to raw data. Nevertheless, 'omics studies can be supported by quality measures (e.g., GLP principles) to ensure quality control, reproducibility and traceability of experiments. This enables regulators to use 'omics data in a fit-for-purpose context, which enhances their applicability for risk assessment. Copyright © 2017 Elsevier Inc. All rights reserved.

  9. Sapc - Application for Adapting Scanned Analogue Photographs to Use Them in Structure from Motion Technology

    NASA Astrophysics Data System (ADS)

    Salach, A.

    2017-05-01

    The documentary value of analogue scanned photographs is invaluable. A large and rich collection of archival photographs is often the only source of information about past of the selected area. This paper presents a method of adaptation of scanned, analogue photographs to suitable form allowing to use them in Structure from Motion technology. For this purpose, an automatic algorithm, implemented in the application called SAPC (Scanned Aerial Photographs Correction), which transforms scans to a form, which characteristic similar to the images captured by a digital camera, was invented. Images, which are created in the applied program as output data, are characterized by the same principal point position in each photo and the same resolution through cutting out the black photo frame. Additionally, SAPC generates a binary image file, which can mask areas of fiducial marks. In the experimental section, scanned, analogue photographs of Warsaw, which had been captured in 1986, were used in two variants: unprocessed and processed in SAPC application. An insightful analysis was conducted on the influence of transformation in SAPC on quality of spatial orientation of photographs. Block adjustment through aerial triangulation was calculated using two SfM software products: Agisoft PhotoScan and Pix4d and their results were compared with results obtained from professional photogrammetric software - Trimble Inpho. The author concluded that pre-processing in SAPC application had a positive impact on a quality of block orientation of scanned, analogue photographs, using SfM technology.

  10. Software technology insertion: A study of success factors

    NASA Technical Reports Server (NTRS)

    Lydon, Tom

    1990-01-01

    Managing software development in large organizations has become increasingly difficult due to increasing technical complexity, stricter government standards, a shortage of experienced software engineers, competitive pressure for improved productivity and quality, the need to co-develop hardware and software together, and the rapid changes in both hardware and software technology. The 'software factory' approach to software development minimizes risks while maximizing productivity and quality through standardization, automation, and training. However, in practice, this approach is relatively inflexible when adopting new software technologies. The methods that a large multi-project software engineering organization can use to increase the likelihood of successful software technology insertion (STI), especially in a standardized engineering environment, are described.

  11. a Low-Cost and Lightweight 3d Interactive Real Estate-Purposed Indoor Virtual Reality Application

    NASA Astrophysics Data System (ADS)

    Ozacar, K.; Ortakci, Y.; Kahraman, I.; Durgut, R.; Karas, I. R.

    2017-11-01

    Interactive 3D architectural indoor design have been more popular after it benefited from Virtual Reality (VR) technologies. VR brings computer-generated 3D content to real life scale and enable users to observe immersive indoor environments so that users can directly modify it. This opportunity enables buyers to purchase a property off-the-plan cheaper through virtual models. Instead of showing property through 2D plan or renders, this visualized interior architecture of an on-sale unbuilt property is demonstrated beforehand so that the investors have an impression as if they were in the physical building. However, current applications either use highly resource consuming software, or are non-interactive, or requires specialist to create such environments. In this study, we have created a real-estate purposed low-cost high quality fully interactive VR application that provides a realistic interior architecture of the property by using free and lightweight software: Sweet Home 3D and Unity. A preliminary study showed that participants generally liked proposed real estate-purposed VR application, and it satisfied the expectation of the property buyers.

  12. Requirements model for an e-Health awareness portal

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Nawi, Mohd Nasrun M.

    2016-08-01

    Requirements engineering is at the heart and foundation of software engineering process. Poor quality requirements inevitably lead to poor quality software solutions. Also, poor requirement modeling is tantamount to designing a poor quality product. So, quality assured requirements development collaborates fine with usable products in giving the software product the needed quality it demands. In the light of the foregoing, the requirements for an e-Ebola Awareness Portal were modeled with a good attention given to these software engineering concerns. The requirements for the e-Health Awareness Portal are modeled as a contribution to the fight against Ebola and helps in the fulfillment of the United Nation's Millennium Development Goal No. 6. In this study requirements were modeled using UML 2.0 modeling technique.

  13. Software quality assurance plan for GCS

    NASA Technical Reports Server (NTRS)

    Duncan, Stephen E.; Bailey, Elizabeth K.

    1990-01-01

    The software quality assurance (SQA) function for the Guidance and Control Software (GCS) project which is part of a software error studies research program is described. The SQA plan outlines all of the procedures, controls, and audits to be carried out by the SQA organization to ensure adherence to the policies, procedures, and standards for the GCS project.

  14. A systematic composite service design modeling method using graph-based theory.

    PubMed

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system.

  15. A Systematic Composite Service Design Modeling Method Using Graph-Based Theory

    PubMed Central

    Elhag, Arafat Abdulgader Mohammed; Mohamad, Radziah; Aziz, Muhammad Waqar; Zeshan, Furkh

    2015-01-01

    The composite service design modeling is an essential process of the service-oriented software development life cycle, where the candidate services, composite services, operations and their dependencies are required to be identified and specified before their design. However, a systematic service-oriented design modeling method for composite services is still in its infancy as most of the existing approaches provide the modeling of atomic services only. For these reasons, a new method (ComSDM) is proposed in this work for modeling the concept of service-oriented design to increase the reusability and decrease the complexity of system while keeping the service composition considerations in mind. Furthermore, the ComSDM method provides the mathematical representation of the components of service-oriented design using the graph-based theoryto facilitate the design quality measurement. To demonstrate that the ComSDM method is also suitable for composite service design modeling of distributed embedded real-time systems along with enterprise software development, it is implemented in the case study of a smart home. The results of the case study not only check the applicability of ComSDM, but can also be used to validate the complexity and reusability of ComSDM. This also guides the future research towards the design quality measurement such as using the ComSDM method to measure the quality of composite service design in service-oriented software system. PMID:25928358

  16. Detailed design of a Ride Quality Augmentation System for commuter aircraft

    NASA Technical Reports Server (NTRS)

    Suikat, Reiner; Donaldson, Kent E.; Downing, David R.

    1989-01-01

    The design of a Ride Quality Augmentation System (RQAS) for commuter aircraft is documented. The RQAS is designed for a Cessna 402B, an 8 passenger prop twin representative to this class of aircraft. The purpose of the RQAS is the reduction of vertical and lateral accelerations of the aircraft due to atmospheric turbulence by the application of active control. The detailed design of the hardware (the aircraft modifications, the Ride Quality Instrumentation System (RQIS), and the required computer software) is examined. The aircraft modifications, consisting of the dedicated control surfaces and the hydraulic actuation system, were designed at Cessna Aircraft by Kansas University-Flight Research Laboratory. The instrumentation system, which consist of the sensor package, the flight computer, a Data Acquisition System, and the pilot and test engineer control panels, was designed by NASA-Langley. The overall system design and the design of the software, both for flight control algorithms and ground system checkout are detailed. The system performance is predicted from linear simulation results and from power spectral densities of the aircraft response to a Dryden gust. The results indicate that both accelerations are possible.

  17. Application of neural networks to software quality modeling of a very large telecommunications system.

    PubMed

    Khoshgoftaar, T M; Allen, E B; Hudepohl, J P; Aud, S J

    1997-01-01

    Society relies on telecommunications to such an extent that telecommunications software must have high reliability. Enhanced measurement for early risk assessment of latent defects (EMERALD) is a joint project of Nortel and Bell Canada for improving the reliability of telecommunications software products. This paper reports a case study of neural-network modeling techniques developed for the EMERALD system. The resulting neural network is currently in the prototype testing phase at Nortel. Neural-network models can be used to identify fault-prone modules for extra attention early in development, and thus reduce the risk of operational problems with those modules. We modeled a subset of modules representing over seven million lines of code from a very large telecommunications software system. The set consisted of those modules reused with changes from the previous release. The dependent variable was membership in the class of fault-prone modules. The independent variables were principal components of nine measures of software design attributes. We compared the neural-network model with a nonparametric discriminant model and found the neural-network model had better predictive accuracy.

  18. Defect measurement and analysis of JPL ground software: a case study

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Spagnuolo, John N., Jr.

    2004-01-01

    Ground software systems at JPL must meet high assurance standards while remaining on schedule due to relatively immovable launch dates for spacecraft that will be controlled by such systems. Toward this end, the Software Quality Improvement (SQI) project's Measurement and Benchmarking (M&B) team is collecting and analyzing defect data of JPL ground system software projects to build software defect prediction models. The aim of these models is to improve predictability with regard to software quality activities. Predictive models will quantitatively define typical trends for JPL ground systems as well as Critical Discriminators (CDs) to provide explanations for atypical deviations from the norm at JPL. CDs are software characteristics that can be estimated or foreseen early in a software project's planning. Thus, these CDs will assist in planning for the predicted degree to which software quality activities for a project are likely to deviation from the normal JPL ground system based on pasted experience across the lab.

  19. Chapter 16: text mining for translational bioinformatics.

    PubMed

    Cohen, K Bretonnel; Hunter, Lawrence E

    2013-04-01

    Text mining for translational bioinformatics is a new field with tremendous research potential. It is a subfield of biomedical natural language processing that concerns itself directly with the problem of relating basic biomedical research to clinical practice, and vice versa. Applications of text mining fall both into the category of T1 translational research-translating basic science results into new interventions-and T2 translational research, or translational research for public health. Potential use cases include better phenotyping of research subjects, and pharmacogenomic research. A variety of methods for evaluating text mining applications exist, including corpora, structured test suites, and post hoc judging. Two basic principles of linguistic structure are relevant for building text mining applications. One is that linguistic structure consists of multiple levels. The other is that every level of linguistic structure is characterized by ambiguity. There are two basic approaches to text mining: rule-based, also known as knowledge-based; and machine-learning-based, also known as statistical. Many systems are hybrids of the two approaches. Shared tasks have had a strong effect on the direction of the field. Like all translational bioinformatics software, text mining software for translational bioinformatics can be considered health-critical and should be subject to the strictest standards of quality assurance and software testing.

  20. BioSig: The Free and Open Source Software Library for Biomedical Signal Processing

    PubMed Central

    Vidaurre, Carmen; Sander, Tilmann H.; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals. PMID:21437227

  1. BioSig: the free and open source software library for biomedical signal processing.

    PubMed

    Vidaurre, Carmen; Sander, Tilmann H; Schlögl, Alois

    2011-01-01

    BioSig is an open source software library for biomedical signal processing. The aim of the BioSig project is to foster research in biomedical signal processing by providing free and open source software tools for many different application areas. Some of the areas where BioSig can be employed are neuroinformatics, brain-computer interfaces, neurophysiology, psychology, cardiovascular systems, and sleep research. Moreover, the analysis of biosignals such as the electroencephalogram (EEG), electrocorticogram (ECoG), electrocardiogram (ECG), electrooculogram (EOG), electromyogram (EMG), or respiration signals is a very relevant element of the BioSig project. Specifically, BioSig provides solutions for data acquisition, artifact processing, quality control, feature extraction, classification, modeling, and data visualization, to name a few. In this paper, we highlight several methods to help students and researchers to work more efficiently with biomedical signals.

  2. Development and case study of a science-based software platform to support policy making on air quality.

    PubMed

    Zhu, Yun; Lao, Yanwen; Jang, Carey; Lin, Chen-Jen; Xing, Jia; Wang, Shuxiao; Fu, Joshua S; Deng, Shuang; Xie, Junping; Long, Shicheng

    2015-01-01

    This article describes the development and implementations of a novel software platform that supports real-time, science-based policy making on air quality through a user-friendly interface. The software, RSM-VAT, uses a response surface modeling (RSM) methodology and serves as a visualization and analysis tool (VAT) for three-dimensional air quality data obtained by atmospheric models. The software features a number of powerful and intuitive data visualization functions for illustrating the complex nonlinear relationship between emission reductions and air quality benefits. The case study of contiguous U.S. demonstrates that the enhanced RSM-VAT is capable of reproducing the air quality model results with Normalized Mean Bias <2% and assisting in air quality policy making in near real time. Copyright © 2014. Published by Elsevier B.V.

  3. Space Shuttle flying qualities and flight control system assessment study, phase 2

    NASA Technical Reports Server (NTRS)

    Myers, T. T.; Johnston, D. E.; Mcruer, D. T.

    1983-01-01

    A program of flying qualities experiments as part of the Orbiter Experiments Program (OEX) is defined. Phase 1, published as CR-170391, reviewed flying qualities criteria and shuttle data. The review of applicable experimental and shuttle data to further define the OEX plan is continued. An unconventional feature of this approach is the use of pilot strategy model identification to relate flight and simulator results. Instrumentation, software, and data analysis techniques for pilot model measurements are examined. The relationship between shuttle characteristics and superaugmented aircraft is established. STS flights 1 through 4 are reviewed from the point of view of flying qualities. A preliminary plan for a coordinated program of inflight and simulator research is presented.

  4. SU-E-T-254: Development of a HDR-BT QA Tool for Verification of Source Position with Oncentra Applicator Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumazaki, Y; Miyaura, K; Hirai, R

    2015-06-15

    Purpose: To develop a High Dose Rate Brachytherapy (HDR-BT) quality assurance (QA) tool for verification of source position with Oncentra applicator modeling, and to report the results of radiation source positions with this tool. Methods: We developed a HDR-BT QA phantom and automated analysis software for verification of source position with Oncentra applicator modeling for the Fletcher applicator used in the MicroSelectron HDR system. This tool is intended for end-to-end tests that mimic the clinical 3D image-guided brachytherapy (3D-IGBT) workflow. The phantom is a 30x30x3 cm cuboid phantom with radiopaque markers, which are inserted into the phantom to evaluate applicatormore » tips and reference source positions; positions are laterally shifted 10 mm from the applicator axis. The markers are lead-based and scatter radiation to expose the films. Gafchromic RTQA2 films are placed on the applicators. The phantom includes spaces to embed the applicators. The source position is determined as the distance between the exposed source position and center position of two pairs of the first radiopaque markers. We generated a 3D-IGBT plan with applicator modeling. The first source position was 6 mm from the applicator tips, and the second source position was 10 mm from the first source position. Results: All source positions were consistent with the exposed positions within 1 mm for all Fletcher applicators using in-house software. Moreover, the distance between source positions was in good agreement with the reference distance. Applicator offset, determined as the distance from the applicator tips at the first source position in the treatment planning system, was accurate. Conclusion: Source position accuracy of applicator modeling used in 3D-IGBT was acceptable. This phantom and software will be useful as a HDR-BT QA tool for verification of source position with Oncentra applicator modeling.« less

  5. CProb: a computational tool for conducting conditional probability analysis.

    PubMed

    Hollister, Jeffrey W; Walker, Henry A; Paul, John F

    2008-01-01

    Conditional probability is the probability of observing one event given that another event has occurred. In an environmental context, conditional probability helps to assess the association between an environmental contaminant (i.e., the stressor) and the ecological condition of a resource (i.e., the response). These analyses, when combined with controlled experiments and other methodologies, show great promise in evaluating ecological conditions from observational data and in defining water quality and other environmental criteria. Current applications of conditional probability analysis (CPA) are largely done via scripts or cumbersome spreadsheet routines, which may prove daunting to end-users and do not provide access to the underlying scripts. Combining spreadsheets with scripts eases computation through a familiar interface (i.e., Microsoft Excel) and creates a transparent process through full accessibility to the scripts. With this in mind, we developed a software application, CProb, as an Add-in for Microsoft Excel with R, R(D)com Server, and Visual Basic for Applications. CProb calculates and plots scatterplots, empirical cumulative distribution functions, and conditional probability. In this short communication, we describe CPA, our motivation for developing a CPA tool, and our implementation of CPA as a Microsoft Excel Add-in. Further, we illustrate the use of our software with two examples: a water quality example and a landscape example. CProb is freely available for download at http://www.epa.gov/emap/nca/html/regions/cprob.

  6. Enhanced Endometriosis Archiving Software (ENEAS): An Application for Storing, Retrieving, Comparing, and Sharing Data of Patients Affected by Endometriosis Integrated in the Daily Practice.

    PubMed

    Centini, Gabriele; Zannoni, Letizia; Lazzeri, Lucia; Buiarelli, Paolo; Limatola, Gianluca; Petraglia, Felice; Seracchioli, Renato; Zupi, Errico

    Endometriosis is a chronic benign disease affecting women of fertile age, associated with pelvic pain and subfertility, often with negative impacts on quality of life. Meetings involving 5 gynecologists skilled in endometriosis management and 2 informatics technology consultants competent in data management and website administration were enlisted to create an endometriosis databank known as ENEAS (Enhanced Endometriosis Archiving Software). This processing system allows users to store, retrieve, compare, and correlate all data collected in conjunction with different Italian endometriosis centers, with the collective objective of obtaining homogeneous data for a large population sample. ENEAS is a web-oriented application that can be based on any open-source database that can be installed locally on a server, allowing collection of data on approximately 700 items, providing standardized and homogeneous data for comparison and analysis. ENEAS is capable of generating a sheet incorporating all data on the management of endometriosis that is both accurate and suitable for each individual patient. ENEAS is an effective and universal web application that encourages providers in the field of endometriosis to use a common language and share data to standardize medical and surgical treatment, with the main benefits being improved patient satisfaction and quality of life. Copyright © 2016 AAGL. Published by Elsevier Inc. All rights reserved.

  7. Evaluating the effect of a third-party implementation of resolution recovery on the quality of SPECT bone scan imaging using visual grading regression.

    PubMed

    Hay, Peter D; Smith, Julie; O'Connor, Richard A

    2016-02-01

    The aim of this study was to evaluate the benefits to SPECT bone scan image quality when applying resolution recovery (RR) during image reconstruction using software provided by a third-party supplier. Bone SPECT data from 90 clinical studies were reconstructed retrospectively using software supplied independent of the gamma camera manufacturer. The current clinical datasets contain 120×10 s projections and are reconstructed using an iterative method with a Butterworth postfilter. Five further reconstructions were created with the following characteristics: 10 s projections with a Butterworth postfilter (to assess intraobserver variation); 10 s projections with a Gaussian postfilter with and without RR; and 5 s projections with a Gaussian postfilter with and without RR. Two expert observers were asked to rate image quality on a five-point scale relative to our current clinical reconstruction. Datasets were anonymized and presented in random order. The benefits of RR on image scores were evaluated using ordinal logistic regression (visual grading regression). The application of RR during reconstruction increased the probability of both observers of scoring image quality as better than the current clinical reconstruction even where the dataset contained half the normal counts. Type of reconstruction and observer were both statistically significant variables in the ordinal logistic regression model. Visual grading regression was found to be a useful method for validating the local introduction of technological developments in nuclear medicine imaging. RR, as implemented by the independent software supplier, improved bone SPECT image quality when applied during image reconstruction. In the majority of clinical cases, acquisition times for bone SPECT intended for the purposes of localization can safely be halved (from 10 s projections to 5 s) when RR is applied.

  8. High-quality remote interactive imaging in the operating theatre

    NASA Astrophysics Data System (ADS)

    Grimstead, Ian J.; Avis, Nick J.; Evans, Peter L.; Bocca, Alan

    2009-02-01

    We present a high-quality display system that enables the remote access within an operating theatre of high-end medical imaging and surgical planning software. Currently, surgeons often use printouts from such software for reference during surgery; our system enables surgeons to access and review patient data in a sterile environment, viewing real-time renderings of MRI & CT data as required. Once calibrated, our system displays shades of grey in Operating Room lighting conditions (removing any gamma correction artefacts). Our system does not require any expensive display hardware, is unobtrusive to the remote workstation and works with any application without requiring additional software licenses. To extend the native 256 levels of grey supported by a standard LCD monitor, we have used the concept of "PseudoGrey" where slightly off-white shades of grey are used to extend the intensity range from 256 to 1,785 shades of grey. Remote access is facilitated by a customized version of UltraVNC, which corrects remote shades of grey for display in the Operating Room. The system is successfully deployed at Morriston Hospital, Swansea, UK, and is in daily use during Maxillofacial surgery. More formal user trials and quantitative assessments are being planned for the future.

  9. Use of a quality improvement tool, the prioritization matrix, to identify and prioritize triage software algorithm enhancement.

    PubMed

    North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg

    2007-10-11

    Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.

  10. The National Library of Medicine Pill Image Recognition Challenge: An Initial Report.

    PubMed

    Yaniv, Ziv; Faruque, Jessica; Howe, Sally; Dunn, Kathel; Sharlip, David; Bond, Andrew; Perillan, Pablo; Bodenreider, Olivier; Ackerman, Michael J; Yoo, Terry S

    2016-10-01

    In January 2016 the U.S. National Library of Medicine announced a challenge competition calling for the development and discovery of high-quality algorithms and software that rank how well consumer images of prescription pills match reference images of pills in its authoritative RxIMAGE collection. This challenge was motivated by the need to easily identify unknown prescription pills both by healthcare personnel and the general public. Potential benefits of this capability include confirmation of the pill in settings where the documentation and medication have been separated, such as in a disaster or emergency; and confirmation of a pill when the prescribed medication changes from brand to generic, or for any other reason the shape and color of the pill change. The data for the competition consisted of two types of images, high quality macro photographs, reference images, and consumer quality photographs of the quality we expect users of a proposed application to acquire. A training dataset consisting of 2000 reference images and 5000 corresponding consumer quality images acquired from 1000 pills was provided to challenge participants. A second dataset acquired from 1000 pills with similar distributions of shape and color was reserved as a segregated testing set. Challenge submissions were required to produce a ranking of the reference images, given a consumer quality image as input. Determination of the winning teams was done using the mean average precision quality metric, with the three winners obtaining mean average precision scores of 0.27, 0.09, and 0.08. In the retrieval results, the correct image was amongst the top five ranked images 43%, 12%, and 11% of the time, out of 5000 query/consumer images. This is an initial promising step towards development of an NLM software system and application-programming interface facilitating pill identification. The training dataset will continue to be freely available online at: http://pir.nlm.nih.gov/challenge/submission.html.

  11. Method for automation of tool preproduction

    NASA Astrophysics Data System (ADS)

    Rychkov, D. A.; Yanyushkin, A. S.; Lobanov, D. V.; Arkhipov, P. V.

    2018-03-01

    The primary objective of tool production is a creation or selection of such tool design which could make it possible to secure high process efficiency, tool availability as well as a quality of received surfaces with minimum means and resources spent on it. It takes much time of application people, being engaged in tool preparation, to make a correct selection of the appropriate tool among the set of variants. Program software has been developed to solve the problem, which helps to create, systematize and carry out a comparative analysis of tool design to identify the rational variant under given production conditions. The literature indicates that systematization and selection of the tool rational design has been carried out in accordance with the developed modeling technology and comparative design analysis. Software application makes it possible to reduce the period of design by 80....85% and obtain a significant annual saving.

  12. Marathon: An Open Source Software Library for the Analysis of Markov-Chain Monte Carlo Algorithms

    PubMed Central

    Rechner, Steffen; Berger, Annabell

    2016-01-01

    We present the software library marathon, which is designed to support the analysis of sampling algorithms that are based on the Markov-Chain Monte Carlo principle. The main application of this library is the computation of properties of so-called state graphs, which represent the structure of Markov chains. We demonstrate applications and the usefulness of marathon by investigating the quality of several bounding methods on four well-known Markov chains for sampling perfect matchings and bipartite graphs. In a set of experiments, we compute the total mixing time and several of its bounds for a large number of input instances. We find that the upper bound gained by the famous canonical path method is often several magnitudes larger than the total mixing time and deteriorates with growing input size. In contrast, the spectral bound is found to be a precise approximation of the total mixing time. PMID:26824442

  13. Radiosurgery planning supported by the GEMSS grid.

    PubMed

    Fenner, J W; Mehrem, R A; Ganesan, V; Riley, S; Middleton, S E; Potter, K; Walton, L

    2005-01-01

    GEMSS (Grid Enabled Medical Simulation Services IST-2001-37153) is an EU project funded to provide a test bed for Grid-enabled health applications. Its purpose is evaluation of Grid computing in the health sector. The health context imposes particular constraints on Grid infrastructure design, and it is this that has driven the feature set of the middleware. In addition to security, the time critical nature of health applications is accommodated by a Quality of Service component, and support for a well defined business model is also included. This paper documents experience of a GEMSS compliant radiosurgery application running within the Medical Physics department at the Royal Hallamshire Hospital in the UK. An outline of the Grid-enabled RAPT radiosurgery application is presented and preliminary experience of its use in the hospital environment is reported. The performance of the software is compared against GammaPlan (an industry standard) and advantages/disadvantages are highlighted. The RAPT software relies on features of the GEMSS middleware that are integral to the success of this application, and together they provide a glimpse of an enabling technology that can impact upon patient management in the 21st century.

  14. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  15. Cone beam computed tomography in veterinary dentistry.

    PubMed

    Van Thielen, Bert; Siguenza, Francis; Hassan, Bassam

    2012-01-01

    The purpose of this study was to assess the feasibility of cone beam computed tomography (CBCT) in imaging dogs and cats for diagnostic dental veterinary applications. CBCT scans of heads of six dogs and two cats were made. Dental panoramic and multi-planar reformatted (MPR) para-sagittal reconstructions were created using specialized software. Image quality and visibility of anatomical landmarks were subjectively assessed by two observers. Good image quality was obtained for the MPR para-sagittal reconstructions through multiple teeth. The image quality of the panoramic reconstructions of dogs was moderate while the panoramic reconstructions of cats were poor since the images were associated with an increased noise level. Segmental panoramic reconstructions of the mouth seem to be useful for studying the dental anatomy especially in dogs. The results of this study using human dental CBCT technology demonstrate the potential of this scanning technology in veterinary medicine. Unfortunately, the moderate image quality obtained with the CBCT technique reported here seems to be inferior to the diagnostic image quality obtained from 2-dimensional dental radiographs. Further research is required to optimize scanning and reconstruction protocols for veterinary applications.

  16. Global land cover mapping and characterization: present situation and future research priorities

    USGS Publications Warehouse

    Giri, Chandra

    2005-01-01

    The availability and accessibility of global land cover data sets plays an important role in many global change studies. The importance of such science‐based information is also reflected in a number of international, regional, and national projects and programs. Recent developments in earth observing satellite technology, information technology, computer hardware and software, and infrastructure development have helped developed better quality land cover data sets. As a result, such data sets are increasingly becoming available, the user‐base is ever widening, application areas have been expanding, and the potential of many other applications are enormous. Yet, we are far from producing high quality global land cover data sets. This paper examines the progress in the development of digital global land cover data, their availability, and current applications. Problems and opportunities are also explained. The overview sets the stage for identifying future research priorities needed for operational land cover assessment and monitoring.

  17. Software process assessments

    NASA Technical Reports Server (NTRS)

    Miller, Sharon E.; Tucker, George T.; Verducci, Anthony J., Jr.

    1992-01-01

    Software process assessments (SPA's) are part of an ongoing program of continuous quality improvements in AT&T. Their use was found to be very beneficial by software development organizations in identifying the issues facing the organization and the actions required to increase both quality and productivity in the organization.

  18. Video Compression

    NASA Technical Reports Server (NTRS)

    1996-01-01

    Optivision developed two PC-compatible boards and associated software under a Goddard Space Flight Center Small Business Innovation Research grant for NASA applications in areas such as telerobotics, telesciences and spaceborne experimentation. From this technology, the company used its own funds to develop commercial products, the OPTIVideo MPEG Encoder and Decoder, which are used for realtime video compression and decompression. They are used in commercial applications including interactive video databases and video transmission. The encoder converts video source material to a compressed digital form that can be stored or transmitted, and the decoder decompresses bit streams to provide high quality playback.

  19. Tesla: An application for real-time data analysis in High Energy Physics

    NASA Astrophysics Data System (ADS)

    Aaij, R.; Amato, S.; Anderlini, L.; Benson, S.; Cattaneo, M.; Clemencic, M.; Couturier, B.; Frank, M.; Gligorov, V. V.; Head, T.; Jones, C.; Komarov, I.; Lupton, O.; Matev, R.; Raven, G.; Sciascia, B.; Skwarnicki, T.; Spradlin, P.; Stahl, S.; Storaci, B.; Vesterinen, M.

    2016-11-01

    Upgrades to the LHCb computing infrastructure in the first long shutdown of the LHC have allowed for high quality decay information to be calculated by the software trigger making a separate offline event reconstruction unnecessary. Furthermore, the storage space of the triggered candidate is an order of magnitude smaller than the entire raw event that would otherwise need to be persisted. Tesla is an application designed to process the information calculated by the trigger, with the resulting output used to directly perform physics measurements.

  20. Shoestring Digital Library: If Existing Digital Library Software Doesn't Suit Your Needs, Create Your Own

    ERIC Educational Resources Information Center

    Weber, Jonathan

    2006-01-01

    Creating a digital library might seem like a task best left to a large research collection with a vast staff and generous budget. However, tools for successfully creating digital libraries are getting easier to use all the time. The explosion of people creating content for the web has led to the availability of many high-quality applications and…

  1. Networking and Information Technology Research and Development. Supplement to the President’s Budget for FY 2002

    DTIC Science & Technology

    2001-07-01

    Web-based applications to improve health data systems and quality of care; innovative strategies for data collection in clinical settings; approaches...research to increase interoperability and integration of software in distributed systems ; protocols and tools for data annotation and management; and...Generation National Defense and National Security Systems .......................... 27 Improved Health Care Systems for All Citizens

  2. Quality by control: Towards model predictive control of mammalian cell culture bioprocesses.

    PubMed

    Sommeregger, Wolfgang; Sissolak, Bernhard; Kandra, Kulwant; von Stosch, Moritz; Mayer, Martin; Striedner, Gerald

    2017-07-01

    The industrial production of complex biopharmaceuticals using recombinant mammalian cell lines is still mainly built on a quality by testing approach, which is represented by fixed process conditions and extensive testing of the end-product. In 2004 the FDA launched the process analytical technology initiative, aiming to guide the industry towards advanced process monitoring and better understanding of how critical process parameters affect the critical quality attributes. Implementation of process analytical technology into the bio-production process enables moving from the quality by testing to a more flexible quality by design approach. The application of advanced sensor systems in combination with mathematical modelling techniques offers enhanced process understanding, allows on-line prediction of critical quality attributes and subsequently real-time product quality control. In this review opportunities and unsolved issues on the road to a successful quality by design and dynamic control implementation are discussed. A major focus is directed on the preconditions for the application of model predictive control for mammalian cell culture bioprocesses. Design of experiments providing information about the process dynamics upon parameter change, dynamic process models, on-line process state predictions and powerful software environments seem to be a prerequisite for quality by control realization. © 2017 The Authors. Biotechnology Journal published by WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Software IV and V Research Priorities and Applied Program Accomplishments Within NASA

    NASA Technical Reports Server (NTRS)

    Blazy, Louis J.

    2000-01-01

    The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering

  4. Stormwater quality modelling in combined sewers: calibration and uncertainty analysis.

    PubMed

    Kanso, A; Chebbo, G; Tassin, B

    2005-01-01

    Estimating the level of uncertainty in urban stormwater quality models is vital for their utilization. This paper presents the results of application of a Monte Carlo Markov Chain method based on the Bayesian theory for the calibration and uncertainty analysis of a storm water quality model commonly used in available software. The tested model uses a hydrologic/hydrodynamic scheme to estimate the accumulation, the erosion and the transport of pollutants on surfaces and in sewers. It was calibrated for four different initial conditions of in-sewer deposits. Calibration results showed large variability in the model's responses in function of the initial conditions. They demonstrated that the model's predictive capacity is very low.

  5. Four applications of a software data collection and analysis methodology

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Selby, Richard W., Jr.

    1985-01-01

    The evaluation of software technologies suffers because of the lack of quantitative assessment of their effect on software development and modification. A seven-step data collection and analysis methodology couples software technology evaluation with software measurement. Four in-depth applications of the methodology are presented. The four studies represent each of the general categories of analyses on the software product and development process: blocked subject-project studies, replicated project studies, multi-project variation studies, and single project strategies. The four applications are in the areas of, respectively, software testing, cleanroom software development, characteristic software metric sets, and software error analysis.

  6. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  7. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  8. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  9. 15 CFR 995.25 - Quality management system.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... management system are those defined in this part. The quality management system must ensure that the... type approved conversion software is maintained by a third party, CEVAD shall ensure that no changes made to the conversion software render the type approval of the conversion software invalid, and shall...

  10. Establishing Quantitative Software Metrics in Department of the Navy Programs

    DTIC Science & Technology

    2016-04-01

    13 Quality to Metrics Dependency Matrix...11 7. Quality characteristics to metrics dependecy matrix...In accomplishing this goal, a need exists for a formalized set of software quality metrics . This document establishes the validity of those necessary

  11. Curatr: a web application for creating, curating and sharing a mass spectral library.

    PubMed

    Palmer, Andrew; Phapale, Prasad; Fay, Dominik; Alexandrov, Theodore

    2018-04-15

    We have developed a web application curatr for the rapid generation of high quality mass spectral fragmentation libraries from liquid-chromatography mass spectrometry datasets. Curatr handles datasets from single or multiplexed standards and extracts chromatographic profiles and potential fragmentation spectra for multiple adducts. An intuitive interface helps users to select high quality spectra that are stored along with searchable molecular information, the providence of each standard and experimental metadata. Curatr supports exports to several standard formats for use with third party software or submission to repositories. We demonstrate the use of curatr to generate the EMBL Metabolomics Core Facility spectral library http://curatr.mcf.embl.de. Source code and example data are at http://github.com/alexandrovteam/curatr/. palmer@embl.de. Supplementary data are available at Bioinformatics online.

  12. [Software for illustrating a cost-quality balance carried out by clinical laboratory practice].

    PubMed

    Nishibori, Masahiro; Asayama, Hitoshi; Kimura, Satoshi; Takagi, Yasushi; Hagihara, Michio; Fujiwara, Mutsunori; Yoneyama, Akiko; Watanabe, Takashi

    2010-09-01

    We have no proper reference indicating the quality of clinical laboratory practice, which should clearly illustrates that better medical tests require more expenses. Japanese Society of Laboratory Medicine was concerned about recent difficult medical economy and issued a committee report proposing a guideline to evaluate the good laboratory practice. According to the guideline, we developed software that illustrate a cost-quality balance carried out by clinical laboratory practice. We encountered a number of controversial problems, for example, how to measure and weight each quality-related factor, how to calculate costs of a laboratory test and how to consider characteristics of a clinical laboratory. Consequently we finished only prototype software within the given period and the budget. In this paper, software implementation of the guideline and the above-mentioned problems are summarized. Aiming to stimulate these discussions, the operative software will be put on the Society's homepage for trial

  13. IMAT graphics manual

    NASA Technical Reports Server (NTRS)

    Stockwell, Alan E.; Cooper, Paul A.

    1991-01-01

    The Integrated Multidisciplinary Analysis Tool (IMAT) consists of a menu driven executive system coupled with a relational database which links commercial structures, structural dynamics and control codes. The IMAT graphics system, a key element of the software, provides a common interface for storing, retrieving, and displaying graphical information. The IMAT Graphics Manual shows users of commercial analysis codes (MATRIXx, MSC/NASTRAN and I-DEAS) how to use the IMAT graphics system to obtain high quality graphical output using familiar plotting procedures. The manual explains the key features of the IMAT graphics system, illustrates their use with simple step-by-step examples, and provides a reference for users who wish to take advantage of the flexibility of the software to customize their own applications.

  14. A survey of Canadian medical physicists: software quality assurance of in‐house software

    PubMed Central

    Kelly, Diane

    2015-01-01

    This paper reports on a survey of medical physicists who write and use in‐house written software as part of their professional work. The goal of the survey was to assess the extent of in‐house software usage and the desire or need for related software quality guidelines. The survey contained eight multiple‐choice questions, a ranking question, and seven free text questions. The survey was sent to medical physicists associated with cancer centers across Canada. The respondents to the survey expressed interest in having guidelines to help them in their software‐related work, but also demonstrated extensive skills in the area of testing, safety, and communication. These existing skills form a basis for medical physicists to establish a set of software quality guidelines. PACS number: 87.55.Qr PMID:25679168

  15. A Guide to Software Evaluation.

    ERIC Educational Resources Information Center

    Leonard, Rex; LeCroy, Barbara

    Arguing that software evaluation is crucial to the quality of courseware available in a school, this paper begins by discussing reasons why microcomputers are making such a tremendous impact on education, and notes that, although the quality of software has improved over the years, the challenge for teachers to integrate computing into the…

  16. 76 FR 54800 - International Business Machines (IBM), Software Group Business Unit, Quality Assurance Group, San...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-02

    ... Machines (IBM), Software Group Business Unit, Quality Assurance Group, San Jose, California; Notice of... workers of International Business Machines (IBM), Software Group Business Unit, Optim Data Studio Tools QA... February 2, 2011 (76 FR 5832). The subject worker group supplies acceptance testing services, design...

  17. Enabling Tools and Methods for International, Inter-disciplinary and Educational Collaboration

    NASA Astrophysics Data System (ADS)

    Robinson, E. M.; Hoijarvi, K.; Falke, S.; Fialkowski, E.; Kieffer, M.; Husar, R. B.

    2008-05-01

    In the past, collaboration has taken place in tightly-knit workgroups where the members had direct connections to each other. Such collaboration was confined to small workgroups and person-to-person communication. Recent developments through the Internet foster virtual workgroups and organizations where dynamic, 'just-in-time' collaboration can take place over a much larger scale. The emergence of virtual workgroups has strongly influenced the interaction of inter-national, inter-disciplinary, as well as educational activities. In this paper we present an array of enabling tools and methods that incorporate the new technologies including web services, software mashups, tag-based structuring and searching, and wikis for collaborative writing and content organization. Large monolithic, 'do-it-all' software tools are giving way to web service modules, combined through service chaining. Application software can now be created using Service Oriented Architecture (SOA). In the air quality community, data providers and users are distributed in space and time creating barriers for data access. By exposing the data on the internet the space, time barriers are lessened. The federated data system, DataFed, developed at Washington University, accesses data from autonomous, distributed providers. Through data "wrappers", DataFed provides uniform and standards-based access services to heterogeneous, distributed data. Service orientation not only lowers the entry resistance for service providers, but it also allows the creation of user-defined applications and/or mashups. For example, Google Earth's open API allowed many groups to mash their content with Google Earth. Ad hoc tagging gives a rich description of the internet resources, but it has the disadvantage of providing a fuzzy schema. The semantic uniformity of the internet resources can be improved by controlled tagging which apply a consistent namespace and tag combinations to diverse objects. One example of this is the photo-sharing web application Flickr. Just like data, by exposing photos through the internet those can be reused in ways unknown and unanticipated by the provider. For air quality application, Flickr allowed a rich collection of images of forest fire smoke, wind blown dust and haze events to be tagged with controlled tags and used in for evaluating subtle features of the events. Wikis, originally used just for collaboratively writing and discuss documents, are now also a social software workflow managers. In air quality data, wikis provides the means to collaboratively create rich metadata. Wikis become a virtual meeting place to discuss ideas before a workshop of conference, display tagged internet resources, and collaboratively work on documents. Wikis are also useful in the classroom. For instance in class projects, the wiki displays harvested resources, maintains collaborative documents and discussions and is the organizational memory for the project.

  18. The Assistant for Specifying the Quality Software (ASQS) Operational Concept Document. Volume 1

    DTIC Science & Technology

    1990-09-01

    Assistant in which the manager supplies system-specific characteristics and needs and the Assistant fills in the software quality concepts and methods. The...member(s) of the Computer Resources Working Group (CRWG) to aid in performing a software quality engineering study. Figure 3.4-1 outlines the...need to recovery from faults more likely than need _o provide alternative functions or interfaces), and more on Autcncmy - 27 - that Modularity

  19. Secure software practices among Malaysian software practitioners: An exploratory study

    NASA Astrophysics Data System (ADS)

    Mohamed, Shafinah Farvin Packeer; Baharom, Fauziah; Deraman, Aziz; Yahya, Jamaiah; Mohd, Haslina

    2016-08-01

    Secure software practices is increasingly gaining much importance among software practitioners and researchers due to the rise of computer crimes in the software industry. It has become as one of the determinant factors for producing high quality software. Even though its importance has been revealed, its current practice in the software industry is still scarce, particularly in Malaysia. Thus, an exploratory study is conducted among software practitioners in Malaysia to study their experiences and practices in the real-world projects. This paper discusses the findings from the study, which involved 93 software practitioners. Structured questionnaire is utilized for data collection purpose whilst statistical methods such as frequency, mean, and cross tabulation are used for data analysis. Outcomes from this study reveal that software practitioners are becoming increasingly aware on the importance of secure software practices, however, they lack of appropriate implementation, which could affect the quality of produced software.

  20. NASA Tech Briefs, April 2003

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Topics include: Tool for Bending a Metal Tube Precisely in a Confined Space; Multiple-Use Mechanisms for Attachment to Seat Tracks; Force-Measuring Clamps; Cellular Pressure-Actuated Joint; Block QCA Fault-Tolerant Logic Gates; Hybrid VLSI/QCA Architecture for Computing FFTs; Arrays of Carbon Nanotubes as RF Filters in Waveguides; Carbon Nanotubes as Resonators for RF Spectrum Analyzers; Software for Viewing Landsat Mosaic Images; Updated Integrated Mission Program; Software for Sharing and Management of Information; Optical-Quality Thin Polymer Membranes; Rollable Thin Shell Composite-Material Paraboloidal Mirrors; Folded Resonant Horns for Power Ultrasonic Applications; Touchdown Ball-Bearing System for Magnetic Bearings; Flux-Based Deadbeat Control of Induction-Motor Torque; Block Copolymers as Templates for Arrays of Carbon Nanotubes; Throttling Cryogen Boiloff To Control Cryostat Temperature; Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station; Turbulence in Supercritical O2/H2 and C7H16/N2 Mixing Layers; and Time-Resolved Measurements in Optoelectronic Microbioanal.

  1. Hydrology for Engineers, Geologists, and Environmental Professionals

    NASA Astrophysics Data System (ADS)

    Ince, Simon

    For people who are involved in the applied aspects of hydrology, it is refreshing to find a textbook that begins with a meaningful disclaimer, albeit in fine print on the back side of the frontispiece:“The present book and the accompanying software have been written according to the latest techniques in scientific hydrology. However, hydrology is at best an inexact science. A good book and a good computer software by themselves do not guarantee accurate or even realistic predictions. Acceptable results in the applications of hydrologic methods to engineering and environmental problems depend to a greater extend (sic) on the skills, logical assumptions, and practical experience of the user, and on the quantity and quality of long-term hydrologic data available. Neither the author nor the publisher assumes any responsibility or any liability, explicitly or implicitly, on the results or the consequences of using the information contained in this book or its accompanying software.”

  2. Experimental evaluation of the optical quality of DMD SLM for its application as Fourier holograms displaying device

    NASA Astrophysics Data System (ADS)

    Molodtsov, D. Y.; Cheremkhin, P. A.; Krasnov, V. V.; Rodin, V. G.

    2016-04-01

    In this paper, the optical quality of micromirror DMD spatial light modulator (SLM) is evaluated and its applicability as an output device for holographic filters in dispersive correlators is analyzed. The possibility of using of DMD SLM extracted from consumer DLP-projector was experimentally evaluated by displaying of Fourier holograms. Software for displaying of holograms was developed. Experiments on holograms reconstruction was conducted with a different number of holograms pixels (and different placement on SLM). Reduction of number of pixels of output hologram (i.e. size of minimum resolvable element) led to improvement of reconstructed image quality. The evaluation shows that not every DMD-chip has acceptable optical quality for its application as display device for Fourier holograms. It was determined that major factor of reconstructed image quality degradation is a curvature of surface of SLM or its safety glass. Ranging hologram size allowed to estimate approximate size of sufficiently flat area of SLM matrix. For tested SLM it was about 1.5 mm. Further hologram size increase led to significant reconstructed image quality degradation. Developed and applied a technique allows to quickly estimate maximum size of holograms that can be displayed with specific SLM without significant degradation of reconstructed image. Additionally it allows to identify areas on the SLM with increased curvature of the surface.

  3. Student satisfaction and loyalty in Denmark: Application of EPSI methodology.

    PubMed

    Shahsavar, Tina; Sudzina, Frantisek

    2017-01-01

    Monitoring and managing customers' satisfaction are key features to benefit from today's competitive environment. In higher education context, only a few studies are available on satisfaction and loyalty of the main customers who are the students, which signifies the need to investigate the field more thoroughly. The aim of this research is to measure the strength of determinants of students' satisfaction and the importance of antecedents in students' satisfaction and loyalty in Denmark. Our research model is the modification of European Performance Satisfaction Index (EPSI), which takes the university's image direct effects on students' expectations into account from students' perspective. The structural equation model of student satisfaction and loyalty has been evaluated using partial least square path modelling. Our findings confirm that the EPSI framework is applicable on student satisfaction and loyalty among Danish universities. We show that all the relationships among variables of the research model are significant except the relationship between quality of software and students' loyalty. Results further verify the significance of antecedents in students' satisfaction and loyalty at Danish universities; the university image and student satisfaction are the antecedents of student loyalty with a significant direct effect, while perceived value, quality of hardware, quality of software, expectations, and university image are antecedents of student satisfaction. Eventually, our findings may be of an inspiration to maintain and improve students' experiences during their study at the university. Dedicating resources to identified important factors from students' perception enable universities to attract more students, make them highly satisfied and loyal.

  4. Parallel processing architecture for H.264 deblocking filter on multi-core platforms

    NASA Astrophysics Data System (ADS)

    Prasad, Durga P.; Sonachalam, Sekar; Kunchamwar, Mangesh K.; Gunupudi, Nageswara Rao

    2012-03-01

    Massively parallel computing (multi-core) chips offer outstanding new solutions that satisfy the increasing demand for high resolution and high quality video compression technologies such as H.264. Such solutions not only provide exceptional quality but also efficiency, low power, and low latency, previously unattainable in software based designs. While custom hardware and Application Specific Integrated Circuit (ASIC) technologies may achieve lowlatency, low power, and real-time performance in some consumer devices, many applications require a flexible and scalable software-defined solution. The deblocking filter in H.264 encoder/decoder poses difficult implementation challenges because of heavy data dependencies and the conditional nature of the computations. Deblocking filter implementations tend to be fixed and difficult to reconfigure for different needs. The ability to scale up for higher quality requirements such as 10-bit pixel depth or a 4:2:2 chroma format often reduces the throughput of a parallel architecture designed for lower feature set. A scalable architecture for deblocking filtering, created with a massively parallel processor based solution, means that the same encoder or decoder will be deployed in a variety of applications, at different video resolutions, for different power requirements, and at higher bit-depths and better color sub sampling patterns like YUV, 4:2:2, or 4:4:4 formats. Low power, software-defined encoders/decoders may be implemented using a massively parallel processor array, like that found in HyperX technology, with 100 or more cores and distributed memory. The large number of processor elements allows the silicon device to operate more efficiently than conventional DSP or CPU technology. This software programing model for massively parallel processors offers a flexible implementation and a power efficiency close to that of ASIC solutions. This work describes a scalable parallel architecture for an H.264 compliant deblocking filter for multi core platforms such as HyperX technology. Parallel techniques such as parallel processing of independent macroblocks, sub blocks, and pixel row level are examined in this work. The deblocking architecture consists of a basic cell called deblocking filter unit (DFU) and dependent data buffer manager (DFM). The DFU can be used in several instances, catering to different performance needs the DFM serves the data required for the different number of DFUs, and also manages all the neighboring data required for future data processing of DFUs. This approach achieves the scalability, flexibility, and performance excellence required in deblocking filters.

  5. Usefulness of the automatic quantitative estimation tool for cerebral blood flow: clinical assessment of the application software tool AQCEL.

    PubMed

    Momose, Mitsuhiro; Takaki, Akihiro; Matsushita, Tsuyoshi; Yanagisawa, Shin; Yano, Kesato; Miyasaka, Tadashi; Ogura, Yuka; Kadoya, Masumi

    2011-01-01

    AQCEL enables automatic reconstruction of single-photon emission computed tomogram (SPECT) without image degradation and quantitative analysis of cerebral blood flow (CBF) after the input of simple parameters. We ascertained the usefulness and quality of images obtained by the application software AQCEL in clinical practice. Twelve patients underwent brain perfusion SPECT using technetium-99m ethyl cysteinate dimer at rest and after acetazolamide (ACZ) loading. Images reconstructed using AQCEL were compared with those reconstructed using conventional filtered back projection (FBP) method for qualitative estimation. Two experienced nuclear medicine physicians interpreted the image quality using the following visual scores: 0, same; 1, slightly superior; 2, superior. For quantitative estimation, the mean CBF values of the normal hemisphere of the 12 patients using ACZ calculated by the AQCEL method were compared with those calculated by the conventional method. The CBF values of the 24 regions of the 3-dimensional stereotaxic region of interest template (3DSRT) calculated by the AQCEL method at rest and after ACZ loading were compared to those calculated by the conventional method. No significant qualitative difference was observed between the AQCEL and conventional FBP methods in the rest study. The average score by the AQCEL method was 0.25 ± 0.45 and that by the conventional method was 0.17 ± 0.39 (P = 0.34). There was a significant qualitative difference between the AQCEL and conventional methods in the ACZ loading study. The average score for AQCEL was 0.83 ± 0.58 and that for the conventional method was 0.08 ± 0.29 (P = 0.003). During quantitative estimation using ACZ, the mean CBF values of 12 patients calculated by the AQCEL method were 3-8% higher than those calculated by the conventional method. The square of the correlation coefficient between these methods was 0.995. While comparing the 24 3DSRT regions of 12 patients, the squares of the correlation coefficient between AQCEL and conventional methods were 0.973 and 0.986 for the normal and affected sides at rest, respectively, and 0.977 and 0.984 for the normal and affected sides after ACZ loading, respectively. The quality of images reconstructed using the application software AQCEL were superior to that obtained using conventional method after ACZ loading, and high correlations were shown in quantity at rest and after ACZ loading. This software can be applied to clinical practice and is a useful tool for improvement of reproducibility and throughput.

  6. MARZ: Manual and automatic redshifting software

    NASA Astrophysics Data System (ADS)

    Hinton, S. R.; Davis, Tamara M.; Lidman, C.; Glazebrook, K.; Lewis, G. F.

    2016-04-01

    The Australian Dark Energy Survey (OzDES) is a 100-night spectroscopic survey underway on the Anglo-Australian Telescope using the fibre-fed 2-degree-field (2dF) spectrograph. We have developed a new redshifting application MARZ with greater usability, flexibility, and the capacity to analyse a wider range of object types than the RUNZ software package previously used for redshifting spectra from 2dF. MARZ is an open-source, client-based, Javascript web-application which provides an intuitive interface and powerful automatic matching capabilities on spectra generated from the AAOmega spectrograph to produce high quality spectroscopic redshift measurements. The software can be run interactively or via the command line, and is easily adaptable to other instruments and pipelines if conforming to the current FITS file standard is not possible. Behind the scenes, a modified version of the AUTOZ cross-correlation algorithm is used to match input spectra against a variety of stellar and galaxy templates, and automatic matching performance for OzDES spectra has increased from 54% (RUNZ) to 91% (MARZ). Spectra not matched correctly by the automatic algorithm can be easily redshifted manually by cycling automatic results, manual template comparison, or marking spectral features.

  7. Helios: History and Anatomy of a Successful In-House Enterprise High-Throughput Screening and Profiling Data Analysis System.

    PubMed

    Gubler, Hanspeter; Clare, Nicholas; Galafassi, Laurent; Geissler, Uwe; Girod, Michel; Herr, Guy

    2018-06-01

    We describe the main characteristics of the Novartis Helios data analysis software system (Novartis, Basel, Switzerland) for plate-based screening and profiling assays, which was designed and built about 11 years ago. It has been in productive use for more than 10 years and is one of the important standard software applications running for a large user community at all Novartis Institutes for BioMedical Research sites globally. A high degree of automation is reached by embedding the data analysis capabilities into a software ecosystem that deals with the management of samples, plates, and result data files, including automated data loading. The application provides a series of analytical procedures, ranging from very simple to advanced, which can easily be assembled by users in very flexible ways. This also includes the automatic derivation of a large set of quality control (QC) characteristics at every step. Any of the raw, intermediate, and final results and QC-relevant quantities can be easily explored through linked visualizations. Links to global assay metadata management, data warehouses, and an electronic lab notebook system are in place. Automated transfer of relevant data to data warehouses and electronic lab notebook systems are also implemented.

  8. Geneious Basic: An integrated and extendable desktop software platform for the organization and analysis of sequence data

    PubMed Central

    Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei

    2012-01-01

    Summary: The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Availability and implementation: Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl. Contact: peter@biomatters.com PMID:22543367

  9. Geneious Basic: an integrated and extendable desktop software platform for the organization and analysis of sequence data.

    PubMed

    Kearse, Matthew; Moir, Richard; Wilson, Amy; Stones-Havas, Steven; Cheung, Matthew; Sturrock, Shane; Buxton, Simon; Cooper, Alex; Markowitz, Sidney; Duran, Chris; Thierer, Tobias; Ashton, Bruce; Meintjes, Peter; Drummond, Alexei

    2012-06-15

    The two main functions of bioinformatics are the organization and analysis of biological data using computational resources. Geneious Basic has been designed to be an easy-to-use and flexible desktop software application framework for the organization and analysis of biological data, with a focus on molecular sequences and related data types. It integrates numerous industry-standard discovery analysis tools, with interactive visualizations to generate publication-ready images. One key contribution to researchers in the life sciences is the Geneious public application programming interface (API) that affords the ability to leverage the existing framework of the Geneious Basic software platform for virtually unlimited extension and customization. The result is an increase in the speed and quality of development of computation tools for the life sciences, due to the functionality and graphical user interface available to the developer through the public API. Geneious Basic represents an ideal platform for the bioinformatics community to leverage existing components and to integrate their own specific requirements for the discovery, analysis and visualization of biological data. Binaries and public API freely available for download at http://www.geneious.com/basic, implemented in Java and supported on Linux, Apple OSX and MS Windows. The software is also available from the Bio-Linux package repository at http://nebc.nerc.ac.uk/news/geneiousonbl.

  10. Computer-aided navigation in dental implantology: 7 years of clinical experience.

    PubMed

    Ewers, Rolf; Schicho, Kurt; Truppe, Michael; Seemann, Rudolf; Reichwein, Astrid; Figl, Michael; Wagner, Arne

    2004-03-01

    This long-term study gives a review over 7 years of research, development, and routine clinical application of computer-aided navigation technology in dental implantology. Benefits and disadvantages of up-to-date technologies are discussed. In the course of the current advancement, various hardware and software configurations are used. In the initial phase, universally applicable navigation software is adapted for implantology. Since 2001, a special software module for dental implantology is available. Preoperative planning is performed on the basis of prosthetic aspects and requirements. In clinical routine use, patient and drill positions are intraoperatively registered by means of optoelectronic tracking systems; during preclinical tests, electromagnetic trackers are also used. In 7 years (1995 to 2002), 55 patients with 327 dental implants were successfully positioned with computer-aided navigation technology. The mean number of implants per patient was 6 (minimum, 1; maximum, 11). No complications were observed; the preoperative planning could be exactly realized. The average expenditure of time for the preparation of a surgical intervention with navigation decreased from 2 to 3 days in the initial phase to one-half day in clinical routine use with software that is optimized for dental implantology. The use of computer-aided navigation technology can contribute to considerable quality improvement. Preoperative planning is exactly realized and intraoperative safety is increased, because damage to nerves or neighboring teeth can be avoided.

  11. Linking data to decision-making: applying qualitative data analysis methods and software to identify mechanisms for using outcomes data.

    PubMed

    Patel, Vaishali N; Riley, Anne W

    2007-10-01

    A multiple case study was conducted to examine how staff in child out-of-home care programs used data from an Outcomes Management System (OMS) and other sources to inform decision-making. Data collection consisted of thirty-seven semi-structured interviews with clinicians, managers, and directors from two treatment foster care programs and two residential treatment centers, and individuals involved with developing the OMS; and observations of clinical and quality management meetings. Case study and grounded theory methodology guided analyses. The application of qualitative data analysis software is described. Results show that although staff rarely used data from the OMS, they did rely on other sources of systematically collected information to inform clinical, quality management, and program decisions. Analyses of how staff used these data suggest that improving the utility of OMS will involve encouraging staff to participate in data-based decision-making, and designing and implementing OMS in a manner that reflects how decision-making processes operate.

  12. SU-G-BRB-02: An Open-Source Software Analysis Library for Linear Accelerator Quality Assurance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kerns, J; Yaldo, D

    Purpose: Routine linac quality assurance (QA) tests have become complex enough to require automation of most test analyses. A new data analysis software library was built that allows physicists to automate routine linear accelerator quality assurance tests. The package is open source, code tested, and benchmarked. Methods: Images and data were generated on a TrueBeam linac for the following routine QA tests: VMAT, starshot, CBCT, machine logs, Winston Lutz, and picket fence. The analysis library was built using the general programming language Python. Each test was analyzed with the library algorithms and compared to manual measurements taken at the timemore » of acquisition. Results: VMAT QA results agreed within 0.1% between the library and manual measurements. Machine logs (dynalogs & trajectory logs) were successfully parsed; mechanical axis positions were verified for accuracy and MLC fluence agreed well with EPID measurements. CBCT QA measurements were within 10 HU and 0.2mm where applicable. Winston Lutz isocenter size measurements were within 0.2mm of TrueBeam’s Machine Performance Check. Starshot analysis was within 0.2mm of the Winston Lutz results for the same conditions. Picket fence images with and without a known error showed that the library was capable of detecting MLC offsets within 0.02mm. Conclusion: A new routine QA software library has been benchmarked and is available for use by the community. The library is open-source and extensible for use in larger systems.« less

  13. Identification of Patient Safety Risks Associated with Electronic Health Records: A Software Quality Perspective.

    PubMed

    Virginio, Luiz A; Ricarte, Ivan Luiz Marques

    2015-01-01

    Although Electronic Health Records (EHR) can offer benefits to the health care process, there is a growing body of evidence that these systems can also incur risks to patient safety when developed or used improperly. This work is a literature review to identify these risks from a software quality perspective. Therefore, the risks were classified based on the ISO/IEC 25010 software quality model. The risks identified were related mainly to the characteristics of "functional suitability" (i.e., software bugs) and "usability" (i.e., interface prone to user error). This work elucidates the fact that EHR quality problems can adversely affect patient safety, resulting in errors such as incorrect patient identification, incorrect calculation of medication dosages, and lack of access to patient data. Therefore, the risks presented here provide the basis for developers and EHR regulating bodies to pay attention to the quality aspects of these systems that can result in patient harm.

  14. Software Process Improvement Journey: IBM Australia Application Management Services

    DTIC Science & Technology

    2005-03-01

    learned from its successes and mistakes and then applied that learning to the next project . 28 CMU/SEI-2005-TR...worldwide re- quirements for project management and quality; it was the organization’s staff members who played a part in the development of the ...environ- ment and that it involves personnel from a variety of areas, ideally not part of the group that developed the technology or process

  15. Evaluation and purchase of an analytical flow cytometer: some of the numerous factors to consider.

    PubMed

    Zucker, Robert M; Fisher, Nancy C

    2013-01-01

    When purchasing a flow cytometer, the decision of which brand, model, specifications, and accessories may be challenging. The decisions should initially be guided by the specific applications intended for the instrument. However, many other factors need to be considered, which include hardware, software, quality assurance, support, service, and price and recommendations from colleagues. These issues are discussed to help guide the purchasing process.

  16. Revealing the ISO/IEC 9126-1 Clique Tree for COTS Software Evaluation

    NASA Technical Reports Server (NTRS)

    Morris, A. Terry

    2007-01-01

    Previous research has shown that acyclic dependency models, if they exist, can be extracted from software quality standards and that these models can be used to assess software safety and product quality. In the case of commercial off-the-shelf (COTS) software, the extracted dependency model can be used in a probabilistic Bayesian network context for COTS software evaluation. Furthermore, while experts typically employ Bayesian networks to encode domain knowledge, secondary structures (clique trees) from Bayesian network graphs can be used to determine the probabilistic distribution of any software variable (attribute) using any clique that contains that variable. Secondary structures, therefore, provide insight into the fundamental nature of graphical networks. This paper will apply secondary structure calculations to reveal the clique tree of the acyclic dependency model extracted from the ISO/IEC 9126-1 software quality standard. Suggestions will be provided to describe how the clique tree may be exploited to aid efficient transformation of an evaluation model.

  17. LA-iMageS: a software for elemental distribution bioimaging using LA-ICP-MS data.

    PubMed

    López-Fernández, Hugo; de S Pessôa, Gustavo; Arruda, Marco A Z; Capelo-Martínez, José L; Fdez-Riverola, Florentino; Glez-Peña, Daniel; Reboiro-Jato, Miguel

    2016-01-01

    The spatial distribution of chemical elements in different types of samples is an important field in several research areas such as biology, paleontology or biomedicine, among others. Elemental distribution imaging by laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) is an effective technique for qualitative and quantitative imaging due to its high spatial resolution and sensitivity. By applying this technique, vast amounts of raw data are generated to obtain high-quality images, essentially making the use of specific LA-ICP-MS imaging software that can process such data absolutely mandatory. Since existing solutions are usually commercial or hard-to-use for average users, this work introduces LA-iMageS, an open-source, free-to-use multiplatform application for fast and automatic generation of high-quality elemental distribution bioimages from LA-ICP-MS data in the PerkinElmer Elan XL format, whose results can be directly exported to external applications for further analysis. A key strength of LA-iMageS is its substantial added value for users, with particular regard to the customization of the elemental distribution bioimages, which allows, among other features, the ability to change color maps, increase image resolution or toggle between 2D and 3D visualizations.

  18. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to set of accepted processes and products for achieving each criterion; (5) Select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  19. A method for tailoring the information content of a software process model

    NASA Technical Reports Server (NTRS)

    Perkins, Sharon; Arend, Mark B.

    1990-01-01

    The framework is defined for a general method for selecting a necessary and sufficient subset of a general software life cycle's information products, to support new software development process. Procedures for characterizing problem domains in general and mapping to a tailored set of life cycle processes and products is presented. An overview of the method is shown using the following steps: (1) During the problem concept definition phase, perform standardized interviews and dialogs between developer and user, and between user and customer; (2) Generate a quality needs profile of the software to be developed, based on information gathered in step 1; (3) Translate the quality needs profile into a profile of quality criteria that must be met by the software to satisfy the quality needs; (4) Map the quality criteria to a set of accepted processes and products for achieving each criterion; (5) select the information products which match or support the accepted processes and product of step 4; and (6) Select the design methodology which produces the information products selected in step 5.

  20. PD5: a general purpose library for primer design software.

    PubMed

    Riley, Michael C; Aubrey, Wayne; Young, Michael; Clare, Amanda

    2013-01-01

    Complex PCR applications for large genome-scale projects require fast, reliable and often highly sophisticated primer design software applications. Presently, such applications use pipelining methods to utilise many third party applications and this involves file parsing, interfacing and data conversion, which is slow and prone to error. A fully integrated suite of software tools for primer design would considerably improve the development time, the processing speed, and the reliability of bespoke primer design software applications. The PD5 software library is an open-source collection of classes and utilities, providing a complete collection of software building blocks for primer design and analysis. It is written in object-oriented C(++) with an emphasis on classes suitable for efficient and rapid development of bespoke primer design programs. The modular design of the software library simplifies the development of specific applications and also integration with existing third party software where necessary. We demonstrate several applications created using this software library that have already proved to be effective, but we view the project as a dynamic environment for building primer design software and it is open for future development by the bioinformatics community. Therefore, the PD5 software library is published under the terms of the GNU General Public License, which guarantee access to source-code and allow redistribution and modification. The PD5 software library is downloadable from Google Code and the accompanying Wiki includes instructions and examples: http://code.google.com/p/primer-design.

  1. A Smart Voltage and Current Monitoring System for Three Phase Inverters Using an Android Smartphone Application

    PubMed Central

    Mnati, Mohannad Jabbar; Van den Bossche, Alex; Chisab, Raad Farhood

    2017-01-01

    In this paper, a new smart voltage and current monitoring system (SVCMS) technique is proposed. It monitors a three phase electrical system using an Arduino platform as a microcontroller to read the voltage and current from sensors and then wirelessly send the measured data to monitor the results using a new Android application. The integrated SVCMS design uses an Arduino Nano V3.0 as the microcontroller to measure the results from three voltage and three current sensors and then send this data, after calculation, to the Android smartphone device of an end user using Bluetooth HC-05. The Arduino Nano V3.0 controller and Bluetooth HC-05 are a cheap microcontroller and wireless device, respectively. The new Android smartphone application that monitors the voltage and current measurements uses the open source MIT App Inventor 2 software. It allows for monitoring some elementary fundamental voltage power quality properties. An effort has been made to investigate what is possible using available off-the-shelf components and open source software. PMID:28420132

  2. The SHIP: A SIP to HTTP Interaction Protocol

    NASA Astrophysics Data System (ADS)

    Zeiß, Joachim; Gabner, Rene; Bessler, Sandford; Happenhofer, Marco

    IMS is capable of providing a wide range of services. As a result, terminal software becomes more and more complex to deliver network intelligence to user applications. Currently mobile terminal software needs to be permanently updated so that the latest network services and functionality can be delivered to the user. In the Internet, browser based user interfaces assure that an interface is made available to the user which offers the latest services in the net immediately. Our approach combines the benefits of the Session Initiation Protocol (SIP) and those of the HTTP protocol to bring the same type of user interfacing to IMS. SIP (IMS) realizes authentication, session management, charging and Quality of Service (QoS), HTTP provides access to Internet services and allows the user interface of an application to run on a mobile terminal while processing and orchestration is done on the server. A SHIP enabled IMS client only needs to handle data transport and session management via SIP, HTTP and RTP and render streaming media, HTML and Javascript. SHIP allows new kinds of applications, which combine audio, video and data within a single multimedia session.

  3. A Smart Voltage and Current Monitoring System for Three Phase Inverters Using an Android Smartphone Application.

    PubMed

    Mnati, Mohannad Jabbar; Van den Bossche, Alex; Chisab, Raad Farhood

    2017-04-15

    In this paper, a new smart voltage and current monitoring system (SVCMS) technique is proposed. It monitors a three phase electrical system using an Arduino platform as a microcontroller to read the voltage and current from sensors and then wirelessly send the measured data to monitor the results using a new Android application. The integrated SVCMS design uses an Arduino Nano V3.0 as the microcontroller to measure the results from three voltage and three current sensors and then send this data, after calculation, to the Android smartphone device of an end user using Bluetooth HC-05. The Arduino Nano V3.0 controller and Bluetooth HC-05 are a cheap microcontroller and wireless device, respectively. The new Android smartphone application that monitors the voltage and current measurements uses the open source MIT App Inventor 2 software. It allows for monitoring some elementary fundamental voltage power quality properties. An effort has been made to investigate what is possible using available off-the-shelf components and open source software.

  4. EEG Recording and Online Signal Processing on Android: A Multiapp Framework for Brain-Computer Interfaces on Smartphone

    PubMed Central

    Debener, Stefan; Emkes, Reiner; Volkening, Nils; Fudickar, Sebastian; Bleichner, Martin G.

    2017-01-01

    Objective Our aim was the development and validation of a modular signal processing and classification application enabling online electroencephalography (EEG) signal processing on off-the-shelf mobile Android devices. The software application SCALA (Signal ProCessing and CLassification on Android) supports a standardized communication interface to exchange information with external software and hardware. Approach In order to implement a closed-loop brain-computer interface (BCI) on the smartphone, we used a multiapp framework, which integrates applications for stimulus presentation, data acquisition, data processing, classification, and delivery of feedback to the user. Main Results We have implemented the open source signal processing application SCALA. We present timing test results supporting sufficient temporal precision of audio events. We also validate SCALA with a well-established auditory selective attention paradigm and report above chance level classification results for all participants. Regarding the 24-channel EEG signal quality, evaluation results confirm typical sound onset auditory evoked potentials as well as cognitive event-related potentials that differentiate between correct and incorrect task performance feedback. Significance We present a fully smartphone-operated, modular closed-loop BCI system that can be combined with different EEG amplifiers and can easily implement other paradigms. PMID:29349070

  5. webpic: A flexible web application for collecting distance and count measurements from images

    PubMed Central

    2018-01-01

    Despite increasing ability to store and analyze large amounts of data for organismal and ecological studies, the process of collecting distance and count measurements from images has largely remained time consuming and error-prone, particularly for tasks for which automation is difficult or impossible. Improving the efficiency of these tasks, which allows for more high quality data to be collected in a shorter amount of time, is therefore a high priority. The open-source web application, webpic, implements common web languages and widely available libraries and productivity apps to streamline the process of collecting distance and count measurements from images. In this paper, I introduce the framework of webpic and demonstrate one readily available feature of this application, linear measurements, using fossil leaf specimens. This application fills the gap between workflows accomplishable by individuals through existing software and those accomplishable by large, unmoderated crowds. It demonstrates that flexible web languages can be used to streamline time-intensive research tasks without the use of specialized equipment or proprietary software and highlights the potential for web resources to facilitate data collection in research tasks and outreach activities with improved efficiency. PMID:29608592

  6. Application of Metamorphic Testing to Supervised Classifiers

    PubMed Central

    Xie, Xiaoyuan; Ho, Joshua; Kaiser, Gail; Xu, Baowen; Chen, Tsong Yueh

    2010-01-01

    Many applications in the field of scientific computing - such as computational biology, computational linguistics, and others - depend on Machine Learning algorithms to provide important core functionality to support solutions in the particular problem domains. However, it is difficult to test such applications because often there is no “test oracle” to indicate what the correct output should be for arbitrary input. To help address the quality of such software, in this paper we present a technique for testing the implementations of supervised machine learning classification algorithms on which such scientific computing software depends. Our technique is based on an approach called “metamorphic testing”, which has been shown to be effective in such cases. More importantly, we demonstrate that our technique not only serves the purpose of verification, but also can be applied in validation. In addition to presenting our technique, we describe a case study we performed on a real-world machine learning application framework, and discuss how programmers implementing machine learning algorithms can avoid the common pitfalls discovered in our study. We also discuss how our findings can be of use to other areas outside scientific computing, as well. PMID:21243103

  7. Automated reuseable components system study results

    NASA Technical Reports Server (NTRS)

    Gilroy, Kathy

    1989-01-01

    The Automated Reusable Components System (ARCS) was developed under a Phase 1 Small Business Innovative Research (SBIR) contract for the U.S. Army CECOM. The objectives of the ARCS program were: (1) to investigate issues associated with automated reuse of software components, identify alternative approaches, and select promising technologies, and (2) to develop tools that support component classification and retrieval. The approach followed was to research emerging techniques and experimental applications associated with reusable software libraries, to investigate the more mature information retrieval technologies for applicability, and to investigate the applicability of specialized technologies to improve the effectiveness of a reusable component library. Various classification schemes and retrieval techniques were identified and evaluated for potential application in an automated library system for reusable components. Strategies for library organization and management, component submittal and storage, and component search and retrieval were developed. A prototype ARCS was built to demonstrate the feasibility of automating the reuse process. The prototype was created using a subset of the classification and retrieval techniques that were investigated. The demonstration system was exercised and evaluated using reusable Ada components selected from the public domain. A requirements specification for a production-quality ARCS was also developed.

  8. EEG Recording and Online Signal Processing on Android: A Multiapp Framework for Brain-Computer Interfaces on Smartphone.

    PubMed

    Blum, Sarah; Debener, Stefan; Emkes, Reiner; Volkening, Nils; Fudickar, Sebastian; Bleichner, Martin G

    2017-01-01

    Our aim was the development and validation of a modular signal processing and classification application enabling online electroencephalography (EEG) signal processing on off-the-shelf mobile Android devices. The software application SCALA (Signal ProCessing and CLassification on Android) supports a standardized communication interface to exchange information with external software and hardware. In order to implement a closed-loop brain-computer interface (BCI) on the smartphone, we used a multiapp framework, which integrates applications for stimulus presentation, data acquisition, data processing, classification, and delivery of feedback to the user. We have implemented the open source signal processing application SCALA. We present timing test results supporting sufficient temporal precision of audio events. We also validate SCALA with a well-established auditory selective attention paradigm and report above chance level classification results for all participants. Regarding the 24-channel EEG signal quality, evaluation results confirm typical sound onset auditory evoked potentials as well as cognitive event-related potentials that differentiate between correct and incorrect task performance feedback. We present a fully smartphone-operated, modular closed-loop BCI system that can be combined with different EEG amplifiers and can easily implement other paradigms.

  9. Free and simple GIS as appropriate for health mapping in a low resource setting: a case study in eastern Indonesia.

    PubMed

    Fisher, Rohan P; Myers, Bronwyn A

    2011-02-25

    Despite the demonstrated utility of GIS for health applications, there are perceived problems in low resource settings: GIS software can be expensive and complex; input data are often of low quality. This study aimed to test the appropriateness of new, inexpensive and simple GIS tools in poorly resourced areas of a developing country. GIS applications were trialled in pilot studies based on mapping of health resources and health indicators at the clinic and district level in the predominantly rural province of Nusa Tenggara Timur in eastern Indonesia. The pilot applications were (i) rapid field collection of health infrastructure data using a GPS enabled PDA, (ii) mapping health indicator data using open source GIS software, and (iii) service availability mapping using a free modelling tool. Through contextualised training, district and clinic staff acquired skills in spatial analysis and visualisation and, six months after the pilot studies, they were using these skills for advocacy in the planning process, to inform the allocation of some health resources, and to evaluate some public health initiatives. We demonstrated that GIS can be a useful and inexpensive tool for the decentralisation of health data analysis to low resource settings through the use of free and simple software, locally relevant training materials and by providing data collection tools to ensure data reliability.

  10. Free and simple GIS as appropriate for health mapping in a low resource setting: a case study in eastern Indonesia

    PubMed Central

    2011-01-01

    Background Despite the demonstrated utility of GIS for health applications, there are perceived problems in low resource settings: GIS software can be expensive and complex; input data are often of low quality. This study aimed to test the appropriateness of new, inexpensive and simple GIS tools in poorly resourced areas of a developing country. GIS applications were trialled in pilot studies based on mapping of health resources and health indicators at the clinic and district level in the predominantly rural province of Nusa Tenggara Timur in eastern Indonesia. The pilot applications were (i) rapid field collection of health infrastructure data using a GPS enabled PDA, (ii) mapping health indicator data using open source GIS software, and (iii) service availability mapping using a free modelling tool. Results Through contextualised training, district and clinic staff acquired skills in spatial analysis and visualisation and, six months after the pilot studies, they were using these skills for advocacy in the planning process, to inform the allocation of some health resources, and to evaluate some public health initiatives. Conclusions We demonstrated that GIS can be a useful and inexpensive tool for the decentralisation of health data analysis to low resource settings through the use of free and simple software, locally relevant training materials and by providing data collection tools to ensure data reliability. PMID:21352553

  11. Advanced techniques and technology for efficient data storage, access, and transfer

    NASA Technical Reports Server (NTRS)

    Rice, Robert F.; Miller, Warner

    1991-01-01

    Advanced techniques for efficiently representing most forms of data are being implemented in practical hardware and software form through the joint efforts of three NASA centers. These techniques adapt to local statistical variations to continually provide near optimum code efficiency when representing data without error. Demonstrated in several earlier space applications, these techniques are the basis of initial NASA data compression standards specifications. Since the techniques clearly apply to most NASA science data, NASA invested in the development of both hardware and software implementations for general use. This investment includes high-speed single-chip very large scale integration (VLSI) coding and decoding modules as well as machine-transferrable software routines. The hardware chips were tested in the laboratory at data rates as high as 700 Mbits/s. A coding module's definition includes a predictive preprocessing stage and a powerful adaptive coding stage. The function of the preprocessor is to optimally process incoming data into a standard form data source that the second stage can handle.The built-in preprocessor of the VLSI coder chips is ideal for high-speed sampled data applications such as imaging and high-quality audio, but additionally, the second stage adaptive coder can be used separately with any source that can be externally preprocessed into the 'standard form'. This generic functionality assures that the applicability of these techniques and their recent high-speed implementations should be equally broad outside of NASA.

  12. Molecular radiotherapy: the NUKFIT software for calculating the time-integrated activity coefficient.

    PubMed

    Kletting, P; Schimmel, S; Kestler, H A; Hänscheid, H; Luster, M; Fernández, M; Bröer, J H; Nosske, D; Lassmann, M; Glatting, G

    2013-10-01

    Calculation of the time-integrated activity coefficient (residence time) is a crucial step in dosimetry for molecular radiotherapy. However, available software is deficient in that it is either not tailored for the use in molecular radiotherapy and/or does not include all required estimation methods. The aim of this work was therefore the development and programming of an algorithm which allows for an objective and reproducible determination of the time-integrated activity coefficient and its standard error. The algorithm includes the selection of a set of fitting functions from predefined sums of exponentials and the choice of an error model for the used data. To estimate the values of the adjustable parameters an objective function, depending on the data, the parameters of the error model, the fitting function and (if required and available) Bayesian information, is minimized. To increase reproducibility and user-friendliness the starting values are automatically determined using a combination of curve stripping and random search. Visual inspection, the coefficient of determination, the standard error of the fitted parameters, and the correlation matrix are provided to evaluate the quality of the fit. The functions which are most supported by the data are determined using the corrected Akaike information criterion. The time-integrated activity coefficient is estimated by analytically integrating the fitted functions. Its standard error is determined assuming Gaussian error propagation. The software was implemented using MATLAB. To validate the proper implementation of the objective function and the fit functions, the results of NUKFIT and SAAM numerical, a commercially available software tool, were compared. The automatic search for starting values was successfully tested for reproducibility. The quality criteria applied in conjunction with the Akaike information criterion allowed the selection of suitable functions. Function fit parameters and their standard error estimated by using SAAM numerical and NUKFIT showed differences of <1%. The differences for the time-integrated activity coefficients were also <1% (standard error between 0.4% and 3%). In general, the application of the software is user-friendly and the results are mathematically correct and reproducible. An application of NUKFIT is presented for three different clinical examples. The software tool with its underlying methodology can be employed to objectively and reproducibly estimate the time integrated activity coefficient and its standard error for most time activity data in molecular radiotherapy.

  13. The role of open-source software in innovation and standardization in radiology.

    PubMed

    Erickson, Bradley J; Langer, Steve; Nagy, Paul

    2005-11-01

    The use of open-source software (OSS), in which developers release the source code to applications they have developed, is popular in the software industry. This is done to allow others to modify and improve software (which may or may not be shared back to the community) and to allow others to learn from the software. Radiology was an early participant in this model, supporting OSS that implemented the ACR-National Electrical Manufacturers Association (now Digital Imaging and Communications in Medicine) standard for medical image communications. In radiology and in other fields, OSS has promoted innovation and the adoption of standards. Popular OSS is of high quality because access to source code allows many people to identify and resolve errors. Open-source software is analogous to the peer-review scientific process: one must be able to see and reproduce results to understand and promote what is shared. The authors emphasize that support for OSS need not threaten vendors; most vendors embrace and benefit from standards. Open-source development does not replace vendors but more clearly defines their roles, typically focusing on areas in which proprietary differentiators benefit customers and on professional services such as implementation planning and service. Continued support for OSS is essential for the success of our field.

  14. g-PRIME: A Free, Windows Based Data Acquisition and Event Analysis Software Package for Physiology in Classrooms and Research Labs.

    PubMed

    Lott, Gus K; Johnson, Bruce R; Bonow, Robert H; Land, Bruce R; Hoy, Ronald R

    2009-01-01

    We present g-PRIME, a software based tool for physiology data acquisition, analysis, and stimulus generation in education and research. This software was developed in an undergraduate neurophysiology course and strongly influenced by instructor and student feedback. g-PRIME is a free, stand-alone, windows application coded and "compiled" in Matlab (does not require a Matlab license). g-PRIME supports many data acquisition interfaces from the PC sound card to expensive high throughput calibrated equipment. The program is designed as a software oscilloscope with standard trigger modes, multi-channel visualization controls, and data logging features. Extensive analysis options allow real time and offline filtering of signals, multi-parameter threshold-and-window based event detection, and two-dimensional display of a variety of parameters including event time, energy density, maximum FFT frequency component, max/min amplitudes, and inter-event rate and intervals. The software also correlates detected events with another simultaneously acquired source (event triggered average) in real time or offline. g-PRIME supports parameter histogram production and a variety of elegant publication quality graphics outputs. A major goal of this software is to merge powerful engineering acquisition and analysis tools with a biological approach to studies of nervous system function.

  15. OpenROCS: a software tool to control robotic observatories

    NASA Astrophysics Data System (ADS)

    Colomé, Josep; Sanz, Josep; Vilardell, Francesc; Ribas, Ignasi; Gil, Pere

    2012-09-01

    We present the Open Robotic Observatory Control System (OpenROCS), an open source software platform developed for the robotic control of telescopes. It acts as a software infrastructure that executes all the necessary processes to implement responses to the system events that appear in the routine and non-routine operations associated to data-flow and housekeeping control. The OpenROCS software design and implementation provides a high flexibility to be adapted to different observatory configurations and event-action specifications. It is based on an abstract model that is independent of the specific hardware or software and is highly configurable. Interfaces to the system components are defined in a simple manner to achieve this goal. We give a detailed description of the version 2.0 of this software, based on a modular architecture developed in PHP and XML configuration files, and using standard communication protocols to interface with applications for hardware monitoring and control, environment monitoring, scheduling of tasks, image processing and data quality control. We provide two examples of how it is used as the core element of the control system in two robotic observatories: the Joan Oró Telescope at the Montsec Astronomical Observatory (Catalonia, Spain) and the SuperWASP Qatar Telescope at the Roque de los Muchachos Observatory (Canary Islands, Spain).

  16. A software tool of digital tomosynthesis application for patient positioning in radiotherapy.

    PubMed

    Yan, Hui; Dai, Jian-Rong

    2016-03-08

    Digital Tomosynthesis (DTS) is an image modality in reconstructing tomographic images from two-dimensional kV projections covering a narrow scan angles. Comparing with conventional cone-beam CT (CBCT), it requires less time and radiation dose in data acquisition. It is feasible to apply this technique in patient positioning in radiotherapy. To facilitate its clinical application, a software tool was developed and the reconstruction processes were accelerated by graphic process-ing unit (GPU). Two reconstruction and two registration processes are required for DTS application which is different from conventional CBCT application which requires one image reconstruction process and one image registration process. The reconstruction stage consists of productions of two types of DTS. One type of DTS is reconstructed from cone-beam (CB) projections covering a narrow scan angle and is named onboard DTS (ODTS), which represents the real patient position in treatment room. Another type of DTS is reconstructed from digitally reconstructed radiography (DRR) and is named reference DTS (RDTS), which represents the ideal patient position in treatment room. Prior to the reconstruction of RDTS, The DRRs are reconstructed from planning CT using the same acquisition setting of CB projections. The registration stage consists of two matching processes between ODTS and RDTS. The target shift in lateral and longitudinal axes are obtained from the matching between ODTS and RDTS in coronal view, while the target shift in longitudinal and vertical axes are obtained from the matching between ODTS and RDTS in sagittal view. In this software, both DRR and DTS reconstruction algorithms were implemented on GPU environments for acceleration purpose. The comprehensive evaluation of this software tool was performed including geometric accuracy, image quality, registration accuracy, and reconstruction efficiency. The average correlation coefficient between DRR/DTS generated by GPU-based algorithm and CPU-based algorithm is 0.99. Based on the measurements of cube phantom on DTS, the geometric errors are within 0.5 mm in three axes. For both cube phantom and pelvic phantom, the registration errors are within 0.5 mm in three axes. Compared with reconstruction performance of CPU-based algorithms, the performances of DRR and DTS reconstructions are improved by a factor of 15 to 20. A GPU-based software tool was developed for DTS application for patient positioning of radiotherapy. The geometric and registration accuracy met the clinical requirement in patient setup of radiotherapy. The high performance of DRR and DTS reconstruction algorithms was achieved by the GPU-based computation environments. It is a useful software tool for researcher and clinician in evaluating DTS application in patient positioning of radiotherapy.

  17. Development of the quality assessment model of EHR software in family medicine practices: research based on user satisfaction.

    PubMed

    Kralj, Damir; Kern, Josipa; Tonkovic, Stanko; Koncar, Miroslav

    2015-09-09

    Family medicine practices (FMPs) make the basis for the Croatian health care system. Use of electronic health record (EHR) software is mandatory and it plays an important role in running these practices, but important functional features still remain uneven and largely left to the will of the software developers. The objective of this study was to develop a novel and comprehensive model for functional evaluation of the EHR software in FMPs, based on current world standards, models and projects, as well as on actual user satisfaction and requirements. Based on previous theoretical and experimental research in this area, we made the initial framework model consisting of six basic categories as a base for online survey questionnaire. Family doctors assessed perceived software quality by using a five-point Likert-type scale. Using exploratory factor analysis and appropriate statistical methods over the collected data, the final optimal structure of the novel model was formed. Special attention was focused on the validity and quality of the novel model. The online survey collected a total of 384 cases. The obtained results indicate both the quality of the assessed software and the quality in use of the novel model. The intense ergonomic orientation of the novel measurement model was particularly emphasised. The resulting novel model is multiple validated, comprehensive and universal. It could be used to assess the user-perceived quality of almost all forms of the ambulatory EHR software and therefore useful to all stakeholders in this area of the health care informatisation.

  18. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  19. A framework for assessing the adequacy and effectiveness of software development methodologies

    NASA Technical Reports Server (NTRS)

    Arthur, James D.; Nance, Richard E.

    1990-01-01

    Tools, techniques, environments, and methodologies dominate the software engineering literature, but relatively little research in the evaluation of methodologies is evident. This work reports an initial attempt to develop a procedural approach to evaluating software development methodologies. Prominent in this approach are: (1) an explication of the role of a methodology in the software development process; (2) the development of a procedure based on linkages among objectives, principles, and attributes; and (3) the establishment of a basis for reduction of the subjective nature of the evaluation through the introduction of properties. An application of the evaluation procedure to two Navy methodologies has provided consistent results that demonstrate the utility and versatility of the evaluation procedure. Current research efforts focus on the continued refinement of the evaluation procedure through the identification and integration of product quality indicators reflective of attribute presence, and the validation of metrics supporting the measure of those indicators. The consequent refinement of the evaluation procedure offers promise of a flexible approach that admits to change as the field of knowledge matures. In conclusion, the procedural approach presented in this paper represents a promising path toward the end goal of objectively evaluating software engineering methodologies.

  20. Modernized build and test infrastructure for control software at ESO: highly flexible building, testing, and automatic quality practices for telescope control software

    NASA Astrophysics Data System (ADS)

    Pellegrin, F.; Jeram, B.; Haucke, J.; Feyrin, S.

    2016-07-01

    The paper describes the introduction of a new automatized build and test infrastructure, based on the open-source software Jenkins1, into the ESO Very Large Telescope control software to replace the preexisting in-house solution. A brief introduction to software quality practices is given, a description of the previous solution, the limitations of it and new upcoming requirements. Modifications required to adapt the new system are described, how these were implemented to current software and the results obtained. An overview on how the new system may be used in future projects is also presented.

  1. Improving the quality of care of patients with rheumatic disease using patient-centric electronic redesign software.

    PubMed

    Newman, Eric D; Lerch, Virginia; Billet, Jon; Berger, Andrea; Kirchner, H Lester

    2015-04-01

    Electronic health records (EHRs) are not optimized for chronic disease management. To improve the quality of care for patients with rheumatic disease, we developed electronic data capture, aggregation, display, and documentation software. The software integrated and reassembled information from the patient (via a touchscreen questionnaire), nurse, physician, and EHR into a series of actionable views. Core functions included trends over time, rheumatology-related demographics, and documentation for patient and provider. Quality measures collected included patient-reported outcomes, disease activity, and function. The software was tested and implemented in 3 rheumatology departments, and integrated into routine care delivery. Post-implementation evaluation measured adoption, efficiency, productivity, and patient perception. Over 2 years, 6,725 patients completed 19,786 touchscreen questionnaires. The software was adopted for use by 86% of patients and rheumatologists. Chart review and documentation time trended downward, and productivity increased by 26%. Patient satisfaction, activation, and adherence remained unchanged, although pre-implementation values were high. A strong correlation was seen between use of the software and disease control (weighted Pearson's correlation coefficient 0.5927, P = 0.0095), and a relative increase in patients with low disease activity of 3% per quarter was noted. We describe innovative software that aggregates, stores, and displays information vital to improving the quality of care for patients with chronic rheumatic disease. The software was well-adopted by patients and providers. Post-implementation, significant improvements in quality of care, efficiency of care, and productivity were demonstrated. Copyright © 2015 by the American College of Rheumatology.

  2. [A review on studies and applications of near infrared spectroscopy technique(NIRS) in detecting quality of hay].

    PubMed

    Ding, Wu-Rong; Gan, You-Min; Guo, Xu-Sheng; Yang, Fu-Yu

    2009-02-01

    The quality of hay can directly affect the price of hay and also livestock productivity. Many kinds of methods have been developed for detecting the quality of hay and the method of near infrared spectroscopy (NIRS) has been widely used with consideration of its fast, effective and nondestructive characteristics during detecting process. In the present paper, the feasibility and effectiveness of application of NIRS to detecting hay quality were expounded. Meanwhile, the advance in the study of using NIRS to detect chemical compositions, extent of incursion by epiphyte, amount of toxicant excreted by endogenetic epiphyte and some minim components that can not be detected by using chemical methods were also introduced detailedly. Based on the review of the progresses in using NIRS to detect the quality of hay, it can be concluded that using NIRS to detect hay quality can avoid the disadvantages of time wasting, complication and high cost when using traditional chemical method. And for better utilization of NIRS in practice, some more studies still need to be implemented to further perfect and improve the utilization of NIRS for detecting forage quality, and more accurate modes and systematic analysis software need to be established in times to come.

  3. Online tools for uncovering data quality issues in satellite-based global precipitation products

    NASA Astrophysics Data System (ADS)

    Liu, Z.; Heo, G.

    2015-12-01

    Accurate and timely available global precipitation products are important to many applications such as flood forecasting, hydrological modeling, vector-borne disease research, crop yield estimates, etc. However, data quality issues such as biases and uncertainties are common in satellite-based precipitation products and it is important to understand these issues in applications. In recent years, algorithms using multi-satellites and multi-sensors for satellite-based precipitation estimates have become popular, such as the TRMM (Tropical Rainfall Measuring Mission) Multi-satellite Precipitation Analysis (TMPA) and the latest Integrated Multi-satellitE Retrievals for GPM (IMERG). Studies show that data quality issues for multi-satellite and multi-sensor products can vary with space and time and can be difficult to summarize. Online tools can provide customized results for a given area of interest, allowing customized investigation or comparison on several precipitation products. Because downloading data and software is not required, online tools can facilitate precipitation product evaluation and comparison. In this presentation, we will present online tools to uncover data quality issues in satellite-based global precipitation products. Examples will be presented as well.

  4. Software Quality Assurance and Controls Standard

    DTIC Science & Technology

    2010-04-27

    Software Quality Assurance d C t l St d dan on ro s an ar Sue Carroll Principal Software Quality Analyst, SAS John Wal z VP Technology and...for Information Operations and Reports, 1215 Jefferson Davis Highway, Suite 1204, Arlington VA 22202-4302. Respondents should be aware that...Cycle (SLC) process? • What is in a SQA Process? • Where are SQA Controls? • What is the SQA standards history? Wh t i h i i SQA?• a s c ang ng n

  5. Simple solution to the medical instrumentation software problem

    NASA Astrophysics Data System (ADS)

    Leif, Robert C.; Leif, Suzanne B.; Leif, Stephanie H.; Bingue, E.

    1995-04-01

    Medical devices now include a substantial software component, which is both difficult and expensive to produce and maintain. Medical software must be developed according to `Good Manufacturing Practices', GMP. Good Manufacturing Practices as specified by the FDA and ISO requires the definition and compliance to a software processes which ensures quality products by specifying a detailed method of software construction. The software process should be based on accepted standards. US Department of Defense software standards and technology can both facilitate the development and improve the quality of medical systems. We describe the advantages of employing Mil-Std-498, Software Development and Documentation, and the Ada programming language. Ada provides the very broad range of functionalities, from embedded real-time to management information systems required by many medical devices. It also includes advanced facilities for object oriented programming and software engineering.

  6. [Instruments in Brazilian Sign Language for assessing the quality of life of the deaf population].

    PubMed

    Chaveiro, Neuma; Duarte, Soraya Bianca Reis; Freitas, Adriana Ribeiro de; Barbosa, Maria Alves; Porto, Celmo Celeno; Fleck, Marcelo Pio de Almeida

    2013-06-01

    To construct versions of the WHOQOL-BREF and WHOQOL-DIS instruments in Brazilian sign language to evaluate the Brazilian deaf population's quality of life. The methodology proposed by the World Health Organization (WHOQOL-BREF and WHOQOL-DIS) was used to construct instruments adapted to the deaf community using Brazilian Sign Language (Libras). The research for constructing the instrument took placein 13 phases: 1) creating the QUALITY OF LIFE sign; 2) developing the answer scales in Libras; 3) translation by a bilingual group; 4) synthesized version; 5) first back translation; 6) production of the version in Libras to be provided to the focal groups; 7) carrying out the Focal Groups; 8) review by a monolingual group; 9) revision by the bilingual group; 10) semantic/syntactic analysis and second back translation; 11) re-evaluation of the back translation by the bilingual group; 12) recording the version into the software; 13) developing the WHOQOL-BREF and WHOQOL-DIS software in Libras. Characteristics peculiar to the culture of the deaf population indicated the necessity of adapting the application methodology of focal groups composed of deaf people. The writing conventions of sign languages have not yet been consolidated, leading to difficulties in graphically registering the translation phases. Linguistics structures that caused major problems in translation were those that included idiomatic Portuguese expressions, for many of which there are no equivalent concepts between Portuguese and Libras. In the end, it was possible to create WHOQOL-BREF and WHOQOL-DIS software in Libras. The WHOQOL-BREF and the WHOQOL-DIS in Libras will allow the deaf to express themselves about their quality of life in an autonomous way, making it possible to investigate these issues more accurately.

  7. Programming Language Software For Graphics Applications

    NASA Technical Reports Server (NTRS)

    Beckman, Brian C.

    1993-01-01

    New approach reduces repetitive development of features common to different applications. High-level programming language and interactive environment with access to graphical hardware and software created by adding graphical commands and other constructs to standardized, general-purpose programming language, "Scheme". Designed for use in developing other software incorporating interactive computer-graphics capabilities into application programs. Provides alternative to programming entire applications in C or FORTRAN, specifically ameliorating design and implementation of complex control and data structures typifying applications with interactive graphics. Enables experimental programming and rapid development of prototype software, and yields high-level programs serving as executable versions of software-design documentation.

  8. Projecting manpower to attain quality

    NASA Technical Reports Server (NTRS)

    Rone, K. Y.

    1983-01-01

    The resulting model is useful as a projection tool but must be validated in order to be used as an on-going software cost engineering tool. A procedure is developed to facilitate the tracking of model projections and actual data to allow the model to be tuned. Finally, since the model must be used in an environment of overlapping development activities on a progression of software elements in development and maintenance, a manpower allocation model is developed for use in a steady state development/maintenance environment. In these days of soaring software costs it becomes increasingly important to properly manage a software development project. One element of the management task is the projection and tracking of manpower required to perform the task. In addition, since the total cost of the task is directly related to the initial quality built into the software, it becomes a necessity to project the development manpower in a way to attain that quality. An approach to projecting and tracking manpower with quality in mind is described.

  9. Automated delineation and characterization of watersheds for more than 3,000 surface-water-quality monitoring stations active in 2010 in Texas

    USGS Publications Warehouse

    Archuleta, Christy-Ann M.; Gonzales, Sophia L.; Maltby, David R.

    2012-01-01

    The U.S. Geological Survey (USGS), in cooperation with the Texas Commission on Environmental Quality, developed computer scripts and applications to automate the delineation of watershed boundaries and compute watershed characteristics for more than 3,000 surface-water-quality monitoring stations in Texas that were active during 2010. Microsoft Visual Basic applications were developed using ArcGIS ArcObjects to format the source input data required to delineate watershed boundaries. Several automated scripts and tools were developed or used to calculate watershed characteristics using Python, Microsoft Visual Basic, and the RivEX tool. Automated methods were augmented by the use of manual methods, including those done using ArcMap software. Watershed boundaries delineated for the monitoring stations are limited to the extent of the Subbasin boundaries in the USGS Watershed Boundary Dataset, which may not include the total watershed boundary from the monitoring station to the headwaters.

  10. Control of Space-Based Electron Beam Free Form Fabrication

    NASA Technical Reports Server (NTRS)

    Seifzer. W. J.; Taminger, K. M.

    2007-01-01

    Engineering a closed-loop control system for an electron beam welder for space-based additive manufacturing is challenging. For earth and space based applications, components must work in a vacuum and optical components become occluded with metal vapor deposition. For extraterrestrial applications added components increase launch weight, increase complexity, and increase space flight certification efforts. Here we present a software tool that closely couples path planning and E-beam parameter controls into the build process to increase flexibility. In an environment where data collection hinders real-time control, another approach is considered that will still yield a high quality build.

  11. orthAgogue: an agile tool for the rapid prediction of orthology relations.

    PubMed

    Ekseth, Ole Kristian; Kuiper, Martin; Mironov, Vladimir

    2014-03-01

    The comparison of genes and gene products across species depends on high-quality tools to determine the relationships between gene or protein sequences from various species. Although some excellent applications are available and widely used, their performance leaves room for improvement. We developed orthAgogue: a multithreaded C application for high-speed estimation of homology relations in massive datasets, operated via a flexible and easy command-line interface. The orthAgogue software is distributed under the GNU license. The source code and binaries compiled for Linux are available at https://code.google.com/p/orthagogue/.

  12. WinHPC System Software | High-Performance Computing | NREL

    Science.gov Websites

    Software WinHPC System Software Learn about the software applications, tools, toolchains, and for industrial applications. Intel Compilers Development Tool, Toolchain Suite featuring an industry

  13. SNAPSHOT: A MODERN, SUSTAINABLE HOLDUP MEASUREMENT SYSTEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rowe, Nathan C; Younkin, James R; Smith, Steven E

    2016-01-01

    SNAPSHOT is a software platform designed to eventually replace Holdup Measurement System 4 (HMS 4), which is the current state-of-the-art for acquisition and analysis of nondestructive assay measurement data for in situ nuclear materials, holdup, in support of criticality safety and material control and accounting. HMS 4 is over 10 years old and is currently unsustainable due to hardware and software incompatibilities that have arisen from advances in detector electronics, primarily updates to multi-channel analyzers (MCAs), and both computer and handheld operating systems. SNAPSHOT is a complete redesign of HMS 4 that addresses the issue of compatibility with modern MCAsmore » and operating systems and that is designed with a flexible architecture to support long-term sustainability. It also provides an updated and more user friendly interface and is being developed under an NQA 1 software quality assurance (SQA) program to facilitate site acceptance for safety-related applications. This paper provides an overview of the SNAPSHOT project including details of the software development process, the SQA program, and the architecture designed to support sustainability.« less

  14. SPOT4 Operational Control Center (CMP)

    NASA Technical Reports Server (NTRS)

    Zaouche, G.

    1993-01-01

    CNES(F) is responsible for the development of a new generation of Operational Control Center (CMP) which will operate the new heliosynchronous remote sensing satellite (SPOT4). This Operational Control Center takes large benefit from the experience of the first generation of control center and from the recent advances in computer technology and standards. The CMP is designed for operating two satellites all the same time with a reduced pool of controllers. The architecture of this CMP is simple, robust, and flexible, since it is based on powerful distributed workstations interconnected through an Ethernet LAN. The application software uses modern and formal software engineering methods, in order to improve quality and reliability, and facilitate maintenance. This software is table driven so it can be easily adapted to other operational needs. Operation tasks are automated to the maximum extent, so that it could be possible to operate the CMP automatically with very limited human interference for supervision and decision making. This paper provides an overview of the SPOTS mission and associated ground segment. It also details the CMP, its functions, and its software and hardware architecture.

  15. An intelligent maximum permissible exposure meter for safety assessments of laser radiation

    NASA Astrophysics Data System (ADS)

    Corder, D. A.; Evans, D. R.; Tyrer, J. R.

    1996-09-01

    There is frequently a need to make laser power or energy density measurements when determining whether radiation from a laser system exceeds the Maximum Permissible Exposure (MPE) as defined in BS EN 60825. This can be achieved using standard commercially available laser power or energy measurement equipment, but some of these have shortcomings when used in this application. Calculations must be performed by the user to compare the measured value to the MPE. The measurement and calculation procedure appears complex to the nonexpert who may be performing the assessment. A novel approach is described which uses purpose designed hardware and software to simplify the process. The hardware is optimized for measuring the relatively low powers associated with MPEs. The software runs on a Psion Series 3a palmtop computer. This reduces the cost and size of the system yet allows graphical and numerical presentation of data. Data output to other software running on PCs is also possible, enabling the instrument to be used as part of a quality system. Throughout the measurement process the opportunity for user error has been minimized by the hardware and software design.

  16. SIMULATION TOOL KIT FOR INDOOR AIR QUALITY AND INHALATION EXPOSURE (IAQX) VERSION 1.0 USER'S GUIDE

    EPA Science Inventory

    The User's Guide describes a Microsoft Windows-based indoor air quality (IAQ) simulation software package designed Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or IAQX for short. This software complements and supplements existing IAQ simulation programs and...

  17. Test Analysis Tools to Ensure Higher Quality of On-Board Real Time Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Boudillet, O.; Mescam, J.-C.; Dalemagne, D.

    2008-08-01

    EADS Astrium Space Transportation, in its Les Mureaux premises, is responsible for the French M51 nuclear deterrent missile onboard SW. There was also developed over 1 million of line of code, mostly in ADA, for the Automated Transfer Vehicle (ATV) onboard SW and the flight control SW of the ARIANE5 launcher which has put it into orbit. As part of the ATV SW, ASTRIUM ST has developed the first Category A SW ever qualified for a European space application. To ensure that all these embedded SW have been developed with the highest quality and reliability level, specific development tools have been designed to cover the steps of source code verification, automated validation test or complete target instruction coverage verification. Three of such dedicated tools are presented here.

  18. Digital radiography: optimization of image quality and dose using multi-frequency software.

    PubMed

    Precht, H; Gerke, O; Rosendahl, K; Tingberg, A; Waaler, D

    2012-09-01

    New developments in processing of digital radiographs (DR), including multi-frequency processing (MFP), allow optimization of image quality and radiation dose. This is particularly promising in children as they are believed to be more sensitive to ionizing radiation than adults. To examine whether the use of MFP software reduces the radiation dose without compromising quality at DR of the femur in 5-year-old-equivalent anthropomorphic and technical phantoms. A total of 110 images of an anthropomorphic phantom were imaged on a DR system (Canon DR with CXDI-50 C detector and MLT[S] software) and analyzed by three pediatric radiologists using Visual Grading Analysis. In addition, 3,500 images taken of a technical contrast-detail phantom (CDRAD 2.0) provide an objective image-quality assessment. Optimal image-quality was maintained at a dose reduction of 61% with MLT(S) optimized images. Even for images of diagnostic quality, MLT(S) provided a dose reduction of 88% as compared to the reference image. Software impact on image quality was found significant for dose (mAs), dynamic range dark region and frequency band. By optimizing image processing parameters, a significant dose reduction is possible without significant loss of image quality.

  19. VoroMQA: Assessment of protein structure quality using interatomic contact areas.

    PubMed

    Olechnovič, Kliment; Venclovas, Česlovas

    2017-06-01

    In the absence of experimentally determined protein structure many biological questions can be addressed using computational structural models. However, the utility of protein structural models depends on their quality. Therefore, the estimation of the quality of predicted structures is an important problem. One of the approaches to this problem is the use of knowledge-based statistical potentials. Such methods typically rely on the statistics of distances and angles of residue-residue or atom-atom interactions collected from experimentally determined structures. Here, we present VoroMQA (Voronoi tessellation-based Model Quality Assessment), a new method for the estimation of protein structure quality. Our method combines the idea of statistical potentials with the use of interatomic contact areas instead of distances. Contact areas, derived using Voronoi tessellation of protein structure, are used to describe and seamlessly integrate both explicit interactions between protein atoms and implicit interactions of protein atoms with solvent. VoroMQA produces scores at atomic, residue, and global levels, all in the fixed range from 0 to 1. The method was tested on the CASP data and compared to several other single-model quality assessment methods. VoroMQA showed strong performance in the recognition of the native structure and in the structural model selection tests, thus demonstrating the efficacy of interatomic contact areas in estimating protein structure quality. The software implementation of VoroMQA is freely available as a standalone application and as a web server at http://bioinformatics.lt/software/voromqa. Proteins 2017; 85:1131-1145. © 2017 Wiley Periodicals, Inc. © 2017 Wiley Periodicals, Inc.

  20. Quality Traceability System of Traditional Chinese Medicine Based on Two Dimensional Barcode Using Mobile Intelligent Technology.

    PubMed

    Cai, Yong; Li, Xiwen; Wang, Runmiao; Yang, Qing; Li, Peng; Hu, Hao

    2016-01-01

    Currently, the chemical fingerprint comparison and analysis is mainly based on professional equipment and software, it's expensive and inconvenient. This study aims to integrate QR (Quick Response) code with quality data and mobile intelligent technology to develop a convenient query terminal for tracing quality in the whole industrial chain of TCM (traditional Chinese medicine). Three herbal medicines were randomly selected and their chemical two-dimensional barcode (2D) barcodes fingerprints were constructed. Smartphone application (APP) based on Android system was developed to read initial data of 2D chemical barcodes, and compared multiple fingerprints from different batches of same species or different species. It was demonstrated that there were no significant differences between original and scanned TCM chemical fingerprints. Meanwhile, different TCM chemical fingerprint QR codes could be rendered in the same coordinate and showed the differences very intuitively. To be able to distinguish the variations of chemical fingerprint more directly, linear interpolation angle cosine similarity algorithm (LIACSA) was proposed to get similarity ratio. This study showed that QR codes can be used as an effective information carrier to transfer quality data. Smartphone application can rapidly read quality information in QR codes and convert data into TCM chemical fingerprints.

  1. Unisys' experience in software quality and productivity management of an existing system

    NASA Technical Reports Server (NTRS)

    Munson, John B.

    1988-01-01

    A summary of Quality Improvement techniques, implementation, and results in the maintenance, management, and modification of large software systems for the Space Shuttle Program's ground-based systems is provided.

  2. Quality and standardization of telecommunication switching system software

    NASA Astrophysics Data System (ADS)

    Ranko, K.; Hivensaio, J.; Myllykangas, A.

    1981-12-01

    The purpose of this paper has been to illustrate quality and standardization of switching system software from the authors point of view with the aim of developing standardization in the user environment.

  3. Application of desktop computers in nuclear engineering education

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Graves, H.W. Jr.

    1990-01-01

    Utilization of desktop computers in the academic environment is based on the same objectives as in the industrial environment - increased quality and efficiency. Desktop computers can be extremely useful teaching tools in two general areas: classroom demonstrations and homework assignments. Although differences in emphasis exist, tutorial programs share many characteristics with interactive software developed for the industrial environment. In the Reactor Design and Fuel Management course at the University of Maryland, several interactive tutorial programs provided by Energy analysis Software Service have been utilized. These programs have been designed to be sufficiently structured to permit an orderly, disciplined solutionmore » to the problem being solved, and yet be flexible enough to accommodate most problem solution options.« less

  4. Analysis of methods of processing of expert information by optimization of administrative decisions

    NASA Astrophysics Data System (ADS)

    Churakov, D. Y.; Tsarkova, E. G.; Marchenko, N. D.; Grechishnikov, E. V.

    2018-03-01

    In the real operation the measure definition methodology in case of expert estimation of quality and reliability of application-oriented software products is offered. In operation methods of aggregation of expert estimates on the example of a collective choice of an instrumental control projects in case of software development of a special purpose for needs of institutions are described. Results of operation of dialogue decision making support system are given an algorithm of the decision of the task of a choice on the basis of a method of the analysis of hierarchies and also. The developed algorithm can be applied by development of expert systems to the solution of a wide class of the tasks anyway connected to a multicriteria choice.

  5. A Submersible Holographic Camera for the Undisturbed Characterization of Optically Relevant Particles in Water (HOLOCAM)

    DTIC Science & Technology

    2012-09-01

    how to improve both reconstruction and analytical software during testing of the submersible system. IMPACT AND APPLICATIONS Quality of Life...project (see related projects below). It could also be used for sediment load monitoring and assesment . The HOLOCAM could provide critical data to any...Science Education and Communication Currently the link between the suspended particle field and the bulk scattering properties of natural waters is

  6. ’Pushing a Big Rock Up a Steep Hill’: Acquisition Lessons Learned from DoD Applications Storefront

    DTIC Science & Technology

    2014-04-30

    software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver software from a central...automated delivery of software patches, web applications, widgets, and mobile application packages. The envisioned application store will deliver... mobile technologies, hoping to enhance warfighter situational awareness and access to information. Unfortunately, the Defense Acquisition System has not

  7. Organizational management practices for achieving software process improvement

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2004-01-01

    The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.

  8. Air Quality Monitor

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Stak-Tracker CEM (Continuous Emission Monitor) Gas Analyzer is an air quality monitor capable of separating the various gases in a bulk exhaust stream and determining the amounts of individual gases present within the stream. The monitor is produced by GE Reuter- Stokes, a subsidiary of GE Corporate Research & Development Center. The Stak-Tracker uses a Langley Research Center software package which measures the concentration of a target gas by determining the degree to which molecules of that gas absorb an infrared beam. The system is environmental-friendly, fast and has relatively low installation and maintenance costs. It is applicable to gas turbines and various industries including glass, paper and cement.

  9. Proceedings of Tenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.

  10. Some issues in numerical simulation of nonlinear structural response

    NASA Technical Reports Server (NTRS)

    Hibbitt, H. D.

    1989-01-01

    The development of commercial finite element software is addressed. This software provides practical tools that are used in an astonishingly wide range of engineering applications that include critical aspects of the safety evaluation of nuclear power plants or of heavily loaded offshore structures in the hostile environments of the North Sea or the Arctic, major design activities associated with the development of airframes for high strength and minimum weight, thermal analysis of electronic components, and the design of sports equipment. In the more advanced application areas, the effectiveness of the product depends critically on the quality of the mechanics and mechanics related algorithms that are implemented. Algorithmic robustness is of primary concern. Those methods that should be chosen will maximize reliability with minimal understanding on the part of the user. Computational efficiency is also important because there are always limited resources, and hence problems that are too time consuming or costly. Finally, some areas where research work will provide new methods and improvements is discussed.

  11. An optimized web-based approach for collaborative stereoscopic medical visualization

    PubMed Central

    Kaspar, Mathias; Parsad, Nigel M; Silverstein, Jonathan C

    2013-01-01

    Objective Medical visualization tools have traditionally been constrained to tethered imaging workstations or proprietary client viewers, typically part of hospital radiology systems. To improve accessibility to real-time, remote, interactive, stereoscopic visualization and to enable collaboration among multiple viewing locations, we developed an open source approach requiring only a standard web browser with no added client-side software. Materials and Methods Our collaborative, web-based, stereoscopic, visualization system, CoWebViz, has been used successfully for the past 2 years at the University of Chicago to teach immersive virtual anatomy classes. It is a server application that streams server-side visualization applications to client front-ends, comprised solely of a standard web browser with no added software. Results We describe optimization considerations, usability, and performance results, which make CoWebViz practical for broad clinical use. We clarify technical advances including: enhanced threaded architecture, optimized visualization distribution algorithms, a wide range of supported stereoscopic presentation technologies, and the salient theoretical and empirical network parameters that affect our web-based visualization approach. Discussion The implementations demonstrate usability and performance benefits of a simple web-based approach for complex clinical visualization scenarios. Using this approach overcomes technical challenges that require third-party web browser plug-ins, resulting in the most lightweight client. Conclusions Compared to special software and hardware deployments, unmodified web browsers enhance remote user accessibility to interactive medical visualization. Whereas local hardware and software deployments may provide better interactivity than remote applications, our implementation demonstrates that a simplified, stable, client approach using standard web browsers is sufficient for high quality three-dimensional, stereoscopic, collaborative and interactive visualization. PMID:23048008

  12. Analysis of Single-Frequency GNSS Data for Determination of Time-Dependent Flow and Deformation of Fast-Moving Glaciers

    NASA Astrophysics Data System (ADS)

    Davis, J. L.; Elosegui, P.; Nettles, M.

    2012-12-01

    Single-frequency GNSS data has not generally been used for high-accuracy geodetic applications since the 1990s, but there are significant advantages if single-frequency GNSS receivers can be usefully deployed for studies of fast-moving outlet glaciers. The cost for these receivers is significantly lower (~50%) than for dual-frequency receivers, a significant benefit given the high spatial density at which these system are deployed on the glacier and the high risk for damage or loss in the glacial environment. In addition, the size of the data files that need to be transferred from extremely remote locations, often at very slow transmission rates, is significantly reduced. Consideration of single-frequency systems for this application is viable because of the relatively small extent (< 50 km) of the entire network to be deployed. Unfortunately, the availability of research-quality software that can perform kinematic solutions on single-frequency data is limited. We have developed the BAKAR software employing a stochastic filter to analyze single-frequency GNSS data. The software can implement a range of stochastic models for time-dependent site position. In this presentation, we describe the BAKAR software, and discuss its strengths and weaknesses. On one hand, chief among the challenges we have encountered are determination of accurate prior positions, and bursts of polar ionospheric activity that impede cycle-slip detection, even over intersite distances as short as 10 km. On the other hand, use of a single-frequency observable is theoretically less sensitive to multipath and signal scattering. We will quantitatively assess these effects, and assess the accuracy of BAKAR in a range of situations and applications.

  13. Current perspectives of CASA applications in diverse mammalian spermatozoa.

    PubMed

    van der Horst, Gerhard; Maree, Liana; du Plessis, Stefan S

    2018-03-26

    Since the advent of computer-aided sperm analysis (CASA) some four decades ago, advances in computer technology and software algorithms have helped establish it as a research and diagnostic instrument for the analysis of spermatozoa. Despite mammalian spermatozoa being the most diverse cell type known, CASA is a great tool that has the capacity to provide rapid, reliable and objective quantitative assessment of sperm quality. This paper provides contemporary research findings illustrating the scientific and commercial applications of CASA and its ability to evaluate diverse mammalian spermatozoa (human, primates, rodents, domestic mammals, wildlife species) at both structural and functional levels. The potential of CASA to quantitatively measure essential aspects related to sperm subpopulations, hyperactivation, morphology and morphometry is also demonstrated. Furthermore, applications of CASA are provided for improved mammalian sperm quality assessment, evaluation of sperm functionality and the effect of different chemical substances or pathologies on sperm fertilising ability. It is clear that CASA has evolved significantly and is currently superior to many manual techniques in the research and clinical setting.

  14. Forensic-metrological considerations on assessment of compliance (or non-compliance) in forensic blood alcohol content determinations: A case study with software application.

    PubMed

    Zamengo, Luca; Frison, Giampietro; Tedeschi, Gianpaola; Frasson, Samuela

    2016-08-01

    Blood alcohol concentration is the most frequent analytical determination carried out in forensic toxicology laboratories worldwide. It is usually required to assess if an offence has been committed by comparing blood alcohol levels with specified legal limits, which can vary widely among countries. Due to possible serious legal consequences associated with non-compliant alcohol levels, measurement uncertainty should be carefully evaluated, along with other metrological aspects which can influence the final result. The whole procedure can be time-consuming and error-generating in routine practice, increasing the risks for unreliable assessments. A software application named Ethanol WorkBook (EtWB) was developed at the author's laboratory by using Visual Basic for Application language and MS Excel(®), with the aim of providing help to forensic analysts involved in blood alcohol determinations. The program can (i) calculate measurement uncertainties and decision limits with different methodologies; (ii) assess compliance to specification limits with a guard-band approach; (iii) manage quality control (QC) data and create control charts for QC samples; (iv) create control maps from real cases data archives; (v) provide laboratory reports with graphical outputs for elaborated data and (vi) create comprehensive searchable case archives. A typical example of drink driving case is presented and discussed to illustrate the importance of a metrological approach for reliable compliance assessment and to demonstrate software application in routine practice. The tool is made freely available to the scientific community at request. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  15. Usability Prediction & Ranking of SDLC Models Using Fuzzy Hierarchical Usability Model

    NASA Astrophysics Data System (ADS)

    Gupta, Deepak; Ahlawat, Anil K.; Sagar, Kalpna

    2017-06-01

    Evaluation of software quality is an important aspect for controlling and managing the software. By such evaluation, improvements in software process can be made. The software quality is significantly dependent on software usability. Many researchers have proposed numbers of usability models. Each model considers a set of usability factors but do not cover all the usability aspects. Practical implementation of these models is still missing, as there is a lack of precise definition of usability. Also, it is very difficult to integrate these models into current software engineering practices. In order to overcome these challenges, this paper aims to define the term `usability' using the proposed hierarchical usability model with its detailed taxonomy. The taxonomy considers generic evaluation criteria for identifying the quality components, which brings together factors, attributes and characteristics defined in various HCI and software models. For the first time, the usability model is also implemented to predict more accurate usability values. The proposed system is named as fuzzy hierarchical usability model that can be easily integrated into the current software engineering practices. In order to validate the work, a dataset of six software development life cycle models is created and employed. These models are ranked according to their predicted usability values. This research also focuses on the detailed comparison of proposed model with the existing usability models.

  16. A new full-field digital mammography system with and without the use of an advanced post-processing algorithm: comparison of image quality and diagnostic performance.

    PubMed

    Ahn, Hye Shin; Kim, Sun Mi; Jang, Mijung; Yun, Bo La; Kim, Bohyoung; Ko, Eun Sook; Han, Boo-Kyung; Chang, Jung Min; Yi, Ann; Cho, Nariya; Moon, Woo Kyung; Choi, Hye Young

    2014-01-01

    To compare new full-field digital mammography (FFDM) with and without use of an advanced post-processing algorithm to improve image quality, lesion detection, diagnostic performance, and priority rank. During a 22-month period, we prospectively enrolled 100 cases of specimen FFDM mammography (Brestige®), which was performed alone or in combination with a post-processing algorithm developed by the manufacturer: group A (SMA), specimen mammography without application of "Mammogram enhancement ver. 2.0"; group B (SMB), specimen mammography with application of "Mammogram enhancement ver. 2.0". Two sets of specimen mammographies were randomly reviewed by five experienced radiologists. Image quality, lesion detection, diagnostic performance, and priority rank with regard to image preference were evaluated. Three aspects of image quality (overall quality, contrast, and noise) of the SMB were significantly superior to those of SMA (p < 0.05). SMB was significantly superior to SMA for visualizing calcifications (p < 0.05). Diagnostic performance, as evaluated by cancer score, was similar between SMA and SMB. SMB was preferred to SMA by four of the five reviewers. The post-processing algorithm may improve image quality with better image preference in FFDM than without use of the software.

  17. Application of microscopy technique and high performance liquid chromatography for quality assessment of Polygonum multiflorum Thunb. (Heshouwu)

    PubMed Central

    Liang, Li; Zhao, Zhongzhen; Kang, Tingguo

    2014-01-01

    Background: The technique of microscopy has been applied for identification of Chinese materia medica (CMM) since decades. However, very few scientific publications report the combination of conventional microscopy and high performance liquid chromatography (HPLC) techniques for further application to quality assessment of CMM. Objective: The objective of this study is to analyze the quality of the dried root tuber of Polygonum multiflorum Thunb. (Heshouwu) and to establish the relationships between 2,3,5,4’-tetrahydroxystilbene-2-O-β-glucoside, combined anthraquinone (CAQ) and quantity of clusters of calcium oxalate. Materials and Methods: In this study, microscopy and HPLC techniques were applied to assess the quality of P. multiflorum Thunb., and SPSS software was used to establish the relationship between microscopic characteristics and chemical components. Results: The results showed close and direct correlations between the quantity of clusters of calcium oxalate in P. multiflorum Thunb. and the contents of 2,3,5,4’-tetrahydroxystilbene-2-O-β-glucoside and CAQ. From these results, it can be deduced that Polygoni Multiflori Radix with a higher quantity of clusters of calcium oxalate should be of better quality. Conclusion: The established method can be helpful for evaluating the quality of CMM based upon the identification and quantitation of chemical and ergastic substance of cells. PMID:25422540

  18. Spectrum analysis on quality requirements consideration in software design documents.

    PubMed

    Kaiya, Haruhiko; Umemura, Masahiro; Ogata, Shinpei; Kaijiri, Kenji

    2013-12-01

    Software quality requirements defined in the requirements analysis stage should be implemented in the final products, such as source codes and system deployment. To guarantee this meta-requirement, quality requirements should be considered in the intermediate stages, such as the design stage or the architectural definition stage. We propose a novel method for checking whether quality requirements are considered in the design stage. In this method, a technique called "spectrum analysis for quality requirements" is applied not only to requirements specifications but also to design documents. The technique enables us to derive the spectrum of a document, and quality requirements considerations in the document are numerically represented in the spectrum. We can thus objectively identify whether the considerations of quality requirements in a requirements document are adapted to its design document. To validate the method, we applied it to commercial software systems with the help of a supporting tool, and we confirmed that the method worked well.

  19. Student satisfaction and loyalty in Denmark: Application of EPSI methodology

    PubMed Central

    Shahsavar, Tina

    2017-01-01

    Monitoring and managing customers’ satisfaction are key features to benefit from today’s competitive environment. In higher education context, only a few studies are available on satisfaction and loyalty of the main customers who are the students, which signifies the need to investigate the field more thoroughly. The aim of this research is to measure the strength of determinants of students’ satisfaction and the importance of antecedents in students’ satisfaction and loyalty in Denmark. Our research model is the modification of European Performance Satisfaction Index (EPSI), which takes the university’s image direct effects on students’ expectations into account from students’ perspective. The structural equation model of student satisfaction and loyalty has been evaluated using partial least square path modelling. Our findings confirm that the EPSI framework is applicable on student satisfaction and loyalty among Danish universities. We show that all the relationships among variables of the research model are significant except the relationship between quality of software and students’ loyalty. Results further verify the significance of antecedents in students’ satisfaction and loyalty at Danish universities; the university image and student satisfaction are the antecedents of student loyalty with a significant direct effect, while perceived value, quality of hardware, quality of software, expectations, and university image are antecedents of student satisfaction. Eventually, our findings may be of an inspiration to maintain and improve students’ experiences during their study at the university. Dedicating resources to identified important factors from students’ perception enable universities to attract more students, make them highly satisfied and loyal. PMID:29240801

  20. Clinical software development for the Web: lessons learned from the BOADICEA project

    PubMed Central

    2012-01-01

    Background In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. Results We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. Conclusions We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback. PMID:22490389

  1. Clinical software development for the Web: lessons learned from the BOADICEA project.

    PubMed

    Cunningham, Alex P; Antoniou, Antonis C; Easton, Douglas F

    2012-04-10

    In the past 20 years, society has witnessed the following landmark scientific advances: (i) the sequencing of the human genome, (ii) the distribution of software by the open source movement, and (iii) the invention of the World Wide Web. Together, these advances have provided a new impetus for clinical software development: developers now translate the products of human genomic research into clinical software tools; they use open-source programs to build them; and they use the Web to deliver them. Whilst this open-source component-based approach has undoubtedly made clinical software development easier, clinical software projects are still hampered by problems that traditionally accompany the software process. This study describes the development of the BOADICEA Web Application, a computer program used by clinical geneticists to assess risks to patients with a family history of breast and ovarian cancer. The key challenge of the BOADICEA Web Application project was to deliver a program that was safe, secure and easy for healthcare professionals to use. We focus on the software process, problems faced, and lessons learned. Our key objectives are: (i) to highlight key clinical software development issues; (ii) to demonstrate how software engineering tools and techniques can facilitate clinical software development for the benefit of individuals who lack software engineering expertise; and (iii) to provide a clinical software development case report that can be used as a basis for discussion at the start of future projects. We developed the BOADICEA Web Application using an evolutionary software process. Our approach to Web implementation was conservative and we used conventional software engineering tools and techniques. The principal software development activities were: requirements, design, implementation, testing, documentation and maintenance. The BOADICEA Web Application has now been widely adopted by clinical geneticists and researchers. BOADICEA Web Application version 1 was released for general use in November 2007. By May 2010, we had > 1200 registered users based in the UK, USA, Canada, South America, Europe, Africa, Middle East, SE Asia, Australia and New Zealand. We found that an evolutionary software process was effective when we developed the BOADICEA Web Application. The key clinical software development issues identified during the BOADICEA Web Application project were: software reliability, Web security, clinical data protection and user feedback.

  2. QRev—Software for computation and quality assurance of acoustic doppler current profiler moving-boat streamflow measurements—Technical manual for version 2.8

    USGS Publications Warehouse

    Mueller, David S.

    2016-06-21

    The software program, QRev applies common and consistent computational algorithms combined with automated filtering and quality assessment of the data to improve the quality and efficiency of streamflow measurements and helps ensure that U.S. Geological Survey streamflow measurements are consistent, accurate, and independent of the manufacturer of the instrument used to make the measurement. Software from different manufacturers uses different algorithms for various aspects of the data processing and discharge computation. The algorithms used by QRev to filter data, interpolate data, and compute discharge are documented and compared to the algorithms used in the manufacturers’ software. QRev applies consistent algorithms and creates a data structure that is independent of the data source. QRev saves an extensible markup language (XML) file that can be imported into databases or electronic field notes software. This report is the technical manual for version 2.8 of QRev.

  3. Early experiences building a software quality prediction model

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  4. The evaluation of a 2D diode array in "magic phantom" for use in high dose rate brachytherapy pretreatment quality assurance.

    PubMed

    Espinoza, A; Petasecca, M; Fuduli, I; Howie, A; Bucci, J; Corde, S; Jackson, M; Lerch, M L F; Rosenfeld, A B

    2015-02-01

    High dose rate (HDR) brachytherapy is a treatment method that is used increasingly worldwide. The development of a sound quality assurance program for the verification of treatment deliveries can be challenging due to the high source activity utilized and the need for precise measurements of dwell positions and times. This paper describes the application of a novel phantom, based on a 2D 11 × 11 diode array detection system, named "magic phantom" (MPh), to accurately measure plan dwell positions and times, compare them directly to the treatment plan, determine errors in treatment delivery, and calculate absorbed dose. The magic phantom system was CT scanned and a 20 catheter plan was generated to simulate a nonspecific treatment scenario. This plan was delivered to the MPh and, using a custom developed software suite, the dwell positions and times were measured and compared to the plan. The original plan was also modified, with changes not disclosed to the primary authors, and measured again using the device and software to determine the modifications. A new metric, the "position-time gamma index," was developed to quantify the quality of a treatment delivery when compared to the treatment plan. The MPh was evaluated to determine the minimum measurable dwell time and step size. The incorporation of the TG-43U1 formalism directly into the software allows for dose calculations to be made based on the measured plan. The estimated dose distributions calculated by the software were compared to the treatment plan and to calibrated EBT3 film, using the 2D gamma analysis method. For the original plan, the magic phantom system was capable of measuring all dwell points and dwell times and the majority were found to be within 0.93 mm and 0.25 s, respectively, from the plan. By measuring the altered plan and comparing it to the unmodified treatment plan, the use of the position-time gamma index showed that all modifications made could be readily detected. The MPh was able to measure dwell times down to 0.067 ± 0.001 s and planned dwell positions separated by 1 mm. The dose calculation carried out by the MPh software was found to be in agreement with values calculated by the treatment planning system within 0.75%. Using the 2D gamma index, the dose map of the MPh plane and measured EBT3 were found to have a pass rate of over 95% when compared to the original plan. The application of this magic phantom quality assurance system to HDR brachytherapy has demonstrated promising ability to perform the verification of treatment plans, based upon the measured dwell positions and times. The introduction of the quantitative position-time gamma index allows for direct comparison of measured parameters against the plan and could be used prior to patient treatment to ensure accurate delivery. © 2015 American Association of Physicists in Medicine.

  5. Landsat surface reflectance quality assurance extraction (version 1.7)

    USGS Publications Warehouse

    Jones, J.W.; Starbuck, M.J.; Jenkerson, Calli B.

    2013-01-01

    The U.S. Geological Survey (USGS) Land Remote Sensing Program is developing an operational capability to produce Climate Data Records (CDRs) and Essential Climate Variables (ECVs) from the Landsat Archive to support a wide variety of science and resource management activities from regional to global scale. The USGS Earth Resources Observation and Science (EROS) Center is charged with prototyping systems and software to generate these high-level data products. Various USGS Geographic Science Centers are charged with particular ECV algorithm development and (or) selection as well as the evaluation and application demonstration of various USGS CDRs and ECVs. Because it is a foundation for many other ECVs, the first CDR in development is the Landsat Surface Reflectance Product (LSRP). The LSRP incorporates data quality information in a bit-packed structure that is not readily accessible without postprocessing services performed by the user. This document describes two general methods of LSRP quality-data extraction for use in image processing systems. Helpful hints for the installation and use of software originally developed for manipulation of Hierarchical Data Format (HDF) produced through the National Aeronautics and Space Administration (NASA) Earth Observing System are first provided for users who wish to extract quality data into separate HDF files. Next, steps follow to incorporate these extracted data into an image processing system. Finally, an alternative example is illustrated in which the data are extracted within a particular image processing system.

  6. Software Quality Control at Belle II

    NASA Astrophysics Data System (ADS)

    Ritter, M.; Kuhr, T.; Hauth, T.; Gebard, T.; Kristof, M.; Pulvermacher, C.; Belle Software Group, II

    2017-10-01

    Over the last seven years the software stack of the next generation B factory experiment Belle II has grown to over one million lines of C++ and Python code, counting only the part included in offline software releases. There are several thousand commits to the central repository by about 100 individual developers per year. To keep a coherent software stack of high quality that it can be sustained and used efficiently for data acquisition, simulation, reconstruction, and analysis over the lifetime of the Belle II experiment is a challenge. A set of tools is employed to monitor the quality of the software and provide fast feedback to the developers. They are integrated in a machinery that is controlled by a buildbot master and automates the quality checks. The tools include different compilers, cppcheck, the clang static analyzer, valgrind memcheck, doxygen, a geometry overlap checker, a check for missing or extra library links, unit tests, steering file level tests, a sophisticated high-level validation suite, and an issue tracker. The technological development infrastructure is complemented by organizational means to coordinate the development.

  7. Software for improving the quality of project management, a case study: international manufacture of electrical equipment

    NASA Astrophysics Data System (ADS)

    Preradović, D. M.; Mićić, Lj S.; Barz, C.

    2017-05-01

    Production conditions in today’s world require software support at every stage of production and development of new products, for quality assurance and compliance with ISO standards. In addition to ISO standards such as usual metrics of quality, companies today are focused on other optional standards, such as CMMI (Capability Maturity Model Integrated) or prescribing they own standards. However, while there is intensive progress being made in the PM (project management), there is still a significant number of projects, at the global level, that are failures. These have failed to achieve their goals, within budget or timeframe. This paper focuses on checking the role of software tools through the rate of success in projects implemented in the case of internationally manufactured electrical equipment. The results of this research show the level of contribution of the project management software used to manage and develop new products to improve PM processes and PM functions, and how selection of the software tools affects the quality of PM processes and successfully completed projects.

  8. Mindcontrol: A web application for brain segmentation quality control.

    PubMed

    Keshavan, Anisha; Datta, Esha; M McDonough, Ian; Madan, Christopher R; Jordan, Kesshi; Henry, Roland G

    2018-04-15

    Tissue classification plays a crucial role in the investigation of normal neural development, brain-behavior relationships, and the disease mechanisms of many psychiatric and neurological illnesses. Ensuring the accuracy of tissue classification is important for quality research and, in particular, the translation of imaging biomarkers to clinical practice. Assessment with the human eye is vital to correct various errors inherent to all currently available segmentation algorithms. Manual quality assurance becomes methodologically difficult at a large scale - a problem of increasing importance as the number of data sets is on the rise. To make this process more efficient, we have developed Mindcontrol, an open-source web application for the collaborative quality control of neuroimaging processing outputs. The Mindcontrol platform consists of a dashboard to organize data, descriptive visualizations to explore the data, an imaging viewer, and an in-browser annotation and editing toolbox for data curation and quality control. Mindcontrol is flexible and can be configured for the outputs of any software package in any data organization structure. Example configurations for three large, open-source datasets are presented: the 1000 Functional Connectomes Project (FCP), the Consortium for Reliability and Reproducibility (CoRR), and the Autism Brain Imaging Data Exchange (ABIDE) Collection. These demo applications link descriptive quality control metrics, regional brain volumes, and thickness scalars to a 3D imaging viewer and editing module, resulting in an easy-to-implement quality control protocol that can be scaled for any size and complexity of study. Copyright © 2017 The Authors. Published by Elsevier Inc. All rights reserved.

  9. Assessing and Managing Quality of Information Assurance

    DTIC Science & Technology

    2010-11-01

    such as firewalls, antivirus scanning tools and mechanisms for user authentication and authorization. Advanced mission-critical systems often...imply increased risk to DoD information systems. The Process and Organizational Maturity (POM) class focuses on the maturity of the software and...include architectural quality. Common Weakness Enumeration (CWE) is a recent example that highlights the connection between software quality and

  10. Analysis of metal artifact reduction tools for dental hardware in CT scans of the oral cavity: kVp, iterative reconstruction, dual-energy CT, metal artifact reduction software: does it make a difference?

    PubMed

    De Crop, An; Casselman, Jan; Van Hoof, Tom; Dierens, Melissa; Vereecke, Elke; Bossu, Nicolas; Pamplona, Jaime; D'Herde, Katharina; Thierens, Hubert; Bacher, Klaus

    2015-08-01

    Metal artifacts may negatively affect radiologic assessment in the oral cavity. The aim of this study was to evaluate different metal artifact reduction techniques for metal artifacts induced by dental hardware in CT scans of the oral cavity. Clinical image quality was assessed using a Thiel-embalmed cadaver. A Catphan phantom and a polymethylmethacrylate (PMMA) phantom were used to evaluate physical-technical image quality parameters such as artifact area, artifact index (AI), and contrast detail (IQFinv). Metal cylinders were inserted in each phantom to create metal artifacts. CT images of both phantoms and the Thiel-embalmed cadaver were acquired on a multislice CT scanner using 80, 100, 120, and 140 kVp; model-based iterative reconstruction (Veo); and synthesized monochromatic keV images with and without metal artifact reduction software (MARs). Four radiologists assessed the clinical image quality, using an image criteria score (ICS). Significant influence of increasing kVp and the use of Veo was found on clinical image quality (p = 0.007 and p = 0.014, respectively). Application of MARs resulted in a smaller artifact area (p < 0.05). However, MARs reconstructed images resulted in lower ICS. Of all investigated techniques, Veo shows to be most promising, with a significant improvement of both the clinical and physical-technical image quality without adversely affecting contrast detail. MARs reconstruction in CT images of the oral cavity to reduce dental hardware metallic artifacts is not sufficient and may even adversely influence the image quality.

  11. Software Engineering Guidebook

    NASA Technical Reports Server (NTRS)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  12. 78 FR 32169 - Facilitating the Deployment of Text-to-911 and Other Next Generation 911 Applications

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-29

    ... messaging services (i.e., all providers of software applications that enable a consumer to send text... messaging services (i.e., all providers of software applications that enable a consumer to send text... providers of interconnected text messaging services (i.e., all providers of software applications that...

  13. Software Quality Metrics: A Software Management Monitoring Method for Air Force Logistics Command in Its Software Quality Assurance Program for the Quantitative Assessment of the System Development Life Cycle under Configuration Management.

    DTIC Science & Technology

    1982-03-01

    pilot systems. Magnitude of the mutant error is classified as: o Program does not compute. o Program computes but does not run test data. o Program...14 Test and Integration ... ............ .. 105 15 The Mapping of SQM to the SDLC ........ ... 108 16 ADS Development .... .............. . 224 17...and funds. While the test phase concludes the normal development cycle, one should realize that with software the development continues in the

  14. Offset Printing Plate Quality Sensor on a Low-Cost Processor

    PubMed Central

    Poljak, Jelena; Botella, Guillermo; García, Carlos; Poljaček, Sanja Mahović; Prieto-Matías, Manuel; Tirado, Francisco

    2013-01-01

    The aim of this work is to develop a microprocessor-based sensor that measures the quality of the offset printing plate through the introduction of different image analysis applications. The main features of the presented system are the low cost, the low amount of power consumption, its modularity and easy integration with other industrial modules for printing plates, and its robustness against noise environments. For the sake of clarity, a viability analysis of previous software is presented through different strategies, based on dynamic histogram and Hough transform. This paper provides performance and scalability data compared with existing costly commercial devices. Furthermore, a general overview of quality control possibilities for printing plates is presented and could be useful to a system where such controls are regularly conducted. PMID:24284766

  15. User experience integrated life-style cloud-based medical application.

    PubMed

    Serban, Alexandru; Lupşe, Oana Sorina; Stoicu-Tivadar, Lăcrămioara

    2015-01-01

    Having a modern application capable to automatically collect and process data from users, based on information and lifestyle answers is one of current challenges for researchers and medical science. The purpose of the current study is to integrate user experience design (UXD) in a cloud-based medical application to improve patient safety, quality of care and organizational efficiency. The process consists of collecting traditional and new data from patients and users using online questionnaires. A questionnaire dynamically asks questions about the user's current diet and lifestyle. After the user will introduce the data, the application will formulate a presumptive nutritional plan and will suggest different medical recommendations regarding a healthy lifestyle, and calculates a risk factor for diseases. This software application, by design and usability will be an efficient tool dedicated for fitness, nutrition and health professionals.

  16. Optical analysis of electro-optical systems by MTF calculus

    NASA Astrophysics Data System (ADS)

    Barbarini, Elisa Signoreto; Dos Santos, Daniel, Jr.; Stefani, Mário Antonio; Yasuoka, Fátima Maria Mitsue; Castro Neto, Jarbas C.; Rodrigues, Evandro Luís Linhari

    2011-08-01

    One of the widely used methods for performance analysis of an optical system is the determination of the Modulation Transfer Function (MTF). The MTF represents a quantitative and direct measure of image quality, and, besides being an objective test, it can be used on concatenated optical system. This paper presents the application of software called SMTF (software modulation transfer function), built in C++ and Open CV platforms for MTF calculation on electro-optical system. Through this technique, it is possible to develop specific method to measure the real time performance of a digital fundus camera, an infrared sensor and an ophthalmological surgery microscope. Each optical instrument mentioned has a particular device to measure the MTF response, which is being developed. Then the MTF information assists the analysis of the optical system alignment, and also defines its resolution limit by the MTF graphic. The result obtained from the implemented software is compared with the theoretical MTF curve from the analyzed systems.

  17. NASA Tech Briefs, March 2003

    NASA Technical Reports Server (NTRS)

    2003-01-01

    Topics covered include: Tool for Bending a Metal Tube Precisely in a Confined Space; Multiple-Use Mechanisms for Attachment to Seat Tracks; Force-Measuring Clamps; Cellular Pressure-Actuated Joint; Block QCA Fault-Tolerant Logic Gates; Hybrid VLSI/QCA Architecture for Computing FFTs; Arrays of Carbon Nanotubes as RF Filters in Waveguides; Carbon Nanotubes as Resonators for RF Spectrum Analyzers; Software for Viewing Landsat Mosaic Images; Updated Integrated Mission Program; Software for Sharing and Management of Information; Update on Integrated Optical Design Analyzer; Optical-Quality Thin Polymer Membranes; Rollable Thin Shell Composite-Material Paraboloidal Mirrors; Folded Resonant Horns for Power Ultrasonic Applications; Touchdown Ball-Bearing System for Magnetic Bearings; Flux-Based Deadbeat Control of Induction-Motor Torque; Block Copolymers as Templates for Arrays of Carbon Nanotubes; Throttling Cryogen Boiloff To Control Cryostat Temperature; Collaborative Software Development Approach Used to Deliver the New Shuttle Telemetry Ground Station; Turbulence in Supercritical O2/H2 and C7H16/N2 Mixing Layers; and Time-Resolved Measurements in Optoelectronic Microbioanal.

  18. HALO--a Java framework for precise transcript half-life determination.

    PubMed

    Friedel, Caroline C; Kaufmann, Stefanie; Dölken, Lars; Zimmer, Ralf

    2010-05-01

    Recent improvements in experimental technologies now allow measurements of de novo transcription and/or RNA decay at whole transcriptome level and determination of precise transcript half-lives. Such transcript half-lives provide important insights into the regulation of biological processes and the relative contributions of RNA decay and de novo transcription to differential gene expression. In this article, we present HALO (Half-life Organizer), the first software for the precise determination of transcript half-lives from measurements of RNA de novo transcription or decay determined with microarrays or RNA-seq. In addition, methods for quality control, filtering and normalization are supplied. HALO provides a graphical user interface, command-line tools and a well-documented Java application programming interface (API). Thus, it can be used both by biologists to determine transcript half-lives fast and reliably with the provided user interfaces as well as software developers integrating transcript half-life analysis into other gene expression profiling pipelines. Source code, executables and documentation are available at http://www.bio.ifi.lmu.de/software/halo.

  19. Comparison of Uas-Based Photogrammetry Software for 3d Point Cloud Generation: a Survey Over a Historical Site

    NASA Astrophysics Data System (ADS)

    Alidoost, F.; Arefi, H.

    2017-11-01

    Nowadays, Unmanned Aerial System (UAS)-based photogrammetry offers an affordable, fast and effective approach to real-time acquisition of high resolution geospatial information and automatic 3D modelling of objects for numerous applications such as topography mapping, 3D city modelling, orthophoto generation, and cultural heritages preservation. In this paper, the capability of four different state-of-the-art software packages as 3DSurvey, Agisoft Photoscan, Pix4Dmapper Pro and SURE is examined to generate high density point cloud as well as a Digital Surface Model (DSM) over a historical site. The main steps of this study are including: image acquisition, point cloud generation, and accuracy assessment. The overlapping images are first captured using a quadcopter and next are processed by different software to generate point clouds and DSMs. In order to evaluate the accuracy and quality of point clouds and DSMs, both visual and geometric assessments are carry out and the comparison results are reported.

  20. RSAT 2015: Regulatory Sequence Analysis Tools

    PubMed Central

    Medina-Rivera, Alejandra; Defrance, Matthieu; Sand, Olivier; Herrmann, Carl; Castro-Mondragon, Jaime A.; Delerce, Jeremy; Jaeger, Sébastien; Blanchet, Christophe; Vincens, Pierre; Caron, Christophe; Staines, Daniel M.; Contreras-Moreira, Bruno; Artufel, Marie; Charbonnier-Khamvongsa, Lucie; Hernandez, Céline; Thieffry, Denis; Thomas-Chollier, Morgane; van Helden, Jacques

    2015-01-01

    RSAT (Regulatory Sequence Analysis Tools) is a modular software suite for the analysis of cis-regulatory elements in genome sequences. Its main applications are (i) motif discovery, appropriate to genome-wide data sets like ChIP-seq, (ii) transcription factor binding motif analysis (quality assessment, comparisons and clustering), (iii) comparative genomics and (iv) analysis of regulatory variations. Nine new programs have been added to the 43 described in the 2011 NAR Web Software Issue, including a tool to extract sequences from a list of coordinates (fetch-sequences from UCSC), novel programs dedicated to the analysis of regulatory variants from GWAS or population genomics (retrieve-variation-seq and variation-scan), a program to cluster motifs and visualize the similarities as trees (matrix-clustering). To deal with the drastic increase of sequenced genomes, RSAT public sites have been reorganized into taxon-specific servers. The suite is well-documented with tutorials and published protocols. The software suite is available through Web sites, SOAP/WSDL Web services, virtual machines and stand-alone programs at http://www.rsat.eu/. PMID:25904632

Top