Science.gov

Sample records for software quality engineering

  1. Software metrics: Software quality metrics for distributed systems. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Post, J. V.

    1981-01-01

    Software quality metrics was extended to cover distributed computer systems. Emphasis is placed on studying embedded computer systems and on viewing them within a system life cycle. The hierarchy of quality factors, criteria, and metrics was maintained. New software quality factors were added, including survivability, expandability, and evolvability.

  2. 2003 SNL ASCI applications software quality engineering assessment report.

    SciTech Connect

    Schofield, Joseph Richard, Jr.; Ellis, Molly A.; Williamson, Charles Michael; Bonano, Lora A.

    2004-02-01

    This document describes the 2003 SNL ASCI Software Quality Engineering (SQE) assessment of twenty ASCI application code teams and the results of that assessment. The purpose of this assessment was to determine code team compliance with the Sandia National Laboratories ASCI Applications Software Quality Engineering Practices, Version 2.0 as part of an overall program assessment.

  3. On Quality and Measures in Software Engineering

    ERIC Educational Resources Information Center

    Bucur, Ion I.

    2006-01-01

    Complexity measures are mainly used to estimate vital information about reliability and maintainability of software systems from regular analysis of the source code. Such measures also provide constant feedback during a software project to assist the control of the development procedure. There exist several models to classify a software product's…

  4. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices

    SciTech Connect

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2002-01-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool. These sections map practices and activities at Sandia to the ASCI Software Quality Engineering: Goals, Principles, and Guidelines, a Department of Energy document.

  5. Sandia National Laboratories ASCI Applications Software Quality Engineering Practices

    SciTech Connect

    ZEPPER, JOHN D.; ARAGON, KATHRYN MARY; ELLIS, MOLLY A.; BYLE, KATHLEEN A.; EATON, DONNA SUE

    2003-04-01

    This document provides a guide to the deployment of the software verification activities, software engineering practices, and project management principles that guide the development of Accelerated Strategic Computing Initiative (ASCI) applications software at Sandia National Laboratories (Sandia). The goal of this document is to identify practices and activities that will foster the development of reliable and trusted products produced by the ASCI Applications program. Document contents include an explanation of the structure and purpose of the ASCI Quality Management Council, an overview of the software development lifecycle, an outline of the practices and activities that should be followed, and an assessment tool.

  6. Trilinos developers SQE guide : ASC software quality engineering practices.

    SciTech Connect

    Willenbring, James Michael; Heroux, Michael Allen

    2013-05-01

    The Trilinos Project is an effort to develop algorithms and enabling technologies within an object-oriented software framework for the solution of large-scale, complex multi-physics engineering and scientific problems. A new software capability is introduced into Trilinos as a package. A Trilinos package is an integral unit and, although there are exceptions such as utility packages, each package is typically developed by a small team of experts in a particular algorithms area such as algebraic preconditioners, nonlinear solvers, etc. The Trilinos Developers SQE Guide is a resource for Trilinos package developers who are working under Advanced Simulation and Computing (ASC) and are therefore subject to the ASC Software Quality Engineering Practices as described in the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan: ASC Software Quality Engineering Practices Version 3.0 document [1]. The Trilinos Developer Policies webpage [2] contains a lot of detailed information that is essential for all Trilinos developers. The Trilinos Software Lifecycle Model [3]defines the default lifecycle model for Trilinos packages and provides a context for many of the practices listed in this document.

  7. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan : ASC software quality engineering practices Version 3.0.

    SciTech Connect

    Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia; Pilch, Martin M.

    2009-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and the software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.

  8. Software Engineering Guidebook

    NASA Technical Reports Server (NTRS)

    Connell, John; Wenneson, Greg

    1993-01-01

    The Software Engineering Guidebook describes SEPG (Software Engineering Process Group) supported processes and techniques for engineering quality software in NASA environments. Three process models are supported: structured, object-oriented, and evolutionary rapid-prototyping. The guidebook covers software life-cycles, engineering, assurance, and configuration management. The guidebook is written for managers and engineers who manage, develop, enhance, and/or maintain software under the Computer Software Services Contract.

  9. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan part 2 mappings for the ASC software quality engineering practices, version 2.0.

    SciTech Connect

    Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR001.3.2 and CPR001.3.6 and to a Department of Energy document, ''ASCI Software Quality Engineering: Goals, Principles, and Guidelines''. This document also identifies ASC management and software project teams' responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  10. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1: ASC software quality engineering practices, Version 2.0.

    SciTech Connect

    Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Minana, Molly A.; Hackney, Patricia; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2006-09-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  11. Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan. Part 2, Mappings for the ASC software quality engineering practices. Version 1.0.

    SciTech Connect

    Ellis, Molly A.; Heaphy, Robert; Sturtevant, Judith E.; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. The plan defines the ASC program software quality practices and provides mappings of these practices to Sandia Corporate Requirements CPR 1.3.2 and 1.3.6 and to a Department of Energy document, 'ASCI Software Quality Engineering: Goals, Principles, and Guidelines'. This document also identifies ASC management and software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals.

  12. Software engineering

    NASA Technical Reports Server (NTRS)

    Fridge, Ernest M., III; Hiott, Jim; Golej, Jim; Plumb, Allan

    1993-01-01

    Today's software systems generally use obsolete technology, are not integrated properly with other software systems, and are difficult and costly to maintain. The discipline of reverse engineering is becoming prominent as organizations try to move their systems up to more modern and maintainable technology in a cost effective manner. The Johnson Space Center (JSC) created a significant set of tools to develop and maintain FORTRAN and C code during development of the space shuttle. This tool set forms the basis for an integrated environment to reengineer existing code into modern software engineering structures which are then easier and less costly to maintain and which allow a fairly straightforward translation into other target languages. The environment will support these structures and practices even in areas where the language definition and compilers do not enforce good software engineering. The knowledge and data captured using the reverse engineering tools is passed to standard forward engineering tools to redesign or perform major upgrades to software systems in a much more cost effective manner than using older technologies. The latest release of the environment was in Feb. 1992.

  13. Sandia National Laboratories Advanced Simulation and Computing (ASC) software quality plan. Part 1 : ASC software quality engineering practices version 1.0.

    SciTech Connect

    Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert; Hodges, Ann Louise; Boucheron, Edward A.; Drake, Richard Roy; Forsythe, Christi A.; Schofield, Joseph Richard, Jr.; Pavlakos, Constantine James; Williamson, Charles Michael; Edwards, Harold Carter

    2005-01-01

    The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management and software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.

  14. Engineering and Software Engineering

    NASA Astrophysics Data System (ADS)

    Jackson, Michael

    The phrase ‘software engineering' has many meanings. One central meaning is the reliable development of dependable computer-based systems, especially those for critical applications. This is not a solved problem. Failures in software development have played a large part in many fatalities and in huge economic losses. While some of these failures may be attributable to programming errors in the narrowest sense—a program's failure to satisfy a given formal specification—there is good reason to think that most of them have other roots. These roots are located in the problem of software engineering rather than in the problem of program correctness. The famous 1968 conference was motivated by the belief that software development should be based on “the types of theoretical foundations and practical disciplines that are traditional in the established branches of engineering.” Yet after forty years of currency the phrase ‘software engineering' still denotes no more than a vague and largely unfulfilled aspiration. Two major causes of this disappointment are immediately clear. First, too many areas of software development are inadequately specialised, and consequently have not developed the repertoires of normal designs that are the indispensable basis of reliable engineering success. Second, the relationship between structural design and formal analytical techniques for software has rarely been one of fruitful synergy: too often it has defined a boundary between competing dogmas, at which mutual distrust and incomprehension deprive both sides of advantages that should be within their grasp. This paper discusses these causes and their effects. Whether the common practice of software development will eventually satisfy the broad aspiration of 1968 is hard to predict; but an understanding of past failure is surely a prerequisite of future success.

  15. Software Quality Assurance Metrics

    NASA Technical Reports Server (NTRS)

    McRae, Kalindra A.

    2004-01-01

    Software Quality Assurance (SQA) is a planned and systematic set of activities that ensures conformance of software life cycle processes and products conform to requirements, standards and procedures. In software development, software quality means meeting requirements and a degree of excellence and refinement of a project or product. Software Quality is a set of attributes of a software product by which its quality is described and evaluated. The set of attributes includes functionality, reliability, usability, efficiency, maintainability, and portability. Software Metrics help us understand the technical process that is used to develop a product. The process is measured to improve it and the product is measured to increase quality throughout the life cycle of software. Software Metrics are measurements of the quality of software. Software is measured to indicate the quality of the product, to assess the productivity of the people who produce the product, to assess the benefits derived from new software engineering methods and tools, to form a baseline for estimation, and to help justify requests for new tools or additional training. Any part of the software development can be measured. If Software Metrics are implemented in software development, it can save time, money, and allow the organization to identify the caused of defects which have the greatest effect on software development. The summer of 2004, I worked with Cynthia Calhoun and Frank Robinson in the Software Assurance/Risk Management department. My task was to research and collect, compile, and analyze SQA Metrics that have been used in other projects that are not currently being used by the SA team and report them to the Software Assurance team to see if any metrics can be implemented in their software assurance life cycle process.

  16. Software quality in 1997

    SciTech Connect

    Jones, C.

    1997-11-01

    For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Deployment (QFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels. Since software is on the critical path for many engineered products, and for internal business systems as well, the new approaches are starting to affect global competition and attract widespread international interest. It can be hypothesized that success in mastering software quality will be a key strategy for dominating global software markets in the 21st century.

  17. Software engineering methodologies and tools

    NASA Technical Reports Server (NTRS)

    Wilcox, Lawrence M.

    1993-01-01

    Over the years many engineering disciplines have developed, including chemical, electronic, etc. Common to all engineering disciplines is the use of rigor, models, metrics, and predefined methodologies. Recently, a new engineering discipline has appeared on the scene, called software engineering. For over thirty years computer software has been developed and the track record has not been good. Software development projects often miss schedules, are over budget, do not give the user what is wanted, and produce defects. One estimate is there are one to three defects per 1000 lines of deployed code. More and more systems are requiring larger and more complex software for support. As this requirement grows, the software development problems grow exponentially. It is believed that software quality can be improved by applying engineering principles. Another compelling reason to bring the engineering disciplines to software development is productivity. It has been estimated that productivity of producing software has only increased one to two percent a year in the last thirty years. Ironically, the computer and its software have contributed significantly to the industry-wide productivity, but computer professionals have done a poor job of using the computer to do their job. Engineering disciplines and methodologies are now emerging supported by software tools that address the problems of software development. This paper addresses some of the current software engineering methodologies as a backdrop for the general evaluation of computer assisted software engineering (CASE) tools from actual installation of and experimentation with some specific tools.

  18. Experimentation in software engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Selby, R. W.; Hutchens, D. H.

    1986-01-01

    Experimentation in software engineering supports the advancement of the field through an iterative learning process. In this paper, a framework for analyzing most of the experimental work performed in software engineering over the past several years is presented. A variety of experiments in the framework is described and their contribution to the software engineering discipline is discussed. Some useful recommendations for the application of the experimental process in software engineering are included.

  19. Proceedings of Tenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1985-01-01

    Papers are presented on the following topics: measurement of software technology, recent studies of the Software Engineering Lab, software management tools, expert systems, error seeding as a program validation technique, software quality assurance, software engineering environments (including knowledge-based environments), the Distributed Computing Design System, and various Ada experiments.

  20. Software productivity improvement through software engineering technology

    NASA Technical Reports Server (NTRS)

    Mcgarry, F. E.

    1985-01-01

    It has been estimated that NASA expends anywhere from 6 to 10 percent of its annual budget on the acquisition, implementation and maintenance of computer software. Although researchers have produced numerous software engineering approaches over the past 5-10 years; each claiming to be more effective than the other, there is very limited quantitative information verifying the measurable impact htat any of these technologies may have in a production environment. At NASA/GSFC, an extended research effort aimed at identifying and measuring software techniques that favorably impact productivity of software development, has been active over the past 8 years. Specific, measurable, software development technologies have been applied and measured in a production environment. Resulting software development approaches have been shown to be effective in both improving quality as well as productivity in this one environment.

  1. Future of Software Engineering Standards

    NASA Technical Reports Server (NTRS)

    Poon, Peter T.

    1997-01-01

    In the new millennium, software engineering standards are expected to continue to influence the process of producing software-intensive systems which are cost-effetive and of high quality. These sytems may range from ground and flight systems used for planetary exploration to educational support systems used in schools as well as consumer-oriented systems.

  2. Software Engineering Improvement Plan

    NASA Technical Reports Server (NTRS)

    2006-01-01

    In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.

  3. NASA software documentation standard software engineering program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  4. SOFTWARE ENGINEERING INSTITUTE (SEI)

    EPA Science Inventory

    The Software Engineering Institute (SEI) is a federally funded research and development center established in 1984 by the U.S. Department of Defense and operated by Carnegie Mellon University. SEI has a broad charter to provide leadership in the practice of software engineering t...

  5. Proceedings of the Seventeenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Proceedings of the Seventeenth Annual Software Engineering Workshop are presented. The software Engineering Laboratory (SEL) is an organization sponsored by NASA/Goddard Space Flight Center and created to investigate the effectiveness of software engineering technologies when applied to the development of applications software. Topics covered include: the Software Engineering Laboratory; process measurement; software reuse; software quality; lessons learned; and is Ada dying.

  6. Software component quality evaluation

    NASA Technical Reports Server (NTRS)

    Clough, A. J.

    1991-01-01

    The paper describes a software inspection process that can be used to evaluate the quality of software components. Quality criteria, process application, independent testing of the process and proposed associated tool support are covered. Early results indicate that this technique is well suited for assessing software component quality in a standardized fashion. With automated machine assistance to facilitate both the evaluation and selection of software components, such a technique should promote effective reuse of software components.

  7. Software engineering as an engineering discipline

    NASA Technical Reports Server (NTRS)

    Berard, Edward V.

    1988-01-01

    The following topics are discussed in the context of software engineering: early use of the term; the 1968 NATO conference; Barry Boehm's definition; four requirements fo software engineering; and additional criteria for software engineering. Additionally, the four major requirements for software engineering--computer science, mathematics, engineering disciplines, and excellent communication skills--are discussed. The presentation is given in vugraph form.

  8. Software engineering ethics

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L.

    1991-01-01

    Software engineering ethics is reviewed. The following subject areas are covered: lack of a system viewpoint; arrogance of PC DOS software vendors; violation od upward compatibility; internet worm; internet worm revisited; student cheating and company hiring interviews; computing practitioners and the commodity market; new projects and old programming languages; schedule and budget; and recent public domain comments.

  9. Software engineering as an engineering discipline

    NASA Technical Reports Server (NTRS)

    Gibbs, Norman

    1988-01-01

    The goals of the Software Engineering Institute's Education Program are as follows: to increase the number of highly qualified software engineers--new software engineers and existing practitioners; and to be the leading center of expertise for software engineering education and training. A discussion of these goals is presented in vugraph form.

  10. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 15

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  11. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 14

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  12. Software Engineering Laboratory Series: Collected Software Engineering Papers. Volume 13

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  13. Sandia software guidelines: Software quality planning

    SciTech Connect

    Not Available

    1987-08-01

    This volume is one in a series of Sandia Software Guidelines intended for use in producing quality software within Sandia National Laboratories. In consonance with the IEEE Standard for Software Quality Assurance Plans, this volume identifies procedures to follow in producing a Software Quality Assurance Plan for an organization or a project, and provides an example project SQA plan. 2 figs., 4 tabs.

  14. Software quality assurance handbook

    SciTech Connect

    Not Available

    1990-09-01

    There are two important reasons for Software Quality Assurance (SQA) at Allied-Signal Inc., Kansas City Division (KCD): First, the benefits from SQA make good business sense. Second, the Department of Energy has requested SQA. This handbook is one of the first steps in a plant-wide implementation of Software Quality Assurance at KCD. The handbook has two main purposes. The first is to provide information that you will need to perform software quality assurance activities. The second is to provide a common thread to unify the approach to SQA at KCD. 2 figs.

  15. Effective Software Engineering Leadership for Development Programs

    ERIC Educational Resources Information Center

    Cagle West, Marsha

    2010-01-01

    Software is a critical component of systems ranging from simple consumer appliances to complex health, nuclear, and flight control systems. The development of quality, reliable, and effective software solutions requires the incorporation of effective software engineering processes and leadership. Processes, approaches, and methodologies for…

  16. SWiFT Software Quality Assurance Plan.

    SciTech Connect

    Berg, Jonathan Charles

    2016-01-01

    This document describes the software development practice areas and processes which contribute to the ability of SWiFT software developers to provide quality software. These processes are designed to satisfy the requirements set forth by the Sandia Software Quality Assurance Program (SSQAP). APPROVALS SWiFT Software Quality Assurance Plan (SAND2016-0765) approved by: Department Manager SWiFT Site Lead Dave Minster (6121) Date Jonathan White (6121) Date SWiFT Controls Engineer Jonathan Berg (6121) Date CHANGE HISTORY Issue Date Originator(s) Description A 2016/01/27 Jon Berg (06121) Initial release of the SWiFT Software Quality Assurance Plan

  17. Software Engineering for Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Fredrickson, Steven E.

    2014-01-01

    The Spacecraft Software Engineering Branch of NASA Johnson Space Center (JSC) provides world-class products, leadership, and technical expertise in software engineering, processes, technology, and systems management for human spaceflight. The branch contributes to major NASA programs (e.g. ISS, MPCV/Orion) with in-house software development and prime contractor oversight, and maintains the JSC Engineering Directorate CMMI rating for flight software development. Software engineering teams work with hardware developers, mission planners, and system operators to integrate flight vehicles, habitats, robotics, and other spacecraft elements. They seek to infuse automation and autonomy into missions, and apply new technologies to flight processor and computational architectures. This presentation will provide an overview of key software-related projects, software methodologies and tools, and technology pursuits of interest to the JSC Spacecraft Software Engineering Branch.

  18. Software engineering as an engineering discipline

    NASA Technical Reports Server (NTRS)

    Freedman, Glenn B.

    1988-01-01

    The purpose of this panel is to explore the emerging field of software engineering from a variety of perspectives: university programs; industry training and definition; government development; and technology transfer. In doing this, the panel will address the issues of distinctions among software engineering, computer science, and computer hardware engineering as they relate to the challenges of large, complex systems.

  19. Empirically Driven Software Engineering Research

    NASA Astrophysics Data System (ADS)

    Rombach, Dieter

    Software engineering is a design discipline. As such, its engineering methods are based on cognitive instead of physical laws, and their effectiveness depends highly on context. Empirical methods can be used to observe the effects of software engineering methods in vivo and in vitro, to identify improvement potentials, and to validate new research results. This paper summarizes both the current body of knowledge and further challenges wrt. empirical methods in software engineering as well as empirically derived evidence regarding software typical engineering methods. Finally, future challenges wrt. education, research, and technology transfer will be outlined.

  20. Software engineering and Ada in design

    NASA Technical Reports Server (NTRS)

    Oneill, Don

    1986-01-01

    Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.

  1. Software engineering standards and practices

    NASA Technical Reports Server (NTRS)

    Durachka, R. W.

    1981-01-01

    Guidelines are presented for the preparation of a software development plan. The various phases of a software development project are discussed throughout its life cycle including a general description of the software engineering standards and practices to be followed during each phase.

  2. Modernization of software quality assurance

    NASA Technical Reports Server (NTRS)

    Bhaumik, Gokul

    1988-01-01

    The customers satisfaction depends not only on functional performance, it also depends on the quality characteristics of the software products. An examination of this quality aspect of software products will provide a clear, well defined framework for quality assurance functions, which improve the life-cycle activities of software development. Software developers must be aware of the following aspects which have been expressed by many quality experts: quality cannot be added on; the level of quality built into a program is a function of the quality attributes employed during the development process; and finally, quality must be managed. These concepts have guided our development of the following definition for a Software Quality Assurance function: Software Quality Assurance is a formal, planned approach of actions designed to evaluate the degree of an identifiable set of quality attributes present in all software systems and their products. This paper is an explanation of how this definition was developed and how it is used.

  3. Sandia National Laboratories Advanced Simulation and Computing (ASC) : appraisal method for the implementation of the ASC software quality engineering practices: Version 1.0.

    SciTech Connect

    Turgeon, Jennifer; Minana, Molly A.

    2008-02-01

    This document provides a guide to the process of conducting software appraisals under the Sandia National Laboratories (SNL) ASC Program. The goal of this document is to describe a common methodology for planning, conducting, and reporting results of software appraisals thereby enabling: development of an objective baseline on implementation of the software quality engineering (SQE) practices identified in the ASC Software Quality Plan across the ASC Program; feedback from project teams on SQE opportunities for improvement; identification of strengths and opportunities for improvement for individual project teams; guidance to the ASC Program on the focus of future SQE activities Document contents include process descriptions, templates to promote consistent conduct of appraisals, and an explanation of the relationship of this procedure to the SNL ASC software program.

  4. Artificial intelligence approaches to software engineering

    NASA Technical Reports Server (NTRS)

    Johannes, James D.; Macdonald, James R.

    1988-01-01

    Artificial intelligence approaches to software engineering are examined. The software development life cycle is a sequence of not so well-defined phases. Improved techniques for developing systems have been formulated over the past 15 years, but pressure continues to attempt to reduce current costs. Software development technology seems to be standing still. The primary objective of the knowledge-based approach to software development presented in this paper is to avoid problem areas that lead to schedule slippages, cost overruns, or software products that fall short of their desired goals. Identifying and resolving software problems early, often in the phase in which they first occur, has been shown to contribute significantly to reducing risks in software development. Software development is not a mechanical process but a basic human activity. It requires clear thinking, work, and rework to be successful. The artificial intelligence approaches to software engineering presented support the software development life cycle through the use of software development techniques and methodologies in terms of changing current practices and methods. These should be replaced by better techniques that that improve the process of of software development and the quality of the resulting products. The software development process can be structured into well-defined steps, of which the interfaces are standardized, supported and checked by automated procedures that provide error detection, production of the documentation and ultimately support the actual design of complex programs.

  5. Proceedings of the Eighth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The four major topics of discussion included: the NASA Software Engineering Laboratory, software testing, human factors in software engineering and software quality assessment. As in the past years, there were 12 position papers presented (3 for each topic) followed by questions and very heavy participation by the general audience.

  6. Technology transfer in software engineering

    NASA Technical Reports Server (NTRS)

    Bishop, Peter C.

    1989-01-01

    The University of Houston-Clear Lake is the prime contractor for the AdaNET Research Project under the direction of NASA Johnson Space Center. AdaNET was established to promote the principles of software engineering to the software development industry. AdaNET will contain not only environments and tools, but also concepts, principles, models, standards, guidelines and practices. Initially, AdaNET will serve clients from the U.S. government and private industry who are working in software development. It will seek new clients from those who have not yet adopted the principles and practices of software engineering. Some of the goals of AdaNET are to become known as an objective, authoritative source of new software engineering information and parts, to provide easy access to information and parts, and to keep abreast of innovations in the field.

  7. Computer systems and software engineering

    NASA Technical Reports Server (NTRS)

    Mckay, Charles W.

    1988-01-01

    The High Technologies Laboratory (HTL) was established in the fall of 1982 at the University of Houston Clear Lake. Research conducted at the High Tech Lab is focused upon computer systems and software engineering. There is a strong emphasis on the interrelationship of these areas of technology and the United States' space program. In Jan. of 1987, NASA Headquarters announced the formation of its first research center dedicated to software engineering. Operated by the High Tech Lab, the Software Engineering Research Center (SERC) was formed at the University of Houston Clear Lake. The High Tech Lab/Software Engineering Research Center promotes cooperative research among government, industry, and academia to advance the edge-of-knowledge and the state-of-the-practice in key topics of computer systems and software engineering which are critical to NASA. The center also recommends appropriate actions, guidelines, standards, and policies to NASA in matters pertinent to the center's research. Results of the research conducted at the High Tech Lab/Software Engineering Research Center have given direction to many decisions made by NASA concerning the Space Station Program.

  8. Concurrent Software Engineering Project

    ERIC Educational Resources Information Center

    Stankovic, Nenad; Tillo, Tammam

    2009-01-01

    Concurrent engineering or overlapping activities is a business strategy for schedule compression on large development projects. Design parameters and tasks from every aspect of a product's development process and their interdependencies are overlapped and worked on in parallel. Concurrent engineering suffers from negative effects such as excessive…

  9. Software engineering tools.

    PubMed

    Wear, L L; Pinkert, J R

    1994-01-01

    We have looked at general descriptions and illustrations of several software development tools, such as tools for prototyping, developing DFDs, testing, and maintenance. Many others are available, and new ones are being developed. However, you have at least seen some examples of powerful CASE tools for systems development. PMID:10131419

  10. A Multidimensional Software Engineering Course

    ERIC Educational Resources Information Center

    Barzilay, O.; Hazzan, O.; Yehudai, A.

    2009-01-01

    Software engineering (SE) is a multidimensional field that involves activities in various areas and disciplines, such as computer science, project management, and system engineering. Though modern SE curricula include designated courses that address these various subjects, an advanced summary course that synthesizes them is still missing. Such a…

  11. Software archeology: a case study in software quality assurance and design

    SciTech Connect

    Macdonald, John M; Lloyd, Jane A; Turner, Cameron J

    2009-01-01

    Ideally, quality is designed into software, just as quality is designed into hardware. However, when dealing with legacy systems, demonstrating that the software meets required quality standards may be difficult to achieve. As the need to demonstrate the quality of existing software was recognized at Los Alamos National Laboratory (LANL), an effort was initiated to uncover and demonstrate that legacy software met the required quality standards. This effort led to the development of a reverse engineering approach referred to as software archaeology. This paper documents the software archaeology approaches used at LANL to document legacy software systems. A case study for the Robotic Integrated Packaging System (RIPS) software is included.

  12. NASA Software Engineering Benchmarking Study

    NASA Technical Reports Server (NTRS)

    Rarick, Heather L.; Godfrey, Sara H.; Kelly, John C.; Crumbley, Robert T.; Wifl, Joel M.

    2013-01-01

    To identify best practices for the improvement of software engineering on projects, NASA's Offices of Chief Engineer (OCE) and Safety and Mission Assurance (OSMA) formed a team led by Heather Rarick and Sally Godfrey to conduct this benchmarking study. The primary goals of the study are to identify best practices that: Improve the management and technical development of software intensive systems; Have a track record of successful deployment by aerospace industries, universities [including research and development (R&D) laboratories], and defense services, as well as NASA's own component Centers; and Identify candidate solutions for NASA's software issues. Beginning in the late fall of 2010, focus topics were chosen and interview questions were developed, based on the NASA top software challenges. Between February 2011 and November 2011, the Benchmark Team interviewed a total of 18 organizations, consisting of five NASA Centers, five industry organizations, four defense services organizations, and four university or university R and D laboratory organizations. A software assurance representative also participated in each of the interviews to focus on assurance and software safety best practices. Interviewees provided a wealth of information on each topic area that included: software policy, software acquisition, software assurance, testing, training, maintaining rigor in small projects, metrics, and use of the Capability Maturity Model Integration (CMMI) framework, as well as a number of special topics that came up in the discussions. NASA's software engineering practices compared favorably with the external organizations in most benchmark areas, but in every topic, there were ways in which NASA could improve its practices. Compared to defense services organizations and some of the industry organizations, one of NASA's notable weaknesses involved communication with contractors regarding its policies and requirements for acquired software. One of NASA's strengths

  13. On the software quality of climate models

    NASA Astrophysics Data System (ADS)

    Pipitone, J.; Easterbrook, S.

    2009-12-01

    A climate model is an executable theory of the climate; the model encapsulates climatological theories in software so that they can be simulated and their implications investigated directly. Thus, in order to trust a climate model one must trust that the software it is built from is robust. Our study explores the nature of software quality in the context of climate modelling: How do we characterise and assess the quality of climate modelling software? We use two major research strategies: (1) analysis of defect densities of leading global climate models and (2) semi-structured interviews with researchers from several climate modelling centres. Defect density analysis is an established software engineering technique for studying software quality. We collected our defect data from bug tracking systems, version control repository comments, and from static analysis of the source code. As a result of our analysis, we characterise common defect types found in climate model software and we identify the software quality factors that are relevant for climate scientists. We also provide a roadmap to achieve proper benchmarks for climate model software quality, and we discuss the implications of our findings for the assessment of climate model software trustworthiness.

  14. Collected software engineering papers, volume 2

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Topics addressed include: summaries of the software engineering laboratory (SEL) organization, operation, and research activities; results of specific research projects in the areas of resource models and software measures; and strategies for data collection for software engineering research.

  15. The Research of Software Engineering Curriculum Reform

    NASA Astrophysics Data System (ADS)

    Kuang, Li-Qun; Han, Xie

    With the problem that software engineering training can't meet the needs of the community, this paper analysis some outstanding reasons in software engineering curriculum teaching, such as old teaching contents, weak in practice and low quality of teachers etc. We propose the methods of teaching reform as guided by market demand, update the teaching content, optimize the teaching methods, reform the teaching practice, strengthen the teacher-student exchange and promote teachers and students together. We carried out the reform and explore positive and achieved the desired results.

  16. Engine Structural Analysis Software

    NASA Technical Reports Server (NTRS)

    McKnight, R. L.; Maffeo, R. J.; Schrantz, S.; Hartle, M. S.; Bechtel, G. S.; Lewis, K.; Ridgway, M.; Chamis, Christos C. (Technical Monitor)

    2001-01-01

    The report describes the technical effort to develop: (1) geometry recipes for nozzles, inlets, disks, frames, shafts, and ducts in finite element form, (2) component design tools for nozzles, inlets, disks, frames, shafts, and ducts which utilize the recipes and (3) an integrated design tool which combines the simulations of the nozzles, inlets, disks, frames, shafts, and ducts with the previously developed combustor, turbine blade, and turbine vane models for a total engine representation. These developments will be accomplished in cooperation and in conjunction with comparable efforts of NASA Glenn Research Center.

  17. Software quality assurance plans for safety-critical software

    SciTech Connect

    Liddle, P.

    2006-07-01

    Application software is defined as safety-critical if a fault in the software could prevent the system components from performing their nuclear-safety functions. Therefore, for nuclear-safety systems, the AREVA TELEPERM{sup R} XS (TXS) system is classified 1E, as defined in the Inst. of Electrical and Electronics Engineers (IEEE) Std 603-1998. The application software is classified as Software Integrity Level (SIL)-4, as defined in IEEE Std 7-4.3.2-2003. The AREVA NP Inc. Software Program Manual (SPM) describes the measures taken to ensure that the TELEPERM XS application software attains a level of quality commensurate with its importance to safety. The manual also describes how TELEPERM XS correctly performs the required safety functions and conforms to established technical and documentation requirements, conventions, rules, and standards. The program manual covers the requirements definition, detailed design, integration, and test phases for the TELEPERM XS application software, and supporting software created by AREVA NP Inc. The SPM is required for all safety-related TELEPERM XS system applications. The program comprises several basic plans and practices: 1. A Software Quality-Assurance Plan (SQAP) that describes the processes necessary to ensure that the software attains a level of quality commensurate with its importance to safety function. 2. A Software Safety Plan (SSP) that identifies the process to reasonably ensure that safety-critical software performs as intended during all abnormal conditions and events, and does not introduce any new hazards that could jeopardize the health and safety of the public. 3. A Software Verification and Validation (V and V) Plan that describes the method of ensuring the software is in accordance with the requirements. 4. A Software Configuration Management Plan (SCMP) that describes the method of maintaining the software in an identifiable state at all times. 5. A Software Operations and Maintenance Plan (SO and MP) that

  18. Software Quality Assurance Audits Guidebooks

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The growth in cost and importance of software to NASA has caused NASA to address the improvement of software development across the agency. One of the products of this program is a series of guidebooks that define a NASA concept of the assurance processes that are used in software development. The Software Assurance Guidebook, NASA-GB-A201, issued in September, 1989, provides an overall picture of the NASA concepts and practices in software assurance. Second level guidebooks focus on specific activities that fall within the software assurance discipline, and provide more detailed information for the manager and/or practitioner. This is the second level Software Quality Assurance Audits Guidebook that describes software quality assurance audits in a way that is compatible with practices at NASA Centers.

  19. Culture shock: Improving software quality

    SciTech Connect

    de Jong, K.; Trauth, S.L.

    1988-01-01

    The concept of software quality can represent a significant shock to an individual who has been developing software for many years and who believes he or she has been doing a high quality job. The very idea that software includes lines of code and associated documentation is foreign and difficult to grasp, at best. Implementation of a software quality program hinges on the concept that software is a product whose quality needs improving. When this idea is introduced into a technical community that is largely ''self-taught'' and has been producing ''good'' software for some time, a fundamental understanding of the concepts associated with software is often weak. Software developers can react as if to say, ''What are you talking about. What do you mean I'm not doing a good job. I haven't gotten any complaints about my code yetexclamation'' Coupling such surprise and resentment with the shock that software really is a product and software quality concepts do exist, can fuel the volatility of these emotions. In this paper, we demonstrate that the concept of software quality can indeed pose a culture shock to developers. We also show that a ''typical'' quality assurance approach, that of imposing a standard and providing inspectors and auditors to assure its adherence, contributes to this shock and detracts from the very goal the approach should achieve. We offer an alternative, adopted through experience, to implement a software quality program: cooperative assistance. We show how cooperation, education, consultation and friendly assistance can overcome this culture shock. 3 refs.

  20. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon D.

    1991-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. All materials have been grouped into eight general subject areas for easy reference: The Software Engineering Laboratory; The Software Engineering Laboratory: Software Development Documents; Software Tools; Software Models; Software Measurement; Technology Evaluations; Ada Technology; and Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  1. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1993-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory: software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. This document contains an index of these publications classified by individual author.

  2. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Groves, Paula; Valett, Jon

    1990-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is given. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: the Software Engineering Laboratory; the Software Engineering Laboratory-software development documents; software tools; software models; software measurement; technology evaluations; Ada technology; and data collection. Subject and author indexes further classify these documents by specific topic and individual author.

  3. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  4. Software quality: Process or people

    NASA Technical Reports Server (NTRS)

    Palmer, Regina; Labaugh, Modenna

    1993-01-01

    This paper will present data related to software development processes and personnel involvement from the perspective of software quality assurance. We examine eight years of data collected from six projects. Data collected varied by project but usually included defect and fault density with limited use of code metrics, schedule adherence, and budget growth information. The data are a blend of AFSCP 800-14 and suggested productivity measures in Software Metrics: A Practioner's Guide to Improved Product Development. A software quality assurance database tool, SQUID, was used to store and tabulate the data.

  5. Practical quality metrics for resolution enhancement software

    NASA Astrophysics Data System (ADS)

    Boone, Robert E.; Lucas, Kevin; Wynd, Raphael; Boatright, Mike; Thompson, Matthew A.; Reich, Alfred J.

    2003-06-01

    The past few years have seen an explosion in the application of software techniques to improve lithographic printing. Techniques such as optical proximity correction (OPC) and phase shift masks (PSM) increase resolution and CD control by distorting the mask pattern data from the original designed pattern. These software techniques are becoming increasingly complicated and non-intuitive; and the rate of complexity increase appears to be accelerating [1]. The benefits of these techniques to improve CD control and lower cost of ownership (COO) is balanced against the effort required to implement them and the additional problems they create. One severe problem for users of immature and complex software tools and methodologies is quality control, [2] as it ultimately becomes a COO problem. Software quality can be defined very simply as the ability of an application to meet detailed customer requirements. Software quality practice can be defined as the adherence to proven methods for planning, developing, testing and maintaining software. Although software quality for lithographic resolution enhancement is extremely important, the understanding and recognition of good software development practices among lithographers is generally poor. We therefore start by reviewing the essential terms and concepts of software quality that impact lithography and COO. We then propose methods by which semiconductor process and design engineers can estimate and compare the quality of the software tools and vendors they are evaluating or using. We include examples from advanced process technology resolution enhancement work that highlight the need for high-quality software practices, and show how to avoid many problems. Note that, although several authors have worked in software application development, our analysis here is somewhat of a black box analysis. The black box is the software development organization of an RET software supplier. Our access to actual developers within these

  6. Software Engineering Improvement Activities/Plan

    NASA Technical Reports Server (NTRS)

    2003-01-01

    bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.

  7. Training, Quality Assurance Factors, and Tools Investigation: a Work Report and Suggestions on Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Lee, Pen-Nan

    1991-01-01

    Previously, several research tasks have been conducted, some observations were obtained, and several possible suggestions have been contemplated involving software quality assurance engineering at NASA Johnson. These research tasks are briefly described. Also, a brief discussion is given on the role of software quality assurance in software engineering along with some observations and suggestions. A brief discussion on a training program for software quality assurance engineers is provided. A list of assurance factors as well as quality factors are also included. Finally, a process model which can be used for searching and collecting software quality assurance tools is presented.

  8. A Guideline of Using Case Method in Software Engineering Courses

    ERIC Educational Resources Information Center

    Zainal, Dzulaiha Aryanee Putri; Razali, Rozilawati; Shukur, Zarina

    2014-01-01

    Software Engineering (SE) education has been reported to fall short in producing high quality software engineers. In seeking alternative solutions, Case Method (CM) is regarded as having potential to solve the issue. CM is a teaching and learning (T&L) method that has been found to be effective in Social Science education. In principle,…

  9. Software cost/resource modeling: Software quality tradeoff measurement

    NASA Technical Reports Server (NTRS)

    Lawler, R. W.

    1980-01-01

    A conceptual framework for treating software quality from a total system perspective is developed. Examples are given to show how system quality objectives may be allocated to hardware and software; to illustrate trades among quality factors, both hardware and software, to achieve system performance objectives; and to illustrate the impact of certain design choices on software functionality.

  10. The Effects of Development Team Skill on Software Product Quality

    NASA Technical Reports Server (NTRS)

    Beaver, Justin M.; Schiavone, Guy A.

    2006-01-01

    This paper provides an analysis of the effect of the skill/experience of the software development team on the quality of the final software product. A method for the assessment of software development team skill and experience is proposed, and was derived from a workforce management tool currently in use by the National Aeronautics and Space Administration. Using data from 26 smallscale software development projects, the team skill measures are correlated to 5 software product quality metrics from the ISO/IEC 9126 Software Engineering Product Quality standard. in the analysis of the results, development team skill is found to be a significant factor in the adequacy of the design and implementation. In addition, the results imply that inexperienced software developers are tasked with responsibilities ill-suited to their skill level, and thus have a significant adverse effect on the quality of the software product. Keywords: software quality, development skill, software metrics

  11. Software Engineering Program: Software Process Improvement Guidebook

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.

  12. Improving Software Engineering on NASA Projects

    NASA Technical Reports Server (NTRS)

    Crumbley, Tim; Kelly, John C.

    2010-01-01

    Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.

  13. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Buhler, Melanie; Valett, Jon

    1989-01-01

    An annotated bibliography is presented of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. The bibliography was updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials were grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. Subject and author indexes further classify these documents by specific topic and individual author.

  14. Software engineering from a Langley perspective

    NASA Technical Reports Server (NTRS)

    Voigt, Susan

    1994-01-01

    A brief introduction to software engineering is presented. The talk is divided into four sections beginning with the question 'What is software engineering', followed by a brief history of the progression of software engineering at the Langley Research Center in the context of an expanding computing environment. Several basic concepts and terms are introduced, including software development life cycles and maturity levels. Finally, comments are offered on what software engineering means for the Langley Research Center and where to find more information on the subject.

  15. Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Kistler, David; Bristow, John; Smith, Don

    1994-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. Nearly 200 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) The Software Engineering Laboratory; (2) The Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  16. Software Engineering Laboratory Series: Proceedings of the Twenty-First Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1996-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  17. Software Engineering Laboratory Series: Proceedings of the Twenty-Second Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1997-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  18. Software Engineering Laboratory Series: Proceedings of the Twentieth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of application software. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that includes this document.

  19. Requirements Engineering in Building Climate Science Software

    ERIC Educational Resources Information Center

    Batcheller, Archer L.

    2011-01-01

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling…

  20. Ten recommendations for software engineering in research.

    PubMed

    Hastings, Janna; Haug, Kenneth; Steinbeck, Christoph

    2014-01-01

    Research in the context of data-driven science requires a backbone of well-written software, but scientific researchers are typically not trained at length in software engineering, the principles for creating better software products. To address this gap, in particular for young researchers new to programming, we give ten recommendations to ensure the usability, sustainability and practicality of research software. PMID:25685331

  1. Annotated bibliography of Software Engineering Laboratory literature

    NASA Technical Reports Server (NTRS)

    1985-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is presented. More than 100 publications are summarized. These publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials are grouped into five general subject areas for easy reference: (1) the software engineering laboratory; (2) software tools; (3) models and measures; (4) technology evaluations; and (5) data collection. An index further classifies these documents by specific topic.

  2. Modular Rocket Engine Control Software (MRECS)

    NASA Technical Reports Server (NTRS)

    Tarrant, C.; Crook, J.

    1998-01-01

    The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for advanced engine control systems that will result in lower software maintenance (operations) costs. It effectively accommodates software requirement changes that occur due to hardware technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives, benefits, and status of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishments are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software architecture, reuse software, and reduced software reverification time related to software changes. MRECS was recently modified to support a Space Shuttle Main Engine (SSME) hot-fire test. Cold Flow and Flight Readiness Testing were completed before the test was cancelled. Currently, the program is focused on supporting NASA MSFC in accomplishing development testing of the Fastrac Engine, part of NASA's Low Cost Technologies (LCT) Program. MRECS will be used for all engine development testing.

  3. A reflection on Software Engineering in HEP

    NASA Astrophysics Data System (ADS)

    Carminati, Federico

    2012-12-01

    High Energy Physics (HEP) has been making very extensive usage of computers to achieve its research goals. Fairly large program suites have been developed, maintained and used over the years and it is fair to say that, overall, HEP has been successful in software development. Yet, HEP software development has not used classical Software Engineering techniques, which have been invented and refined to help the production of large programmes. In this paper we will review the development of HEP code with its strengths and weaknesses. Using several well-known HEP software projects as examples, we will try to demonstrate that our community has used a form of Software Engineering, albeit in an informal manner. The software development techniques employed in these projects are indeed very close in many aspects to the modern tendencies of Software Engineering itself, in particular the so-called “agile technologies”. The paper will conclude with an outlook on the future of software development in HEP.

  4. Evaluating software development characteristics: Assessment of software measures in the Software Engineering Laboratory. [reliability engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1981-01-01

    Work on metrics is discussed. Factors that affect software quality are reviewed. Metrics is discussed in terms of criteria achievements, reliability, and fault tolerance. Subjective and objective metrics are distinguished. Product/process and cost/quality metrics are characterized and discussed.

  5. Software Engineering Education: Some Important Dimensions

    ERIC Educational Resources Information Center

    Mishra, Alok; Cagiltay, Nergiz Ercil; Kilic, Ozkan

    2007-01-01

    Software engineering education has been emerging as an independent and mature discipline. Accordingly, various studies are being done to provide guidelines for curriculum design. The main focus of these guidelines is around core and foundation courses. This paper summarizes the current problems of software engineering education programs. It also…

  6. Modular Rocket Engine Control Software (MRECS)

    NASA Technical Reports Server (NTRS)

    Tarrant, Charlie; Crook, Jerry

    1997-01-01

    The Modular Rocket Engine Control Software (MRECS) Program is a technology demonstration effort designed to advance the state-of-the-art in launch vehicle propulsion systems. Its emphasis is on developing and demonstrating a modular software architecture for a generic, advanced engine control system that will result in lower software maintenance (operations) costs. It effectively accommodates software requirements changes that occur due to hardware. technology upgrades and engine development testing. Ground rules directed by MSFC were to optimize modularity and implement the software in the Ada programming language. MRECS system software and the software development environment utilize Commercial-Off-the-Shelf (COTS) products. This paper presents the objectives and benefits of the program. The software architecture, design, and development environment are described. MRECS tasks are defined and timing relationships given. Major accomplishment are listed. MRECS offers benefits to a wide variety of advanced technology programs in the areas of modular software, architecture, reuse software, and reduced software reverification time related to software changes. Currently, the program is focused on supporting MSFC in accomplishing a Space Shuttle Main Engine (SSME) hot-fire test at Stennis Space Center and the Low Cost Boost Technology (LCBT) Program.

  7. Implementing Large Projects in Software Engineering Courses

    ERIC Educational Resources Information Center

    Coppit, David

    2006-01-01

    In software engineering education, large projects are widely recognized as a useful way of exposing students to the real-world difficulties of team software development. But large projects are difficult to put into practice. First, educators rarely have additional time to manage software projects. Second, classrooms have inherent limitations that…

  8. Selection of software for mechanical engineering undergraduates

    NASA Astrophysics Data System (ADS)

    Cheah, C. T.; Yin, C. S.; Halim, T.; Naser, J.; Blicblau, A. S.

    2016-07-01

    A major problem with the undergraduate mechanical course is the limited exposure of students to software packages coupled with the long learning curve on the existing software packages. This work proposes the use of appropriate software packages for the entire mechanical engineering curriculum to ensure students get sufficient exposure real life design problems. A variety of software packages are highlighted as being suitable for undergraduate work in mechanical engineering, e.g. simultaneous non-linear equations; uncertainty analysis; 3-D modeling software with the FEA; analysis tools for the solution of problems in thermodynamics, fluid mechanics, mechanical system design, and solid mechanics.

  9. Glossary of Software Engineering Laboratory terms

    NASA Technical Reports Server (NTRS)

    1983-01-01

    A glossary of terms used in the Software Engineering Laboratory (SEL) is given. The terms are defined within the context of the software development environment for flight dynamics at the Goddard Space Flight Center. A concise reference for clarifying the language employed in SEL documents and data collection forms is given. Basic software engineering concepts are explained and standard definitions for use by SEL personnel are established.

  10. MOSS, an evaluation of software engineering techniques

    NASA Technical Reports Server (NTRS)

    Bounds, J. R.; Pruitt, J. L.

    1976-01-01

    An evaluation of the software engineering techniques used for the development of a Modular Operating System (MOSS) was described. MOSS is a general purpose real time operating system which was developed for the Concept Verification Test (CVT) program. Each of the software engineering techniques was described and evaluated based on the experience of the MOSS project. Recommendations for the use of these techniques on future software projects were also given.

  11. Software Process Improvement through the Removal of Project-Level Knowledge Flow Obstacles: The Perceptions of Software Engineers

    ERIC Educational Resources Information Center

    Mitchell, Susan Marie

    2012-01-01

    Uncontrollable costs, schedule overruns, and poor end product quality continue to plague the software engineering field. Innovations formulated with the expectation to minimize or eliminate cost, schedule, and quality problems have generally fallen into one of three categories: programming paradigms, software tools, and software process…

  12. Engine Structures Modeling Software System (ESMOSS)

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Engine Structures Modeling Software System (ESMOSS) is the development of a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components, and substructures which can be transferred to finite element analysis programs such as NASTRAN. The NASA Lewis Engine Structures Program is concerned with the development of technology for the rational structural design and analysis of advanced gas turbine engines with emphasis on advanced structural analysis, structural dynamics, structural aspects of aeroelasticity, and life prediction. Fundamental and common to all of these developments is the need for geometric and analytical model descriptions at various engine assembly levels which are generated using ESMOSS.

  13. Collected Software Engineering Papers, Volume 10

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from Oct. 1991 - Nov. 1992. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document. For the convenience of this presentation, the 11 papers contained here are grouped into 5 major sections: (1) the Software Engineering Laboratory; (2) software tools studies; (3) software models studies; (4) software measurement studies; and (5) Ada technology studies.

  14. Development of a comprehensive software engineering environment

    NASA Technical Reports Server (NTRS)

    Hartrum, Thomas C.; Lamont, Gary B.

    1987-01-01

    The generation of a set of tools for software lifecycle is a recurring theme in the software engineering literature. The development of such tools and their integration into a software development environment is a difficult task because of the magnitude (number of variables) and the complexity (combinatorics) of the software lifecycle process. An initial development of a global approach was initiated in 1982 as the Software Development Workbench (SDW). Continuing efforts focus on tool development, tool integration, human interfacing, data dictionaries, and testing algorithms. Current efforts are emphasizing natural language interfaces, expert system software development associates and distributed environments with Ada as the target language. The current implementation of the SDW is on a VAX-11/780. Other software development tools are being networked through engineering workstations.

  15. Pragmatic quality metrics for evolutionary software development models

    NASA Technical Reports Server (NTRS)

    Royce, Walker

    1990-01-01

    Due to the large number of product, project, and people parameters which impact large custom software development efforts, measurement of software product quality is a complex undertaking. Furthermore, the absolute perspective from which quality is measured (customer satisfaction) is intangible. While we probably can't say what the absolute quality of a software product is, we can determine the relative quality, the adequacy of this quality with respect to pragmatic considerations, and identify good and bad trends during development. While no two software engineers will ever agree on an optimum definition of software quality, they will agree that the most important perspective of software quality is its ease of change. We can call this flexibility, adaptability, or some other vague term, but the critical characteristic of software is that it is soft. The easier the product is to modify, the easier it is to achieve any other software quality perspective. This paper presents objective quality metrics derived from consistent lifecycle perspectives of rework which, when used in concert with an evolutionary development approach, can provide useful insight to produce better quality per unit cost/schedule or to achieve adequate quality more efficiently. The usefulness of these metrics is evaluated by applying them to a large, real world, Ada project.

  16. Software Engineering Approaches to Ontology Development

    NASA Astrophysics Data System (ADS)

    Gaševic, Dragan; Djuric, Dragan; Devedžic, Vladan

    Ontologies, as formal representations of domain knowledge, enable knowledge sharing between different knowledge-based applications. Diverse techniques originating from the field of artificial intelligence are aimed at facilitating ontology development. However, these techniques, although well known to AI experts, are typically unknown to a large population of software engineers. In order to overcome the gap between the knowledge of software engineering practitioners and AI techniques, a few proposals have been made suggesting the use of well-known software engineering techniques, such as UML, for ontology development (Cranefield 2001a).

  17. Software engineering laboratory series: Annotated bibliography of software engineering laboratory literature

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Valett, Jon

    1992-01-01

    This document is an annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory. More than 100 publications are summarized. These publications cover many areas of software engineering and range from research reports to software documentation. This document has been updated and reorganized substantially since the original version (SEL-82-006, November 1982). All materials have been grouped into eight general subject areas for easy reference: (1) the Software Engineering Laboratory; (2) the Software Engineering Laboratory: Software Development Documents; (3) Software Tools; (4) Software Models; (5) Software Measurement; (6) Technology Evaluations; (7) Ada Technology; and (8) Data Collection. This document contains an index of these publications classified by individual author.

  18. The Software Engineering Laboratory: An operational software experience factory

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Caldiera, Gianluigi; Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon

    1992-01-01

    For 15 years, the Software Engineering Laboratory (SEL) has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software and software processes within a production software development environment at NASA/GSFC. The SEL comprises three major organizations: (1) NASA/GSFC, Flight Dynamics Division; (2) University of Maryland, Department of Computer Science; and (3) Computer Sciences Corporation, Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents, all of which describe some aspect of the software engineering technology that was analyzed in the flight dynamics environment at NASA. The studies range from small, controlled experiments (such as analyzing the effectiveness of code reading versus that of functional testing) to large, multiple project studies (such as assessing the impacts of Ada on a production environment). The organization's driving goal is to improve the software process continually, so that sustained improvement may be observed in the resulting products. This paper discusses the SEL as a functioning example of an operational software experience factory and summarizes the characteristics of and major lessons learned from 15 years of SEL operations.

  19. Studies and experiments in the Software Engineering Lab (SEL)

    NASA Technical Reports Server (NTRS)

    Mcgarry, F. E.; Card, D. N.

    1985-01-01

    The Software Engineering Laboratory (SEL) is an organization created nearly 10 years ago for the purpose of identifying, measuring and applying quality software engineering techniques in a production environment. The members of the SEL include NASA/Goddard Space Flight Center (GSFC, the sponsor and organizer), University of Maryland, and Computer Sciences Corporation. Since its inception the SEL has conducted numerous experiments, and has evaluated a wide range of software technologies. This paper describes several of the more recent experiments as well as some of the general conclusions to which the SEL has arrived.

  20. The role of software engineering in the space station program

    NASA Technical Reports Server (NTRS)

    Hall, Dana

    1988-01-01

    Software engineering applications snapshots within the Space Station Freedom Program; software engineering and Ada training; software reuse; hierarchial command and control; program characteristics; integrated, international environments; software production, integration, and management; and integrated simulation environment are outlined in viewgraph format.

  1. Software-Engineering Process Simulation (SEPS) model

    NASA Technical Reports Server (NTRS)

    Lin, C. Y.; Abdel-Hamid, T.; Sherif, J. S.

    1992-01-01

    The Software Engineering Process Simulation (SEPS) model is described which was developed at JPL. SEPS is a dynamic simulation model of the software project development process. It uses the feedback principles of system dynamics to simulate the dynamic interactions among various software life cycle development activities and management decision making processes. The model is designed to be a planning tool to examine tradeoffs of cost, schedule, and functionality, and to test the implications of different managerial policies on a project's outcome. Furthermore, SEPS will enable software managers to gain a better understanding of the dynamics of software project development and perform postmodern assessments.

  2. Repository-Based Software Engineering (RBSE) program

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Support of a software engineering program was provided in the following areas: client/customer liaison; research representation/outreach; and program support management. Additionally, a list of deliverables is presented.

  3. Software engineering environment tool set integration

    NASA Technical Reports Server (NTRS)

    Selfridge, William P.

    1986-01-01

    Space Transportation System Division (STSD) Engineering has a program to promote excellence within the engineering function. This program resulted in a capital funded facility based on a VAX cluster called the Rockwell Operational Engineering System (ROSES). The second phase of a three phase plan to establish an integrated software engineering environment for ROSES is examined. It discusses briefly phase one which establishes the basic capability for a modern software development environment to include a tool set, training and standards. Phase two is a tool set integration. The tool set is primarily off-the-shelf tools acquired through vendors or government agencies (public domain). These tools were placed into categories of software development. These categories are: requirements, design, and construction support; verification and validation support; and software management support. The integration of the tool set is being performed through concept prototyping and development of tools specifically designed to support the life cycle and provide transition from one phase to the next.

  4. A Discussion of the Software Quality Assurance Role

    NASA Technical Reports Server (NTRS)

    Kandt, Ronald Kirk

    2010-01-01

    The basic idea underlying this paper is that the conventional understanding of the role of a Software Quality Assurance (SQA) engineer is unduly limited. This is because few have asked who the customers of a SQA engineer are. Once you do this, you can better define what tasks a SQA engineer should perform, as well as identify the knowledge and skills that such a person should have. The consequence of doing this is that a SQA engineer can provide greater value to his or her customers. It is the position of this paper that a SQA engineer providing significant value to his or her customers must not only assume the role of an auditor, but also that of a software and systems engineer. This is because software engineers and their managers particularly value contributions that directly impact products and their development. These ideas are summarized as lessons learned, based on my experience at Jet Propulsion Laboratory (JPL).

  5. Software process improvement in the NASA software engineering laboratory

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin

    1994-01-01

    The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.

  6. Glossary of software engineering laboratory terms

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A glossary of terms used in the Software Engineering Laboratory (SEL) is presented. The terms are defined within the context of the software development environment for flight dynamics at Goddard Space Flight Center. A concise reference for clarifying and understanding the language employed in SEL documents and data collection forms is provided.

  7. Software engineering technology transfer: Understanding the process

    NASA Technical Reports Server (NTRS)

    Zelkowitz, Marvin V.

    1993-01-01

    Technology transfer is of crucial concern to both government and industry today. In this report, the mechanisms developed by NASA to transfer technology are explored and the actual mechanisms used to transfer software development technologies are investigated. Time, cost, and effectiveness of software engineering technology transfer is reported.

  8. An Ontology for Software Engineering Education

    ERIC Educational Resources Information Center

    Ling, Thong Chee; Jusoh, Yusmadi Yah; Adbullah, Rusli; Alwi, Nor Hayati

    2013-01-01

    Software agents communicate using ontology. It is important to build an ontology for specific domain such as Software Engineering Education. Building an ontology from scratch is not only hard, but also incur much time and cost. This study aims to propose an ontology through adaptation of the existing ontology which is originally built based on a…

  9. SERI QC Solar Data Quality Assessment Software

    Energy Science and Technology Software Center (ESTSC)

    1994-12-31

    SERI QC is a mathematical software package that assesses the quality of solar radiation data. The SERI QC software is a function written in the C programming language. IT IS NOT A STANDALONE SOFTWARE APPLICATION. The user must write the calling application that requires quality assessment of solar data. The C function returns data quality flags to the calling program. A companion program, QCFIT, is a standalone Windows application that provides support files for themore » SERI QC function (data quality boundaries). The QCFIT software can also be used as an analytical tool for visualizing solar data quality independent of the SERI QC function.« less

  10. SERI QC Solar Data Quality Assessment Software

    SciTech Connect

    1994-12-31

    SERI QC is a mathematical software package that assesses the quality of solar radiation data. The SERI QC software is a function written in the C programming language. IT IS NOT A STANDALONE SOFTWARE APPLICATION. The user must write the calling application that requires quality assessment of solar data. The C function returns data quality flags to the calling program. A companion program, QCFIT, is a standalone Windows application that provides support files for the SERI QC function (data quality boundaries). The QCFIT software can also be used as an analytical tool for visualizing solar data quality independent of the SERI QC function.

  11. Happy software developers solve problems better: psychological measurements in empirical software engineering

    PubMed Central

    Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers’ productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states—emotions and moods—deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint

  12. Happy software developers solve problems better: psychological measurements in empirical software engineering.

    PubMed

    Graziotin, Daniel; Wang, Xiaofeng; Abrahamsson, Pekka

    2014-01-01

    For more than thirty years, it has been claimed that a way to improve software developers' productivity and software quality is to focus on people and to provide incentives to make developers satisfied and happy. This claim has rarely been verified in software engineering research, which faces an additional challenge in comparison to more traditional engineering fields: software development is an intellectual activity and is dominated by often-neglected human factors (called human aspects in software engineering research). Among the many skills required for software development, developers must possess high analytical problem-solving skills and creativity for the software construction process. According to psychology research, affective states-emotions and moods-deeply influence the cognitive processing abilities and performance of workers, including creativity and analytical problem solving. Nonetheless, little research has investigated the correlation between the affective states, creativity, and analytical problem-solving performance of programmers. This article echoes the call to employ psychological measurements in software engineering research. We report a study with 42 participants to investigate the relationship between the affective states, creativity, and analytical problem-solving skills of software developers. The results offer support for the claim that happy developers are indeed better problem solvers in terms of their analytical abilities. The following contributions are made by this study: (1) providing a better understanding of the impact of affective states on the creativity and analytical problem-solving capacities of developers, (2) introducing and validating psychological measurements, theories, and concepts of affective states, creativity, and analytical-problem-solving skills in empirical software engineering, and (3) raising the need for studying the human factors of software engineering by employing a multidisciplinary viewpoint. PMID

  13. The IEEE Software Engineering Standards Process

    PubMed Central

    Buckley, Fletcher J.

    1984-01-01

    Software Engineering has emerged as a field in recent years, and those involved increasingly recognize the need for standards. As a result, members of the Institute of Electrical and Electronics Engineers (IEEE) formed a subcommittee to develop these standards. This paper discusses the ongoing standards development, and associated efforts.

  14. Software engineering education in medical informatics.

    PubMed

    Leven, F J

    1989-11-01

    Requirements and approaches of Software Engineering education in the field of Medical Informatics are described with respect to the impact of (1) experiences characterizing the "software misery", (2) status and tendencies in software methodology, and (3) educational status and needs in computer science education influenced by the controversy "theoretical versus practical education". Special attention is directed toward the growing importance of analysis, design methods, and techniques in the professional spectrum of Medical Informatics, the relevance of general principles of systems engineering in health care, the potential of non-procedural programming paradigms, and the intersection of Artificial Intelligence and education. Realizations of and experiences with programs in the field of Software Engineering are reported with respect to special requirements in Medical Informatics. PMID:2695780

  15. Design of software engineering teaching website

    NASA Astrophysics Data System (ADS)

    Li, Yuxiang; Liu, Xin; Zhang, Guangbin; Liu, Xingshun; Gao, Zhenbo

    "􀀶oftware engineering" is different from the general professional courses, it is born for getting rid of the software crisis and adapting to the development of software industry, it is a theory course, especially a practical course. However, due to the own characteristics of software engineering curriculum, in the daily teaching process, concerning theoretical study, students may feel boring, obtain low interest in learning and poor test results and other problems. ASPNET design technique is adopted and Access 2007 database is used for system to design and realize "Software Engineering" teaching website. System features mainly include theoretical teaching, case teaching, practical teaching, teaching interaction, database, test item bank, announcement, etc., which can enhance the vitality, interest and dynamic role of learning.

  16. Collected software engineering papers, volume 8

    NASA Technical Reports Server (NTRS)

    1990-01-01

    A collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period November 1989 through October 1990 is presented. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography. The seven presented papers are grouped into four major categories: (1) experimental research and evaluation of software measurement; (2) studies on models for software reuse; (3) a software tool evaluation; and (4) Ada technology and studies in the areas of reuse and specification.

  17. Software systems development in petroleum engineering

    NASA Astrophysics Data System (ADS)

    Browning, D. J.; Cain, G. M.; Carmichael, N. P.; Gouldstone, F. G.; Wadsley, A. W.; Webb, S. J.; Winder, P.

    1985-10-01

    Many approaches to designing software systems have been developed for use in commercial or business environments. These development methods and procedures have improved dramatically over the last ten years although it is only recently that these have been employed in scientific and technological applications. Many of these implementations have been unsuccessful because the design methodology has been divorced from the practical requirements of the industry in which the software system is to operate. This paper discusses a modern approach to software development which directly relates to an engineering environment and which is designed to satisfy practical criteria of acceptability of the software when delivered to the petroleum engineer. Since all field developments nowadays rely heavily on associated software systems, the approach presented here can lead to improved mechanical systems reliability and shorter development/design cycles.

  18. ETICS: the international software engineering service for the grid

    NASA Astrophysics Data System (ADS)

    Meglio, A. D.; Bégin, M.-E.; Couvares, P.; Ronchieri, E.; Takacs, E.

    2008-07-01

    The ETICS system is a distributed software configuration, build and test system designed to fulfil the needs of improving the quality, reliability and interoperability of distributed software in general and grid software in particular. The ETICS project is a consortium of five partners (CERN, INFN, Engineering Ingegneria Informatica, 4D Soft and the University of Wisconsin-Madison). The ETICS service consists of a build and test job execution system based on the Metronome software and an integrated set of web services and software engineering tools to design, maintain and control build and test scenarios. The ETICS system allows taking into account complex dependencies among applications and middleware components and provides a rich environment to perform static and dynamic analysis of the software and execute deployment, system and interoperability tests. This paper gives an overview of the system architecture and functionality set and then describes how the EC-funded EGEE, DILIGENT and OMII-Europe projects are using the software engineering services to build, validate and distribute their software. Finally a number of significant use and test cases will be described to show how ETICS can be used in particular to perform interoperability tests of grid middleware using the grid itself.

  19. Software quality assurance plan for GCS

    NASA Technical Reports Server (NTRS)

    Duncan, Stephen E.; Bailey, Elizabeth K.

    1990-01-01

    The software quality assurance (SQA) function for the Guidance and Control Software (GCS) project which is part of a software error studies research program is described. The SQA plan outlines all of the procedures, controls, and audits to be carried out by the SQA organization to ensure adherence to the policies, procedures, and standards for the GCS project.

  20. Collected software engineering papers, volume 9

    NASA Technical Reports Server (NTRS)

    1991-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1990 through October 1991. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the ninth such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. For the convenience of this presentation, the eight papers contained here are grouped into three major categories: (1) software models studies; (2) software measurement studies; and (3) Ada technology studies. The first category presents studies on reuse models, including a software reuse model applied to maintenance and a model for an organization to support software reuse. The second category includes experimental research methods and software measurement techniques. The third category presents object-oriented approaches using Ada and object-oriented features proposed for Ada. The SEL is actively working to understand and improve the software development process at GSFC.

  1. Quality engineering as a profession.

    SciTech Connect

    Kolb, Rachel R.; Hoover, Marcey L.

    2012-12-01

    Over the course of time, the profession of quality engineering has witnessed significant change, from its original emphasis on quality control and inspection to a more contemporary focus on upholding quality processes throughout the organization and its product realization activities. This paper describes the profession of quality engineering, exploring how today's quality engineers and quality professionals are certified individuals committed to upholding quality processes and principles while working with different dimensions of product development. It also discusses the future of the quality engineering profession and the future of the quality movement as a whole.

  2. Advances in knowledge-based software engineering

    NASA Technical Reports Server (NTRS)

    Truszkowski, Walt

    1991-01-01

    The underlying hypothesis of this work is that a rigorous and comprehensive software reuse methodology can bring about a more effective and efficient utilization of constrained resources in the development of large-scale software systems by both government and industry. It is also believed that correct use of this type of software engineering methodology can significantly contribute to the higher levels of reliability that will be required of future operational systems. An overview and discussion of current research in the development and application of two systems that support a rigorous reuse paradigm are presented: the Knowledge-Based Software Engineering Environment (KBSEE) and the Knowledge Acquisition fo the Preservation of Tradeoffs and Underlying Rationales (KAPTUR) systems. Emphasis is on a presentation of operational scenarios which highlight the major functional capabilities of the two systems.

  3. Diversification and Challenges of Software Engineering Standards

    NASA Technical Reports Server (NTRS)

    Poon, Peter T.

    1994-01-01

    The author poses certain questions in this paper: 'In the future, should there be just one software engineering standards set? If so, how can we work towards that goal? What are the challenges of internationalizing standards?' Based on the author's personal view, the statement of his position is as follows: 'There should NOT be just one set of software engineering standards in the future. At the same time, there should NOT be the proliferation of standards, and the number of sets of standards should be kept to a minimum.It is important to understand the diversification of the areas which are spanned by the software engineering standards.' The author goes on to describe the diversification of processes, the diversification in the national and international character of standards organizations, the diversification of the professional organizations producing standards, the diversification of the types of businesses and industries, and the challenges of internationalizing standards.

  4. Experience with a software engineering environment framework

    NASA Technical Reports Server (NTRS)

    Blumberg, R.; Reedy, A.; Yodis, E.

    1985-01-01

    Experience with a software engineering environment framework tool called the Automated Product Control Environment (APCE) is described. The goals of the framework design, an overview of the major functions and features of the framework, and implementation and use of the framework are presented. Aspects of the framework discussed include automation and control; portability, distributability, and interoperability; cost/benefit analysis; and productivity. Results of using the framework are discussed and the framework approach is briefly compared to other software development environment approaches.

  5. Repository-based software engineering program

    NASA Technical Reports Server (NTRS)

    Wilson, James

    1992-01-01

    The activities performed during September 1992 in support of Tasks 01 and 02 of the Repository-Based Software Engineering Program are outlined. The recommendations and implementation strategy defined at the September 9-10 meeting of the Reuse Acquisition Action Team (RAAT) are attached along with the viewgraphs and reference information presented at the Institute for Defense Analyses brief on legal and patent issues related to software reuse.

  6. Automated software engineering planning with SASEA

    SciTech Connect

    Lawlis, P.K.; Hoffman, C.L.

    1998-07-01

    Planning for effective software engineering is not easy, and software project managers would usually welcome assistance in this area. Very effective assistance could be provided by automated tools that are decision aids. However, a comprehensive suite of such tools does not yet exist. One area that has been addressed is the selection of a programming language. This paper discusses in detail a decision tool that has been developed for language selection. It also addresses the areas in which other such tools are required.

  7. Software Quality Assurance for Nuclear Safety Systems

    SciTech Connect

    Sparkman, D R; Lagdon, R

    2004-05-16

    The US Department of Energy has undertaken an initiative to improve the quality of software used to design and operate their nuclear facilities across the United States. One aspect of this initiative is to revise or create new directives and guides associated with quality practices for the safety software in its nuclear facilities. Safety software includes the safety structures, systems, and components software and firmware, support software and design and analysis software used to ensure the safety of the facility. DOE nuclear facilities are unique when compared to commercial nuclear or other industrial activities in terms of the types and quantities of hazards that must be controlled to protect workers, public and the environment. Because of these differences, DOE must develop an approach to software quality assurance that ensures appropriate risk mitigation by developing a framework of requirements that accomplishes the following goals: {sm_bullet} Ensures the software processes developed to address nuclear safety in design, operation, construction and maintenance of its facilities are safe {sm_bullet} Considers the larger system that uses the software and its impacts {sm_bullet} Ensures that the software failures do not create unsafe conditions Software designers for nuclear systems and processes must reduce risks in software applications by incorporating processes that recognize, detect, and mitigate software failure in safety related systems. It must also ensure that fail safe modes and component testing are incorporated into software design. For nuclear facilities, the consideration of risk is not necessarily sufficient to ensure safety. Systematic evaluation, independent verification and system safety analysis must be considered for software design, implementation, and operation. The software industry primarily uses risk analysis to determine the appropriate level of rigor applied to software practices. This risk-based approach distinguishes safety

  8. Software engineering and the role of Ada: Executive seminar

    NASA Technical Reports Server (NTRS)

    Freedman, Glenn B.

    1987-01-01

    The objective was to introduce the basic terminology and concepts of software engineering and Ada. The life cycle model is reviewed. The application of the goals and principles of software engineering is applied. An introductory understanding of the features of the Ada language is gained. Topics addressed include: the software crises; the mandate of the Space Station Program; software life cycle model; software engineering; and Ada under the software engineering umbrella.

  9. Collected software engineering papers, volume 11

    NASA Technical Reports Server (NTRS)

    1993-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1992 through November 1993. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 11th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  10. Some Future Software Engineering Opportunities and Challenges

    NASA Astrophysics Data System (ADS)

    Boehm, Barry

    This paper provides an update and extension of a 2006 paper, “Some Future Trends and Implications for Systems and Software Engineering Processes,” Systems Engineering, Spring 2006. Some of its challenges and opportunities are similar, such as the need to simultaneously achieve high levels of both agility and assurance. Others have emerged as increasingly important, such as the challenges of dealing with ultralarge volumes of data, with multicore chips, and with software as a service. The paper is organized around eight relatively surprise-free trends and two “wild cards” whose trends and implications are harder to foresee. The eight surprise-free trends are:

  11. Collected software engineering papers, volume 12

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This document is a collection of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) from November 1993 through October 1994. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. This is the 12th such volume of technical papers produced by the SEL. Although these papers cover several topics related to software engineering, they do not encompass the entire scope of SEL activities and interests. Additional information about the SEL and its research efforts may be obtained from the sources listed in the bibliography at the end of this document.

  12. Requirements Engineering for Software Integrity and Safety

    NASA Technical Reports Server (NTRS)

    Leveson, Nancy G.

    2002-01-01

    Requirements flaws are the most common cause of errors and software-related accidents in operational software. Most aerospace firms list requirements as one of their most important outstanding software development problems and all of the recent, NASA spacecraft losses related to software (including the highly publicized Mars Program failures) can be traced to requirements flaws. In light of these facts, it is surprising that relatively little research is devoted to requirements in contrast with other software engineering topics. The research proposed built on our previous work. including both criteria for determining whether a requirements specification is acceptably complete and a new approach to structuring system specifications called Intent Specifications. This grant was to fund basic research on how these ideas could be extended to leverage innovative approaches to the problems of (1) reducing the impact of changing requirements, (2) finding requirements specification flaws early through formal and informal analysis, and (3) avoiding common flaws entirely through appropriate requirements specification language design.

  13. Software engineering processes for Class D missions

    NASA Astrophysics Data System (ADS)

    Killough, Ronnie; Rose, Debi

    2013-09-01

    Software engineering processes are often seen as anathemas; thoughts of CMMI key process areas and NPR 7150.2A compliance matrices can motivate a software developer to consider other career fields. However, with adequate definition, common-sense application, and an appropriate level of built-in flexibility, software engineering processes provide a critical framework in which to conduct a successful software development project. One problem is that current models seem to be built around an underlying assumption of "bigness," and assume that all elements of the process are applicable to all software projects regardless of size and tolerance for risk. This is best illustrated in NASA's NPR 7150.2A in which, aside from some special provisions for manned missions, the software processes are to be applied based solely on the criticality of the software to the mission, completely agnostic of the mission class itself. That is, the processes applicable to a Class A mission (high priority, very low risk tolerance, very high national significance) are precisely the same as those applicable to a Class D mission (low priority, high risk tolerance, low national significance). This paper will propose changes to NPR 7150.2A, taking mission class into consideration, and discuss how some of these changes are being piloted for a current Class D mission—the Cyclone Global Navigation Satellite System (CYGNSS).

  14. Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process

    NASA Technical Reports Server (NTRS)

    Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom

    1997-01-01

    The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.

  15. Proceedings of the Thirteenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1988-01-01

    Topics covered in the workshop included studies and experiments conducted in the Software Engineering Laboratory (SEL), a cooperative effort of NASA Goddard Space Flight Center, the University of Maryland, and Computer Sciences Corporation; software models; software products; and software tools.

  16. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical

  17. Proceedings of the Fifth Triennial Software Quality Forum 2000, Software for the Next Millennium, Software Quality Forum

    SciTech Connect

    Scientific Software Engineering Group, CIC-12

    2000-04-01

    The Software Quality Forum is a triennial conference held by the Software Quality Assurance Subcommittee for the Department of Energy's Quality Managers. The forum centers on key issues, information, and technology important in software development for the Nuclear Weapons Complex. This year it will be opened up to include local information technology companies and software vendors presenting their solutions, ideas, and lessons learned. The Software Quality Forum 2000 will take on a more hands-on, instructional tone than those previously held. There will be an emphasis on providing information, tools, and resources to assist developers in their goal of producing next generation software.

  18. Engineering software development with HyperCard

    NASA Technical Reports Server (NTRS)

    Darko, Robert J.

    1990-01-01

    The successful and unsuccessful techniques used in the development of software using HyperCard are described. The viability of the HyperCard for engineering is evaluated and the future use of HyperCard by this particular group of developers is discussed.

  19. Software Engineering for User Interfaces. Technical Report.

    ERIC Educational Resources Information Center

    Draper, Stephen W.; Norman, Donald A.

    The discipline of software engineering can be extended in a natural way to deal with the issues raised by a systematic approach to the design of human-machine interfaces. The user should be treated as part of the system being designed and projects should be organized to take into account the current lack of a priori knowledge of user interface…

  20. 2011 SAPHIRE 8 Software Quality Assurance Status Report

    SciTech Connect

    Kurt G. Vedros

    2011-09-01

    The Software Quality Assurance engineer position was created in fiscal year 2011 to better maintain and improve the quality of the SAPHIRE 8 development program. This year's Software Quality Assurance tasks concentrated on developing the framework of the SQA program. This report reviews the accomplishments and recommendations for each of the subtasks set forth for JCN V6059: (1) Reviews, Tests, and Code Walkthroughs; (2) Data Dictionary; (3) Metrics; (4) Requirements Traceability Matrix; (5) Provide Oversight on SAPHIRE QA Activities; and (6) Support NRC Presentations and Meetings.

  1. Collected software engineering papers, volume 7

    NASA Technical Reports Server (NTRS)

    1989-01-01

    A collection is presented of selected technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period Dec. 1988 to Oct. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the seven papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  2. The HELIOS medical software engineering environment.

    PubMed

    Degoulet, P; Jean, F C; Meinzer, H P; Engelmann, U; Baud, R; Rassinoux, A M; Jagermann, C; Sandblad, B; Cordelle, D; Wigertz, O

    1994-10-01

    The aim of the HELIOS project is to create an integrated Software Engineering Environment (SEE) to facilitate the development and maintenance of medical applications. HELIOS is made of a set of software components, communicating through a software bus called the HELIOS Unification Bus. The object oriented paradigm is used both as the basic structure for building the software components and as the methodology for modelling, storing and retrieving the entities and procedures used in an application. Development standards include UNIX as operating system and X Window/MOTIF as windowing environment. One of the target applications for the HELIOS prototype is the development of a multimedia medical workstation as a front end to a hospital information system. PMID:7889774

  3. Collected software engineering papers, volume 6

    NASA Technical Reports Server (NTRS)

    1988-01-01

    A collection is presented of technical papers produced by participants in the Software Engineering Laboratory (SEL) during the period 1 Jun. 1987 to 1 Jan. 1989. The purpose of the document is to make available, in one reference, some results of SEL research that originally appeared in a number of different forums. For the convenience of this presentation, the twelve papers contained here are grouped into three major categories: (1) Software Measurement and Technology Studies; (2) Measurement Environment Studies; and (3) Ada Technology Studies. The first category presents experimental research and evaluation of software measurement and technology; the second presents studies on software environments pertaining to measurement. The last category represents Ada technology and includes research, development, and measurement studies.

  4. Software Engineering Technology Infusion Within NASA

    NASA Technical Reports Server (NTRS)

    Zelkowitz, Marvin V.

    1996-01-01

    Abstract technology transfer is of crucial concern to both government and industry today. In this paper, several software engineering technologies used within NASA are studied, and the mechanisms, schedules, and efforts at transferring these technologies are investigated. The goals of this study are: 1) to understand the difference between technology transfer (the adoption of a new method by large segments of an industry) as an industry-wide phenomenon and the adoption of a new technology by an individual organization (called technology infusion); and 2) to see if software engineering technology transfer differs from other engineering disciplines. While there is great interest today in developing technology transfer models for industry, it is the technology infusion process that actually causes changes in the current state of the practice.

  5. Integrating interface slicing into software engineering processes

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    Interface slicing is a tool which was developed to facilitate software engineering. As previously presented, it was described in terms of its techniques and mechanisms. The integration of interface slicing into specific software engineering activities is considered by discussing a number of potential applications of interface slicing. The applications discussed specifically address the problems, issues, or concerns raised in a previous project. Because a complete interface slicer is still under development, these applications must be phrased in future tenses. Nonetheless, the interface slicing techniques which were presented can be implemented using current compiler and static analysis technology. Whether implemented as a standalone tool or as a module in an integrated development or reverse engineering environment, they require analysis no more complex than that required for current system development environments. By contrast, conventional slicing is a methodology which, while showing much promise and intuitive appeal, has yet to be fully implemented in a production language environment despite 12 years of development.

  6. Parallelization of Rocket Engine System Software (Press)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1996-01-01

    The main goal is to assess parallelization requirements for the Rocket Engine Numeric Simulator (RENS) project which, aside from gathering information on liquid-propelled rocket engines and setting forth requirements, involve a large FORTRAN based package at NASA Lewis Research Center and TDK software developed by SUBR/UWF. The ultimate aim is to develop, test, integrate, and suitably deploy a family of software packages on various aspects and facets of rocket engines using liquid-propellants. At present, all project efforts by the funding agency, NASA Lewis Research Center, and the HBCU participants are disseminated over the internet using world wide web home pages. Considering obviously expensive methods of actual field trails, the benefits of software simulators are potentially enormous. When realized, these benefits will be analogous to those provided by numerous CAD/CAM packages and flight-training simulators. According to the overall task assignments, Hampton University's role is to collect all available software, place them in a common format, assess and evaluate, define interfaces, and provide integration. Most importantly, the HU's mission is to see to it that the real-time performance is assured. This involves source code translations, porting, and distribution. The porting will be done in two phases: First, place all software on Cray XMP platform using FORTRAN. After testing and evaluation on the Cray X-MP, the code will be translated to C + + and ported to the parallel nCUBE platform. At present, we are evaluating another option of distributed processing over local area networks using Sun NFS, Ethernet, TCP/IP. Considering the heterogeneous nature of the present software (e.g., first started as an expert system using LISP machines) which now involve FORTRAN code, the effort is expected to be quite challenging.

  7. Software for Optimizing Quality Assurance of Other Software

    NASA Technical Reports Server (NTRS)

    Feather, Martin; Cornford, Steven; Menzies, Tim

    2004-01-01

    Software assurance is the planned and systematic set of activities that ensures that software processes and products conform to requirements, standards, and procedures. Examples of such activities are the following: code inspections, unit tests, design reviews, performance analyses, construction of traceability matrices, etc. In practice, software development projects have only limited resources (e.g., schedule, budget, and availability of personnel) to cover the entire development effort, of which assurance is but a part. Projects must therefore select judiciously from among the possible assurance activities. At its heart, this can be viewed as an optimization problem; namely, to determine the allocation of limited resources (time, money, and personnel) to minimize risk or, alternatively, to minimize the resources needed to reduce risk to an acceptable level. The end result of the work reported here is a means to optimize quality-assurance processes used in developing software.

  8. What's Happening in the Software Engineering Laboratory?

    NASA Technical Reports Server (NTRS)

    Pajerski, Rose; Green, Scott; Smith, Donald

    1995-01-01

    Since 1976 the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization the Flight Dynamics Division (FDD) at Goddard Space Flight Center, develops, maintains, and manages complex flight dynamics systems. This paper presents an overview of recent activities and studies in SEL, using as a framework the SEL's organizational goals and experience based software improvement approach. It focuses on two SEL experience areas : (1) the evolution of the measurement program and (2) an analysis of three generations of Cleanroom experiments.

  9. A research review of quality assessment for software

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Measures were recommended to assess the quality of software submitted to the AdaNet program. The quality factors that are important to software reuse are explored and methods of evaluating those factors are discussed. Quality factors important to software reuse are: correctness, reliability, verifiability, understandability, modifiability, and certifiability. Certifiability is included because the documentation of many factors about a software component such as its efficiency, portability, and development history, constitute a class for factors important to some users, not important at all to other, and impossible for AdaNet to distinguish between a priori. The quality factors may be assessed in different ways. There are a few quantitative measures which have been shown to indicate software quality. However, it is believed that there exists many factors that indicate quality and have not been empirically validated due to their subjective nature. These subjective factors are characterized by the way in which they support the software engineering principles of abstraction, information hiding, modularity, localization, confirmability, uniformity, and completeness.

  10. Requirements Engineering in Building Climate Science Software

    NASA Astrophysics Data System (ADS)

    Batcheller, Archer L.

    Software has an important role in supporting scientific work. This dissertation studies teams that build scientific software, focusing on the way that they determine what the software should do. These requirements engineering processes are investigated through three case studies of climate science software projects. The Earth System Modeling Framework assists modeling applications, the Earth System Grid distributes data via a web portal, and the NCAR (National Center for Atmospheric Research) Command Language is used to convert, analyze and visualize data. Document analysis, observation, and interviews were used to investigate the requirements-related work. The first research question is about how and why stakeholders engage in a project, and what they do for the project. Two key findings arise. First, user counts are a vital measure of project success, which makes adoption important and makes counting tricky and political. Second, despite the importance of quantities of users, a few particular "power users" develop a relationship with the software developers and play a special role in providing feedback to the software team and integrating the system into user practice. The second research question focuses on how project objectives are articulated and how they are put into practice. The team seeks to both build a software system according to product requirements but also to conduct their work according to process requirements such as user support. Support provides essential communication between users and developers that assists with refining and identifying requirements for the software. It also helps users to learn and apply the software to their real needs. User support is a vital activity for scientific software teams aspiring to create infrastructure. The third research question is about how change in scientific practice and knowledge leads to changes in the software, and vice versa. The "thickness" of a layer of software infrastructure impacts whether the

  11. Consolidated View on Space Software Engineering Problems - An Empirical Study

    NASA Astrophysics Data System (ADS)

    Silva, N.; Vieira, M.; Ricci, D.; Cotroneo, D.

    2015-09-01

    Independent software verification and validation (ISVV) has been a key process for engineering quality assessment for decades, and is considered in several international standards. The “European Space Agency (ESA) ISVV Guide” is used for the European Space market to drive the ISVV tasks and plans, and to select applicable tasks and techniques. Software artefacts have room for improvement due to the amount if issues found during ISVV tasks. This article presents the analysis of the results of a large set of ISVV issues originated from three different ESA missions-amounting to more than 1000 issues. The study presents the main types, triggers and impacts related to the ISVV issues found and sets the path for a global software engineering improvement based on the most common deficiencies identified for space projects.

  12. A knowledge based software engineering environment testbed

    NASA Technical Reports Server (NTRS)

    Gill, C.; Reedy, A.; Baker, L.

    1985-01-01

    The Carnegie Group Incorporated and Boeing Computer Services Company are developing a testbed which will provide a framework for integrating conventional software engineering tools with Artifical Intelligence (AI) tools to promote automation and productivity. The emphasis is on the transfer of AI technology to the software development process. Experiments relate to AI issues such as scaling up, inference, and knowledge representation. In its first year, the project has created a model of software development by representing software activities; developed a module representation formalism to specify the behavior and structure of software objects; integrated the model with the formalism to identify shared representation and inheritance mechanisms; demonstrated object programming by writing procedures and applying them to software objects; used data-directed and goal-directed reasoning to, respectively, infer the cause of bugs and evaluate the appropriateness of a configuration; and demonstrated knowledge-based graphics. Future plans include introduction of knowledge-based systems for rapid prototyping or rescheduling; natural language interfaces; blackboard architecture; and distributed processing

  13. Domain and Specification Models for Software Engineering

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui

    1992-01-01

    This paper discusses our approach to representing application domain knowledge for specific software engineering tasks. Application domain knowledge is embodied in a domain model. Domain models are used to assist in the creation of specification models. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model. One aspect of the system-hierarchical organization is described in detail.

  14. The ATLAS data management software engineering process

    NASA Astrophysics Data System (ADS)

    Lassnig, M.; Garonne, V.; Stewart, G. A.; Barisits, M.; Beermann, T.; Vigne, R.; Serfon, C.; Goossens, L.; Nairz, A.; Molfetas, A.; Atlas Collaboration

    2014-06-01

    Rucio is the next-generation data management system of the ATLAS experiment. The software engineering process to develop Rucio is fundamentally different to existing software development approaches in the ATLAS distributed computing community. Based on a conceptual design document, development takes place using peer-reviewed code in a test-driven environment. The main objectives are to ensure that every engineer understands the details of the full project, even components usually not touched by them, that the design and architecture are coherent, that temporary contributors can be productive without delay, that programming mistakes are prevented before being committed to the source code, and that the source is always in a fully functioning state. This contribution will illustrate the workflows and products used, and demonstrate the typical development cycle of a component from inception to deployment within this software engineering process. Next to the technological advantages, this contribution will also highlight the social aspects of an environment where every action is subject to detailed scrutiny.

  15. Software engineering with application-specific languages

    NASA Technical Reports Server (NTRS)

    Campbell, David J.; Barker, Linda; Mitchell, Deborah; Pollack, Robert H.

    1993-01-01

    Application-Specific Languages (ASL's) are small, special-purpose languages that are targeted to solve a specific class of problems. Using ASL's on software development projects can provide considerable cost savings, reduce risk, and enhance quality and reliability. ASL's provide a platform for reuse within a project or across many projects and enable less-experienced programmers to tap into the expertise of application-area experts. ASL's have been used on several software development projects for the Space Shuttle Program. On these projects, the use of ASL's resulted in considerable cost savings over conventional development techniques. Two of these projects are described.

  16. Data systems and computer science: Software Engineering Program

    NASA Technical Reports Server (NTRS)

    Zygielbaum, Arthur I.

    1991-01-01

    An external review of the Integrated Technology Plan for the Civil Space Program is presented. This review is specifically concerned with the Software Engineering Program. The goals of the Software Engineering Program are as follows: (1) improve NASA's ability to manage development, operation, and maintenance of complex software systems; (2) decrease NASA's cost and risk in engineering complex software systems; and (3) provide technology to assure safety and reliability of software in mission critical applications.

  17. SWQM: Source Water Quality Modeling Software

    Energy Science and Technology Software Center (ESTSC)

    2008-01-08

    The Source Water Quality Modeling software (SWQM) simulates the water quality conditions that reflect properties of water generated by water treatment facilities. SWQM consists of a set of Matlab scripts that model the statistical variation that is expected in a water treatment facility’s water, such as pH and chlorine levels.

  18. Impact of knowledge-based software engineering on aerospace systems

    NASA Technical Reports Server (NTRS)

    Peyton, Liem; Gersh, Mark A.; Swietek, Gregg

    1991-01-01

    The emergence of knowledge engineering as a software technology will dramatically alter the use of software by expanding application areas across a wide spectrum of industries. The engineering and management of large aerospace software systems could benefit from a knowledge engineering approach. An understanding of this technology can potentially make significant improvements to the current practice of software engineering, and provide new insights into future development and support practices.

  19. SAPHIRE 8 Software Quality Assurance Oversight

    SciTech Connect

    Kurt G. Vedros

    2011-09-01

    The software quality assurance oversight consists of updating and maintaining revision control of the SAPHIRE 8 quality assurance program documentation and of monitoring revision control of the SAPHIRE 8 source code. This report summarizes the oversight efforts through description of the revision control system (RCS) setup, operation and contents. Documents maintained under revision control include the Acceptance Test Plan (ATP), Configuration Management Plan, Quality Assurance Plan, Software Project Plan, Requirements Traceability Matrix (RTM), System Test Plan, SDP Interface Training Manual, and the SAPHIRE 8, 'New Features and Capabilities Overview'.

  20. Continuous Software Integration and Quality Control during Software Development

    NASA Astrophysics Data System (ADS)

    Ettl, M.; Neidhardt, A.; Brisken, W.; Dassing, R.

    2012-12-01

    Modern software has to be stable, portable, fast, and reliable. This requires a sophisticated infrastructure supporting and providing the developers with additional information about the state and the quality of the project. That is why we have created a centralized software repository, where the whole code-base is managed and version controlled on a centralized server. Based on this, a hierarchical build system has been developed where each project and their sub-projects can be compiled by simply calling the top level Makefile. On the top of this, a nightly build system has been created where the top level Makefiles of each project are called every night. The results of the build including the compiler warnings are reported to the developers using generated HTML pages. In addition, all the source code is automatically checked using a static code analysis tool, called "cppcheck". This tool produces warnings, similar to those of a compiler, but more pedantic. The reports of this analysis are translated to HTML and reported to the developers similar to the nightly builds. Armed with this information,the developers can discover issues in their projects at an early development stage. In combination it reduces the number of possible issues in our software to ensure quality of our projects at different development stages. These checks are also offered to the community. They are currently used within the DiFX software correlator project.

  1. Property-Based Software Engineering Measurement

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Morasca, Sandro; Basili, Victor R.

    1997-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts, regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysts, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact and rigorous, because it is based on precise mathematical concepts. We use this framework to propose definitions of several important measurement concepts (size, length, complexity, cohesion, coupling). It does not intend to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalisms and properties we introduce are convenient and intuitive. This framework contributes constructively to a firmer theoretical ground of software measurement.

  2. Property-Based Software Engineering Measurement

    NASA Technical Reports Server (NTRS)

    Briand, Lionel; Morasca, Sandro; Basili, Victor R.

    1995-01-01

    Little theory exists in the field of software system measurement. Concepts such as complexity, coupling, cohesion or even size are very often subject to interpretation and appear to have inconsistent definitions in the literature. As a consequence, there is little guidance provided to the analyst attempting to define proper measures for specific problems. Many controversies in the literature are simply misunderstandings and stem from the fact that some people talk about different measurement concepts under the same label (complexity is the most common case). There is a need to define unambiguously the most important measurement concepts used in the measurement of software products. One way of doing so is to define precisely what mathematical properties characterize these concepts regardless of the specific software artifacts to which these concepts are applied. Such a mathematical framework could generate a consensus in the software engineering community and provide a means for better communication among researchers, better guidelines for analysis, and better evaluation methods for commercial static analyzers for practitioners. In this paper, we propose a mathematical framework which is generic, because it is not specific to any particular software artifact, and rigorous, because it is based on precise mathematical concepts. This framework defines several important measurement concepts (size, length, complexity, cohesion, coupling). It is not intended to be complete or fully objective; other frameworks could have been proposed and different choices could have been made. However, we believe that the formalism and properties we introduce are convenient and intuitive. In addition, we have reviewed the literature on this subject and compared it with our work. This framework contributes constructively to a firmer theoretical ground of software measurement.

  3. Proceedings of the Ninth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1984-01-01

    Experiences in measurement, utilization, and evaluation of software methodologies, models, and tools are discussed. NASA's involvement in ever larger and more complex systems, like the space station project, provides a motive for the support of software engineering research and the exchange of ideas in such forums. The topics of current SEL research are software error studies, experiments with software development, and software tools.

  4. Software Engineering Laboratory (SEL) cleanroom process model

    NASA Technical Reports Server (NTRS)

    Green, Scott; Basili, Victor; Godfrey, Sally; Mcgarry, Frank; Pajerski, Rose; Waligora, Sharon

    1991-01-01

    The Software Engineering Laboratory (SEL) cleanroom process model is described. The term 'cleanroom' originates in the integrated circuit (IC) production process, where IC's are assembled in dust free 'clean rooms' to prevent the destructive effects of dust. When applying the clean room methodology to the development of software systems, the primary focus is on software defect prevention rather than defect removal. The model is based on data and analysis from previous cleanroom efforts within the SEL and is tailored to serve as a guideline in applying the methodology to future production software efforts. The phases that are part of the process model life cycle from the delivery of requirements to the start of acceptance testing are described. For each defined phase, a set of specific activities is discussed, and the appropriate data flow is described. Pertinent managerial issues, key similarities and differences between the SEL's cleanroom process model and the standard development approach used on SEL projects, and significant lessons learned from prior cleanroom projects are presented. It is intended that the process model described here will be further tailored as additional SEL cleanroom projects are analyzed.

  5. Software for Collaborative Engineering of Launch Rockets

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas Troy

    2003-01-01

    The Rocket Evaluation and Cost Integration for Propulsion and Engineering software enables collaborative computing with automated exchange of information in the design and analysis of launch rockets and other complex systems. RECIPE can interact with and incorporate a variety of programs, including legacy codes, that model aspects of a system from the perspectives of different technological disciplines (e.g., aerodynamics, structures, propulsion, trajectory, aeroheating, controls, and operations) and that are used by different engineers on different computers running different operating systems. RECIPE consists mainly of (1) ISCRM a file-transfer subprogram that makes it possible for legacy codes executed in their original operating systems on their original computers to exchange data and (2) CONES an easy-to-use filewrapper subprogram that enables the integration of legacy codes. RECIPE provides a tightly integrated conceptual framework that emphasizes connectivity among the programs used by the collaborators, linking these programs in a manner that provides some configuration control while facilitating collaborative engineering tradeoff studies, including design to cost studies. In comparison with prior collaborative-engineering schemes, one based on the use of RECIPE enables fewer engineers to do more in less time.

  6. Architecture-Centric Software Quality Management

    NASA Astrophysics Data System (ADS)

    Maciaszek, Leszek A.

    Software quality is a multi-faceted concept defined using different attributes and models. From all various quality requirements, the quality of adaptiveness is by far most critical. Based on this assumption, this paper offers an architecture-centric approach to production of measurably-adaptive systems. The paper uses the PCBMER (Presentation, Controller, Bean, Mediator, Entity, and Resource) meta-architecture to demonstrate how complexity of a software solution can be measured and kept under control in standalone applications. Meta-architectural extensions aimed at managing quality in integration development projects are also introduced. The DSM (Design Structure Matrix) method is used to explain our approach to measure the quality. The discussion is conducted against the background of the holonic approach to science (as the middle-ground between holism and reductionism).

  7. Software engineering and scale-free networks.

    PubMed

    Wen, Lian; Dromey, R Geoff; Kirk, Diana

    2009-08-01

    Complex-network theory is a new approach in studying different types of large systems in both the physical and the abstract worlds. In this paper, we have studied two kinds of network from software engineering: the component dependence network and the sorting comparison network (SCN). It is found that they both show the same scale-free property under certain conditions as complex networks in other fields. These results suggest that complex-network theory can be a useful approach to the study of software systems. The special properties of SCNs provide a more repeatable and deterministic way to study the evolution and optimization of complex networks. They also suggest that the closer a sorting algorithm is to the theoretical optimal limit, the more its SCN is like a scale-free network. This may also indicate that, to store and retrieve information efficiently, a concept network might need to be scale-free. PMID:19380275

  8. Software engineering and scale-free networks.

    PubMed

    Wen, Lian; Dromey, R Geoff; Kirk, Diana

    2009-06-01

    Complex-network theory is a new approach in studying different types of large systems in both the physical and the abstract worlds. In this paper, we have studied two kinds of network from software engineering: the component dependence network and the sorting comparison network (SCN). It is found that they both show the same scale-free property under certain conditions as complex networks in other fields. These results suggest that complex-network theory can be a useful approach to the study of software systems. The special properties of SCNs provide a more repeatable and deterministic way to study the evolution and optimization of complex networks. They also suggest that the closer a sorting algorithm is to the theoretical optimal limit, the more its SCN is like a scale-free network. This may also indicate that, to store and retrieve information efficiently, a concept network might need to be scale-free. PMID:19188126

  9. Relations between information system engineering and software engineering

    NASA Technical Reports Server (NTRS)

    Callender, E. D.; Hartsough, C.; Morris, R. V.

    1981-01-01

    This paper examines some of the relations between information system engineering and software engineering. A model for the development process of an information system is presented that focuses on problems common to both disciplines. The concepts of complexity, multiplicity of view, distortion in communication, and concurrency and iteration in implementation are treated. A set of design constructs for the description of an information system is presented. The role of project management is treated. The issue of how to characterize requirements analysis is answered by making it a design activity from the point of view of a user of the product system.

  10. Epistemology, software engineering and formal methods

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael

    1994-01-01

    One of the most basic questions anyone can ask is, 'How do I know that what I think I know is true?' The study of this question is called epistemology. Traditionally, epistemology has been considered to be of legitimate interest only to philosophers, theologians, and three year old children who respond to every statement by asking, 'Why?' Software engineers need to be interested in the subject, however, because a lack of sufficient understanding of epistemology contributes to many of the current problems in the field.

  11. Software Engineering and Swarm-Based Systems

    NASA Technical Reports Server (NTRS)

    Hinchey, Michael G.; Sterritt, Roy; Pena, Joaquin; Rouff, Christopher A.

    2006-01-01

    We discuss two software engineering aspects in the development of complex swarm-based systems. NASA researchers have been investigating various possible concept missions that would greatly advance future space exploration capabilities. The concept mission that we have focused on exploits the principles of autonomic computing as well as being based on the use of intelligent swarms, whereby a (potentially large) number of similar spacecraft collaborate to achieve mission goals. The intent is that such systems not only can be sent to explore remote and harsh environments but also are endowed with greater degrees of protection and longevity to achieve mission goals.

  12. Software Engineering Tools for Scientific Models

    NASA Technical Reports Server (NTRS)

    Abrams, Marc; Saboo, Pallabi; Sonsini, Mike

    2013-01-01

    Software tools were constructed to address issues the NASA Fortran development community faces, and they were tested on real models currently in use at NASA. These proof-of-concept tools address the High-End Computing Program and the Modeling, Analysis, and Prediction Program. Two examples are the NASA Goddard Earth Observing System Model, Version 5 (GEOS-5) atmospheric model in Cell Fortran on the Cell Broadband Engine, and the Goddard Institute for Space Studies (GISS) coupled atmosphere- ocean model called ModelE, written in fixed format Fortran.

  13. The software engineering laboratory: An approach to measuring software technology

    NASA Technical Reports Server (NTRS)

    Mcgarry, F.

    1980-01-01

    The investigations of the software evaluation laboratory into the software development process at NASA/Goddard are described. A data collection process for acquiring detailed histories of software development projects is outlined. The application of different sets of software methodologies to specific applications projects is summarized. The effect of the development methodology on productivity is discussed.

  14. Computer-Aided Software Engineering - An approach to real-time software development

    NASA Technical Reports Server (NTRS)

    Walker, Carrie K.; Turkovich, John J.

    1989-01-01

    A new software engineering discipline is Computer-Aided Software Engineering (CASE), a technology aimed at automating the software development process. This paper explores the development of CASE technology, particularly in the area of real-time/scientific/engineering software, and a history of CASE is given. The proposed software development environment for the Advanced Launch System (ALS CASE) is described as an example of an advanced software development system for real-time/scientific/engineering (RT/SE) software. The Automated Programming Subsystem of ALS CASE automatically generates executable code and corresponding documentation from a suitably formatted specification of the software requirements. Software requirements are interactively specified in the form of engineering block diagrams. Several demonstrations of the Automated Programming Subsystem are discussed.

  15. Systems Engineering, Quality and Testing

    NASA Technical Reports Server (NTRS)

    Shepherd, Christena C.

    2015-01-01

    AS9100 has little to say about how to apply a Quality Management System (QMS) to aerospace test programs. There is little in the quality engineering Body of Knowledge that applies to testing, unless it is nondestructive examination or some type of lab or bench testing. If one examines how the systems engineering processes are implemented throughout a test program; and how these processes can be mapped to AS9100, a number of areas for involvement of the quality professional are revealed.

  16. Math Description Engine Software Development Kit

    NASA Technical Reports Server (NTRS)

    Shelton, Robert O.; Smith, Stephanie L.; Dexter, Dan E.; Hodgson, Terry R.

    2010-01-01

    The Math Description Engine Software Development Kit (MDE SDK) can be used by software developers to make computer-rendered graphs more accessible to blind and visually-impaired users. The MDE SDK generates alternative graph descriptions in two forms: textual descriptions and non-verbal sound renderings, or sonification. It also enables display of an animated trace of a graph sonification on a visual graph component, with color and line-thickness options for users having low vision or color-related impairments. A set of accessible graphical user interface widgets is provided for operation by end users and for control of accessible graph displays. Version 1.0 of the MDE SDK generates text descriptions for 2D graphs commonly seen in math and science curriculum (and practice). The mathematically rich text descriptions can also serve as a virtual math and science assistant for blind and sighted users, making graphs more accessible for everyone. The MDE SDK has a simple application programming interface (API) that makes it easy for programmers and Web-site developers to make graphs accessible with just a few lines of code. The source code is written in Java for cross-platform compatibility and to take advantage of Java s built-in support for building accessible software application interfaces. Compiled-library and NASA Open Source versions are available with API documentation and Programmer s Guide at http:/ / prim e.jsc.n asa. gov.

  17. Software for Engineering Simulations of a Spacecraft

    NASA Technical Reports Server (NTRS)

    Shireman, Kirk; McSwain, Gene; McCormick, Bernell; Fardelos, Panayiotis

    2005-01-01

    Spacecraft Engineering Simulation II (SES II) is a C-language computer program for simulating diverse aspects of operation of a spacecraft characterized by either three or six degrees of freedom. A functional model in SES can include a trajectory flight plan; a submodel of a flight computer running navigational and flight-control software; and submodels of the environment, the dynamics of the spacecraft, and sensor inputs and outputs. SES II features a modular, object-oriented programming style. SES II supports event-based simulations, which, in turn, create an easily adaptable simulation environment in which many different types of trajectories can be simulated by use of the same software. The simulation output consists largely of flight data. SES II can be used to perform optimization and Monte Carlo dispersion simulations. It can also be used to perform simulations for multiple spacecraft. In addition to its generic simulation capabilities, SES offers special capabilities for space-shuttle simulations: for this purpose, it incorporates submodels of the space-shuttle dynamics and a C-language version of the guidance, navigation, and control components of the space-shuttle flight software.

  18. Parallelization of Rocket Engine Simulator Software (PRESS)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1997-01-01

    Parallelization of Rocket Engine System Software (PRESS) project is part of a collaborative effort with Southern University at Baton Rouge (SUBR), University of West Florida (UWF), and Jackson State University (JSU). The second-year funding, which supports two graduate students enrolled in our new Master's program in Computer Science at Hampton University and the principal investigator, have been obtained for the period from October 19, 1996 through October 18, 1997. The key part of the interim report was new directions for the second year funding. This came about from discussions during Rocket Engine Numeric Simulator (RENS) project meeting in Pensacola on January 17-18, 1997. At that time, a software agreement between Hampton University and NASA Lewis Research Center had already been concluded. That agreement concerns off-NASA-site experimentation with PUMPDES/TURBDES software. Before this agreement, during the first year of the project, another large-scale FORTRAN-based software, Two-Dimensional Kinetics (TDK), was being used for translation to an object-oriented language and parallelization experiments. However, that package proved to be too complex and lacking sufficient documentation for effective translation effort to the object-oriented C + + source code. The focus, this time with better documented and more manageable PUMPDES/TURBDES package, was still on translation to C + + with design improvements. At the RENS Meeting, however, the new impetus for the RENS projects in general, and PRESS in particular, has shifted in two important ways. One was closer alignment with the work on Numerical Propulsion System Simulator (NPSS) through cooperation and collaboration with LERC ACLU organization. The other was to see whether and how NASA's various rocket design software can be run over local and intra nets without any radical efforts for redesign and translation into object-oriented source code. There were also suggestions that the Fortran based code be

  19. Software engineering project management - A state-of-the-art report

    NASA Technical Reports Server (NTRS)

    Thayer, R. H.; Lehman, J. H.

    1977-01-01

    The management of software engineering projects in the aerospace industry was investigated. The survey assessed such features as contract type, specification preparation techniques, software documentation required by customers, planning and cost-estimating, quality control, the use of advanced program practices, software tools and test procedures, the education levels of project managers, programmers and analysts, work assignment, automatic software monitoring capabilities, design and coding reviews, production times, success rates, and organizational structure of the projects.

  20. Annotated bibliography of Software Engineering Laboratory (SEL) literature

    NASA Technical Reports Server (NTRS)

    Card, D.

    1982-01-01

    An annotated bibliography of technical papers, documents, and memorandums produced by or related to the Software Engineering Laboratory is presented. More than 75 publications are summarized. An index of these publications by subject is also included. These publications cover many areas of software engineering and range from research reports to software documentation.

  1. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  2. Proceedings of the 14th Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1989-01-01

    Several software related topics are presented. Topics covered include studies and experiment at the Software Engineering Laboratory at the Goddard Space Flight Center, predicting project success from the Software Project Management Process, software environments, testing in a reuse environment, domain directed reuse, and classification tree analysis using the Amadeus measurement and empirical analysis.

  3. Software Quality Perceptions of Stakeholders Involved in the Software Development Process

    ERIC Educational Resources Information Center

    Padmanabhan, Priya

    2013-01-01

    Software quality is one of the primary determinants of project management success. Stakeholders involved in software development widely agree that quality is important (Barney and Wohlin 2009). However, they may differ on what constitutes software quality, and which of its attributes are more important than others. Although, software quality…

  4. Tool Use Within NASA Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Shigeta, Denise; Port, Dan; Nikora, Allen P.; Wilf, Joel

    2013-01-01

    As space mission software systems become larger and more complex, it is increasingly important for the software assurance effort to have the ability to effectively assess both the artifacts produced during software system development and the development process itself. Conceptually, assurance is a straightforward idea - it is the result of activities carried out by an organization independent of the software developers to better inform project management of potential technical and programmatic risks, and thus increase management's confidence in the decisions they ultimately make. In practice, effective assurance for large, complex systems often entails assessing large, complex software artifacts (e.g., requirements specifications, architectural descriptions) as well as substantial amounts of unstructured information (e.g., anomaly reports resulting from testing activities during development). In such an environment, assurance engineers can benefit greatly from appropriate tool support. In order to do so, an assurance organization will need accurate and timely information on the tool support available for various types of assurance activities. In this paper, we investigate the current use of tool support for assurance organizations within NASA, and describe on-going work at JPL for providing assurance organizations with the information about tools they need to use them effectively.

  5. Software Engineering Laboratory (SEL) data and information policy

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank

    1991-01-01

    The policies and overall procedures that are used in distributing and in making available products of the Software Engineering Laboratory (SEL) are discussed. The products include project data and measures, project source code, reports, and software tools.

  6. Seeing beyond Computer Science and Software Engineering

    NASA Astrophysics Data System (ADS)

    Nori, Kesav Vithal

    The boundaries of computer science are defined by what symbolic computation can accomplish. Software Engineering is concerned with effective use of computing technology to support automatic computation on a large scale so as to construct desirable solutions to worthwhile problems. Both focus on what happens within the machine. In contrast, most practical applications of computing support end-users in realizing (often unsaid) objectives. It is often said that such objectives cannot be even specified, e.g., what is the specification of MS Word, or for that matter, any flavour of UNIX? This situation points to the need for architecting what people do with computers. Based on Systems Thinking and Cybernetics, we present such a viewpoint which hinges on Human Responsibility and means of living up to it.

  7. Applying optimization software libraries to engineering problems

    NASA Technical Reports Server (NTRS)

    Healy, M. J.

    1984-01-01

    Nonlinear programming, preliminary design problems, performance simulation problems trajectory optimization, flight computer optimization, and linear least squares problems are among the topics covered. The nonlinear programming applications encountered in a large aerospace company are a real challenge to those who provide mathematical software libraries and consultation services. Typical applications include preliminary design studies, data fitting and filtering, jet engine simulations, control system analysis, and trajectory optimization and optimal control. Problem sizes range from single-variable unconstrained minimization to constrained problems with highly nonlinear functions and hundreds of variables. Most of the applications can be posed as nonlinearly constrained minimization problems. Highly complex optimization problems with many variables were formulated in the early days of computing. At the time, many problems had to be reformulated or bypassed entirely, and solution methods often relied on problem-specific strategies. Problems with more than ten variables usually went unsolved.

  8. Software Engineering for Scientific Computer Simulations

    NASA Astrophysics Data System (ADS)

    Post, Douglass E.; Henderson, Dale B.; Kendall, Richard P.; Whitney, Earl M.

    2004-11-01

    Computer simulation is becoming a very powerful tool for analyzing and predicting the performance of fusion experiments. Simulation efforts are evolving from including only a few effects to many effects, from small teams with a few people to large teams, and from workstations and small processor count parallel computers to massively parallel platforms. Successfully making this transition requires attention to software engineering issues. We report on the conclusions drawn from a number of case studies of large scale scientific computing projects within DOE, academia and the DoD. The major lessons learned include attention to sound project management including setting reasonable and achievable requirements, building a good code team, enforcing customer focus, carrying out verification and validation and selecting the optimum computational mathematics approaches.

  9. SAPHIRE 8 Quality Assurance Software Metrics Report

    SciTech Connect

    Kurt G. Vedros

    2011-08-01

    The purpose of this review of software metrics is to examine the quality of the metrics gathered in the 2010 IV&V and to set an outline for results of updated metrics runs to be performed. We find from the review that the maintenance of accepted quality standards presented in the SAPHIRE 8 initial Independent Verification and Validation (IV&V) of April, 2010 is most easily achieved by continuing to utilize the tools used in that effort while adding a metric of bug tracking and resolution. Recommendations from the final IV&V were to continue periodic measurable metrics such as McCabe's complexity measure to ensure quality is maintained. The four software tools used to measure quality in the IV&V were CodeHealer, Coverage Validator, Memory Validator, Performance Validator, and Thread Validator. These are evaluated based on their capabilities. We attempted to run their latest revisions with the newer Delphi 2010 based SAPHIRE 8 code that has been developed and was successful with all of the Validator series of tools on small tests. Another recommendation from the IV&V was to incorporate a bug tracking and resolution metric. To improve our capability of producing this metric, we integrated our current web reporting system with the SpiraTest test management software purchased earlier this year to track requirements traceability.

  10. Involving Software Engineering Students in Open Source Software Projects: Experiences from a Pilot Study

    ERIC Educational Resources Information Center

    Sowe, Sulayman K.; Stamelos, Ioannis G.

    2007-01-01

    Anecdotal and research evidences show that the Free and Open Source Software (F/OSS) development model has produced a paradigm shift in the way we develop, support, and distribute software. This shift is not only redefining the software industry but also the way we teach and learn in our software engineering (SE) courses. But for many universities…

  11. FMT (Flight Software Memory Tracker) For Cassini Spacecraft-Software Engineering Using JAVA

    NASA Technical Reports Server (NTRS)

    Kan, Edwin P.; Uffelman, Hal; Wax, Allan H.

    1997-01-01

    The software engineering design of the Flight Software Memory Tracker (FMT) Tool is discussed in this paper. FMT is a ground analysis software set, consisting of utilities and procedures, designed to track the flight software, i.e., images of memory load and updatable parameters of the computers on-board Cassini spacecraft. FMT is implemented in Java.

  12. Proceedings of the 19th Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1994-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by NASA/GSFC and created to investigate the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are: (1) to understand the software development process in the GSFC environment; (2) to measure the effects of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that include this document.

  13. Data collection procedures for the Software Engineering Laboratory (SEL) database

    NASA Technical Reports Server (NTRS)

    Heller, Gerard; Valett, Jon; Wild, Mary

    1992-01-01

    This document is a guidebook to collecting software engineering data on software development and maintenance efforts, as practiced in the Software Engineering Laboratory (SEL). It supersedes the document entitled Data Collection Procedures for the Rehosted SEL Database, number SEL-87-008 in the SEL series, which was published in October 1987. It presents procedures to be followed on software development and maintenance projects in the Flight Dynamics Division (FDD) of Goddard Space Flight Center (GSFC) for collecting data in support of SEL software engineering research activities. These procedures include detailed instructions for the completion and submission of SEL data collection forms.

  14. Performance Engineering Technology for Scientific Component Software

    SciTech Connect

    Malony, Allen D.

    2007-05-08

    Large-scale, complex scientific applications are beginning to benefit from the use of component software design methodology and technology for software development. Integral to the success of component-based applications is the ability to achieve high-performing code solutions through the use of performance engineering tools for both intra-component and inter-component analysis and optimization. Our work on this project aimed to develop performance engineering technology for scientific component software in association with the DOE CCTTSS SciDAC project (active during the contract period) and the broader Common Component Architecture (CCA) community. Our specific implementation objectives were to extend the TAU performance system and Program Database Toolkit (PDT) to support performance instrumentation, measurement, and analysis of CCA components and frameworks, and to develop performance measurement and monitoring infrastructure that could be integrated in CCA applications. These objectives have been met in the completion of all project milestones and in the transfer of the technology into the continuing CCA activities as part of the DOE TASCS SciDAC2 effort. In addition to these achievements, over the past three years, we have been an active member of the CCA Forum, attending all meetings and serving in several working groups, such as the CCA Toolkit working group, the CQoS working group, and the Tutorial working group. We have contributed significantly to CCA tutorials since SC'04, hosted two CCA meetings, participated in the annual ACTS workshops, and were co-authors on the recent CCA journal paper [24]. There are four main areas where our project has delivered results: component performance instrumentation and measurement, component performance modeling and optimization, performance database and data mining, and online performance monitoring. This final report outlines the achievements in these areas for the entire project period. The submitted progress

  15. Tools and Behavioral Abstraction: A Direction for Software Engineering

    NASA Astrophysics Data System (ADS)

    Leino, K. Rustan M.

    As in other engineering professions, software engineers rely on tools. Such tools can analyze program texts and design specifications more automatically and in more detail than ever before. While many tools today are applied to find new defects in old code, I predict that more software-engineering tools of the future will be available to software authors at the time of authoring. If such analysis tools can be made to be fast enough and easy enough to use, they can help software engineers better produce and evolve programs.

  16. Program for implementing software quality metrics

    SciTech Connect

    Yule, H.P.; Riemer, C.A.

    1992-04-01

    This report describes a program by which the Veterans Benefit Administration (VBA) can implement metrics to measure the performance of automated data systems and demonstrate that they are improving over time. It provides a definition of quality, particularly with regard to software. Requirements for management and staff to achieve a successful metrics program are discussed. It lists the attributes of high-quality software, then describes the metrics or calculations that can be used to measure these attributes in a particular system. Case studies of some successful metrics programs used by business are presented. The report ends with suggestions on which metrics the VBA should use and the order in which they should be implemented.

  17. Distribution and communication in software engineering environments. Application to the HELIOS Software Bus.

    PubMed Central

    Jean, F. C.; Jaulent, M. C.; Coignard, J.; Degoulet, P.

    1991-01-01

    Modularity, distribution and integration are current trends in Software Engineering. To reach these goals HELIOS, a distributive Software Engineering Environment dedicated to the medical field, has been conceived and a prototype implemented. This environment is made by the collaboration of several, well encapsulated Software Components. This paper presents the architecture retained to allow communication between the different components and focus on the implementation details of the Software Bus, the communication and integration vector of the currently running prototype. PMID:1807652

  18. Master Pump Shutdown MPS Software Quality Assurance Plan (SQAP)

    SciTech Connect

    BEVINS, R.R.

    2000-09-20

    The MPSS Software Quality Assurance (SQAP) describes the tools and strategy used in the development of the MPSS software. The document also describes the methodology for controlling and managing changes to the software.

  19. Generic domain models in software engineering

    NASA Technical Reports Server (NTRS)

    Maiden, Neil

    1992-01-01

    This paper outlines three research directions related to domain-specific software development: (1) reuse of generic models for domain-specific software development; (2) empirical evidence to determine these generic models, namely elicitation of mental knowledge schema possessed by expert software developers; and (3) exploitation of generic domain models to assist modelling of specific applications. It focuses on knowledge acquisition for domain-specific software development, with emphasis on tool support for the most important phases of software development.

  20. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software

    PubMed Central

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians. PMID:25996054

  1. Engineering bioinformatics: building reliability, performance and productivity into bioinformatics software.

    PubMed

    Lawlor, Brendan; Walsh, Paul

    2015-01-01

    There is a lack of software engineering skills in bioinformatic contexts. We discuss the consequences of this lack, examine existing explanations and remedies to the problem, point out their shortcomings, and propose alternatives. Previous analyses of the problem have tended to treat the use of software in scientific contexts as categorically different from the general application of software engineering in commercial settings. In contrast, we describe bioinformatic software engineering as a specialization of general software engineering, and examine how it should be practiced. Specifically, we highlight the difference between programming and software engineering, list elements of the latter and present the results of a survey of bioinformatic practitioners which quantifies the extent to which those elements are employed in bioinformatics. We propose that the ideal way to bring engineering values into research projects is to bring engineers themselves. We identify the role of Bioinformatic Engineer and describe how such a role would work within bioinformatic research teams. We conclude by recommending an educational emphasis on cross-training software engineers into life sciences, and propose research on Domain Specific Languages to facilitate collaboration between engineers and bioinformaticians. PMID:25996054

  2. APPLICATION OF SOFTWARE QUALITY ASSURANCE CONCEPTS AND PROCEDURES TO ENVIORNMENTAL RESEARCH INVOLVING SOFTWARE DEVELOPMENT

    EPA Science Inventory

    As EPA’s environmental research expands into new areas that involve the development of software, quality assurance concepts and procedures that were originally developed for environmental data collection may not be appropriate. Fortunately, software quality assurance is a ...

  3. Adopting software quality measures for healthcare processes.

    PubMed

    Yildiz, Ozkan; Demirörs, Onur

    2009-01-01

    In this study, we investigated the adoptability of software quality measures for healthcare process measurement. Quality measures of ISO/IEC 9126 are redefined from a process perspective to build a generic healthcare process quality measurement model. Case study research method is used, and the model is applied to a public hospital's Entry to Care process. After the application, weak and strong aspects of the process can be easily observed. Access audibility, fault removal, completeness of documentation, and machine utilization are weak aspects and these aspects are the candidates for process improvement. On the other hand, functional completeness, fault ratio, input validity checking, response time, and throughput time are the strong aspects of the process. PMID:19745339

  4. Professional Ethics of Software Engineers: An Ethical Framework.

    PubMed

    Lurie, Yotam; Mark, Shlomo

    2016-04-01

    The purpose of this article is to propose an ethical framework for software engineers that connects software developers' ethical responsibilities directly to their professional standards. The implementation of such an ethical framework can overcome the traditional dichotomy between professional skills and ethical skills, which plagues the engineering professions, by proposing an approach to the fundamental tasks of the practitioner, i.e., software development, in which the professional standards are intrinsically connected to the ethical responsibilities. In so doing, the ethical framework improves the practitioner's professionalism and ethics. We call this approach Ethical-Driven Software Development (EDSD), as an approach to software development. EDSD manifests the advantages of an ethical framework as an alternative to the all too familiar approach in professional ethics that advocates "stand-alone codes of ethics". We believe that one outcome of this synergy between professional and ethical skills is simply better engineers. Moreover, since there are often different software solutions, which the engineer can provide to an issue at stake, the ethical framework provides a guiding principle, within the process of software development, that helps the engineer evaluate the advantages and disadvantages of different software solutions. It does not and cannot affect the end-product in and of-itself. However, it can and should, make the software engineer more conscious and aware of the ethical ramifications of certain engineering decisions within the process. PMID:26047575

  5. Experiences with Integrating Simulation into a Software Engineering Curriculum

    ERIC Educational Resources Information Center

    Bollin, Andreas; Hochmuller, Elke; Mittermeir, Roland; Samuelis, Ladislav

    2012-01-01

    Software Engineering education must account for a broad spectrum of knowledge and skills software engineers will be required to apply throughout their professional life. Covering all the topics in depth within a university setting is infeasible due to curricular constraints as well as due to the inherent differences between educational…

  6. V&V Within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1996-01-01

    Verification and Validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission-critical software. V&V is a systems engineering discipline that evaluates the software in a systems context, and is currently applied during the development of a specific application system. In order to bring the effectiveness of V&V to bear within reuse-based software engineering, V&V must be incorporated within the domain engineering process.

  7. Proceedings of the Twenty-Fourth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    2000-01-01

    On December 1 and 2, the Software Engineering Laboratory (SEL), a consortium composed of NASA/Goddard, the University of Maryland, and CSC, held the 24th Software Engineering Workshop (SEW), the last of the millennium. Approximately 240 people attended the 2-day workshop. Day 1 was composed of four sessions: International Influence of the Software Engineering Laboratory; Object Oriented Testing and Reading; Software Process Improvement; and Space Software. For the first session, three internationally known software process experts discussed the influence of the SEL with respect to software engineering research. In the Space Software session, prominent representatives from three different NASA sites- GSFC's Marti Szczur, the Jet Propulsion Laboratory's Rick Doyle, and the Ames Research Center IV&V Facility's Lou Blazy- discussed the future of space software in their respective centers. At the end of the first day, the SEW sponsored a reception at the GSFC Visitors' Center. Day 2 also provided four sessions: Using the Experience Factory; A panel discussion entitled "Software Past, Present, and Future: Views from Government, Industry, and Academia"; Inspections; and COTS. The day started with an excellent talk by CSC's Frank McGarry on "Attaining Level 5 in CMM Process Maturity." Session 2, the panel discussion on software, featured NASA Chief Information Officer Lee Holcomb (Government), our own Jerry Page (Industry), and Mike Evangelist of the National Science Foundation (Academia). Each presented his perspective on the most important developments in software in the past 10 years, in the present, and in the future.

  8. Engineering Software Suite Validates System Design

    NASA Technical Reports Server (NTRS)

    2007-01-01

    EDAptive Computing Inc.'s (ECI) EDAstar engineering software tool suite, created to capture and validate system design requirements, was significantly funded by NASA's Ames Research Center through five Small Business Innovation Research (SBIR) contracts. These programs specifically developed Syscape, used to capture executable specifications of multi-disciplinary systems, and VectorGen, used to automatically generate tests to ensure system implementations meet specifications. According to the company, the VectorGen tests considerably reduce the time and effort required to validate implementation of components, thereby ensuring their safe and reliable operation. EDASHIELD, an additional product offering from ECI, can be used to diagnose, predict, and correct errors after a system has been deployed using EDASTAR -created models. Initial commercialization for EDASTAR included application by a large prime contractor in a military setting, and customers include various branches within the U.S. Department of Defense, industry giants like the Lockheed Martin Corporation, Science Applications International Corporation, and Ball Aerospace and Technologies Corporation, as well as NASA's Langley and Glenn Research Centers

  9. Proceedings of the Fifteenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The Software Engineering Laboratory (SEL) is an organization sponsored by GSFC and created for the purpose of investigating the effectiveness of software engineering technologies when applied to the development of applications software. The goals of the SEL are: (1) to understand the software development process in the GSFC environment; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. Fifteen papers were presented at the Fifteenth Annual Software Engineering Workshop in five sessions: (1) SEL at age fifteen; (2) process improvement; (3) measurement; (4) reuse; and (5) process assessment. The sessions were followed by two panel discussions: (1) experiences in implementing an effective measurement program; and (2) software engineering in the 1980's. A summary of the presentations and panel discussions is given.

  10. Operational excellence (six sigma) philosophy: Application to software quality assurance

    SciTech Connect

    Lackner, M.

    1997-11-01

    This report contains viewgraphs on operational excellence philosophy of six sigma applied to software quality assurance. This report outlines the following: goal of six sigma; six sigma tools; manufacturing vs administrative processes; Software quality assurance document inspections; map software quality assurance requirements document; failure mode effects analysis for requirements document; measuring the right response variables; and questions.

  11. A report on NASA software engineering and Ada training requirements

    NASA Technical Reports Server (NTRS)

    Legrand, Sue; Freedman, Glenn B.; Svabek, L.

    1987-01-01

    NASA's software engineering and Ada skill base are assessed and information that may result in new models for software engineering, Ada training plans, and curricula are provided. A quantitative assessment which reflects the requirements for software engineering and Ada training across NASA is provided. A recommended implementation plan including a suggested curriculum with associated duration per course and suggested means of delivery is also provided. The distinction between education and training is made. Although it was directed to focus on NASA's need for the latter, the key relationships to software engineering education are also identified. A rationale and strategy for implementing a life cycle education and training program are detailed in support of improved software engineering practices and the transition to Ada.

  12. FSO and quality of service software prediction

    NASA Astrophysics Data System (ADS)

    Bouchet, O.; Marquis, T.; Chabane, M.; Alnaboulsi, M.; Sizun, H.

    2005-08-01

    Free-space optical (FSO) communication links constitute an alternative option to radio relay links and to optical cables facing growth needs in high-speed telecommunications (abundance of unregulated bandwidth, rapid installation, availability of low-cost optical components offering a high data rate, etc). Their operationalisation requires a good knowledge of the atmospheric effects which can negatively affect role propagation and the availability of the link, and thus to the quality of service (QoS). Better control of these phenomena will allow for the evaluation of system performance and thus assist with improving reliability. The aim of this paper is to compare the behavior of a FSO link located in south of France (Toulouse: with the following parameters: around 270 meters (0.2 mile) long, 34 Mbps data rate, 850 nm wavelength and PDH frame) with airport meteorological data. The second aim of the paper is to assess in-house FSO quality of service prediction software, through comparing simulations with the optical link data and the weather data. The analysis uses in-house software FSO quality of service prediction software ("FSO Prediction") developed by France Telecom Research & Development, which integrates news fog fading equations (compare to Kim & al.) and includes multiple effects (geometrical attenuation, atmospheric fading, rain, snow, scintillation and refraction attenuation due to atmospheric turbulence, optical mispointing attenuation). The FSO link field trial, intended to enable the demonstration and evaluation of these different effects, is described; and preliminary results of the field trial, from December 2004 to May 2005, are then presented.

  13. On the Prospects and Concerns of Integrating Open Source Software Environment in Software Engineering Education

    ERIC Educational Resources Information Center

    Kamthan, Pankaj

    2007-01-01

    Open Source Software (OSS) has introduced a new dimension in software community. As the development and use of OSS becomes prominent, the question of its integration in education arises. In this paper, the following practices fundamental to projects and processes in software engineering are examined from an OSS perspective: project management;…

  14. Automated Theorem Proving in High-Quality Software Design

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Swanson, Keith (Technical Monitor)

    2001-01-01

    The amount and complexity of software developed during the last few years has increased tremendously. In particular, programs are being used more and more in embedded systems (from car-brakes to plant-control). Many of these applications are safety-relevant, i.e. a malfunction of hardware or software can cause severe damage or loss. Tremendous risks are typically present in the area of aviation, (nuclear) power plants or (chemical) plant control. Here, even small problems can lead to thousands of casualties and huge financial losses. Large financial risks also exist when computer systems are used in the area of telecommunication (telephone, electronic commerce) or space exploration. Computer applications in this area are not only subject to safety considerations, but also security issues are important. All these systems must be designed and developed to guarantee high quality with respect to safety and security. Even in an industrial setting which is (or at least should be) aware of the high requirements in Software Engineering, many incidents occur. For example, the Warshaw Airbus crash, was caused by an incomplete requirements specification. Uncontrolled reuse of an Ariane 4 software module was the reason for the Ariane 5 disaster. Some recent incidents in the telecommunication area, like illegal "cloning" of smart-cards of D2GSM handies, or the extraction of (secret) passwords from German T-online users show that also in this area serious flaws can happen. Due to the inherent complexity of computer systems, most authors claim that only a rigorous application of formal methods in all stages of the software life cycle can ensure high quality of the software and lead to real safe and secure systems. In this paper, we will have a look, in how far automated theorem proving can contribute to a more widespread application of formal methods and their tools, and what automated theorem provers (ATPs) must provide in order to be useful.

  15. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  16. Avionics Simulation, Development and Software Engineering

    NASA Technical Reports Server (NTRS)

    2002-01-01

    During this reporting period, all technical responsibilities were accomplished as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14), the MSFC EXPRESS Project Office (FD31), and the Huntsville Boeing Company. Accomplishments included: performing special tasks; supporting Software Review Board (SRB), Avionics Test Bed (ATB), and EXPRESS Software Control Panel (ESCP) activities; participating in technical meetings; and coordinating issues between the Boeing Company and the MSFC Project Office.

  17. On the engineering of crucial software

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.; Knight, J. C.; Gregory, S. T.

    1983-01-01

    The various aspects of the conventional software development cycle are examined. This cycle was the basis of the augmented approach contained in the original grant proposal. This cycle was found inadequate for crucial software development, and the justification for this opinion is presented. Several possible enhancements to the conventional software cycle are discussed. Software fault tolerance, a possible enhancement of major importance, is discussed separately. Formal verification using mathematical proof is considered. Automatic programming is a radical alternative to the conventional cycle and is discussed. Recommendations for a comprehensive approach are presented, and various experiments which could be conducted in AIRLAB are described.

  18. Large-scale visualization projects for teaching software engineering.

    PubMed

    Müller, Christoph; Reina, Guido; Burch, Michael; Weiskopf, Daniel

    2012-01-01

    The University of Stuttgart's software engineering major complements the traditional computer science major with more practice-oriented education. Two-semester software projects in various application areas offered by the university's different computer science institutes are a successful building block in the curriculum. With this realistic, complex project setting, students experience the practice of software engineering, including software development processes, technologies, and soft skills. In particular, visualization-based projects are popular with students. Such projects offer them the opportunity to gain profound knowledge that would hardly be possible with only regular lectures and homework assignments. PMID:24806629

  19. Software Engineering Research/Developer Collaborations in 2005

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom

    2006-01-01

    In CY 2005, three collaborations between software engineering technology providers and NASA software development personnel deployed three software engineering technologies on NASA development projects (a different technology on each project). The main purposes were to benefit the projects, infuse the technologies if beneficial into NASA, and give feedback to the technology providers to improve the technologies. Each collaboration project produced a final report. Section 2 of this report summarizes each project, drawing from the final reports and communications with the software developers and technology providers. Section 3 indicates paths to further infusion of the technologies into NASA practice. Section 4 summarizes some technology transfer lessons learned. Also included is an acronym list.

  20. Proceedings of the Eighteenth Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The workshop provided a forum for software practitioners from around the world to exchange information on the measurement, use, and evaluation of software methods, models, and tools. This year, approximately 450 people attended the workshop, which consisted of six sessions on the following topics: the Software Engineering Laboratory, measurement, technology assessment, advanced concepts, process, and software engineering issues in NASA. Three presentations were given in each of the topic areas. The content of those presentations and the research papers detailing the work reported are included in these proceedings. The workshop concluded with a tutorial session on how to start an Experience Factory.

  1. Quality measures and assurance for AI (Artificial Intelligence) software

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1988-01-01

    This report is concerned with the application of software quality and evaluation measures to AI software and, more broadly, with the question of quality assurance for AI software. Considered are not only the metrics that attempt to measure some aspect of software quality, but also the methodologies and techniques (such as systematic testing) that attempt to improve some dimension of quality, without necessarily quantifying the extent of the improvement. The report is divided into three parts Part 1 reviews existing software quality measures, i.e., those that have been developed for, and applied to, conventional software. Part 2 considers the characteristics of AI software, the applicability and potential utility of measures and techniques identified in the first part, and reviews those few methods developed specifically for AI software. Part 3 presents an assessment and recommendations for the further exploration of this important area.

  2. Resilience Engineering in Critical Long Term Aerospace Software Systems: A New Approach to Spacecraft Software Safety

    NASA Astrophysics Data System (ADS)

    Dulo, D. A.

    Safety critical software systems permeate spacecraft, and in a long term venture like a starship would be pervasive in every system of the spacecraft. Yet software failure today continues to plague both the systems and the organizations that develop them resulting in the loss of life, time, money, and valuable system platforms. A starship cannot afford this type of software failure in long journeys away from home. A single software failure could have catastrophic results for the spaceship and the crew onboard. This paper will offer a new approach to developing safe reliable software systems through focusing not on the traditional safety/reliability engineering paradigms but rather by focusing on a new paradigm: Resilience and Failure Obviation Engineering. The foremost objective of this approach is the obviation of failure, coupled with the ability of a software system to prevent or adapt to complex changing conditions in real time as a safety valve should failure occur to ensure safe system continuity. Through this approach, safety is ensured through foresight to anticipate failure and to adapt to risk in real time before failure occurs. In a starship, this type of software engineering is vital. Through software developed in a resilient manner, a starship would have reduced or eliminated software failure, and would have the ability to rapidly adapt should a software system become unstable or unsafe. As a result, long term software safety, reliability, and resilience would be present for a successful long term starship mission.

  3. Empirical studies of design software: Implications for software engineering environments

    NASA Technical Reports Server (NTRS)

    Krasner, Herb

    1988-01-01

    The empirical studies team of MCC's Design Process Group conducted three studies in 1986-87 in order to gather data on professionals designing software systems in a range of situations. The first study (the Lift Experiment) used thinking aloud protocols in a controlled laboratory setting to study the cognitive processes of individual designers. The second study (the Object Server Project) involved the observation, videotaping, and data collection of a design team of a medium-sized development project over several months in order to study team dynamics. The third study (the Field Study) involved interviews with the personnel from 19 large development projects in the MCC shareholders in order to study how the process of design is affected by organizationl and project behavior. The focus of this report will be on key observations of design process (at several levels) and their implications for the design of environments.

  4. SAPHIRE 8 Software Quality Assurance Plan

    SciTech Connect

    Curtis Smith

    2010-02-01

    This Quality Assurance (QA) Plan documents the QA activities that will be managed by the INL related to JCN N6423. The NRC developed the SAPHIRE computer code for performing probabilistic risk assessments (PRAs) using a personal computer (PC) at the Idaho National Laboratory (INL) under Job Code Number (JCN) L1429. SAPHIRE started out as a feasibility study for a PRA code to be run on a desktop personal PC and evolved through several phases into a state-of-the-art PRA code. The developmental activity of SAPHIRE was the result of two concurrent important events: The tremendous expansion of PC software and hardware capability of the 90s and the onset of a risk-informed regulation era.

  5. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  6. Software Engineering Code of Ethics and Professional Practice.

    PubMed

    2001-04-01

    The Software Engineering Code of Ethics and Professional Practice, intended as a standard for teaching and practicing software engineering, documents the ethical and professional obligations of software engineers. The code should instruct practitioners about the standards society expects them to meet, about what their peers strive for, and about what to expect of one another. In addition, the code should also inform the public about the responsibilities that are important to the profession. Adopted in 2000 by the IEEE Computer Society and the ACM--two leading international computing societies--the code of ethics is intended as a guide for members of the evolving software engineering profession. The code was developed by a multinational task force with additional input from other professionals from industry, government posts, military installations, and educational professions. PMID:11349363

  7. The Hidden Job Requirements for a Software Engineer

    SciTech Connect

    Marinovici, Maria C.; Kirkham, Harold; Glass, Kevin A.

    2014-01-09

    In a world increasingly operated by computers, where innovation depends on software, the software engineer’s role is changing continuously and gaining new dimensions. In commercial software development as well as scientific research environments, the way software developers are perceived is changing, because they are more important to the business than ever before. Nowadays, their job requires skills extending beyond the regular job description posted by HR, and more is expected. To advance and thrive in their new roles, the software engineers must embrace change, and practice the themes of the new era (integration, collaboration and optimization). The challenges may be somehow intimidating for freshly graduated software engineers. Through this paper the authors hope to set them on a path for success, by helping them relinquish their fear of the unknown.

  8. Infusing Software Engineering Technology into Practice at NASA

    NASA Technical Reports Server (NTRS)

    Pressburger, Thomas; Feather, Martin S.; Hinchey, Michael; Markosia, Lawrence

    2006-01-01

    We present an ongoing effort of the NASA Software Engineering Initiative to encourage the use of advanced software engineering technology on NASA projects. Technology infusion is in general a difficult process yet this effort seems to have found a modest approach that is successful for some types of technologies. We outline the process and describe the experience of the technology infusions that occurred over a two year period. We also present some lessons from the experiences.

  9. Study of a Quantum Framework for Search Based Software Engineering

    NASA Astrophysics Data System (ADS)

    Wu, Nan; Song, Fangmin; Li, Xiangdong

    2013-06-01

    The Search Based Software Engineering (SBSE) is widely used in the software engineering to identify optimal solutions. The traditional methods and algorithms used in SBSE are criticized due to their high costs. In this paper, we propose a rapid modified-Grover quantum searching method for SBSE, and theoretically this method can be applied to any search-space structure and any type of searching problems.

  10. Proceedings of the Twenty-Third Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Twenty-third Annual Software Engineering Workshop (SEW) provided 20 presentations designed to further the goals of the Software Engineering Laboratory (SEL) of the NASA-GSFC. The presentations were selected on their creativity. The sessions which were held on 2-3 of December 1998, centered on the SEL, Experimentation, Inspections, Fault Prediction, Verification and Validation, and Embedded Systems and Safety-Critical Systems.

  11. Models and metrics for software management and engineering

    NASA Technical Reports Server (NTRS)

    Basili, V. R.

    1988-01-01

    This paper attempts to characterize and present a state of the art view of several quantitative models and metrics of the software life cycle. These models and metrics can be used to aid in managing and engineering software projects. They deal with various aspects of the software process and product, including resources allocation and estimation, changes and errors, size, complexity and reliability. Some indication is given of the extent to which the various models have been used and the success they have achieved.

  12. Imprinting Community College Computer Science Education with Software Engineering Principles

    ERIC Educational Resources Information Center

    Hundley, Jacqueline Holliday

    2012-01-01

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and…

  13. Architecture independent environment for developing engineering software on MIMD computers

    NASA Technical Reports Server (NTRS)

    Valimohamed, Karim A.; Lopez, L. A.

    1990-01-01

    Engineers are constantly faced with solving problems of increasing complexity and detail. Multiple Instruction stream Multiple Data stream (MIMD) computers have been developed to overcome the performance limitations of serial computers. The hardware architectures of MIMD computers vary considerably and are much more sophisticated than serial computers. Developing large scale software for a variety of MIMD computers is difficult and expensive. There is a need to provide tools that facilitate programming these machines. First, the issues that must be considered to develop those tools are examined. The two main areas of concern were architecture independence and data management. Architecture independent software facilitates software portability and improves the longevity and utility of the software product. It provides some form of insurance for the investment of time and effort that goes into developing the software. The management of data is a crucial aspect of solving large engineering problems. It must be considered in light of the new hardware organizations that are available. Second, the functional design and implementation of a software environment that facilitates developing architecture independent software for large engineering applications are described. The topics of discussion include: a description of the model that supports the development of architecture independent software; identifying and exploiting concurrency within the application program; data coherence; engineering data base and memory management.

  14. Imprinting Community College Computer Science Education with Software Engineering Principles

    NASA Astrophysics Data System (ADS)

    Hundley, Jacqueline Holliday

    Although the two-year curriculum guide includes coverage of all eight software engineering core topics, the computer science courses taught in Alabama community colleges limit student exposure to the programming, or coding, phase of the software development lifecycle and offer little experience in requirements analysis, design, testing, and maintenance. We proposed that some software engineering principles can be incorporated into the introductory-level of the computer science curriculum. Our vision is to give community college students a broader exposure to the software development lifecycle. For those students who plan to transfer to a baccalaureate program subsequent to their community college education, our vision is to prepare them sufficiently to move seamlessly into mainstream computer science and software engineering degrees. For those students who plan to move from the community college to a programming career, our vision is to equip them with the foundational knowledge and skills required by the software industry. To accomplish our goals, we developed curriculum modules for teaching seven of the software engineering knowledge areas within current computer science introductory-level courses. Each module was designed to be self-supported with suggested learning objectives, teaching outline, software tool support, teaching activities, and other material to assist the instructor in using it.

  15. Software Engineering Infrastructure in a Large Virtual Campus

    ERIC Educational Resources Information Center

    Cristobal, Jesus; Merino, Jorge; Navarro, Antonio; Peralta, Miguel; Roldan, Yolanda; Silveira, Rosa Maria

    2011-01-01

    Purpose: The design, construction and deployment of a large virtual campus are a complex issue. Present virtual campuses are made of several software applications that complement e-learning platforms. In order to develop and maintain such virtual campuses, a complex software engineering infrastructure is needed. This paper aims to analyse the…

  16. Software Quality and Copyright: Issues in Computer-Assisted Instruction.

    ERIC Educational Resources Information Center

    Helm, Virginia

    The two interconnected problems of educational quality and piracy are described and analyzed in this book, which begins with an investigation of the accusations regarding the alleged dismal quality of educational software. The reality behind accusations of rampant piracy and the effect of piracy on the quality of educational software is examined…

  17. General purpose optimization software for engineering design

    NASA Technical Reports Server (NTRS)

    Vanderplaats, G. N.

    1990-01-01

    The author has developed several general purpose optimization programs over the past twenty years. The earlier programs were developed as research codes and served that purpose reasonably well. However, in taking the formal step from research to industrial application programs, several important lessons have been learned. Among these are the importance of clear documentation, immediate user support, and consistent maintenance. Most important has been the issue of providing software that gives a good, or at least acceptable, design at minimum computational cost. Here, the basic issues developing optimization software for industrial applications are outlined and issues of convergence rate, reliability, and relative minima are discussed. Considerable feedback has been received from users, and new software is being developed to respond to identified needs. The basic capabilities of this software are outlined. A major motivation for the development of commercial grade software is ease of use and flexibility, and these issues are discussed with reference to general multidisciplinary applications. It is concluded that design productivity can be significantly enhanced by the more widespread use of optimization as an everyday design tool.

  18. The Business Case for Automated Software Engineering

    NASA Technical Reports Server (NTRS)

    Menzies, Tim; Elrawas, Oussama; Hihn, Jairus M.; Feather, Martin S.; Madachy, Ray; Boehm, Barry

    2007-01-01

    Adoption of advanced automated SE (ASE) tools would be more favored if a business case could be made that these tools are more valuable than alternate methods. In theory, software prediction models can be used to make that case. In practice, this is complicated by the 'local tuning' problem. Normally. predictors for software effort and defects and threat use local data to tune their predictions. Such local tuning data is often unavailable. This paper shows that assessing the relative merits of different SE methods need not require precise local tunings. STAR 1 is a simulated annealer plus a Bayesian post-processor that explores the space of possible local tunings within software prediction models. STAR 1 ranks project decisions by their effects on effort and defects and threats. In experiments with NASA systems. STARI found one project where ASE were essential for minimizing effort/ defect/ threats; and another project were ASE tools were merely optional.

  19. An Investigation of an Open-Source Software Development Environment in a Software Engineering Graduate Course

    ERIC Educational Resources Information Center

    Ge, Xun; Huang, Kun; Dong, Yifei

    2010-01-01

    A semester-long ethnography study was carried out to investigate project-based learning in a graduate software engineering course through the implementation of an Open-Source Software Development (OSSD) learning environment, which featured authentic projects, learning community, cognitive apprenticeship, and technology affordances. The study…

  20. HydroShare: Applying professional software engineering to a new NSF-funded large software project

    NASA Astrophysics Data System (ADS)

    Idaszak, R.; Tarboton, D. G.; Ames, D.; Saleem Arrigo, J. A.; Band, L. E.; Bedig, A.; Castronova, A. M.; Christopherson, L.; Coposky, J.; Couch, A.; Dash, P.; Gan, T.; Goodall, J.; Gustafson, K.; Heard, J.; Hooper, R. P.; Horsburgh, J. S.; Jackson, S.; Johnson, H.; Maidment, D. R.; Mbewe, P.; Merwade, V.; Miles, B.; Reeder, S.; Russell, T.; Song, C.; Taylor, A.; Thakur, S.; Valentine, D. W.; Whiteaker, T. L.

    2013-12-01

    HydroShare is an online, collaborative system being developed for sharing hydrologic data and models as part of the NSF's Software Infrastructure for Sustained Innovation (SI2) program (NSF collaborative award numbers 1148453 and 1148090). HydroShare involves a large software development effort requiring cooperative research and distributed software development between domain scientists, professional software engineers (here 'professional' denotes previous commercial experience in the application of modern software engineering), and university software developers. HydroShare expands upon the data sharing capabilities of the Hydrologic Information System of the Consortium of Universities for the Advancement of Hydrologic Sciences, Inc. (CUAHSI) by broadening the classes of data accommodated, expanding capability to include the sharing of models and model components, and taking advantage of emerging social media functionality to enhance information about and collaboration around hydrologic data and models. With a goal of enabling better science concomitant with improved sustainable software practices, we will describe our approach, experiences, and lessons learned thus-far in applying professional software engineering to a large NSF-funded software project from the project's onset.

  1. Support for Different Roles in Software Engineering Master's Thesis Projects

    ERIC Educational Resources Information Center

    Host, M.; Feldt, R.; Luders, F.

    2010-01-01

    Like many engineering programs in Europe, the final part of most Swedish software engineering programs is a longer project in which the students write a Master's thesis. These projects are often conducted in cooperation between a university and industry, and the students often have two supervisors, one at the university and one in industry. In…

  2. Experimental software engineering: Seventeen years of lessons in the SEL

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank E.

    1992-01-01

    Seven key principles developed by the Software Engineering Laboratory (SEL) at the Goddard Space Flight Center (GSFC) of the National Aeronautics and Space Administration (NASA) are described. For the past 17 years, the SEL has been experimentally analyzing the development of production software as varying techniques and methodologies are applied in this one environment. The SEL has collected, archived, and studied detailed measures from more than 100 flight dynamics projects, thereby gaining significant insight into the effectiveness of numerous software techniques, as well as extensive experience in the overall effectiveness of 'Experimental Software Engineering'. This experience has helped formulate follow-on studies in the SEL, and it has helped other software organizations better understand just what can be accomplished and what cannot be accomplished through experimentation.

  3. Current research in the Software Engineering Laboratory (SEL)

    NASA Technical Reports Server (NTRS)

    Card, D.; Basili, V. R.; Zelkowitz, M.

    1983-01-01

    The basic goal of software engineering is to produce the best possible software at the lowest possible cost. Many practices, tools, and techniques (collectively referred to as technologies) were developed that purport to help do this, some of which have become widely accepted in the software industry. However, few of these technologies were effectively evaluated experimentally. This is due in large part to an insufficient understanding of the software development process, a lack of recognized standards for measurement, and the prohibitive cost of large-scale controlled experiments. The analysis described addresses some of these issues. The specific objectives of this study were to: measure technology use in a production environment; develop a model for evaluating software engineering technologies; and evaluate the effects of productivity and reliability of some specific technologies.

  4. Proposing an Evidence-Based Strategy for Software Requirements Engineering.

    PubMed

    Lindoerfer, Doris; Mansmann, Ulrich

    2016-01-01

    This paper discusses an evidence-based approach to software requirements engineering. The approach is called evidence-based, since it uses publications on the specific problem as a surrogate for stakeholder interests, to formulate risks and testing experiences. This complements the idea that agile software development models are more relevant, in which requirements and solutions evolve through collaboration between self-organizing cross-functional teams. The strategy is exemplified and applied to the development of a Software Requirements list used to develop software systems for patient registries. PMID:27577464

  5. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 1: Introduction and user's guide

    NASA Technical Reports Server (NTRS)

    1983-01-01

    Reporting software programs provide formatted listings and summary reports of the Software Engineering Laboratory (SEL) data base contents. The operating procedures and system information for 18 different reporting software programs are described. Sample output reports from each program are provided.

  6. Software Engineering Research/Developer Collaborations (C104)

    NASA Technical Reports Server (NTRS)

    Shell, Elaine; Shull, Forrest

    2005-01-01

    The goal of this collaboration was to produce Flight Software Branch (FSB) process standards for software inspections which could be used across three new missions within the FSB. The standard was developed by Dr. Forrest Shull (Fraunhofer Center for Experimental Software Engineering, Maryland) using the Perspective-Based Inspection approach, (PBI research has been funded by SARP) , then tested on a pilot Branch project. Because the short time scale of the collaboration ruled out a quantitative evaluation, it would be decided whether the standard was suitable for roll-out to other Branch projects based on a qualitative measure: whether the standard received high ratings from Branch personnel as to usability and overall satisfaction. The project used for piloting the Perspective-Based Inspection approach was a multi-mission framework designed for reuse. This was a good choice because key representatives from the three new missions would be involved in the inspections. The perspective-based approach was applied to produce inspection procedures tailored for the specific quality needs of the branch. The technical information to do so was largely drawn through a series of interviews with Branch personnel. The framework team used the procedures to review requirements. The inspections were useful for indicating that a restructuring of the requirements document was needed, which led to changes in the development project plan. The standard was sent out to other Branch personnel for review. Branch personnel were very positive. However, important changes were identified because the perspective of Attitude Control System (ACS) developers had not been adequately represented, a result of the specific personnel interviewed. The net result is that with some further work to incorporate the ACS perspective, and in synchrony with the roll out of independent Branch standards, the PBI approach will be implemented in the FSB. Also, the project intends to continue its collaboration with

  7. Avionics Simulation, Development and Software Engineering

    NASA Technical Reports Server (NTRS)

    Francis, Ronald C.; Settle, Gray; Tobbe, Patrick A.; Kissel, Ralph; Glaese, John; Blanche, Jim; Wallace, L. D.

    2001-01-01

    This monthly report summarizes the work performed under contract NAS8-00114 for Marshall Space Flight Center in the following tasks: 1) Purchase Order No. H-32831D, Task Order 001A, GPB Program Software Oversight; 2) Purchase Order No. H-32832D, Task Order 002, ISS EXPRESS Racks Software Support; 3) Purchase Order No. H-32833D, Task Order 003, SSRMS Math Model Integration; 4) Purchase Order No. H-32834D, Task Order 004, GPB Program Hardware Oversight; 5) Purchase Order No. H-32835D, Task Order 005, Electrodynamic Tether Operations and Control Analysis; 6) Purchase Order No. H-32837D, Task Order 007, SRB Command Receiver/Decoder; and 7) Purchase Order No. H-32838D, Task Order 008, AVGS/DART SW and Simulation Support

  8. An overview of the Software Engineering Laboratory

    NASA Technical Reports Server (NTRS)

    1994-01-01

    This report describes the background and structure of the SEL organization, the SEL process improvement approach, and its experimentation and data collection process. Results of some sample SEL studies are included. It includes a discussion of the overall implication of trends observed over 17 years of process improvement efforts and looks at the return on investment based on a comparison of total investment in process improvement with the measurable improvements seen in the organization's software product.

  9. Multiple Viewpoints System/ Software Engineering for Space

    NASA Astrophysics Data System (ADS)

    Blondelle, Gael; Panunzio, Marco; Pequery, Jerome; Bats, Melanie; Garcia, Gerald; Brun, Cedric

    2013-08-01

    This paper presents a return of experience on using viewpoint-oriented modeling to design on-board software for satellites. First, we demonstrate the interest of integrating heterogeneous viewpoints in a tool to cover the development process of an embedded system. Then, we recall the Space Component Model, its implementation with Obeo Designer, and the capability to extend it with specific purpose Domain Specific Languages. Last, we expose further viewpoints that could be implemented to address new aspects like safety or interoperability.

  10. Quality Assurance in Software Development: An Exploratory Investigation in Software Project Failures and Business Performance

    ERIC Educational Resources Information Center

    Ichu, Emmanuel A.

    2010-01-01

    Software quality is perhaps one of the most sought-after attributes in product development, however; this goal is unattained. Problem factors in software development and how these have affected the maintainability of the delivered software systems requires a thorough investigation. It was, therefore, very important to understand software…

  11. Object oriented development of engineering software using CLIPS

    NASA Technical Reports Server (NTRS)

    Yoon, C. John

    1991-01-01

    Engineering applications involve numeric complexity and manipulations of a large amount of data. Traditionally, numeric computation has been the concern in developing an engineering software. As engineering application software became larger and more complex, management of resources such as data, rather than the numeric complexity, has become the major software design problem. Object oriented design and implementation methodologies can improve the reliability, flexibility, and maintainability of the resulting software; however, some tasks are better solved with the traditional procedural paradigm. The C Language Integrated Production System (CLIPS), with deffunction and defgeneric constructs, supports the procedural paradigm. The natural blending of object oriented and procedural paradigms has been cited as the reason for the popularity of the C++ language. The CLIPS Object Oriented Language's (COOL) object oriented features are more versatile than C++'s. A software design methodology based on object oriented and procedural approaches appropriate for engineering software, and to be implemented in CLIPS was outlined. A method for sensor placement for Space Station Freedom is being implemented in COOL as a sample problem.

  12. Software quality assurance and software safety in the Biomed Control System

    SciTech Connect

    Singh, R.P.; Chu, W.T.; Ludewigt, B.A.; Marks, K.M.; Nyman, M.A.; Renner, T.R.; Stradtner, R.

    1989-10-31

    The Biomed Control System is a hardware/software system used for the delivery, measurement and monitoring of heavy-ion beams in the patient treatment and biology experiment rooms in the Bevalac at the Lawrence Berkeley Laboratory (LBL). This paper describes some aspects of this system including historical background philosophy, configuration management, hardware features that facilitate software testing, software testing procedures, the release of new software quality assurance, safety and operator monitoring. 3 refs.

  13. A software quality model and metrics for risk assessment

    NASA Technical Reports Server (NTRS)

    Hyatt, L.; Rosenberg, L.

    1996-01-01

    A software quality model and its associated attributes are defined and used as the model for the basis for a discussion on risk. Specific quality goals and attributes are selected based on their importance to a software development project and their ability to be quantified. Risks that can be determined by the model's metrics are identified. A core set of metrics relating to the software development process and its products is defined. Measurements for each metric and their usability and applicability are discussed.

  14. Software Estimates Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    2002-01-01

    Simulation-Based Cost Model (SiCM) is a computer program that simulates pertinent aspects of the testing of rocket engines for the purpose of estimating the costs of such testing during time intervals specified by its users. A user enters input data for control of simulations; information on the nature of, and activity in, a given testing project; and information on resources. Simulation objects are created on the basis of this input. Costs of the engineering-design, construction, and testing phases of a given project are estimated from numbers and labor rates of engineers and technicians employed in each phase, the duration of each phase; costs of materials used in each phase; and, for the testing phase, the rate of maintenance of the testing facility. The three main outputs of SiCM are (1) a curve, updated at each iteration of the simulation, that shows overall expenditures vs. time during the interval specified by the user; (2) a histogram of the total costs from all iterations of the simulation; and (3) table displaying means and variances of cumulative costs for each phase from all iterations. Other outputs include spending curves for each phase.

  15. Software architecture and engineering for patient records: current and future.

    PubMed

    Weng, Chunhua; Levine, Betty A; Mun, Seong K

    2009-05-01

    During the "The National Forum on the Future of the Defense Health Information System," a track focusing on "Systems Architecture and Software Engineering" included eight presenters. These presenters identified three key areas of interest in this field, which include the need for open enterprise architecture and a federated database design, net centrality based on service-oriented architecture, and the need for focus on software usability and reusability. The eight panelists provided recommendations related to the suitability of service-oriented architecture and the enabling technologies of grid computing and Web 2.0 for building health services research centers and federated data warehouses to facilitate large-scale collaborative health care and research. Finally, they discussed the need to leverage industry best practices for software engineering to facilitate rapid software development, testing, and deployment. PMID:19562959

  16. Software Engineering Research/Developer Collaborations in 2004 (C104)

    NASA Technical Reports Server (NTRS)

    Pressburger, Tom; Markosian, Lawrance

    2005-01-01

    In 2004, six collaborations between software engineering technology providers and NASA software development personnel deployed a total of five software engineering technologies (for references, see Section 7.2) on the NASA projects. The main purposes were to benefit the projects, infuse the technologies if beneficial into NASA, and give feedback to the technology providers to improve the technologies. Each collaboration project produced a final report (for references, see Section 7.1). Section 2 of this report summarizes each project, drawing from the final reports and communications with the software developers and technology providers. Section 3 indicates paths to further infusion of the technologies into NASA practice. Section 4 summarizes some technology transfer lessons learned. Section 6 lists the acronyms used in this report.

  17. Problem-Solving Methods in Agent-Oriented Software Engineering

    NASA Astrophysics Data System (ADS)

    Bogg, Paul; Beydoun, Ghassan; Low, Graham

    Problem-solving methods (PSM) are abstract structures that describe specific reasoning processes employed to solve a set of similar problems. We envisage that off-the-shelf PSMs can assist in the development of agent-oriented solutions, not only as reusable and extensible components that software engineers employ for designing agent architecture solutions, but just as importantly as a set of runtime capabilities that agents themselves dynamically employ in order to solve problems. This chapter describes PSMs for agent-oriented software engineering (AOSE) that address interaction-dependent problem-solving such as negotiation or cooperation. An extension to an AOSE methodology MOBMAS is proposed whereby PSMs are integrated in the software development phases of MAS Organization Design, Internal Design, and Interaction Design. In this way, knowledge engineering drives the development of agent-oriented systems.

  18. V & V Within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1996-01-01

    Verification and validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission critical software. This paper describes the working group's success in identifying V&V tasks that could be performed in the domain engineering and transition levels of reuse-based software engineering. The primary motivation for V&V at the domain level is to provide assurance that the domain requirements are correct and that the domain artifacts correctly implement the domain requirements. A secondary motivation is the possible elimination of redundant V&V activities at the application level. The group also considered the criteria and motivation for performing V&V in domain engineering.

  19. Artificial Intelligence Software Engineering (AISE) model

    NASA Technical Reports Server (NTRS)

    Kiss, Peter A.

    1990-01-01

    The American Institute of Aeronautics and Astronautics has initiated a committee on standards for Artificial Intelligence. Presented are the initial efforts of one of the working groups of that committee. A candidate model is presented for the development life cycle of knowledge based systems (KBSs). The intent is for the model to be used by the aerospace community and eventually be evolved into a standard. The model is rooted in the evolutionary model, borrows from the spiral model, and is embedded in the standard Waterfall model for software development. Its intent is to satisfy the development of both stand-alone and embedded KBSs. The phases of the life cycle are shown and detailed as are the review points that constitute the key milestones throughout the development process. The applicability and strengths of the model are discussed along with areas needing further development and refinement by the aerospace community.

  20. Repository-Based Software Engineering Program: Working Program Management Plan

    NASA Technical Reports Server (NTRS)

    1993-01-01

    Repository-Based Software Engineering Program (RBSE) is a National Aeronautics and Space Administration (NASA) sponsored program dedicated to introducing and supporting common, effective approaches to software engineering practices. The process of conceiving, designing, building, and maintaining software systems by using existing software assets that are stored in a specialized operational reuse library or repository, accessible to system designers, is the foundation of the program. In addition to operating a software repository, RBSE promotes (1) software engineering technology transfer, (2) academic and instructional support of reuse programs, (3) the use of common software engineering standards and practices, (4) software reuse technology research, and (5) interoperability between reuse libraries. This Program Management Plan (PMP) is intended to communicate program goals and objectives, describe major work areas, and define a management report and control process. This process will assist the Program Manager, University of Houston at Clear Lake (UHCL) in tracking work progress and describing major program activities to NASA management. The goal of this PMP is to make managing the RBSE program a relatively easy process that improves the work of all team members. The PMP describes work areas addressed and work efforts being accomplished by the program; however, it is not intended as a complete description of the program. Its focus is on providing management tools and management processes for monitoring, evaluating, and administering the program; and it includes schedules for charting milestones and deliveries of program products. The PMP was developed by soliciting and obtaining guidance from appropriate program participants, analyzing program management guidance, and reviewing related program management documents.

  1. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  2. The development and technology transfer of software engineering technology at NASA. Johnson Space Center

    NASA Technical Reports Server (NTRS)

    Pitman, C. L.; Erb, D. M.; Izygon, M. E.; Fridge, E. M., III; Roush, G. B.; Braley, D. M.; Savely, R. T.

    1992-01-01

    The United State's big space projects of the next decades, such as Space Station and the Human Exploration Initiative, will need the development of many millions of lines of mission critical software. NASA-Johnson (JSC) is identifying and developing some of the Computer Aided Software Engineering (CASE) technology that NASA will need to build these future software systems. The goal is to improve the quality and the productivity of large software development projects. New trends are outlined in CASE technology and how the Software Technology Branch (STB) at JSC is endeavoring to provide some of these CASE solutions for NASA is described. Key software technology components include knowledge-based systems, software reusability, user interface technology, reengineering environments, management systems for the software development process, software cost models, repository technology, and open, integrated CASE environment frameworks. The paper presents the status and long-term expectations for CASE products. The STB's Reengineering Application Project (REAP), Advanced Software Development Workstation (ASDW) project, and software development cost model (COSTMODL) project are then discussed. Some of the general difficulties of technology transfer are introduced, and a process developed by STB for CASE technology insertion is described.

  3. Seamless Method- and Model-based Software and Systems Engineering

    NASA Astrophysics Data System (ADS)

    Broy, Manfred

    Today engineering software intensive systems is still more or less handicraft or at most at the level of manufacturing. Many steps are done ad-hoc and not in a fully systematic way. Applied methods, if any, are not scientifically justified, not justified by empirical data and as a result carrying out large software projects still is an adventure. However, there is no reason why the development of software intensive systems cannot be done in the future with the same precision and scientific rigor as in established engineering disciplines. To do that, however, a number of scientific and engineering challenges have to be mastered. The first one aims at a deep understanding of the essentials of carrying out such projects, which includes appropriate models and effective management methods. What is needed is a portfolio of models and methods coming together with a comprehensive support by tools as well as deep insights into the obstacles of developing software intensive systems and a portfolio of established and proven techniques and methods with clear profiles and rules that indicate when which method is ready for application. In the following we argue that there is scientific evidence and enough research results so far to be confident that solid engineering of software intensive systems can be achieved in the future. However, yet quite a number of scientific research problems have to be solved.

  4. Software Estimates Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Smith, C. L.

    2003-01-01

    Simulation-Based Cost Model (SiCM), a discrete event simulation developed in Extend , simulates pertinent aspects of the testing of rocket propulsion test articles for the purpose of estimating the costs of such testing during time intervals specified by its users. A user enters input data for control of simulations; information on the nature of, and activity in, a given testing project; and information on resources. Simulation objects are created on the basis of this input. Costs of the engineering-design, construction, and testing phases of a given project are estimated from numbers and labor rates of engineers and technicians employed in each phase, the duration of each phase; costs of materials used in each phase; and, for the testing phase, the rate of maintenance of the testing facility. The three main outputs of SiCM are (1) a curve, updated at each iteration of the simulation, that shows overall expenditures vs. time during the interval specified by the user; (2) a histogram of the total costs from all iterations of the simulation; and (3) table displaying means and variances of cumulative costs for each phase from all iterations. Other outputs include spending curves for each phase.

  5. Quantum searching application in search based software engineering

    NASA Astrophysics Data System (ADS)

    Wu, Nan; Song, FangMin; Li, Xiangdong

    2013-05-01

    The Search Based Software Engineering (SBSE) is widely used in software engineering for identifying optimal solutions. However, there is no polynomial-time complexity solution used in the traditional algorithms for SBSE, and that causes the cost very high. In this paper, we analyze and compare several quantum search algorithms that could be applied for SBSE: quantum adiabatic evolution searching algorithm, fixed-point quantum search (FPQS), quantum walks, and a rapid modified Grover quantum searching method. The Grover's algorithm is thought as the best choice for a large-scaled unstructured data searching and theoretically it can be applicable to any search-space structure and any type of searching problems.

  6. Parallelization of Rocket Engine Simulator Software (PRESS)

    NASA Technical Reports Server (NTRS)

    Cezzar, Ruknet

    1998-01-01

    We have outlined our work in the last half of the funding period. We have shown how a demo package for RESSAP using MPI can be done. However, we also mentioned the difficulties with the UNIX platform. We have reiterated some of the suggestions made during the presentation of the progress of the at Fourth Annual HBCU Conference. Although we have discussed, in some detail, how TURBDES/PUMPDES software can be run in parallel using MPI, at present, we are unable to experiment any further with either MPI or PVM. Due to X windows not being implemented, we are also not able to experiment further with XPVM, which it will be recalled, has a nice GUI interface. There are also some concerns, on our part, about MPI being an appropriate tool. The best thing about MPr is that it is public domain. Although and plenty of documentation exists for the intricacies of using MPI, little information is available on its actual implementations. Other than very typical, somewhat contrived examples, such as Jacobi algorithm for solving Laplace's equation, there are few examples which can readily be applied to real situations, such as in our case. In effect, the review of literature on both MPI and PVM, and there is a lot, indicate something similar to the enormous effort which was spent on LISP and LISP-like languages as tools for artificial intelligence research. During the development of a book on programming languages [12], when we searched the literature for very simple examples like taking averages, reading and writing records, multiplying matrices, etc., we could hardly find a any! Yet, so much was said and done on that topic in academic circles. It appears that we faced the same problem with MPI, where despite significant documentation, we could not find even a simple example which supports course-grain parallelism involving only a few processes. From the foregoing, it appears that a new direction may be required for more productive research during the extension period (10/19/98 - 10

  7. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1. PMID:21142522

  8. Software environment for implementing engineering applications on MIMD computers

    NASA Technical Reports Server (NTRS)

    Lopez, L. A.; Valimohamed, K. A.; Schiff, S.

    1990-01-01

    In this paper the concept for a software environment for developing engineering application systems for multiprocessor hardware (MIMD) is presented. The philosophy employed is to solve the largest problems possible in a reasonable amount of time, rather than solve existing problems faster. In the proposed environment most of the problems concerning parallel computation and handling of large distributed data spaces are hidden from the application program developer, thereby facilitating the development of large-scale software applications. Applications developed under the environment can be executed on a variety of MIMD hardware; it protects the application software from the effects of a rapidly changing MIMD hardware technology.

  9. Based Aspect-oriented Petri Nets in Software Engineering

    NASA Astrophysics Data System (ADS)

    Hu, Wensong; Yang, Xingui; Zuo, Ke

    Aspect Oriented (Aspect-Oriented, referred to as AO) as a new programming technology is increasingly cause for concern. This article describes a number of experts to study the current object-oriented Petri Nets (OO PN) adding aspect-oriented thinking, combined with software design and development cycle, given the aspect-oriented OO PN in software engineering methods and steps. Shows the method of using AO PN government office system software design and development of application examples, and gives some object class, the log section and the application form. As the plane of isolation, reducing the coupling, the use of AO PN ways in different applications will use a combination of each section, allowing code reusability enhancement. OOPN itself can process the software system design and development of effective control to ensure that the software system reliability and standardization.

  10. PIRANHA: A Knowledge Discovery Engine Software

    Energy Science and Technology Software Center (ESTSC)

    2008-12-18

    Clustering textual information provides a number of useful capabilities for a PIRANHA user. Finding Similar Documents : A user can select a document of interest, and then quickly find other documents that are a close match. For example, a user may have an e-mail of interest found on a captured computer. Clustering allows for similar e-mail on other computers to be quickly found, thus potentially establishing a link. Document Sampling: A set of documents willmore » usually contain common themes or topic. Representative documents from these themes can be quickly found, and presented to a user. For example, a captured hard drive may have thousands of documents representing many different topics, i.e., form finances to methods to favorite restaurants. Ten or twenty representative documents from these themes can be found, and used by a user to quickly determine what areas need further attention. Classifying Documents: A set of representative documents can be used by a user to define a topic of interest, and then related documents can be added to that set. This allows a user to pick specific documents that are of interest, perhaps nuclear materials, then allows the software agents to automatically put related document into the same nuclear materials folder.« less

  11. NIF Projects Controls and Information Systems Software Quality Assurance Plan

    SciTech Connect

    Fishler, B

    2011-03-18

    Quality achievement for the National Ignition Facility (NIF) and the National Ignition Campaign (NIC) is the responsibility of the NIF Projects line organization as described in the NIF and Photon Science Directorate Quality Assurance Plan (NIF QA Plan). This Software Quality Assurance Plan (SQAP) is subordinate to the NIF QA Plan and establishes quality assurance (QA) activities for the software subsystems within Controls and Information Systems (CIS). This SQAP implements an activity level software quality assurance plan for NIF Projects as required by the LLNL Institutional Software Quality Assurance Program (ISQAP). Planned QA activities help achieve, assess, and maintain appropriate quality of software developed and/or acquired for control systems, shot data systems, laser performance modeling systems, business applications, industrial control and safety systems, and information technology systems. The objective of this SQAP is to ensure that appropriate controls are developed and implemented for management planning, work execution, and quality assessment of the CIS organization's software activities. The CIS line organization places special QA emphasis on rigorous configuration control, change management, testing, and issue tracking to help achieve its quality goals.

  12. Milestones in Software Engineering and Knowledge Engineering History: A Comparative Review

    PubMed Central

    del Águila, Isabel M.; Palma, José; Túnez, Samuel

    2014-01-01

    We present a review of the historical evolution of software engineering, intertwining it with the history of knowledge engineering because “those who cannot remember the past are condemned to repeat it.” This retrospective represents a further step forward to understanding the current state of both types of engineerings; history has also positive experiences; some of them we would like to remember and to repeat. Two types of engineerings had parallel and divergent evolutions but following a similar pattern. We also define a set of milestones that represent a convergence or divergence of the software development methodologies. These milestones do not appear at the same time in software engineering and knowledge engineering, so lessons learned in one discipline can help in the evolution of the other one. PMID:24624046

  13. Milestones in software engineering and knowledge engineering history: a comparative review.

    PubMed

    del Águila, Isabel M; Palma, José; Túnez, Samuel

    2014-01-01

    We present a review of the historical evolution of software engineering, intertwining it with the history of knowledge engineering because "those who cannot remember the past are condemned to repeat it." This retrospective represents a further step forward to understanding the current state of both types of engineerings; history has also positive experiences; some of them we would like to remember and to repeat. Two types of engineerings had parallel and divergent evolutions but following a similar pattern. We also define a set of milestones that represent a convergence or divergence of the software development methodologies. These milestones do not appear at the same time in software engineering and knowledge engineering, so lessons learned in one discipline can help in the evolution of the other one. PMID:24624046

  14. Modelling of diesel engine fuelled with biodiesel using engine simulation software

    NASA Astrophysics Data System (ADS)

    Said, Mohd Farid Muhamad; Said, Mazlan; Aziz, Azhar Abdul

    2012-06-01

    This paper is about modelling of a diesel engine that operates using biodiesel fuels. The model is used to simulate or predict the performance and combustion of the engine by simplified the geometry of engine component in the software. The model is produced using one-dimensional (1D) engine simulation software called GT-Power. The fuel properties library in the software is expanded to include palm oil based biodiesel fuels. Experimental works are performed to investigate the effect of biodiesel fuels on the heat release profiles and the engine performance curves. The model is validated with experimental data and good agreement is observed. The simulation results show that combustion characteristics and engine performances differ when biodiesel fuels are used instead of no. 2 diesel fuel.

  15. Issues in Software Engineering of Relevance to Instructional Design

    ERIC Educational Resources Information Center

    Douglas, Ian

    2006-01-01

    Software engineering is popularly misconceived as being an upmarket term for programming. In a way, this is akin to characterizing instructional design as the process of creating PowerPoint slides. In both these areas, the construction of systems, whether they are learning or computer systems, is only one part of a systematic process. The most…

  16. Microsoft Excel Software Usage for Teaching Science and Engineering Curriculum

    ERIC Educational Resources Information Center

    Singh, Gurmukh; Siddiqui, Khalid

    2009-01-01

    In this article, our main objective is to present the use of Microsoft Software Excel 2007/2003 for teaching college and university level curriculum in science and engineering. In particular, we discuss two interesting and fascinating examples of interactive applications of Microsoft Excel targeted for undergraduate students in: 1) computational…

  17. PEOPLE IN PHYSICS: Interview with Scott Durow, Software Engineer, Oxford

    NASA Astrophysics Data System (ADS)

    Burton, Conducted by Paul

    1998-05-01

    Scott Durow was educated at Bootham School, York. He studied Physics, Mathematics and Chemistry to A-level and went on to Nottingham University to read Medical Physics. After graduating from Nottingham he embarked on his present career as a Software Engineer based in Oxford. He is a musician in his spare time, as a member of a band and playing the French horn.

  18. Application of Plagiarism Screening Software in the Chemical Engineering Curriculum

    ERIC Educational Resources Information Center

    Cooper, Matthew E.; Bullard, Lisa G.

    2014-01-01

    Plagiarism is an area of increasing concern for written ChE assignments, such as laboratory and design reports, due to ease of access to text and other materials via the internet. This study examines the application of plagiarism screening software to four courses in a university chemical engineering curriculum. The effectiveness of plagiarism…

  19. QUICK - An interactive software environment for engineering design

    NASA Technical Reports Server (NTRS)

    Skinner, David L.

    1989-01-01

    QUICK, an interactive software environment for engineering design, provides a programmable FORTRAN-like calculator interface to a wide range of data structures as well as both built-in and user created functions. QUICK also provides direct access to the operating systems of eight different machine architectures. The evolution of QUICK and a brief overview of the current version are presented.

  20. The Company Approach to Software Engineering Project Courses

    ERIC Educational Resources Information Center

    Broman, D.; Sandahl, K.; Abu Baker, M.

    2012-01-01

    Teaching larger software engineering project courses at the end of a computing curriculum is a way for students to learn some aspects of real-world jobs in industry. Such courses, often referred to as capstone courses, are effective for learning how to apply the skills they have acquired in, for example, design, test, and configuration management.…

  1. A Team Building Model for Software Engineering Courses Term Projects

    ERIC Educational Resources Information Center

    Sahin, Yasar Guneri

    2011-01-01

    This paper proposes a new model for team building, which enables teachers to build coherent teams rapidly and fairly for the term projects of software engineering courses. Moreover, the model can also be used to build teams for any type of project, if the team member candidates are students, or if they are inexperienced on a certain subject. The…

  2. Open Source Projects in Software Engineering Education: A Mapping Study

    ERIC Educational Resources Information Center

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study…

  3. [Stressor and stress reduction strategies for computer software engineers].

    PubMed

    Asakura, Takashi

    2002-07-01

    First, in this article we discuss 10 significant occupational stressors for computer software engineers, based on the review of the scientific literature on their stress and mental health. The stressors include 1) quantitative work overload, 2) time pressure, 3) qualitative work load, 4) speed and diffusion of technological innovation, and technological divergence, 5) low discretional power, 6) underdeveloped career pattern, 7) low earnings/reward from jobs, 8) difficulties in managing a project team for software development and establishing support system, 9) difficulties in customer relations, and 10) personality characteristics. In addition, we delineate their working and organizational conditions that cause such occupational stressors in order to find strategies to reduce those stressors in their workplaces. Finally, we suggest three stressor and stress reduction strategies for software engineers. PMID:12229224

  4. Quality Market: Design and Field Study of Prediction Market for Software Quality Control

    ERIC Educational Resources Information Center

    Krishnamurthy, Janaki

    2010-01-01

    Given the increasing competition in the software industry and the critical consequences of software errors, it has become important for companies to achieve high levels of software quality. While cost reduction and timeliness of projects continue to be important measures, software companies are placing increasing attention on identifying the user…

  5. Development and Application of New Quality Model for Software Projects

    PubMed Central

    Karnavel, K.; Dillibabu, R.

    2014-01-01

    The IT industry tries to employ a number of models to identify the defects in the construction of software projects. In this paper, we present COQUALMO and its limitations and aim to increase the quality without increasing the cost and time. The computation time, cost, and effort to predict the residual defects are very high; this was overcome by developing an appropriate new quality model named the software testing defect corrective model (STDCM). The STDCM was used to estimate the number of remaining residual defects in the software product; a few assumptions and the detailed steps of the STDCM are highlighted. The application of the STDCM is explored in software projects. The implementation of the model is validated using statistical inference, which shows there is a significant improvement in the quality of the software projects. PMID:25478594

  6. Adoption of Requirements Engineering Practices in Malaysian Software Development Companies

    NASA Astrophysics Data System (ADS)

    Solemon, Badariah; Sahibuddin, Shamsul; Ghani, Abdul Azim Abd

    This paper presents exploratory survey results on Requirements Engineering (RE) practices of some software development companies in Malaysia. The survey attempted to identify patterns of RE practices the companies are implementing. Information required for the survey was obtained through a survey, mailed self-administered questionnaires distributed to project managers and software developers who are working at software development companies operated across the country. The results showed that the overall adoption of the RE practices in these companies is strong. However, the results also indicated that fewer companies in the survey have use appropriate CASE tools or software to support their RE process and practices, define traceability policies and maintain traceability manual in their projects.

  7. Towards a mature measurement environment: Creating a software engineering research environment

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.

    1990-01-01

    Software engineering researchers are building tools, defining methods, and models; however, there are problems with the nature and style of the research. The research is typically bottom-up, done in isolation so the pieces cannot be easily logically or physically integrated. A great deal of the research is essentially the packaging of a particular piece of technology with little indication of how the work would be integrated with other prices of research. The research is not aimed at solving the real problems of software engineering, i.e., the development and maintenance of quality systems in a productive manner. The research results are not evaluated or analyzed via experimentation or refined and tailored to the application environment. Thus, it cannot be easily transferred into practice. Because of these limitations we have not been able to understand the components of the discipline as a coherent whole and the relationships between various models of the process and product. What is needed is a top down experimental, evolutionary framework in which research can be focused, logically and physically integrated to produce quality software productively, and evaluated and tailored to the application environment. This implies the need for experimentation, which in turn implies the need for a laboratory that is associated with the artifact we are studying. This laboratory can only exist in an environment where software is being built, i.e., as part of a real software development and maintenance organization. Thus, we propose that Software Engineering Laboratory (SEL) type activities exist in all organizations to support software engineering research. We describe the SEL from a researcher's point of view, and discuss the corporate and government benefits of the SEL. The discussion focuses on the benefits to the research community.

  8. Quality engineering as a discipline of study.

    SciTech Connect

    Kolb, Rachel R.; Hoover, Marcey L.

    2012-12-01

    The current framework for quality scholarship in the United States ranges from the training and education of future quality engineers, managers, and professionals to focused and sustained research initiatives that, through academic institutions and other organizations, aim to improve the knowledge and application of quality across a variety of sectors. Numerous quality journals also provide a forum for professional dissemination of information.

  9. Software Engineering Laboratory (SEL) data base reporting software user's guide and system description. Volume 2: Program descriptions

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The structure and functions of each reporting software program for the Software Engineering Laboratory data base are described. Baseline diagrams, module descriptions, and listings of program generation files are included.

  10. Using software metrics and software reliability models to attain acceptable quality software for flight and ground support software for avionic systems

    NASA Technical Reports Server (NTRS)

    Lawrence, Stella

    1992-01-01

    This paper is concerned with methods of measuring and developing quality software. Reliable flight and ground support software is a highly important factor in the successful operation of the space shuttle program. Reliability is probably the most important of the characteristics inherent in the concept of 'software quality'. It is the probability of failure free operation of a computer program for a specified time and environment.

  11. Sharing Research Models: Using Software Engineering Practices for Facilitation.

    PubMed

    Bryant, Stephanie P; Solano, Eric; Cantor, Susanna; Cooley, Philip C; Wagener, Diane K

    2011-03-01

    Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems' behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations-such as nonintuitive user interface features and data input specifications-may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices- the iterative software development process, object-oriented methodology, and Unified Modeling Language-and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780

  12. Sharing Research Models: Using Software Engineering Practices for Facilitation

    PubMed Central

    Bryant, Stephanie P.; Solano, Eric; Cantor, Susanna; Cooley, Philip C.; Wagener, Diane K.

    2011-01-01

    Increasingly, researchers are turning to computational models to understand the interplay of important variables on systems’ behaviors. Although researchers may develop models that meet the needs of their investigation, application limitations—such as nonintuitive user interface features and data input specifications—may limit the sharing of these tools with other research groups. By removing these barriers, other research groups that perform related work can leverage these work products to expedite their own investigations. The use of software engineering practices can enable managed application production and shared research artifacts among multiple research groups by promoting consistent models, reducing redundant effort, encouraging rigorous peer review, and facilitating research collaborations that are supported by a common toolset. This report discusses three established software engineering practices— the iterative software development process, object-oriented methodology, and Unified Modeling Language—and the applicability of these practices to computational model development. Our efforts to modify the MIDAS TranStat application to make it more user-friendly are presented as an example of how computational models that are based on research and developed using software engineering practices can benefit a broader audience of researchers. PMID:21687780

  13. TriBITS lifecycle model. Version 1.0, a lean/agile software lifecycle model for research-based computational science and engineering and applied mathematical software.

    SciTech Connect

    Willenbring, James M.; Bartlett, Roscoe Ainsworth; Heroux, Michael Allen

    2012-01-01

    Software lifecycles are becoming an increasingly important issue for computational science and engineering (CSE) software. The process by which a piece of CSE software begins life as a set of research requirements and then matures into a trusted high-quality capability is both commonplace and extremely challenging. Although an implicit lifecycle is obviously being used in any effort, the challenges of this process - respecting the competing needs of research vs. production - cannot be overstated. Here we describe a proposal for a well-defined software lifecycle process based on modern Lean/Agile software engineering principles. What we propose is appropriate for many CSE software projects that are initially heavily focused on research but also are expected to eventually produce usable high-quality capabilities. The model is related to TriBITS, a build, integration and testing system, which serves as a strong foundation for this lifecycle model, and aspects of this lifecycle model are ingrained in the TriBITS system. Here, we advocate three to four phases or maturity levels that address the appropriate handling of many issues associated with the transition from research to production software. The goals of this lifecycle model are to better communicate maturity levels with customers and to help to identify and promote Software Engineering (SE) practices that will help to improve productivity and produce better software. An important collection of software in this domain is Trilinos, which is used as the motivation and the initial target for this lifecycle model. However, many other related and similar CSE (and non-CSE) software projects can also make good use of this lifecycle model, especially those that use the TriBITS system. Indeed this lifecycle process, if followed, will enable large-scale sustainable integration of many complex CSE software efforts across several institutions.

  14. Information management in software engineering: A hypertext based approach

    SciTech Connect

    Garg, P.K.

    1989-01-01

    Large scale software systems requires automated support for the communication and coordination of cooperative effort required to manage, create, store, print, and use large volumes of information. Hypertext is a flexible, general purpose, non-linear model for storing textual information. The thesis of this dissertation is that software engineering information should be managed using a hypertext based approach. Previous models of software engineering information do not provide the flexibility and generality of a hypertext model. We describe the design and implementation of an Intelligent Software Hypertext System (Ishys), for management of software engineering information. In Ishys, Basic Templates (BTs) are nodes of a hypertext, which can have point-to-point and node-to-node links. Forms are sequences of BTs which can be used to model software life cycle documents. Compositions are arbitrary collections of BTs. Projects represent partitions of the hypertext such that each project has the same forms. Tasks are responsibilities of agents which use or produce forms. Tasks are composed of actions which use or produce BTs. Actions are composed of primitive actions which are commands provided through Ishys. In a general user mode, users create information in Ishys by instantiating BTs. In a super user mode, users define the projects, forms, BTs, tasks, actions, and primitive actions. Structured communication about BTs is supported through annotations attached to different points within BTs. To illustrate the flexibility and generality of the hypertext based approach we show how a hypertext can support generalizations, specializations, revisions, and aggregations. To illustrate the power of hypertext systems in supporting collaborative work, we discuss how a hypertext system can support two models for composition of node composition: (1) the assembly line model, and (2) the exploratory assembly model.

  15. Software engineering capability for Ada (GRASP/Ada Tool)

    NASA Technical Reports Server (NTRS)

    Cross, James H., II

    1995-01-01

    The GRASP/Ada project (Graphical Representations of Algorithms, Structures, and Processes for Ada) has successfully created and prototyped a new algorithmic level graphical representation for Ada software, the Control Structure Diagram (CSD). The primary impetus for creation of the CSD was to improve the comprehension efficiency of Ada software and, as a result, improve reliability and reduce costs. The emphasis has been on the automatic generation of the CSD from Ada PDL or source code to support reverse engineering and maintenance. The CSD has the potential to replace traditional prettyprinted Ada Source code. A new Motif compliant graphical user interface has been developed for the GRASP/Ada prototype.

  16. Software Engineering Laboratory (SEL) relationships, models, and management rules

    NASA Technical Reports Server (NTRS)

    Decker, William; Hendrick, Robert; Valett, Jon D.

    1991-01-01

    Over 50 individual Software Engineering Laboratory (SEL) research results, extracted from a review of published SEL documentation, that can be applied directly to managing software development projects are captured. Four basic categories of results are defined and discussed - environment profiles, relationships, models, and management rules. In each category, research results are presented as a single page that summarizes the individual result, lists potential uses of the result by managers, and references the original SEL documentation where the result was found. The document serves as a concise reference summary of applicable research for SEL managers.

  17. Open source data analysis and visualization software for optical engineering

    NASA Astrophysics Data System (ADS)

    Smith, Greg A.; Lewis, Benjamin J.; Palmer, Michael; Kim, Dae Wook; Loeff, Adrian R.; Burge, James H.

    2012-10-01

    SAGUARO is open-source software developed to simplify data assimilation, analysis, and visualization by providing a single framework for disparate data sources from raw hardware measurements to optical simulation output. Developed with a user-friendly graphical interface in the MATLABTM environment, SAGUARO is intended to be easy for the enduser in search of useful optical information as well as the developer wanting to add new modules and functionalities. We present here the flexibility of the SAGUARO software and discuss how it can be applied to the wider optical engineering community.

  18. A software engineering approach for medical workstations development.

    PubMed

    Jean, F C; Lavril, M; Lemaitre, D; Sauquet, D; Degoulet, P

    1994-01-01

    Multimedia medical workstations represent the natural tool for accessing the hospital information system environment. They are complex medical systems that have to gather, in a single framework, a large collection of components dealing with multimedia medical objects. To remain current with both medical practice and with advances in the computer science field, they have to allow the iterative addition of new functions to the set of existing ones. In this paper, after a survey of commonly required medical workstation functional components, we shall try to discuss how a software engineering approach can streamline the development of a medical workstation. Different software engineering tools needed to build the functional components of a workstation are described. Their integration in a single dedicated environment is considered through four perspectives: data, presentation, communication and control. Benefits and limitations of an object-oriented approach are discussed. PMID:8125636

  19. A survey of program slicing for software engineering

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    This research concerns program slicing which is used as a tool for program maintainence of software systems. Program slicing decreases the level of effort required to understand and maintain complex software systems. It was first designed as a debugging aid, but it has since been generalized into various tools and extended to include program comprehension, module cohesion estimation, requirements verification, dead code elimination, and maintainence of several software systems, including reverse engineering, parallelization, portability, and reuse component generation. This paper seeks to address and define terminology, theoretical concepts, program representation, different program graphs, developments in static slicing, dynamic slicing, and semantics and mathematical models. Applications for conventional slicing are presented, along with a prognosis of future work in this field.

  20. Aspect-Oriented Model-Driven Software Product Line Engineering

    NASA Astrophysics Data System (ADS)

    Groher, Iris; Voelter, Markus

    Software product line engineering aims to reduce development time, effort, cost, and complexity by taking advantage of the commonality within a portfolio of similar products. The effectiveness of a software product line approach directly depends on how well feature variability within the portfolio is implemented and managed throughout the development lifecycle, from early analysis through maintenance and evolution. This article presents an approach that facilitates variability implementation, management, and tracing by integrating model-driven and aspect-oriented software development. Features are separated in models and composed of aspect-oriented composition techniques on model level. Model transformations support the transition from problem to solution space models. Aspect-oriented techniques enable the explicit expression and modularization of variability on model, template, and code level. The presented concepts are illustrated with a case study of a home automation system.

  1. Software quality assurance (SQA) for Savannah River reactors

    SciTech Connect

    Schaumann, C.M.

    1990-01-01

    Over the last 25 years, the Savannah River Site (SRS) has developed a strong Software Quality Assurance (SQA) program. It provides the information and management controls required of a high quality auditable system. The SRS SQA program provides the framework to meet the requirements in increasing regulation.

  2. Software Engineering Laboratory (SEL) database organization and user's guide

    NASA Technical Reports Server (NTRS)

    So, Maria; Heller, Gerard; Steinberg, Sandra; Spiegel, Douglas

    1989-01-01

    The organization of the Software Engineering Laboratory (SEL) database is presented. Included are definitions and detailed descriptions of the database tables and views, the SEL data, and system support data. The mapping from the SEL and system support data to the base tables is described. In addition, techniques for accessing the database, through the Database Access Manager for the SEL (DAMSEL) system and via the ORACLE structured query language (SQL), are discussed.

  3. Software Engineering Laboratory Ada performance study: Results and implications

    NASA Technical Reports Server (NTRS)

    Booth, Eric W.; Stark, Michael E.

    1992-01-01

    The SEL is an organization sponsored by NASA/GSFC to investigate the effectiveness of software engineering technologies applied to the development of applications software. The SEL was created in 1977 and has three organizational members: NASA/GSFC, Systems Development Branch; The University of Maryland, Computer Sciences Department; and Computer Sciences Corporation, Systems Development Operation. The goals of the SEL are as follows: (1) to understand the software development process in the GSFC environments; (2) to measure the effect of various methodologies, tools, and models on this process; and (3) to identify and then to apply successful development practices. The activities, findings, and recommendations of the SEL are recorded in the Software Engineering Laboratory Series, a continuing series of reports that include the Ada Performance Study Report. This paper describes the background of Ada in the Flight Dynamics Division (FDD), the objectives and scope of the Ada Performance Study, the measurement approach used, the performance tests performed, the major test results, and the implications for future FDD Ada development efforts.

  4. The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)

    2003-01-01

    We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.

  5. Application of software engineering to development of reactor-safety codes

    SciTech Connect

    Wilburn, N P; Niccoli, L G

    1980-11-01

    As a result of the drastically increasing cost of software and the lack of an engineering approach, the technology of Software Engineering is being developed. Software Engineering provides an answer to the increasing cost of developing and maintaining software. It has been applied extensively in the business and aerospace communities and is just now being applied to the development of scientific software and, in particular, to the development of reactor safety codes at HEDL.

  6. Assuring quality in high-consequence engineering

    SciTech Connect

    Hoover, Marcey L.; Kolb, Rachel R.

    2014-03-01

    In high-consequence engineering organizations, such as Sandia, quality assurance may be heavily dependent on staff competency. Competency-dependent quality assurance models are at risk when the environment changes, as it has with increasing attrition rates, budget and schedule cuts, and competing program priorities. Risks in Sandia's competency-dependent culture can be mitigated through changes to hiring, training, and customer engagement approaches to manage people, partners, and products. Sandia's technical quality engineering organization has been able to mitigate corporate-level risks by driving changes that benefit all departments, and in doing so has assured Sandia's commitment to excellence in high-consequence engineering and national service.

  7. Software Process Improvement Initiatives Based on Quality Assurance Strategies: A QATAM Pilot Application

    NASA Astrophysics Data System (ADS)

    Winkler, Dietmar; Elberzhager, Frank; Biffl, Stefan; Eschbach, Robert

    Quality Assurance (QA) strategies, i.e., bundles of verification and validation approaches embedded within a balanced software process can support project and quality managers in systematically planning and implementing improvement initiatives. New and modified processes and methods come up frequently that seems promising candidates for improvement. Nevertheless, the impact of processes and methods strongly depends on individual project contexts. A major challenge is how to systematically select and implement "bestpractices" for product construction, verification, and validation. In this paper we present the Quality Assurance Tradeoff Analysis Method (QATAM) that supports engineers in (a) systematically identifying candidate QA strategies and (b) evaluating QA strategy variants in a given project context. We evaluate feasibility and usefulness in a pilot application in a medium-size software engineering organization. Main results were that QATAM was considered useful for identifying and evaluating various improvement initiatives applicable for large organizations as well as for small and medium enterprises.

  8. Interferometer software development at JPL: using software engineering to reduce integration headaches

    NASA Astrophysics Data System (ADS)

    Deck, Michael D.; Hines, Braden E.

    1998-07-01

    This paper describes some of the software engineering practices that are being used by the Realtime Interferometer Control Systems Testbed (RICST) project at JPL to address integration and integratability issues.New documentation and review techniques based on formal methods permit early identification of potential interface problems. An incremental life cycle improves the manageability of the software development process. A 'cleanroom mindset' reduces the number of defects that have to be removed during integration and test. And team ownership of work products permits the project to grow while providing a variety of opportunities to team members. This paper presents data, including software metrics and analysis, from the first several incremental deliveries developed by the RICST project.

  9. Software forecasting as it is really done: A study of JPL software engineers

    NASA Technical Reports Server (NTRS)

    Griesel, Martha Ann; Hihn, Jairus M.; Bruno, Kristin J.; Fouser, Thomas J.; Tausworthe, Robert C.

    1993-01-01

    This paper presents a summary of the results to date of a Jet Propulsion Laboratory internally funded research task to study the costing process and parameters used by internally recognized software cost estimating experts. Protocol Analysis and Markov process modeling were used to capture software engineer's forecasting mental models. While there is significant variation between the mental models that were studied, it was nevertheless possible to identify a core set of cost forecasting activities, and it was also found that the mental models cluster around three forecasting techniques. Further partitioning of the mental models revealed clustering of activities, that is very suggestive of a forecasting lifecycle. The different forecasting methods identified were based on the use of multiple-decomposition steps or multiple forecasting steps. The multiple forecasting steps involved either forecasting software size or an additional effort forecast. Virtually no subject used risk reduction steps in combination. The results of the analysis include: the identification of a core set of well defined costing activities, a proposed software forecasting life cycle, and the identification of several basic software forecasting mental models. The paper concludes with a discussion of the implications of the results for current individual and institutional practices.

  10. Software to model AXAF image quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees

    1993-01-01

    This draft final report describes the work performed under this delivery order from May 1992 through June 1993. The purpose of this contract was to enhance and develop an integrated optical performance modeling software for complex x-ray optical systems such as AXAF. The GRAZTRACE program developed by the MSFC Optical Systems Branch for modeling VETA-I was used as the starting baseline program. The original program was a large single file program and, therefore, could not be modified very efficiently. The original source code has been reorganized, and a 'Make Utility' has been written to update the original program. The new version of the source code consists of 36 small source files to make it easier for the code developer to manage and modify the program. A user library has also been built and a 'Makelib' utility has been furnished to update the library. With the user library, the users can easily access the GRAZTRACE source files and build a custom library. A user manual for the new version of GRAZTRACE has been compiled. The plotting capability for the 3-D point spread functions and contour plots has been provided in the GRAZTRACE using the graphics package DISPLAY. The Graphics emulator over the network has been set up for programming the graphics routine. The point spread function and the contour plot routines have also been modified to display the plot centroid, and to allow the user to specify the plot range, and the viewing angle options. A Command Mode version of GRAZTRACE has also been developed. More than 60 commands have been implemented in a Code-V like format. The functions covered in this version include data manipulation, performance evaluation, and inquiry and setting of internal parameters. The user manual for these commands has been formatted as in Code-V, showing the command syntax, synopsis, and options. An interactive on-line help system for the command mode has also been accomplished to allow the user to find valid commands, command syntax

  11. An evaluation of expert systems for software engineering management

    NASA Technical Reports Server (NTRS)

    Ramsey, Connie Loggia; Basili, Victor R.

    1989-01-01

    The development of four separate prototype expert systems to aid in software engineering management is described. Given the values for certain metrics, these systems provide interpretations which explain any abnormal patterns of these values during the development of a software project. The four expert systems, which solve the same problem, were built using two different approaches to knowledge acquisition, a bottom-up approach and a top-down approach, and two different expert system methods, rule-based deduction and frame-based abduction. In a comparison to see which methods might better suit the needs of this field, it was found that the bottom-up approach led to better results than did the top-down approach, and the rule-based deduction systems using simple rules provided more complete and correct solutions than did the frame-based abduction systems.

  12. Measuring the effect of conflict on software engineering teams.

    PubMed

    Karn, J S; Cowling, A J

    2008-05-01

    This article describes a project that aimed to uncover the effects of different forms of conflict on team performance during the important feasibility, requirements analysis, and design phases of software engineering (SE) projects. The research subjects were master of science students who were working to produce software commissioned by real-world clients. A template was developed that allowed researchers to record details of any conflicts that occurred. It was found that some forms of conflict were more damaging than others and that the frequency and intensity of specific conflicts are important factors to consider. The experience of the researchers when using the final template suggests that it is a valuable weapon to have in one's arsenal if one is interested in observing and recording the details of conflict in either SE teams or teams in different contexts. PMID:18522070

  13. Software Product Line Engineering Approach for Enhancing Agile Methodologies

    NASA Astrophysics Data System (ADS)

    Martinez, Jabier; Diaz, Jessica; Perez, Jennifer; Garbajosa, Juan

    One of the main principles of Agile methodologies consists in the early and continuous delivery of valuable software by short time-framed iterations. After each iteration, a working product is delivered according to the requirements defined at the beginning of the iteration. Testing tools facilitate the task of checking if the system provides the expected behavior according to the specified requirements. However, since testing tools need to be adapted in order to test new working products in each iteration, a significant effort has to be invested. This work presents a Software Product Line Engineering (SPLE) approach that allows flexibility in the adaption of testing tools with the working products in an iterative way. A case study is also presented using PLUM (Product Line Unified Modeller) as the tool suite for SPL implementation and management.

  14. Autonomous Cryogenics Loading Operations Simulation Software: Knowledgebase Autonomous Test Engineer

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S.

    2012-01-01

    The Simulation Software, KATE (Knowledgebase Autonomous Test Engineer), is used to demonstrate the automatic identification of faults in a system. The ACLO (Autonomous Cryogenics Loading Operation) project uses KATE to monitor and find faults in the loading of the cryogenics int o a vehicle fuel tank. The KATE software interfaces with the IHM (Integrated Health Management) systems bus to communicate with other systems that are part of ACLO. One system that KATE uses the IHM bus to communicate with is AIS (Advanced Inspection System). KATE will send messages to AIS when there is a detected anomaly. These messages include visual inspection of specific valves, pressure gauges and control messages to have AIS open or close manual valves. My goals include implementing the connection to the IHM bus within KATE and for the AIS project. I will also be working on implementing changes to KATE's Ul and implementing the physics objects in KATE that will model portions of the cryogenics loading operation.

  15. Engine structures modeling software system: Computer code. User's manual

    NASA Technical Reports Server (NTRS)

    1992-01-01

    ESMOSS is a specialized software system for the construction of geometric descriptive and discrete analytical models of engine parts, components and substructures which can be transferred to finite element analysis programs such as NASTRAN. The software architecture of ESMOSS is designed in modular form with a central executive module through which the user controls and directs the development of the analytical model. Modules consist of a geometric shape generator, a library of discretization procedures, interfacing modules to join both geometric and discrete models, a deck generator to produce input for NASTRAN and a 'recipe' processor which generates geometric models from parametric definitions. ESMOSS can be executed both in interactive and batch modes. Interactive mode is considered to be the default mode and that mode will be assumed in the discussion in this document unless stated otherwise.

  16. Changes and challenges in the Software Engineering Laboratory

    NASA Technical Reports Server (NTRS)

    Pajerski, Rose

    1994-01-01

    Since 1976, the Software Engineering Laboratory (SEL) has been dedicated to understanding and improving the way in which one NASA organization, the Flight Dynamics Division (FDD), develops, maintains, and manages complex flight dynamics systems. The SEL is composed of three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation. During the past 18 years, the SEL's overall goal has remained the same: to improve the FDD's software products and processes in a measured manner. This requires that each development and maintenance effort be viewed, in part, as a SEL experiment which examines a specific technology or builds a model of interest for use on subsequent efforts. The SEL has undertaken many technology studies while developing operational support systems for numerous NASA spacecraft missions.

  17. Quality Dimensions of Internet Search Engines.

    ERIC Educational Resources Information Center

    Xie, M.; Wang, H.; Goh, T. N.

    1998-01-01

    Reviews commonly used search engines (AltaVista, Excite, infoseek, Lycos, HotBot, WebCrawler), focusing on existing comparative studies; considers quality dimensions from the customer's point of view based on a SERVQUAL framework; and groups these quality expectations in five dimensions: tangibles, reliability, responsiveness, assurance, and…

  18. Open source software engineering for geoscientific modeling applications

    NASA Astrophysics Data System (ADS)

    Bilke, L.; Rink, K.; Fischer, T.; Kolditz, O.

    2012-12-01

    , static analysis tools) - Informs developers on errors (via email) - Generates source code documentation - Provides binaries for end users These points enhance the software development process considerably. Firstly, platform independence is maintained. Additionally, errors in the source code can be tracked down easily. Lastly, developers gain access to code analysis tools and up-to-date source code documentation.; Overview of the OpenGeoSys software engineering workflow

  19. Towards Multi-Method Research Approach in Empirical Software Engineering

    NASA Astrophysics Data System (ADS)

    Mandić, Vladimir; Markkula, Jouni; Oivo, Markku

    This paper presents results of a literature analysis on Empirical Research Approaches in Software Engineering (SE). The analysis explores reasons why traditional methods, such as statistical hypothesis testing and experiment replication are weakly utilized in the field of SE. It appears that basic assumptions and preconditions of the traditional methods are contradicting the actual situation in the SE. Furthermore, we have identified main issues that should be considered by the researcher when selecting the research approach. In virtue of reasons for weak utilization of traditional methods we propose stronger use of Multi-Method approach with Pragmatism as the philosophical standpoint.

  20. Modelling Gaia CCD pixels with Silvaco 3D engineering software

    NASA Astrophysics Data System (ADS)

    Seabroke, G. M.; Prod'Homme, T.; Hopkinson, G.; Burt, D.; Robbins, M.; Holland, A.

    2011-02-01

    Gaia will only achieve its unprecedented measurement accuracy requirements with detailed calibration and correction for radiation damage. We present our Silvaco 3D engineering software model of the Gaia CCD pixel and two of its applications for Gaia: (1) physically interpreting supplementary buried channel (SBC) capacity measurements (pocket-pumping and first pixel response) in terms of e2v manufacturing doping alignment tolerances; and (2) deriving electron densities within a charge packet as a function of the number of constituent electrons and 3D position within the charge packet as input to microscopic models being developed to simulate radiation damage.

  1. TRITON: graphic software for rational engineering of enzymes.

    PubMed

    Damboský, J; Prokop, M; Koca, J

    2001-01-01

    Engineering of the catalytic properties of enzymes requires knowledge about amino acid residues interacting with the transition state of the substrate. TRITON is a graphic software package for modelling enzymatic reactions for the analysis of essential interactions between the enzyme and its substrate and for in silico construction of protein mutants. The reactions are modelled using semi-empirical quantum-mechanic methods and the protein mutants are constructed by homology modelling. The users are guided through the calculation and data analysis by wizards. PMID:11252253

  2. The study on network security based on software engineering

    NASA Astrophysics Data System (ADS)

    Jia, Shande; Ao, Qian

    2012-04-01

    Developing a SP is a sensitive task because the SP itself can lead to security weaknesses if it is not conform to the security properties. Hence, appropriate techniques are necessary to overcome such problems. These techniques must accompany the policy throughout its deployment phases. The main contribution of this paper is then, the proposition of three of these activities: validation, test and multi-SP conflict management. Our techniques are inspired by the well established techniques of the software engineering for which we have found some similarities with the security domain.

  3. The image related services of the HELIOS software engineering environment.

    PubMed

    Engelmann, U; Meinzer, H P; Schröter, A; Günnel, U; Demiris, A M; Makabe, M; Evers, H; Jean, F C; Degoulet, P

    1995-01-01

    This paper describes the approach of the European HELIOS project to integrate image processing tools into ward information systems. The image processing tools are the result of the basic research in image analysis in the Department Medical and Biological Informatics at the German Cancer Research Center. These tools for the analysis of two-dimensional images and three-dimensional data volumes with 3D reconstruction and visualization ae part of the Image Related Services of HELIOS. The HELIOS software engineering environment allows to use the image processing functionality in integrated applications. PMID:7743775

  4. Software metrics: The key to quality software on the NCC project

    NASA Technical Reports Server (NTRS)

    Burns, Patricia J.

    1993-01-01

    Network Control Center (NCC) Project metrics are captured during the implementation and testing phases of the NCCDS software development lifecycle. The metrics data collection and reporting function has interfaces with all elements of the NCC project. Close collaboration with all project elements has resulted in the development of a defined and repeatable set of metrics processes. The resulting data are used to plan and monitor release activities on a weekly basis. The use of graphical outputs facilitates the interpretation of progress and status. The successful application of metrics throughout the NCC project has been instrumental in the delivery of quality software. The use of metrics on the NCC Project supports the needs of the technical and managerial staff. This paper describes the project, the functions supported by metrics, the data that are collected and reported, how the data are used, and the improvements in the quality of deliverable software since the metrics processes and products have been in use.

  5. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    NASA Technical Reports Server (NTRS)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  6. IMPROVING (SOFTWARE) PATENT QUALITY THROUGH THE ADMINISTRATIVE PROCESS

    PubMed Central

    Rai, Arti K.

    2014-01-01

    The available evidence indicates that patent quality, particularly in the area of software, needs improvement. This Article argues that even an agency as institutionally constrained as the U.S. Patent and Trademark Office (“PTO”) could implement a portfolio of pragmatic, cost-effective quality improvement strategies. The argument in favor of these strategies draws upon not only legal theory and doctrine but also new data from a PTO software examination unit with relatively strict practices. Strategies that resolve around Section 112 of the patent statute could usefully be deployed at the initial examination stage. Other strategies could be deployed within the new post-issuance procedures available to the agency under the America Invents Act. Notably, although the strategies the Article discusses have the virtue of being neutral as to technology, they are likely to have a very significant practical impact in the area of software. PMID:25221346

  7. IMPROVING (SOFTWARE) PATENT QUALITY THROUGH THE ADMINISTRATIVE PROCESS.

    PubMed

    Rai, Arti K

    2013-11-24

    The available evidence indicates that patent quality, particularly in the area of software, needs improvement. This Article argues that even an agency as institutionally constrained as the U.S. Patent and Trademark Office ("PTO") could implement a portfolio of pragmatic, cost-effective quality improvement strategies. The argument in favor of these strategies draws upon not only legal theory and doctrine but also new data from a PTO software examination unit with relatively strict practices. Strategies that resolve around Section 112 of the patent statute could usefully be deployed at the initial examination stage. Other strategies could be deployed within the new post-issuance procedures available to the agency under the America Invents Act. Notably, although the strategies the Article discusses have the virtue of being neutral as to technology, they are likely to have a very significant practical impact in the area of software. PMID:25221346

  8. Early experiences building a software quality prediction model

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.; Evanco, W. M.; Smith, M. C.

    1990-01-01

    Early experiences building a software quality prediction model are discussed. The overall research objective is to establish a capability to project a software system's quality from an analysis of its design. The technical approach is to build multivariate models for estimating reliability and maintainability. Data from 21 Ada subsystems were analyzed to test hypotheses about various design structures leading to failure-prone or unmaintainable systems. Current design variables highlight the interconnectivity and visibility of compilation units. Other model variables provide for the effects of reusability and software changes. Reported results are preliminary because additional project data is being obtained and new hypotheses are being developed and tested. Current multivariate regression models are encouraging, explaining 60 to 80 percent of the variation in error density of the subsystems.

  9. Orthographic Software Modelling: A Novel Approach to View-Based Software Engineering

    NASA Astrophysics Data System (ADS)

    Atkinson, Colin

    The need to support multiple views of complex software architectures, each capturing a different aspect of the system under development, has been recognized for a long time. Even the very first object-oriented analysis/design methods such as the Booch method and OMT supported a number of different diagram types (e.g. structural, behavioral, operational) and subsequent methods such as Fusion, Kruchten's 4+1 views and the Rational Unified Process (RUP) have added many more views over time. Today's leading modeling languages such as the UML and SysML, are also oriented towards supporting different views (i.e. diagram types) each able to portray a different facets of a system's architecture. More recently, so called enterprise architecture frameworks such as the Zachman Framework, TOGAF and RM-ODP have become popular. These add a whole set of new non-functional views to the views typically emphasized in traditional software engineering environments.

  10. Evaluating software development by analysis of changes - Some data from the Software Engineering Laboratory

    NASA Technical Reports Server (NTRS)

    Weiss, D. M.; Basili, V. R.

    1985-01-01

    Basili and Weiss (1984) have discussed an approach for obtaining valid data which may be used to evaluate software development methodologies in a production environment. The methodology consists of five elements, including the identification of goals, the determination of questions of interest from the goals, the development of a data collection form, the development of data collection procedures, and the validation and analysis of the data. The current investigation is concerned with the presentation of the results from such an evaluation. The presented data were collected as part of studies reported by Basili et al. (1977). These studies had been conducted by NASA's Software Engineering Laboratory (SEL). Attention is given to an overview of the SEL, the application of the considered methodology, the results of a data analysis, and conclusions about the SEL environment.

  11. The cleanroom case study in the Software Engineering Laboratory: Project description and early analysis

    NASA Technical Reports Server (NTRS)

    Green, Scott; Kouchakdjian, Ara; Basili, Victor; Weidow, David

    1990-01-01

    This case study analyzes the application of the cleanroom software development methodology to the development of production software at the NASA/Goddard Space Flight Center. The cleanroom methodology emphasizes human discipline in program verification to produce reliable software products that are right the first time. Preliminary analysis of the cleanroom case study shows that the method can be applied successfully in the FDD environment and may increase staff productivity and product quality. Compared to typical Software Engineering Laboratory (SEL) activities, there is evidence of lower failure rates, a more complete and consistent set of inline code documentation, a different distribution of phase effort activity, and a different growth profile in terms of lines of code developed. The major goals of the study were to: (1) assess the process used in the SEL cleanroom model with respect to team structure, team activities, and effort distribution; (2) analyze the products of the SEL cleanroom model and determine the impact on measures of interest, including reliability, productivity, overall life-cycle cost, and software quality; and (3) analyze the residual products in the application of the SEL cleanroom model, such as fault distribution, error characteristics, system growth, and computer usage.

  12. GenePRIMP: A software quality control tool

    SciTech Connect

    Amrita Pati

    2010-05-05

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  13. GenePRIMP: A software quality control tool

    ScienceCinema

    Amrita Pati

    2010-09-01

    Amrita Pati of the DOE Joint Genome Institute's Genome Biology group describes the software tool GenePRIMP and how it fits into the quality control pipeline for microbial genomics. Further details regarding GenePRIMP appear in a paper published online May 2, 2010 in Nature Methods.

  14. Software for Preprocessing Data from Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2004-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris). EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PV-WAVE based plotting software.

  15. Software for Preprocessing Data From Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2002-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC "E" test-stand complex and utilize the SSC file format. The programs are the following: 1) Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel; 2) QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot (a free graphing program written by Rick Paris); and 3) EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PVWAVE based plotting software.

  16. Software for Preprocessing Data From Rocket-Engine Tests

    NASA Technical Reports Server (NTRS)

    Cheng, Chiu-Fu

    2003-01-01

    Three computer programs have been written to preprocess digitized outputs of sensors during rocket-engine tests at Stennis Space Center (SSC). The programs apply exclusively to the SSC E test-stand complex and utilize the SSC file format. The programs are the following: (1) Engineering Units Generator (EUGEN) converts sensor-output-measurement data to engineering units. The inputs to EUGEN are raw binary test-data files, which include the voltage data, a list identifying the data channels, and time codes. EUGEN effects conversion by use of a file that contains calibration coefficients for each channel. (2) QUICKLOOK enables immediate viewing of a few selected channels of data, in contradistinction to viewing only after post-test processing (which can take 30 minutes to several hours depending on the number of channels and other test parameters) of data from all channels. QUICKLOOK converts the selected data into a form in which they can be plotted in engineering units by use of Winplot. (3) EUPLOT provides a quick means for looking at data files generated by EUGEN without the necessity of relying on the PVWAVE based plotting software.

  17. BIOESTIM: software for automatic design of estimators in bioprocess engineering.

    PubMed

    Farza, M; Chéruy, A

    1994-09-01

    This paper describes BIOESTIM, a software package devoted to on-line estimation in bioprocess engineering. BIOESTIM enables bioengineers automatically to design state and parameter estimators from a minimal knowledge of the process kinetics. Such estimators allow development of software sensors capable of coping with the lack of reliable instrumentation suited to real-time monitoring. The estimator building procedure through BIOESTIM starts up from a dynamical material balance model of the bioprocess. This model, supplied by the user, is next completed by other information with no requirement for numerical values: the user has only to specify available measurements, coupled reactions and the known yield coefficients. On the base of this knowledge, BIOESTIM proceeds to symbolic algebraic manipulations on the model in order to study estimation possibilities and check identifiability of yield coefficients. When the design of an estimator is possible, the corresponding equations are automatically generated. Moreover, these estimators are stored in a user-specified file which is automatically interfaced with a specialized simulation software including data treatment and numerical integration packages. Thus, the user can simulate the estimator performances under various operational conditions using available experimental measurements. A typical example dealing with microbial growth and biosynthesis reactions is given in order to illustrate the main functional capabilities of BIOESTIM. BIOESTIM has been designed and written in a modular fashion. The module dealing with estimators design makes use of symbolic computation; it is written in Mathematica and runs on every computer on which this language is available. PMID:7828062

  18. Open source projects in software engineering education: a mapping study

    NASA Astrophysics Data System (ADS)

    Nascimento, Debora M. C.; Almeida Bittencourt, Roberto; Chavez, Christina

    2015-01-01

    Context: It is common practice in academia to have students work with "toy" projects in software engineering (SE) courses. One way to make such courses more realistic and reduce the gap between academic courses and industry needs is getting students involved in open source projects (OSP) with faculty supervision. Objective: This study aims to summarize the literature on how OSP have been used to facilitate students' learning of SE. Method: A systematic mapping study was undertaken by identifying, filtering and classifying primary studies using a predefined strategy. Results: 72 papers were selected and classified. The main results were: (a) most studies focused on comprehensive SE courses, although some dealt with specific areas; (b) the most prevalent approach was the traditional project method; (c) studies' general goals were: learning SE concepts and principles by using OSP, learning open source software or both; (d) most studies tried out ideas in regular courses within the curriculum; (e) in general, students had to work with predefined projects; (f) there was a balance between approaches where instructors had either inside control or no control on the activities performed by students; (g) when learning was assessed, software artefacts, reports and presentations were the main instruments used by teachers, while surveys were widely used for students' self-assessment; (h) most studies were published in the last seven years. Conclusions: The resulting map gives an overview of the existing initiatives in this context and shows gaps where further research can be pursued.

  19. Continuous integration and quality control for scientific software

    NASA Astrophysics Data System (ADS)

    Neidhardt, A.; Ettl, M.; Brisken, W.; Dassing, R.

    2013-08-01

    Modern software has to be stable, portable, fast and reliable. This is going to be also more and more important for scientific software. But this requires a sophisticated way to inspect, check and evaluate the quality of source code with a suitable, automated infrastructure. A centralized server with a software repository and a version control system is one essential part, to manage the code basis and to control the different development versions. While each project can be compiled separately, the whole code basis can also be compiled with one central “Makefile”. This is used to create automated, nightly builds. Additionally all sources are inspected automatically with static code analysis and inspection tools, which check well-none error situations, memory and resource leaks, performance issues, or style issues. In combination with an automatic documentation generator it is possible to create the developer documentation directly from the code and the inline comments. All reports and generated information are presented as HTML page on a Web server. Because this environment increased the stability and quality of the software of the Geodetic Observatory Wettzell tremendously, it is now also available for scientific communities. One regular customer is already the developer group of the DiFX software correlator project.

  20. Shaping Software Engineering Curricula Using Open Source Communities: A Case Study

    ERIC Educational Resources Information Center

    Bowring, James; Burke, Quinn

    2016-01-01

    This paper documents four years of a novel approach to teaching a two-course sequence in software engineering as part of the ABET-accredited computer science curriculum at the College of Charleston. This approach is team-based and centers on learning software engineering in the context of open source software projects. In the first course, teams…

  1. The software product assurance metrics study: JPL's software systems quality and productivity

    NASA Technical Reports Server (NTRS)

    Bush, Marilyn W.

    1989-01-01

    The findings are reported of the Jet Propulsion Laboratory (JPL)/Software Product Assurance (SPA) Metrics Study, conducted as part of a larger JPL effort to improve software quality and productivity. Until recently, no comprehensive data had been assembled on how JPL manages and develops software-intensive systems. The first objective was to collect data on software development from as many projects and for as many years as possible. Results from five projects are discussed. These results reflect 15 years of JPL software development, representing over 100 data points (systems and subsystems), over a third of a billion dollars, over four million lines of code and 28,000 person months. Analysis of this data provides a benchmark for gauging the effectiveness of past, present and future software development work. In addition, the study is meant to encourage projects to record existing metrics data and to gather future data. The SPA long term goal is to integrate the collection of historical data and ongoing project data with future project estimations.

  2. In the soft-to-hard technical spectrum: Where is software engineering?

    NASA Technical Reports Server (NTRS)

    Leibfried, Theodore F.; Macdonald, Robert B.

    1992-01-01

    In the computer journals and tabloids, there have been a plethora of articles written about the software engineering field. But while advocates of the need for an engineering approach to software development, it is impressive how many authors have treated the subject of software engineering without adequately addressing the fundamentals of what engineering as a discipline consists of. A discussion is presented of the various related facets of this issue in a logical framework to advance the thesis that the software development process is necessarily an engineering process. The purpose is to examine more of the details of the issue of whether or not the design and development of software for digital computer processing systems should be both viewed and treated as a legitimate field of professional engineering. Also, the type of academic and professional level education programs that would be required to support a software engineering discipline is examined.

  3. Development of an Ada programming support environment database SEAD (Software Engineering and Ada Database) administration manual

    NASA Technical Reports Server (NTRS)

    Liaw, Morris; Evesson, Donna

    1988-01-01

    Software Engineering and Ada Database (SEAD) was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities which are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce duplication of effort while improving quality in the development of future software systems. SEAD data is organized into five major areas: information regarding education and training resources which are relevant to the life cycle of Ada-based software engineering projects such as those in the Space Station program; research publications relevant to NASA projects such as the Space Station Program and conferences relating to Ada technology; the latest progress reports on Ada projects completed or in progress both within NASA and throughout the free world; Ada compilers and other commercial products that support Ada software development; and reusable Ada components generated both within NASA and from elsewhere in the free world. This classified listing of reusable components shall include descriptions of tools, libraries, and other components of interest to NASA. Sources for the data include technical newletters and periodicals, conference proceedings, the Ada Information Clearinghouse, product vendors, and project sponsors and contractors.

  4. Software Engineering Support Activities for Very Small Entities

    NASA Astrophysics Data System (ADS)

    Ribaud, Vincent; Saliou, Philippe; O'Connor, Rory V.; Laporte, Claude Y.

    The emerging ISO/IEC 29110 standard Lifecycle profiles for Very Small Entities has at its core a Management and Engineering Guides which is targeted at very small entity (enterprise, organization, department or project) having up to 25 people, to assist them unlock the potential benefits of using standards which are specifically designed to address there needs. The developers of the standard, ISO/IEC JCT1/SC7 Working Group 24 (WG24), recommend the use of pilot projects as a mean to trial the adoption of the new International standard in small organisations. Accordingly an ISO/IEC 29110 pilot project has been established between the Software Engineering group of Brest University and a 14 person company with the aim of establishing an engineering discipline for a new web-based project. This paper details the lessons learned from the pilot project and based on our experiences with using ISO/IEC 29110 we identify a potential deficiency and accordingly propose new process area, "Infrastructure and Support" for include in the future evolution of ISO/IEC 29110 Process Profiles.

  5. The application of formal software engineering methods to the unattended and remote monitoring software suite at Los Alamos National Laboratory

    SciTech Connect

    Determan, John Clifford; Longo, Joseph F; Michel, Kelly D

    2009-01-01

    The Unattended and Remote Monitoring (UNARM) system is a collection of specialized hardware and software used by the International Atomic Energy Agency (IAEA) to institute nuclear safeguards at many nuclear facilities around the world. The hardware consists of detectors, instruments, and networked computers for acquiring various forms of data, including but not limited to radiation data, global position coordinates, camera images, isotopic data, and operator declarations. The software provides two primary functions: the secure and reliable collection of this data from the instruments and the ability to perform an integrated review and analysis of the disparate data sources. Several years ago the team responsible for maintaining the software portion of the UNARM system began the process of formalizing its operations. These formal operations include a configuration management system, a change control board, an issue tracking system, and extensive formal testing, for both functionality and reliability. Functionality is tested with formal test cases chosen to fully represent the data types and methods of analysis that will be commonly encountered. Reliability is tested with iterative, concurrent testing where up to five analyses are executed simultaneously for thousands of cycles. Iterative concurrent testing helps ensure that there are no resource conflicts or leaks when multiple system components are in use simultaneously. The goal of this work is to provide a high quality, reliable product, commensurate with the criticality of the application. Testing results will be presented that demonstrate that this goal has been achieved and the impact of the introduction of a formal software engineering framework to the UNARM product will be presented.

  6. A Role-Playing Game for a Software Engineering Lab: Developing a Product Line

    ERIC Educational Resources Information Center

    Zuppiroli, Sara; Ciancarini, Paolo; Gabbrielli, Maurizio

    2012-01-01

    Software product line development refers to software engineering practices and techniques for creating families of similar software systems from a basic set of reusable components, called shared assets. Teaching how to deal with software product lines in a university lab course is a challenging task, because there are several practical issues that…

  7. RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings Appendices

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: flight critical software; management of real-time Ada; software reuse; megaprogramming software; Ada net; POSIX and Ada integration in the Space Station Freedom Program; and assessment of formal methods for trustworthy computer systems.

  8. First statistical analysis of Geant4 quality software metrics

    NASA Astrophysics Data System (ADS)

    Ronchieri, Elisabetta; Grazia Pia, Maria; Giacomini, Francesco

    2015-12-01

    Geant4 is a simulation system of particle transport through matter, widely used in several experimental areas from high energy physics and nuclear experiments to medical studies. Some of its applications may involve critical use cases; therefore they would benefit from an objective assessment of the software quality of Geant4. In this paper, we provide a first statistical evaluation of software metrics data related to a set of Geant4 physics packages. The analysis aims at identifying risks for Geant4 maintainability, which would benefit from being addressed at an early stage. The findings of this pilot study set the grounds for further extensions of the analysis to the whole of Geant4 and to other high energy physics software systems.

  9. RICIS Software Engineering 90 Symposium: Aerospace Applications and Research Directions Proceedings

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Papers presented at RICIS Software Engineering Symposium are compiled. The following subject areas are covered: synthesis - integrating product and process; Serpent - a user interface management system; prototyping distributed simulation networks; and software reuse.

  10. Knowledge engineering software: A demonstration of a high end tool

    SciTech Connect

    Salzman, G.C.; Krall, R.B.; Marinuzzi, J.G.

    1987-01-01

    Many investigators wanting to apply knowledge-based systems (KBS) as consultants for cancer diagnosis have turned to tools running on personal computers. While some of these tools serve well for small tasks, they lack the power available with the high end KBS tools such as KEE (Knowledge Engineering Environment) and ART (Automated Reasoning Tool). These tools were originally developed on Lisp machines and have the full functionality of the Lisp language as well as many additional features. They provide a rich and highly productive environment for the software developer. To illustrate the capability of one of these high end tools we have converted a table showing the classification of benign soft tissue tumors into a KEE knowledge base. We have used the tools available in Kee to identify the tumor type for a hypothetical patient. 10 figs.

  11. Repository-based software engineering program: Concept document

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This document provides the context for Repository-Based Software Engineering's (RBSE's) evolving functional and operational product requirements, and it is the parent document for development of detailed technical and management plans. When furnished, requirements documents will serve as the governing RBSE product specification. The RBSE Program Management Plan will define resources, schedules, and technical and organizational approaches to fulfilling the goals and objectives of this concept. The purpose of this document is to provide a concise overview of RBSE, describe the rationale for the RBSE Program, and define a clear, common vision for RBSE team members and customers. The document also provides the foundation for developing RBSE user and system requirements and a corresponding Program Management Plan. The concept is used to express the program mission to RBSE users and managers and to provide an exhibit for community review.

  12. Internet Evolution and the Role of Software Engineering

    NASA Astrophysics Data System (ADS)

    Zave, Pamela

    The classic Internet architecture is a victim of its own success. Having succeeded so well at empowering users and encouraging innovation, it has been made obsolete by explosive growth in users, traffic, applications, and threats. For the past decade, the networking community has been focused on the many deficiencies of the current Internet and the possible paths toward a better future Internet. This paper explains why the Internet is likely to architectures running on multiple virtual networks, rather than having a single architecture. In this context, there is an urgent need for research that starts from the requirements of Internet applications and works downward toward network resources, in addition to the predominantly bottom-up work of the networking community. This paper aims to encourage the software-engineering community to participate in this research by providing a starting point and a broad program of research questions and projects.

  13. Autonomous Cryogenics Loading Operations Simulation Software: Knowledgebase Autonomous Test Engineer

    NASA Technical Reports Server (NTRS)

    Wehner, Walter S., Jr.

    2013-01-01

    Working on the ACLO (Autonomous Cryogenics Loading Operations) project I have had the opportunity to add functionality to the physics simulation software known as KATE (Knowledgebase Autonomous Test Engineer), create a new application allowing WYSIWYG (what-you-see-is-what-you-get) creation of KATE schematic files and begin a preliminary design and implementation of a new subsystem that will provide vision services on the IHM (Integrated Health Management) bus. The functionality I added to KATE over the past few months includes a dynamic visual representation of the fluid height in a pipe based on number of gallons of fluid in the pipe and implementing the IHM bus connection within KATE. I also fixed a broken feature in the system called the Browser Display, implemented many bug fixes and made changes to the GUI (Graphical User Interface).

  14. The Many Faces of a Software Engineer in a Research Community

    SciTech Connect

    Marinovici, Maria C.; Kirkham, Harold

    2013-10-14

    The ability to gather, analyze and make decisions based on real world data is changing nearly every field of human endeavor. These changes are particularly challenging for software engineers working in a scientific community, designing and developing large, complex systems. To avoid the creation of a communications gap (almost a language barrier), the software engineers should possess an ‘adaptive’ skill. In the science and engineering research community, the software engineers must be responsible for more than creating mechanisms for storing and analyzing data. They must also develop a fundamental scientific and engineering understanding of the data. This paper looks at the many faces that a software engineer should have: developer, domain expert, business analyst, security expert, project manager, tester, user experience professional, etc. Observations made during work on a power-systems scientific software development are analyzed and extended to describe more generic software development projects.

  15. Software quality for 1997 - what works and what doesn`t?

    SciTech Connect

    Jones, C.

    1997-11-01

    This presentation provides a view of software quality for 1997 - what works and what doesn`t. For many years, software quality assurance lagged behind hardware quality assurance in terms of methods, metrics, and successful results. New approaches such as Quality Function Development (WFD) the ISO 9000-9004 standards, the SEI maturity levels, and Total Quality Management (TQM) are starting to attract wide attention, and in some cases to bring software quality levels up to a parity with manufacturing quality levels.

  16. Software Past, Present, and Future: Views from Government, Industry and Academia

    NASA Technical Reports Server (NTRS)

    Holcomb, Lee; Page, Jerry; Evangelist, Michael

    2000-01-01

    Views from the NASA CIO NASA Software Engineering Workshop on software development from the past, present, and future are presented. The topics include: 1) Software Past; 2) Software Present; 3) NASA's Largest Software Challenges; 4) 8330 Software Projects in Industry Standish Groups 1994 Report; 5) Software Future; 6) Capability Maturity Model (CMM): Software Engineering Institute (SEI) levels; 7) System Engineering Quality Also Part of the Problem; 8) University Environment Trends Will Increase the Problem in Software Engineering; and 9) NASA Software Engineering Goals.

  17. NARAC SOFTWARE QUALITY ASSURANCE: ADAPTING FORMALISM TO MEET VARYING NEEDS

    SciTech Connect

    Walker, H; Nasstrom, J S; Homann, S G

    2007-11-20

    The National Atmospheric Release Advisory Center (NARAC) provides tools and services that predict and map the spread of hazardous material accidentally or intentionally released into the atmosphere. NARAC is a full function system that can meet a wide range of needs with a particular focus on emergency response. The NARAC system relies on computer software in the form of models of the atmosphere and related physical processes supported by a framework for data acquisition and management, user interface, visualization, communications and security. All aspects of the program's operations and research efforts are predicated to varying degrees on the reliable and correct performance of this software. Consequently, software quality assurance (SQA) is an essential component of the NARAC program. The NARAC models and system span different levels of sophistication, fidelity and complexity. These different levels require related but different approaches to SQA. To illustrate this, two different levels of software complexity are considered in this paper. As a relatively simple example, the SQA procedures that are being used for HotSpot, a straight-line Gaussian model focused on radiological releases, are described. At the other extreme, the SQA issues that must be considered and balanced for the more complex NARAC system are reviewed.

  18. The repository-based software engineering program: Redefining AdaNET as a mainstream NASA source

    NASA Technical Reports Server (NTRS)

    1993-01-01

    The Repository-based Software Engineering Program (RBSE) is described to inform and update senior NASA managers about the program. Background and historical perspective on software reuse and RBSE for NASA managers who may not be familiar with these topics are provided. The paper draws upon and updates information from the RBSE Concept Document, baselined by NASA Headquarters, Johnson Space Center, and the University of Houston - Clear Lake in April 1992. Several of NASA's software problems and what RBSE is now doing to address those problems are described. Also, next steps to be taken to derive greater benefit from this Congressionally-mandated program are provided. The section on next steps describes the need to work closely with other NASA software quality, technology transfer, and reuse activities and focuses on goals and objectives relative to this need. RBSE's role within NASA is addressed; however, there is also the potential for systematic transfer of technology outside of NASA in later stages of the RBSE program. This technology transfer is discussed briefly.

  19. Disciplining Discourse: Discourse Practice in the Affiliated Professions of Software Engineering Design.

    ERIC Educational Resources Information Center

    Geisler, Cheryl; Rogers, Edwin H.; Haller, Cynthia R.

    1998-01-01

    Investigates the discourse practices of the "affiliated professions" of software engineering design. Compares lists of design issues generated by students in computer science and computer communication to lists produced by experts associated with software engineering and by students entering an unaffiliated profession. Suggests that affiliated…

  20. The Effective Use of Professional Software in an Undergraduate Mining Engineering Curriculum

    ERIC Educational Resources Information Center

    Kecojevic, Vladislav; Bise, Christopher; Haight, Joel

    2005-01-01

    The use of professional software is an integral part of a student's education in the mining engineering curriculum at The Pennsylvania State University. Even though mining engineering represents a limited market across U.S. educational institutions, the goal still exists for using this type of software to enrich the learning environment with…

  1. Changes in Transferable Knowledge Resulting from Study in a Graduate Software Engineering Curriculum

    ERIC Educational Resources Information Center

    Bareiss, Ray; Sedano, Todd; Katz, Edward

    2012-01-01

    This paper presents the initial results of a study of the evolution of students' knowledge of software engineering from the beginning to the end of a master's degree curriculum in software engineering. Students were presented with a problem involving the initiation of a complex new project at the beginning of the program and again at the end of…

  2. Success Factors for Using Case Method in Teaching and Learning Software Engineering

    ERIC Educational Resources Information Center

    Razali, Rozilawati; Zainal, Dzulaiha Aryanee Putri

    2013-01-01

    The Case Method (CM) has long been used effectively in Social Science education. Its potential use in Applied Science such as Software Engineering (SE) however has yet to be further explored. SE is an engineering discipline that concerns the principles, methods and tools used throughout the software development lifecycle. In CM, subjects are…

  3. Analysis of quality raw data of second generation sequencers with Quality Assessment Software

    PubMed Central

    2011-01-01

    Background Second generation technologies have advantages over Sanger; however, they have resulted in new challenges for the genome construction process, especially because of the small size of the reads, despite the high degree of coverage. Independent of the program chosen for the construction process, DNA sequences are superimposed, based on identity, to extend the reads, generating contigs; mismatches indicate a lack of homology and are not included. This process improves our confidence in the sequences that are generated. Findings We developed Quality Assessment Software, with which one can review graphs showing the distribution of quality values from the sequencing reads. This software allow us to adopt more stringent quality standards for sequence data, based on quality-graph analysis and estimated coverage after applying the quality filter, providing acceptable sequence coverage for genome construction from short reads. Conclusions Quality filtering is a fundamental step in the process of constructing genomes, as it reduces the frequency of incorrect alignments that are caused by measuring errors, which can occur during the construction process due to the size of the reads, provoking misassemblies. Application of quality filters to sequence data, using the software Quality Assessment, along with graphing analyses, provided greater precision in the definition of cutoff parameters, which increased the accuracy of genome construction. PMID:21501521

  4. Investigation of the current requirements engineering practices among software developers at the Universiti Utara Malaysia Information Technology (UUMIT) centre

    NASA Astrophysics Data System (ADS)

    Hussain, Azham; Mkpojiogu, Emmanuel O. C.; Abdullah, Inam

    2016-08-01

    Requirements Engineering (RE) is a systemic and integrated process of eliciting, elaborating, negotiating, validating and managing of the requirements of a system in a software development project. UUM has been supported by various systems developed and maintained by the UUM Information Technology (UUMIT) Centre. The aim of this study was to assess the current requirements engineering practices at UUMIT. The main problem that prompted this research is the lack of studies that support software development activities at the UUMIT. The study is geared at helping UUMIT produce quality but time and cost saving software products by implementing cutting edge and state of the art requirements engineering practices. Also, the study contributes to UUM by identifying the activities needed for software development so that the management will be able to allocate budget to provide adequate and precise training for the software developers. Three variables were investigated: Requirement Description, Requirements Development (comprising: Requirements Elicitation, Requirements Analysis and Negotiation, Requirements Validation), and Requirement Management. The results from the study showed that the current practice of requirement engineering in UUMIT is encouraging, but still need further development and improvement because a few RE practices were seldom practiced.

  5. Using UML Modeling to Facilitate Three-Tier Architecture Projects in Software Engineering Courses

    ERIC Educational Resources Information Center

    Mitra, Sandeep

    2014-01-01

    This article presents the use of a model-centric approach to facilitate software development projects conforming to the three-tier architecture in undergraduate software engineering courses. Many instructors intend that such projects create software applications for use by real-world customers. While it is important that the first version of these…

  6. Paralel Multiphysics Algorithms and Software for Computational Nuclear Engineering

    SciTech Connect

    D. Gaston; G. Hansen; S. Kadioglu; D. A. Knoll; C. Newman; H. Park; C. Permann; W. Taitano

    2009-08-01

    There is a growing trend in nuclear reactor simulation to consider multiphysics problems. This can be seen in reactor analysis where analysts are interested in coupled flow, heat transfer and neutronics, and in fuel performance simulation where analysts are interested in thermomechanics with contact coupled to species transport and chemistry. These more ambitious simulations usually motivate some level of parallel computing. Many of the coupling efforts to date utilize simple 'code coupling' or first-order operator splitting, often referred to as loose coupling. While these approaches can produce answers, they usually leave questions of accuracy and stability unanswered. Additionally, the different physics often reside on separate grids which are coupled via simple interpolation, again leaving open questions of stability and accuracy. Utilizing state of the art mathematics and software development techniques we are deploying next generation tools for nuclear engineering applications. The Jacobian-free Newton-Krylov (JFNK) method combined with physics-based preconditioning provide the underlying mathematical structure for our tools. JFNK is understood to be a modern multiphysics algorithm, but we are also utilizing its unique properties as a scale bridging algorithm. To facilitate rapid development of multiphysics applications we have developed the Multiphysics Object-Oriented Simulation Environment (MOOSE). Examples from two MOOSE based applications: PRONGHORN, our multiphysics gas cooled reactor simulation tool and BISON, our multiphysics, multiscale fuel performance simulation tool will be presented.

  7. Parallel multiphysics algorithms and software for computational nuclear engineering

    NASA Astrophysics Data System (ADS)

    Gaston, D.; Hansen, G.; Kadioglu, S.; Knoll, D. A.; Newman, C.; Park, H.; Permann, C.; Taitano, W.

    2009-07-01

    There is a growing trend in nuclear reactor simulation to consider multiphysics problems. This can be seen in reactor analysis where analysts are interested in coupled flow, heat transfer and neutronics, and in fuel performance simulation where analysts are interested in thermomechanics with contact coupled to species transport and chemistry. These more ambitious simulations usually motivate some level of parallel computing. Many of the coupling efforts to date utilize simple code coupling or first-order operator splitting, often referred to as loose coupling. While these approaches can produce answers, they usually leave questions of accuracy and stability unanswered. Additionally, the different physics often reside on separate grids which are coupled via simple interpolation, again leaving open questions of stability and accuracy. Utilizing state of the art mathematics and software development techniques we are deploying next generation tools for nuclear engineering applications. The Jacobian-free Newton-Krylov (JFNK) method combined with physics-based preconditioning provide the underlying mathematical structure for our tools. JFNK is understood to be a modern multiphysics algorithm, but we are also utilizing its unique properties as a scale bridging algorithm. To facilitate rapid development of multiphysics applications we have developed the Multiphysics Object-Oriented Simulation Environment (MOOSE). Examples from two MOOSE-based applications: PRONGHORN, our multiphysics gas cooled reactor simulation tool and BISON, our multiphysics, multiscale fuel performance simulation tool will be presented.

  8. Advanced software development workstation project: Engineering scripting language. Graphical editor

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Software development is widely considered to be a bottleneck in the development of complex systems, both in terms of development and in terms of maintenance of deployed systems. Cost of software development and maintenance can also be very high. One approach to reducing costs and relieving this bottleneck is increasing the reuse of software designs and software components. A method for achieving such reuse is a software parts composition system. Such a system consists of a language for modeling software parts and their interfaces, a catalog of existing parts, an editor for combining parts, and a code generator that takes a specification and generates code for that application in the target language. The Advanced Software Development Workstation is intended to be an expert system shell designed to provide the capabilities of a software part composition system.

  9. Software Engineering Laboratory (SEL) compendium of tools, revision 1

    NASA Technical Reports Server (NTRS)

    1982-01-01

    A set of programs used to aid software product development is listed. Known as software tools, such programs include requirements analyzers, design languages, precompilers, code auditors, code analyzers, and software librarians. Abstracts, resource requirements, documentation, processing summaries, and availability are indicated for most tools.

  10. Modeling a distributed environment for a petroleum reservoir engineering application with software product line

    NASA Astrophysics Data System (ADS)

    de Faria Scheidt, Rafael; Vilain, Patrícia; Dantas, M. A. R.

    2014-10-01

    Petroleum reservoir engineering is a complex and interesting field that requires large amount of computational facilities to achieve successful results. Usually, software environments for this field are developed without taking care out of possible interactions and extensibilities required by reservoir engineers. In this paper, we present a research work which it is characterized by the design and implementation based on a software product line model for a real distributed reservoir engineering environment. Experimental results indicate successfully the utilization of this approach for the design of distributed software architecture. In addition, all components from the proposal provided greater visibility of the organization and processes for the reservoir engineers.

  11. Daily quality assurance software for a satellite radiometer system

    NASA Technical Reports Server (NTRS)

    Keegstra, P. B.; Smoot, G. F.; Bennett, C. L.; Aymon, J.; Backus, C.; Deamici, G.; Hinshaw, G.; Jackson, P. D.; Kogut, A.; Lineweaver, C.

    1992-01-01

    Six Differential Microwave Radiometers (DMR) on COBE (Cosmic Background Explorer) measure the large-angular-scale isotropy of the cosmic microwave background (CMB) at 31.5, 53, and 90 GHz. Quality assurance software analyzes the daily telemetry from the spacecraft to ensure that the instrument is operating correctly and that the data are not corrupted. Quality assurance for DMR poses challenging requirements. The data are differential, so a single bad point can affect a large region of the sky, yet the CMB isotropy requires lengthy integration times (greater than 1 year) to limit potential CMB anisotropies. Celestial sources (with the exception of the moon) are not, in general, visible in the raw differential data. A 'quicklook' software system was developed that, in addition to basic plotting and limit-checking, implements a collection of data tests as well as long-term trending. Some of the key capabilities include the following: (1) stability analysis showing how well the data RMS averages down with increased data; (2) a Fourier analysis and autocorrelation routine to plot the power spectrum and confirm the presence of the 3 mK 'cosmic' dipole signal; (3) binning of the data against basic spacecraft quantities such as orbit angle; (4) long-term trending; and (5) dipole fits to confirm the spacecraft attitude azimuth angle.

  12. Analyst Tools and Quality Control Software for the ARM Data System

    SciTech Connect

    Moore, Sean; Hughes, Gary

    2008-07-31

    Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed web-based data analysis and visualization tools such as the interactive plotting program NCVweb, various diagnostic plot browsers, and a datastream processing status application. These tools allow even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers. We have also embarked on a system to comprehensively generate long time-series plots, frequency distributions, and other relevant statistics for scientific and engineering data in most high-level, publicly available ARM data streams. Furthermore, frequency distributions categorized by month or by season are made available to help define valid data ranges specific to those time domains. These statistics can be used to set limits that when checked, will improve upon the reporting of suspicious data and the early detection of instrument malfunction. The statistics and proposed limits are stored in a database for easy reporting, refining, and for use by other processes. Web-based applications to view the results are also available.

  13. Development of Software to Model AXAF-I Image Quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees; Hawkins, Lamar

    1996-01-01

    This draft final report describes the work performed under the delivery order number 145 from May 1995 through August 1996. The scope of work included a number of software development tasks for the performance modeling of AXAF-I. A number of new capabilities and functions have been added to the GT software, which is the command mode version of the GRAZTRACE software, originally developed by MSFC. A structural data interface has been developed for the EAL (old SPAR) finite element analysis FEA program, which is being used by MSFC Structural Analysis group for the analysis of AXAF-I. This interface utility can read the structural deformation file from the EAL and other finite element analysis programs such as NASTRAN and COSMOS/M, and convert the data to a suitable format that can be used for the deformation ray-tracing to predict the image quality for a distorted mirror. There is a provision in this utility to expand the data from finite element models assuming 180 degrees symmetry. This utility has been used to predict image characteristics for the AXAF-I HRMA, when subjected to gravity effects in the horizontal x-ray ground test configuration. The development of the metrology data processing interface software has also been completed. It can read the HDOS FITS format surface map files, manipulate and filter the metrology data, and produce a deformation file, which can be used by GT for ray tracing for the mirror surface figure errors. This utility has been used to determine the optimum alignment (axial spacing and clocking) for the four pairs of AXAF-I mirrors. Based on this optimized alignment, the geometric images and effective focal lengths for the as built mirrors were predicted to cross check the results obtained by Kodak.

  14. Requirements for guidelines systems: implementation challenges and lessons from existing software-engineering efforts

    PubMed Central

    2012-01-01

    Background A large body of work in the clinical guidelines field has identified requirements for guideline systems, but there are formidable challenges in translating such requirements into production-quality systems that can be used in routine patient care. Detailed analysis of requirements from an implementation perspective can be useful in helping define sub-requirements to the point where they are implementable. Further, additional requirements emerge as a result of such analysis. During such an analysis, study of examples of existing, software-engineering efforts in non-biomedical fields can provide useful signposts to the implementer of a clinical guideline system. Methods In addition to requirements described by guideline-system authors, comparative reviews of such systems, and publications discussing information needs for guideline systems and clinical decision support systems in general, we have incorporated additional requirements related to production-system robustness and functionality from publications in the business workflow domain, in addition to drawing on our own experience in the development of the Proteus guideline system (http://proteme.org). Results The sub-requirements are discussed by conveniently grouping them into the categories used by the review of Isern and Moreno 2008. We cite previous work under each category and then provide sub-requirements under each category, and provide example of similar work in software-engineering efforts that have addressed a similar problem in a non-biomedical context. Conclusions When analyzing requirements from the implementation viewpoint, knowledge of successes and failures in related software-engineering efforts can guide implementers in the choice of effective design and development strategies. PMID:22405400

  15. Evaluating software development by analysis of changes: The data from the software engineering laboratory

    NASA Technical Reports Server (NTRS)

    1982-01-01

    An effective data collection methodology for evaluating software development methodologies was applied to four different software development projects. Goals of the data collection included characterizing changes and errors, characterizing projects and programmers, identifying effective error detection and correction techniques, and investigating ripple effects. The data collected consisted of changes (including error corrections) made to the software after code was written and baselined, but before testing began. Data collection and validation were concurrent with software development. Changes reported were verified by interviews with programmers.

  16. The Use of the Software MATLAB To Improve Chemical Engineering Education.

    ERIC Educational Resources Information Center

    Damatto, T.; Maegava, L. M.; Filho, R. Maciel

    In all the Brazilian Universities involved with the project "Prodenge-Reenge", the main objective is to improve teaching and learning procedures for the engineering disciplines. The Chemical Engineering College of Campinas State University focused its effort on the use of engineering softwares. The work developed by this project has allowed all…

  17. Questioning the Role of Requirements Engineering in the Causes of Safety-Critical Software Failures

    NASA Technical Reports Server (NTRS)

    Johnson, C. W.; Holloway, C. M.

    2006-01-01

    Many software failures stem from inadequate requirements engineering. This view has been supported both by detailed accident investigations and by a number of empirical studies; however, such investigations can be misleading. It is often difficult to distinguish between failures in requirements engineering and problems elsewhere in the software development lifecycle. Further pitfalls arise from the assumption that inadequate requirements engineering is a cause of all software related accidents for which the system fails to meet its requirements. This paper identifies some of the problems that have arisen from an undue focus on the role of requirements engineering in the causes of major accidents. The intention is to provoke further debate within the emerging field of forensic software engineering.

  18. Engineering Play: Children's Software and the Cultural Politics of Edutainment

    ERIC Educational Resources Information Center

    Ito, Mizuko

    2006-01-01

    The late 1980s saw the emergence of a new genre of instructional media, "edutainment", which utilized the capabilities of multimedia personal computers to animate software designed to both educate and entertain young children. This paper describes the production of, marketing of and play with edutainment software as a contemporary example of…

  19. Caesy: A software tool for computer-aided engineering

    NASA Technical Reports Server (NTRS)

    Wette, Matt

    1993-01-01

    A new software tool, Caesy, is described. This tool provides a strongly typed programming environment for research in the development of algorithms and software for computer-aided control system design. A description of the user language and its implementation as they currently stand are presented along with a description of work in progress and areas of future work.

  20. Software Quality Evaluation Models Applicable in Health Information and Communications Technologies. A Review of the Literature.

    PubMed

    Villamor Ordozgoiti, Alberto; Delgado Hito, Pilar; Guix Comellas, Eva María; Fernandez Sanchez, Carlos Manuel; Garcia Hernandez, Milagros; Lluch Canut, Teresa

    2016-01-01

    Information and Communications Technologies in healthcare has increased the need to consider quality criteria through standardised processes. The aim of this study was to analyse the software quality evaluation models applicable to healthcare from the perspective of ICT-purchasers. Through a systematic literature review with the keywords software, product, quality, evaluation and health, we selected and analysed 20 original research papers published from 2005-2016 in health science and technology databases. The results showed four main topics: non-ISO models, software quality evaluation models based on ISO/IEC standards, studies analysing software quality evaluation models, and studies analysing ISO standards for software quality evaluation. The models provide cost-efficiency criteria for specific software, and improve use outcomes. The ISO/IEC25000 standard is shown as the most suitable for evaluating the quality of ICTs for healthcare use from the perspective of institutional acquisition. PMID:27350495

  1. The maintenance, distribution and development of biomedical computer software: an exercise in software engineering.

    PubMed

    Boston, R C; Granek, H; Sutton, N; Weber, K; Greif, P; Zech, L

    1986-06-01

    The growing reliance of biomedical investigations on computer software in almost all facets of their work places considerable emphasis on the need for the integrated management of the software. In order to efficiently develop, distribute, and maintain the software, tools are required which not only automate these tasks but also, wherever possible, 'semi-intelligently', alert their user to irregular situation. We describe an assortment of such tools routinely used in the management of the SAAM/CONSAM biokinetic software and illustrate their application. Furthermore, using these techniques we have presented some comparative performances of numerical integrators and of computer processors. PMID:3637127

  2. Implementing Quality Assurance for the Numerical Research Software Dune / PDELab / DuMux

    NASA Astrophysics Data System (ADS)

    Flemisch, B.; Bastian, P.; Kempf, D.; Koch, T.; Helmig, R.

    2015-12-01

    Quality assurance and, in particular, automated testing, should be one of the key elements of modern software development. However, applying common techniques from software engineering to numerical frameworks, such as Dune, may be challenging since the requirements for a test might be very different to standard software. This talk gives an overview of our work in describing system tests for numerical software and developing test tools to ensure that qualitative and quantitative properties of PDE discretizations are preserved. The developed tools are employed in the Dune discretization module Dune-PDELab and the porous-media simulator DuMux.The newly developed module dune-testtools provides the following components: A domain specific language for feature modelling, which is naturally integrated into the workflow of numerical simulation. Tools to test whether a given PDE discretization does still yield the correct result without performance (or scalability) regressions. Integration of the above tools into a CMake based build system. Extensions to the Dune core modules to support the development of system tests.

  3. Seven Processes that Enable NASA Software Engineering Technologies

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.

  4. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by UAH and provide versions of the software in a Macintosh and Windows compatible format.

  5. Software engineering and data management for automated payload experiment tool

    NASA Technical Reports Server (NTRS)

    Maddux, Gary A.; Provancha, Anna; Chattam, David

    1994-01-01

    The Microgravity Projects Office identified a need to develop a software package that will lead experiment developers through the development planning process, obtain necessary information, establish an electronic data exchange avenue, and allow easier manipulation/reformatting of the collected information. An MS-DOS compatible software package called the Automated Payload Experiment Tool (APET) has been developed and delivered. The objective of this task is to expand on the results of the APET work previously performed by University of Alabama in Huntsville (UAH) and provide versions of the software in a Macintosh and Windows compatible format. Appendix 1 science requirements document (SRD) Users Manual is attached.

  6. Use of project ontologies and terminology servers to support software engineering.

    PubMed

    Bång, M; Eriksson, H; Timpka, T

    1998-01-01

    Complex medical software imposes new requirements on the methods and tools used for maintenance. Appropriate maintenance tools can increase software reliability and quality by providing means to trace dependencies among software artifacts for reducing unexpected impacts in software caused by software changes. We have used the GRAIL concept-representation language for medical terminologies to build a project ontology that models relationships among software artifacts. Our approach involves modeling of the terminology used in software projects, which enables us to describe, classify and relate individual software artifacts. A networked repository accessible to the entire software development staff stores the conceptual model, source code and associated documents. We present an architecture for a maintenance tool, and show how developers can use GRAIL to build a project ontology. PMID:10384533

  7. Lessons learned from development and quality assurance of software systems at the Halden Project

    SciTech Connect

    Bjorlo, T.J.; Berg, O.; Pehrsen, M.; Dahll, G.; Sivertsen, T.

    1996-03-01

    The OECD Halden Reactor Project has developed a number of software systems within the research programmes. These programmes have comprised a wide range of topics, like studies of software for safety-critical applications, development of different operator support systems, and software systems for building and implementing graphical user interfaces. The systems have ranged from simple prototypes to installations in process plants. In the development of these software systems, Halden has gained much experience in quality assurance of different types of software. This paper summarises the accumulated experience at the Halden Project in quality assurance of software systems. The different software systems being developed at the Halden Project may be grouped into three categories. These are plant-specific software systems (one-of-a-kind deliveries), generic software products, and safety-critical software systems. This classification has been found convenient as the categories have different requirements to the quality assurance process. In addition, the experience from use of software development tools and proprietary software systems at Halden, is addressed. The paper also focuses on the experience gained from the complete software life cycle, starting with the software planning phase and ending with software operation and maintenance.

  8. Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory

    NASA Technical Reports Server (NTRS)

    Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.

    1994-01-01

    As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.

  9. A Middleware Platform for Providing Mobile and Embedded Computing Instruction to Software Engineering Students

    ERIC Educational Resources Information Center

    Mattmann, C. A.; Medvidovic, N.; Malek, S.; Edwards, G.; Banerjee, S.

    2012-01-01

    As embedded software systems have grown in number, complexity, and importance in the modern world, a corresponding need to teach computer science students how to effectively engineer such systems has arisen. Embedded software systems, such as those that control cell phones, aircraft, and medical equipment, are subject to requirements and…

  10. Software Engineering Laboratory (SEL) Data Base Maintenance System (DBAM) user's guide and system description

    NASA Technical Reports Server (NTRS)

    Lo, P. S.; Card, D.

    1983-01-01

    The Software Engineering Laboratory (SEL) Data Base Maintenance System (DBAM) is explained. The various software facilities of the SEL, DBAM operating procedures, and DBAM system information are described. The relationships among DBAM components (baseline diagrams), component descriptions, overlay descriptions, indirect command file listings, file definitions, and sample data collection forms are provided.

  11. Incorporating Computer-Aided Software in the Undergraduate Chemical Engineering Core Courses

    ERIC Educational Resources Information Center

    Alnaizy, Raafat; Abdel-Jabbar, Nabil; Ibrahim, Taleb H.; Husseini, Ghaleb A.

    2014-01-01

    Introductions of computer-aided software and simulators are implemented during the sophomore-year of the chemical engineering (ChE) curriculum at the American University of Sharjah (AUS). Our faculty concurs that software integration within the curriculum is beneficial to our students, as evidenced by the positive feedback received from industry…

  12. QUICK - AN INTERACTIVE SOFTWARE ENVIRONMENT FOR ENGINEERING DESIGN

    NASA Technical Reports Server (NTRS)

    Schlaifer, R. S.

    1994-01-01

    QUICK provides the computer user with the facilities of a sophisticated desk calculator which can perform scalar, vector and matrix arithmetic, propagate conic orbits, determine planetary and satellite coordinates and perform other related astrodynamic calculations within a Fortran-like environment. QUICK is an interpreter, therefore eliminating the need to use a compiler or a linker to run QUICK code. QUICK capabilities include options for automated printing of results, the ability to submit operating system commands on some systems, and access to a plotting package (MASL)and a text editor without leaving QUICK. Mathematical and programming features of QUICK include the ability to handle arbitrary algebraic expressions, the capability to define user functions in terms of other functions, built-in constants such as pi, direct access to useful COMMON areas, matrix capabilities, extensive use of double precision calculations, and the ability to automatically load user functions from a standard library. The MASL (The Multi-mission Analysis Software Library) plotting package, included in the QUICK package, is a set of FORTRAN 77 compatible subroutines designed to facilitate the plotting of engineering data by allowing programmers to write plotting device independent applications. Its universality lies in the number of plotting devices it puts at the user's disposal. The MASL package of routines has proved very useful and easy to work with, yielding good plots for most new users on the first or second try. The functions provided include routines for creating histograms, "wire mesh" surface plots and contour plots as well as normal graphs with a large variety of axis types. The library has routines for plotting on cartesian, polar, log, mercator, cyclic, calendar, and stereographic axes, and for performing automatic or explicit scaling. The lengths of the axes of a plot are completely under the control of the program using the library. Programs written to use the MASL

  13. Proteomics Quality Control: Quality Control Software for MaxQuant Results.

    PubMed

    Bielow, Chris; Mastrobuoni, Guido; Kempa, Stefan

    2016-03-01

    Mass spectrometry-based proteomics coupled to liquid chromatography has matured into an automatized, high-throughput technology, producing data on the scale of multiple gigabytes per instrument per day. Consequently, an automated quality control (QC) and quality analysis (QA) capable of detecting measurement bias, verifying consistency, and avoiding propagation of error is paramount for instrument operators and scientists in charge of downstream analysis. We have developed an R-based QC pipeline called Proteomics Quality Control (PTXQC) for bottom-up LC-MS data generated by the MaxQuant1 software pipeline. PTXQC creates a QC report containing a comprehensive and powerful set of QC metrics, augmented with automated scoring functions. The automated scores are collated to create an overview heatmap at the beginning of the report, giving valuable guidance also to nonspecialists. Our software supports a wide range of experimental designs, including stable isotope labeling by amino acids in cell culture (SILAC), tandem mass tags (TMT), and label-free data. Furthermore, we introduce new metrics to score MaxQuant's Match-between-runs (MBR) functionality by which peptide identifications can be transferred across Raw files based on accurate retention time and m/z. Last but not least, PTXQC is easy to install and use and represents the first QC software capable of processing MaxQuant result tables. PTXQC is freely available at https://github.com/cbielow/PTXQC . PMID:26653327

  14. Measuring Ada as a software development technology in the Software Engineering Laboratory (SEL)

    NASA Technical Reports Server (NTRS)

    Agresti, W. W.

    1985-01-01

    An experiment is in progress to measure the effectiveness of Ada in the National Aeronautics and Space Administration/Goddard Space Flight Center flight dynamics software development environment. The experiment features the parallel development of software in FORTRAN and Ada. The experiment organization, objectives, and status are discussed. Experiences with an Ada training program and data from the development of a 5700-line Ada training exercise are reported.

  15. A methodology for collecting valid software engineering data

    NASA Technical Reports Server (NTRS)

    Basili, V. R.; Weiss, D. M.

    1984-01-01

    An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments (other NRL Reports). The application showed that the methodology was both feasible and useful.

  16. Software Engineering Laboratory (SEL) programmer workbench phase 1 evaluation

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Phase 1 of the SEL programmer workbench consists of the design of the following three components: communications link, command language processor, and collection of software aids. A brief description, and evaluation, and recommendations are presented for each of these three components.

  17. A methodology for collecting valid software engineering data

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Weiss, David M.

    1983-01-01

    An effective data collection method for evaluating software development methodologies and for studying the software development process is described. The method uses goal-directed data collection to evaluate methodologies with respect to the claims made for them. Such claims are used as a basis for defining the goals of the data collection, establishing a list of questions of interest to be answered by data analysis, defining a set of data categorization schemes, and designing a data collection form. The data to be collected are based on the changes made to the software during development, and are obtained when the changes are made. To insure accuracy of the data, validation is performed concurrently with software development and data collection. Validation is based on interviews with those people supplying the data. Results from using the methodology show that data validation is a necessary part of change data collection. Without it, as much as 50% of the data may be erroneous. Feasibility of the data collection methodology was demonstrated by applying it to five different projects in two different environments. The application showed that the methodology was both feasible and useful.

  18. An approach to software quality assurance for robotic inspection systems

    SciTech Connect

    Kiebel, G.R. )

    1993-01-01

    Software quality assurance (SQA) for robotic systems used in nuclear waste applications is vital to ensure that these systems operate safely and reliably and pose a minimum risk to the environment. This paper describes the approach to be taken for SQA for the control and data acquisition system for a robotic system that is being developed for remote surveillance and inspection of underground storage tanks (USTS) at the Hanford site under the sponsorship of the UST-Integrated Demonstration Program of the U.S. Department of Energy's (DOE's) Office of Technology Development (OTD). The robotic system is called the Light-Duty Utility Arm (LDUA) System. It has a multiaxis arm with a 2.74-M (9-ft) reach and 22.7-kg (50-1b) payload that is mounted on the end of a 13.7-m (45-ft) positioning mast. It is designed to enter a UST through an available 30.5-cm (12-in.) riser. A deployment vehicle carries the positioning equipment to insert the mast and arm into the riser and a containment enclosure to control contamination when the arm is in the tank or being transported from tank to tank. A set of interchangeable end effectors is carried on the end of the arm. These end effectors provide a wide range of observation and measurement functions, such as photographic and video inspection and recording, detailed surface mapping of the tank and surface of the waste, in situ chemical analysis of the waste, etc.

  19. Is Chinese Software Engineering Professionalizing or Not?: Specialization of Knowledge, Subjective Identification and Professionalization

    ERIC Educational Resources Information Center

    Yang, Yan

    2012-01-01

    Purpose: This paper aims to discuss the challenge for the classical idea of professionalism in understanding the Chinese software engineering industry after giving a close insight into the development of this industry as well as individual engineers with a psycho-societal perspective. Design/methodology/approach: The study starts with the general…

  20. Balancing Plan-Driven and Agile Methods in Software Engineering Project Courses

    NASA Astrophysics Data System (ADS)

    Boehm, Barry; Port, Dan; Winsor Brown, A.

    2002-09-01

    For the past 6 years, we have been teaching a two-semester software engineering project course. The students organize into 5-person teams and develop largely web-based electronic services projects for real USC campus clients. We have been using and evolving a method called Model- Based (System) Architecting and Software Engineering (MBASE) for use in both the course and in industrial applications. The MBASE Guidelines include a lot of documents. We teach risk-driven documentation: if it is risky to document something, and not risky to leave it out (e.g., GUI screen placements), leave it out. Even so, students tend to associate more documentation with higher grades, although our grading eventually discourages this. We are always on the lookout for ways to have students learn best practices without having to produce excessive documentation. Thus, we were very interested in analyzing the various emerging agile methods. We found that agile methods and milestone plan-driven methods are part of a “how much planning is enough?” spectrum. Both agile and plan-driven methods have home grounds of project characteristics where they clearly work best, and where the other will have difficulties. Hybrid agile/plan-driven approaches are feasible, and necessary for projects having a mix of agile and plan-driven home ground characteristics. Information technology trends are going more toward the agile methods' home ground characteristics of emergent requirements and rapid change, although there is a concurrent increase in concern with dependability. As a result, we are currently experimenting with risk-driven combinations of MBASE and agile methods, such as integrating requirements, test plans, peer reviews, and pair programming into “agile quality management.”

  1. The Use of Modeling for Flight Software Engineering on SMAP

    NASA Technical Reports Server (NTRS)

    Murray, Alexander; Jones, Chris G.; Reder, Leonard; Cheng, Shang-Wen

    2011-01-01

    The Soil Moisture Active Passive (SMAP) mission proposes to deploy an Earth-orbiting satellite with the goal of obtaining global maps of soil moisture content at regular intervals. Launch is currently planned in 2014. The spacecraft bus would be built at the Jet Propulsion Laboratory (JPL), incorporating both new avionics as well as hardware and software heritage from other JPL projects. [4] provides a comprehensive overview of the proposed mission

  2. The DOE Meteorological Coordinating Council Perspective on the Application to Meteorological Software of DOE's Software Quality Assurance Requirements

    SciTech Connect

    Mazzola, Carl; Schalk, Walt; Glantz, Clifford S.

    2008-03-12

    Since 1994, the DOE Meteorological Coordinating Council (DMCC) has been addressing meteorological monitoring and meteorological applications program issues at U.S. Department of Energy/National Nuclear Safety Administration (DOE/NNSA) sites and providing solutions to these issues. The fundamental objectives of the DMCC include promoting cost-effective meteorological support and facilitating the use of common meteorological methods, procedures, and standards at all DOE/NNSA sites. In 2005, the DOE established strict software quality assurance (SQA) requirements for safety software, including consequence assessment software used for hazard assessments and safety analyses. These evaluations often utilize meteorological data supplied by DOE/NNSA site-based meteorological programs. However, the DOE has not established SQA guidance for this type of safety-related meteorological software. To address this gap, the DMCC is developing this guidance for the use of both public- and private-sector organizations. The goal for the DMCC is to mimic the SQA requirements for safety software but allow a much greater degree of “grading” in determining exactly what specific activities are needed. The emphasis of the DMCC SQA guidelines is on three key elements: 1) design and implementation documentation, 2) configuration management, and 3) verification and validation testing. These SQA guidelines should provide owners and users of meteorological software with a fair degree of assurance that their software is reliable, documented, and tested without putting an undue burden on meteorological system software developers.

  3. Application of software quality assurance to a specific scientific code development task

    SciTech Connect

    Dronkers, J.J.

    1986-03-01

    This paper describes an application of software quality assurance to a specific scientific code development program. The software quality assurance program consists of three major components: administrative control, configuration management, and user documentation. The program attempts to be consistent with existing local traditions of scientific code development while at the same time providing a controlled process of development.

  4. Impacts of software and its engineering on the carbon footprint of ICT

    SciTech Connect

    Kern, Eva; Dick, Markus; Naumann, Stefan; Hiller, Tim

    2015-04-15

    The energy consumption of information and communication technology (ICT) is still increasing. Even though several solutions regarding the hardware side of Green IT exist, the software contribution to Green IT is not well investigated. The carbon footprint is one way to rate the environmental impacts of ICT. In order to get an impression of the induced CO{sub 2} emissions of software, we will present a calculation method for the carbon footprint of a software product over its life cycle. We also offer an approach on how to integrate some aspects of carbon footprint calculation into software development processes and discuss impacts and tools regarding this calculation method. We thus show the relevance of energy measurements and the attention to impacts on the carbon footprint by software within Green Software Engineering.

  5. Hypermedia for Training: A Software and Instructional Engineering Model.

    ERIC Educational Resources Information Center

    Cortinovis, Renato

    1992-01-01

    Presents an engineering environment oriented to the development of hypermedia applications for training. Training trends are described; requirements for a computer-based training (CBT) strategy are outlined; a hypermedia course structure is examined; microinstructional events (MIEs) are explained; and technology requirements and selection are…

  6. Engineering Play: A Cultural History of Children's Software

    ERIC Educational Resources Information Center

    Ito, Mizuko

    2009-01-01

    Today, computers are part of kids' everyday lives, used both for play and for learning. We envy children's natural affinity for computers, the ease with which they click in and out of digital worlds. Thirty years ago, however, the computer belonged almost exclusively to business, the military, and academia. In "Engineering Play," Mizuko Ito…

  7. Quality Assurance and Accreditation of Engineering Education in Jordan

    ERIC Educational Resources Information Center

    Aqlan, Faisal; Al-Araidah, Omar; Al-Hawari, Tarek

    2010-01-01

    This paper provides a study of the quality assurance and accreditation in the Jordanian higher education sector and focuses mainly on engineering education. It presents engineering education, accreditation and quality assurance in Jordan and considers the Jordan University of Science and Technology (JUST) for a case study. The study highlights the…

  8. Software engineering and Ada (Trademark) training: An implementation model for NASA

    NASA Technical Reports Server (NTRS)

    Legrand, Sue; Freedman, Glenn

    1988-01-01

    The choice of Ada for software engineering for projects such as the Space Station has resulted in government and industrial groups considering training programs that help workers become familiar with both a software culture and the intricacies of a new computer language. The questions of how much time it takes to learn software engineering with Ada, how much an organization should invest in such training, and how the training should be structured are considered. Software engineering is an emerging, dynamic discipline. It is defined by the author as the establishment and application of sound engineering environments, tools, methods, models, principles, and concepts combined with appropriate standards, guidelines, and practices to support computing which is correct, modifiable, reliable and safe, efficient, and understandable throughout the life cycle of the application. Neither the training programs needed, nor the content of such programs, have been well established. This study addresses the requirements for training for NASA personnel and recommends an implementation plan. A curriculum and a means of delivery are recommended. It is further suggested that a knowledgeable programmer may be able to learn Ada in 5 days, but that it takes 6 to 9 months to evolve into a software engineer who uses the language correctly and effectively. The curriculum and implementation plan can be adapted for each NASA Center according to the needs dictated by each project.

  9. Engine structures analysis software: Component Specific Modeling (COSMO)

    NASA Astrophysics Data System (ADS)

    McKnight, R. L.; Maffeo, R. J.; Schwartz, S.

    1994-08-01

    A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.

  10. Engine Structures Analysis Software: Component Specific Modeling (COSMO)

    NASA Technical Reports Server (NTRS)

    Mcknight, R. L.; Maffeo, R. J.; Schwartz, S.

    1994-01-01

    A component specific modeling software program has been developed for propulsion systems. This expert program is capable of formulating the component geometry as finite element meshes for structural analysis which, in the future, can be spun off as NURB geometry for manufacturing. COSMO currently has geometry recipes for combustors, turbine blades, vanes, and disks. Component geometry recipes for nozzles, inlets, frames, shafts, and ducts are being added. COSMO uses component recipes that work through neutral files with the Technology Benefit Estimator (T/BEST) program which provides the necessary base parameters and loadings. This report contains the users manual for combustors, turbine blades, vanes, and disks.

  11. Applying neural networks as software sensors for enzyme engineering.

    PubMed

    Linko, S; Zhu, Y H; Linko, P

    1999-04-01

    The on-line control of enzyme-production processes is difficult, owing to the uncertainties typical of biological systems and to the lack of suitable on-line sensors for key process variables. For example, intelligent methods to predict the end point of fermentation could be of great economic value. Computer-assisted control based on artificial-neural-network models offers a novel solution in such situations. Well-trained feedforward-backpropagation neural networks can be used as software sensors in enzyme-process control; their performance can be affected by a number of factors. PMID:10203774

  12. Observer versus software engineer: The dawn of the armchair astronomers

    NASA Astrophysics Data System (ADS)

    Economou, F.; Jenness, T.; Delorey, K. K.; Cavanagh, B.; de Witt, S.; Allan, A.

    2004-07-01

    Traditional ground-based observatories have moved into a new era where the dominant consumers for their data products are astronomers who no longer come out to use the facility themselves. Using the JCMT and UKIRT as an example, we discuss the wide variety of software systems that are necessary for maintaining a high level of scientific involvement for the absentee observer. These include cradle-to-grave flexible scheduling support, advanced data reduction pipelines, VO integration and intelligent agent networks. Is the observational astronomer becoming an endangered species, and does this really matter?

  13. Civil Engineering: Improving the Quality of Life.

    ERIC Educational Resources Information Center

    One Feather, Sandra

    2002-01-01

    American Indian civil engineers describe the educational paths that led them to their engineering careers, applications of civil engineering in reservation communities, necessary job skills, opportunities afforded by internship programs, continuing education, and the importance of early preparation in math and science. Addresses of 12 resource Web…

  14. A Methodology to Evaluate Agent Oriented Software Engineering Techniques

    SciTech Connect

    Lin, Chia-En; Kavi, Krishna M.; Sheldon, Frederick T; Daley, Kristopher M; Abercrombie, Robert K

    2007-01-01

    Systems using software agents (or multi-agent systems, MAS) are becoming more popular within the development mainstream because, as the name suggests, an agent aims to handle tasks autonomously with intelligence. To benefit from autonomous control and reduced running costs, system functions are performed automatically. Agent-oriented considerations are being steadily accepted into the various software design paradigms. Agents may work alone, but most commonly, they cooperate toward achieving some application goal(s). MAS's are components in systems that are viewed as many individuals living in a society working together. From a SE perspective, solving a problem should encompass problem realization, requirements analysis, architecture design and implementation. These steps should be implemented within a life-cycle process including testing, verification, and reengineering to proving the built system is sound. In this paper, we explore the various applications of agent-based systems categorized into different application domains. A baseline is developed herein to help us focus on the core of agent concepts throughout the comparative study and to investigate both the object-oriented and agent-oriented techniques that are available for constructing agent-based systems. In each respect, we address the conceptual background associated with these methodologies and how available tools can be applied within specific domains.

  15. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merlon M.

    2003-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  16. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merlon M.

    2004-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  17. Software for Estimating Costs of Testing Rocket Engines

    NASA Technical Reports Server (NTRS)

    Hines, Merion M.

    2002-01-01

    A high-level parametric mathematical model for estimating the costs of testing rocket engines and components at Stennis Space Center has been implemented as a Microsoft Excel program that generates multiple spreadsheets. The model and the program are both denoted, simply, the Cost Estimating Model (CEM). The inputs to the CEM are the parameters that describe particular tests, including test types (component or engine test), numbers and duration of tests, thrust levels, and other parameters. The CEM estimates anticipated total project costs for a specific test. Estimates are broken down into testing categories based on a work-breakdown structure and a cost-element structure. A notable historical assumption incorporated into the CEM is that total labor times depend mainly on thrust levels. As a result of a recent modification of the CEM to increase the accuracy of predicted labor times, the dependence of labor time on thrust level is now embodied in third- and fourth-order polynomials.

  18. Marooned on Mars: Mind-Spinning Books for Software Engineers

    NASA Technical Reports Server (NTRS)

    Clancey, William J.; Swanson, Keith (Technical Monitor)

    1999-01-01

    Dragonfly - NASA and the Crisis Aboard MIR (New York: HarperCollins Publishers), the story of the Russian-American misadventures on MIR. An expose with almost embarrassing detail about the inner-workings of Johnson Space Center in Houston, this book is best read with the JSC organization chart in hand. Here's the real world of engineering and life in extreme environments. It makes most other accounts of "requirements analysis" appear glib and simplistic. The book vividly portrays the sometimes harrowing experiences of the American astronauts in the web of Russian interpersonal relations and literally in the web of MIR's wiring. Burrough's exposition reveals how handling bureaucratic procedures and bulky facilities is as much a matter of moxie and goodwill as technical capability. Lessons from MIR showed NASA that getting to Mars required a different view of knowledge and improvisation-long-duration missions are not at all like the scripted and pre-engineered flights of Apollo or the Space Shuttle.

  19. Product Engineering Class in the Software Safety Risk Taxonomy for Building Safety-Critical Systems

    NASA Technical Reports Server (NTRS)

    Hill, Janice; Victor, Daniel

    2008-01-01

    When software safety requirements are imposed on legacy safety-critical systems, retrospective safety cases need to be formulated as part of recertifying the systems for further use and risks must be documented and managed to give confidence for reusing the systems. The SEJ Software Development Risk Taxonomy [4] focuses on general software development issues. It does not, however, cover all the safety risks. The Software Safety Risk Taxonomy [8] was developed which provides a construct for eliciting and categorizing software safety risks in a straightforward manner. In this paper, we present extended work on the taxonomy for safety that incorporates the additional issues inherent in the development and maintenance of safety-critical systems with software. An instrument called a Software Safety Risk Taxonomy Based Questionnaire (TBQ) is generated containing questions addressing each safety attribute in the Software Safety Risk Taxonomy. Software safety risks are surfaced using the new TBQ and then analyzed. In this paper we give the definitions for the specialized Product Engineering Class within the Software Safety Risk Taxonomy. At the end of the paper, we present the tool known as the 'Legacy Systems Risk Database Tool' that is used to collect and analyze the data required to show traceability to a particular safety standard

  20. Guidance and Control Software Project Data - Volume 4: Configuration Management and Quality Assurance Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes configuration management and quality assurance documents from the GCS project. Volume 4 contains six appendices: A. Software Accomplishment Summary for the Guidance and Control Software Project; B. Software Configuration Index for the Guidance and Control Software Project; C. Configuration Management Records for the Guidance and Control Software Project; D. Software Quality Assurance Records for the Guidance and Control Software Project; E. Problem Report for the Pluto Implementation of the Guidance and Control Software Project; and F. Support Documentation Change Reports for the Guidance and Control Software Project.

  1. An infrastructure for the creation of high end scientific and engineering software tools and applications

    SciTech Connect

    Drummond, L.A.; Marques, O.A.; Wilson, G.V.

    2003-04-01

    This document has been prepared as a response to the High End Computing Revitalization Task Force (HECRTF) call for white papers. Our main goal is to identify mechanism necessary for the design and implementation of an infrastructure to support development of high-end scientific and engineering software tools and applications. This infrastructure will provide a plethora of software services to facilitate the efficient deployment of future HEC technology as well as collaborations among researchers and engineers across disciplines and institutions. In particular, we address here the following points; Key software technologies that must be advanced to strengthen the foundation for developing new generations of HEC systems. A Software Infrastructure for minimizing ''time to solution'' by users of HEC systems.

  2. Ethical education in software engineering: responsibility in the production of complex systems.

    PubMed

    Génova, Gonzalo; González, M Rosario; Fraga, Anabel

    2007-12-01

    Among the various contemporary schools of moral thinking, consequence-based ethics, as opposed to rule-based, seems to have a good acceptance among professionals such as software engineers. But naïve consequentialism is intellectually too weak to serve as a practical guide in the profession. Besides, the complexity of software systems makes it very hard to know in advance the consequences that will derive from professional activities in the production of software. Therefore, following the spirit of well-known codes of ethics such as the ACM/IEEE's, we advocate for a more solid position in the ethical education of software engineers, which we call 'moderate deontologism', that takes into account both rules and consequences to assess the goodness of actions, and at the same time pays an adequate consideration to the absolute values of human dignity. In order to educate responsible professionals, however, this position should be complemented with a pedagogical approach to virtue ethics. PMID:18066681

  3. ISO and software quality assurance - licensing and certification of software professionals

    SciTech Connect

    Hare, J.; Rodin, L.

    1997-11-01

    This report contains viewgraphs on licensing and certifing of software professionals. Discussed in this report are: certification programs; licensing programs; why became certified; certification as a condition of empolyment; certification requirements; and examination structures.

  4. Database Access Manager for the Software Engineering Laboratory (DAMSEL) user's guide

    NASA Technical Reports Server (NTRS)

    1990-01-01

    Operating instructions for the Database Access Manager for the Software Engineering Laboratory (DAMSEL) system are presented. Step-by-step instructions for performing various data entry and report generation activities are included. Sample sessions showing the user interface display screens are also included. Instructions for generating reports are accompanied by sample outputs for each of the reports. The document groups the available software functions by the classes of users that may access them.

  5. The concept of a flexible system for use in software engineering

    NASA Astrophysics Data System (ADS)

    Adamek, Marek; Mulawka, Jan

    2014-11-01

    This paper concerns one of the fundamental issues of software engineering. We consider the concept of software metrics based on attributes describing each line of a computer program source code. Furthermore we present a design for a flexible, plugin based system which allows to define and use such metrics. To show the practical utility of the solution we implemented a basic version of the system and conducted experimental verification. Some of tests results and final conclusions are also provided.

  6. An Application of Intelligent Data Analysis Techniques to a Large Software Engineering Dataset

    NASA Astrophysics Data System (ADS)

    Cain, James; Counsell, Steve; Swift, Stephen; Tucker, Allan

    Within the development of large software systems, there is significant value in being able to predict changes. If we can predict the likely changes that a system will undergo, then we can estimate likely developer effort and allocate resources appropriately. Within object oriented software development, these changes are often identified as refactorings. Very few studies have explored the prediction of refactorings on a wide-scale. Within this paper we aim to do just this, through applying intelligent data analysis techniques to a uniquely large and comprehensive software engineering time series dataset. Our analysis show extremely promising results, allowing us to predict the occurrence of future large changes.

  7. Enhancement/upgrade of Engine Structures Technology Best Estimator (EST/BEST) Software System

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin

    2003-01-01

    This report describes the work performed during the contract period and the capabilities included in the EST/BEST software system. The developed EST/BEST software system includes the integrated NESSUS, IPACS, COBSTRAN, and ALCCA computer codes required to perform the engine cycle mission and component structural analysis. Also, the interactive input generator for NESSUS, IPACS, and COBSTRAN computer codes have been developed and integrated with the EST/BEST software system. The input generator allows the user to create input from scratch as well as edit existing input files interactively. Since it has been integrated with the EST/BEST software system, it enables the user to modify EST/BEST generated files and perform the analysis to evaluate the benefits. Appendix A gives details of how to use the newly added features in the EST/BEST software system.

  8. The Role and Quality of Software Safety in the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.

    2010-01-01

    In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.

  9. Training Initiative for Embedded Software Engineers through Collaborative Research Project and Open Educational Course

    NASA Astrophysics Data System (ADS)

    Ishida, Rieko; Yamamoto, Masaki; Unagami, Tomoaki; Mori, Takao; Hond, Shinya; Ichiba, Toshiyuki; Takase, Hideki; Takada, Hiroaki

    The authors developed two types of human resource development program consisting of a) one-year collaborative research project, and b) open education course, aimed to train embedded software engineers to high technical standards. In the collaborative research oriented approach, the authors train engineers through research project at Nagoya University. In the open education course, the authors reflect and adopt the latest outcome of the aforementioned research project to one or two days open education course to educate engineers. This allowed the authors to design and provide several courses aimed to train engineers with the latest contents that reflect the rapid technical advances in the embedded software industry. The courses were highly evaluated by the participants.

  10. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  11. The Need for V&V in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    V&V is currently performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to entire' domain or product line rather than a critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. engineering. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for activities.

  12. Development and application of software packages in welding engineering

    NASA Astrophysics Data System (ADS)

    Wei, Yan-Hong; Zhan, Xiao-Hong; Dong, Zhi-Bo

    2011-06-01

    The weldability of some material is analyzed with simple calculating program in this paper, and weldability testing data are shared through database system. The welding procedures are designed with help of expert systems, and the knowledge is shared among welding engineers. Not only the preparing progress of the welding documents is completed with database systems but also the complex decision on the necessity of the qualification test according to the present procedure qualification records (PQRs) and manufacture codes is made. Moreover, the artificial neural network (ANN) technique is proven to be one of the effective ways to predict mechanical properties of welded joints when there are enough tested data to train the models. Finally, the achievements in modeling microstructure of welded joints are introduced, especially in solid transformation and grain growth in both heat-affected zone (HAZ) and welded molten pool.

  13. Environmental concept for engineering software on MIMD computers

    NASA Technical Reports Server (NTRS)

    Lopez, L. A.; Valimohamed, K.

    1989-01-01

    The issues related to developing an environment in which engineering systems can be implemented on MIMD machines are discussed. The problem is presented in terms of implementing the finite element method under such an environment. However, neither the concepts nor the prototype implementation environment are limited to this application. The topics discussed include: the ability to schedule and synchronize tasks efficiently; granularity of tasks; load balancing; and the use of a high level language to specify parallel constructs, manage data, and achieve portability. The objective of developing a virtual machine concept which incorporates solutions to the above issues leads to a design that can be mapped onto loosely coupled, tightly coupled, and hybrid systems.

  14. Software to model AXAF-I image quality

    NASA Technical Reports Server (NTRS)

    Ahmad, Anees; Feng, Chen

    1995-01-01

    A modular user-friendly computer program for the modeling of grazing-incidence type x-ray optical systems has been developed. This comprehensive computer software GRAZTRACE covers the manipulation of input data, ray tracing with reflectivity and surface deformation effects, convolution with x-ray source shape, and x-ray scattering. The program also includes the capabilities for image analysis, detector scan modeling, and graphical presentation of the results. A number of utilities have been developed to interface the predicted Advanced X-ray Astrophysics Facility-Imaging (AXAF-I) mirror structural and thermal distortions with the ray-trace. This software is written in FORTRAN 77 and runs on a SUN/SPARC station. An interactive command mode version and a batch mode version of the software have been developed.

  15. The development of an Ada programming support environment database: SEAD (Software Engineering and Ada Database), user's manual

    NASA Technical Reports Server (NTRS)

    Liaw, Morris; Evesson, Donna

    1988-01-01

    This is a manual for users of the Software Engineering and Ada Database (SEAD). SEAD was developed to provide an information resource to NASA and NASA contractors with respect to Ada-based resources and activities that are available or underway either in NASA or elsewhere in the worldwide Ada community. The sharing of such information will reduce the duplication of effort while improving quality in the development of future software systems. The manual describes the organization of the data in SEAD, the user interface from logging in to logging out, and concludes with a ten chapter tutorial on how to use the information in SEAD. Two appendices provide quick reference for logging into SEAD and using the keyboard of an IBM 3270 or VT100 computer terminal.

  16. Development of innovative computer software to facilitate the setup and computation of water quality index

    PubMed Central

    2013-01-01

    Background Developing a water quality index which is used to convert the water quality dataset into a single number is the most important task of most water quality monitoring programmes. As the water quality index setup is based on different local obstacles, it is not feasible to introduce a definite water quality index to reveal the water quality level. Findings In this study, an innovative software application, the Iranian Water Quality Index Software (IWQIS), is presented in order to facilitate calculation of a water quality index based on dynamic weight factors, which will help users to compute the water quality index in cases where some parameters are missing from the datasets. Conclusion A dataset containing 735 water samples of drinking water quality in different parts of the country was used to show the performance of this software using different criteria parameters. The software proved to be an efficient tool to facilitate the setup of water quality indices based on flexible use of variables and water quality databases. PMID:24499556

  17. DEVELOPMENT OF A WINDOWS-BASED INDOOR AIR QUALITY SIMULATION SOFTWARE PACKAGE

    EPA Science Inventory

    A Microsoft Windows-based indoor air quality (IAQ) simulation software package has been developed and is currently undergoing small-scale beta test and quality assurance review. Tentatively named Simulation Tool Kit for Indoor Air Quality and Inhalation Exposure, or STKi for sho...

  18. Neural networks as "software sensors" in enzyme engineering.

    PubMed

    Linko, S; Zhu, Y H; Linko, P

    1998-12-13

    Industrial applications of enzyme technology are rapidly increasing. On-line control of enzyme production processes, however, is difficult owing to the uncertainties typical of biological systems and to the lack of suitable on-line sensors for key process variables and quality attributes. We demonstrate that well-trained feedforward backpropagation neural networks with one hidden layer can be employed to overcome such problems with no need for a priori knowledge of the relationships of the process variables involved. Neural network programs were written in Microsoft Visual C++ for Windows and implemented in a personal computer. The goodness of fit of the trained neural network to the reference data was determined by the coefficient of determination, R2. Case studies of beta-galactosidase, glucoamylase, lipase, and xylanase production processes will be used as examples. PMID:10075639

  19. Mapping modern software process engineering techniques onto an HEP development environment

    NASA Astrophysics Data System (ADS)

    Wellisch, J. P.

    2003-04-01

    One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.

  20. A Framework for Evaluating the Software Product Quality of Pregnancy Monitoring Mobile Personal Health Records.

    PubMed

    Idri, Ali; Bachiri, Mariam; Fernández-Alemán, José Luis

    2016-03-01

    Stakeholders' needs and expectations are identified by means of software quality requirements, which have an impact on software product quality. In this paper, we present a set of requirements for mobile personal health records (mPHRs) for pregnancy monitoring, which have been extracted from literature and existing mobile apps on the market. We also use the ISO/IEC 25030 standard to suggest the requirements that should be considered during the quality evaluation of these mPHRs. We then go on to design a checklist in which we contrast the mPHRs for pregnancy monitoring requirements with software product quality characteristics and sub-characteristics in order to calculate the impact of these requirements on software product quality, using the ISO/IEC 25010 software product quality standard. The results obtained show that the requirements related to the user's actions and the app's features have the most impact on the external sub-characteristics of the software product quality model. The only sub-characteristic affected by all the requirements is Appropriateness of Functional suitability. The characteristic Operability is affected by 95% of the requirements while the lowest degrees of impact were identified for Compatibility (15%) and Transferability (6%). Lastly, the degrees of the impact of the mPHRs for pregnancy monitoring requirements are discussed in order to provide appropriate recommendations for the developers and stakeholders of mPHRs for pregnancy monitoring. PMID:26643080

  1. Sound quality prediction for engine-radiated noise

    NASA Astrophysics Data System (ADS)

    Liu, Hai; Zhang, Junhong; Guo, Peng; Bi, Fengrong; Yu, Hanzhengnan; Ni, Guangjian

    2015-05-01

    Diesel engine-radiated noise quality prediction is an important topic because engine noise has a significant impact on the overall vehicle noise. Sound quality prediction is based on subjective and objective evaluation of engine noise. The integrated satisfaction index (ISI) is proposed as a criterion for differentiate noise quality in the subjective evaluation, and five psychoacoustic parameters are selected for characterizing and analyzing the noise quality of the diesel engine objectively. The combination of support vector machines (SVM) and genetic algorithm (GA) is proposed in order to establish a model for predicting the diesel engine-radiated noise quality for all operation conditions. The performance of the GA-SVM model is compared with the BP neural network model, and the results show that the mean relative error of the GA-SVM model is smaller than the BP neural network model. The importance rank of the sound quality metrics to the ISI is indicated by the non-parametric correlation analysis. This study suggests that the GA-SVM model is very useful for accurately predicting the diesel engine-radiated noise quality.

  2. The software-cycle model for re-engineering and reuse

    NASA Technical Reports Server (NTRS)

    Bailey, John W.; Basili, Victor R.

    1992-01-01

    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.

  3. A Constrained and Guided Approach for Managing Software Engineering Course Projects

    ERIC Educational Resources Information Center

    Cheng, Y.-P.; Lin, J. M.-C.

    2010-01-01

    This paper documents several years of experimentation with a new approach to organizing and managing projects in a software engineering course. The initial failure and subsequent refinements that the new approach has been through since 2004 are described herein. The "constrained and guided" approach, as it is called, has helped to reduce project…

  4. Software Engineering as Seen through Its Research Literature: A Study in Co-Word Analysis.

    ERIC Educational Resources Information Center

    Coulter, Neal; Monarch, Ira; Konda, Suresh

    1998-01-01

    This empirical research demonstrates the effectiveness of content analysis to map the research literature of the software engineering discipline. Co-word analysis, which is related to cocitation analysis, is used to identify associations among indexing terms from the AMC (Association for Computing Machinery) Computing Classification System and to…

  5. Facilitating Constructive Alignment in Power Systems Engineering Education Using Free and Open-Source Software

    ERIC Educational Resources Information Center

    Vanfretti, L.; Milano, F.

    2012-01-01

    This paper describes how the use of free and open-source software (FOSS) can facilitate the application of constructive alignment theory in power systems engineering education by enabling the deep learning approach in power system analysis courses. With this aim, this paper describes the authors' approach in using the Power System Analysis Toolbox…

  6. Object-oriented Information System in the HELIOS Medical Software Engineering Environment.

    PubMed

    Jean, F C; Thelliez, T; Mascart, J J; Degoulet, P

    1992-01-01

    This paper presents the architecture of the Information System of HELIOS, a medical Software Engineering Environment. It is an object oriented framework for the development of medical applications which puts particular emphasis on tools and techniques favouring reuse of previous work and enhancing collaboration between developers. PMID:1482942

  7. Similarities and Differences in the Academic Education of Software Engineering and Architectural Design Professionals

    ERIC Educational Resources Information Center

    Hazzan, Orit; Karni, Eyal

    2006-01-01

    This article focuses on the similarities and differences in the academic education of software engineers and architects. The rationale for this work stems from our observation, each from the perspective of her or his own discipline, that these two professional design and development processes share some similarities. A pilot study was performed,…

  8. Selective Guide to Literature on Software Review Sources. Engineering Literature Guides, Number 8.

    ERIC Educational Resources Information Center

    Bean, Margaret H., Ed.

    This selective literature guide serves as a directory to software evaluation sources for all sizes of microcomputers. Information is provided on review sources and guides which deal with a variety of applications such as library, engineering, school, and business as well as a variety of systems, including DOS and CP/M. This document is intended to…

  9. Software Engineering Laboratory (SEL). Data base organization and user's guide, revision 1

    NASA Technical Reports Server (NTRS)

    Lo, P. S.; Wyckoff, D.; Page, J.; Mcgarry, F. E.

    1983-01-01

    The structure of the Software Engineering Laboratory (SEL) data base is described. It defines each data base file in detail and provides information about how to access and use the data for programmers and other users. Several data base reporting programs are described also.

  10. The Design and Evaluation of a Cryptography Teaching Strategy for Software Engineering Students

    ERIC Educational Resources Information Center

    Dowling, T.

    2006-01-01

    The present paper describes the design, implementation and evaluation of a cryptography module for final-year software engineering students. The emphasis is on implementation architectures and practical cryptanalysis rather than a standard mathematical approach. The competitive continuous assessment process reflects this approach and rewards…

  11. BIRP: Software for interactive search and retrieval of image engineering data

    NASA Technical Reports Server (NTRS)

    Arvidson, R. E.; Bolef, L. K.; Guinness, E. A.; Norberg, P.

    1980-01-01

    Better Image Retrieval Programs (BIRP), a set of programs to interactively sort through and to display a database, such as engineering data for images acquired by spacecraft is described. An overview of the philosophy of BIRP design, the structure of BIRP data files, and examples that illustrate the capabilities of the software are provided.

  12. Teaching Software Engineering by Means of Computer-Game Development: Challenges and Opportunities

    ERIC Educational Resources Information Center

    Cagiltay, Nergiz Ercil

    2007-01-01

    Software-engineering education programs are intended to prepare students for a field that involves rapidly changing conditions and expectations. Thus, there is always a danger that the skills and the knowledge provided may soon become obsolete. This paper describes results and draws on experiences from the implementation of a computer…

  13. 3D Game-Based Learning System for Improving Learning Achievement in Software Engineering Curriculum

    ERIC Educational Resources Information Center

    Su,Chung-Ho; Cheng, Ching-Hsue

    2013-01-01

    The advancement of game-based learning has encouraged many related studies, such that students could better learn curriculum by 3-dimension virtual reality. To enhance software engineering learning, this paper develops a 3D game-based learning system to assist teaching and assess the students' motivation, satisfaction and learning…

  14. The software engineering journey: From a naieve past into a responsible future

    SciTech Connect

    Chapa, S.K.

    1997-08-01

    All engineering fields experience growth, from early trial & error approaches, to disciplined approaches based on fundamental understanding. The field of software engineering is making the long and arduous journey, accomplished by evolution of thinking in many dimensions. This paper takes the reader along a trio of simultaneous evolutionary paths. First, the reader experiences evolution from a zero-risk mindset to a managed-risk mindset. Along this path, the reader observes three generations of security risk management and their implications for software system assurance. Next is a growth path from separate surety disciplines to an integrated systems surety approach. On the way, the reader visits safety, security, and dependability disciplines and peers into a future vision which coalesces them. The third and final evolutionary path explored here transitions the software engineering field from best practices to fundamental understanding. Along this road, the reader observes a framework for developing a {open_quotes}science behind the engineering{close_quotes}, and methodologies for software surety analysis.

  15. Unisys' experience in software quality and productivity management of an existing system

    NASA Technical Reports Server (NTRS)

    Munson, John B.

    1988-01-01

    A summary of Quality Improvement techniques, implementation, and results in the maintenance, management, and modification of large software systems for the Space Shuttle Program's ground-based systems is provided.

  16. ARROWSMITH-P: A prototype expert system for software engineering management

    NASA Technical Reports Server (NTRS)

    Basili, Victor R.; Ramsey, Connie Loggia

    1985-01-01

    Although the field of software engineering is relatively new, it can benefit from the use of expert systems. Two prototype expert systems were developed to aid in software engineering management. Given the values for certain metrics, these systems will provide interpretations which explain any abnormal patterns of these values during the development of a software project. The two systems, which solve the same problem, were built using different methods, rule-based deduction and frame-based abduction. A comparison was done to see which method was better suited to the needs of this field. It was found that both systems performed moderately well, but the rule-based deduction system using simple rules provided more complete solutions than did the frame-based abduction system.

  17. Software: our quest for excellence. Honoring 50 years of software history, progress, and process

    SciTech Connect

    1997-06-01

    The Software Quality Forum was established by the Software Quality Assurance (SQA) Subcommittee, which serves as a technical advisory group on software engineering and quality initiatives and issues for DOE`s quality managers. The forum serves as an opportunity for all those involved in implementing SQA programs to meet and share ideas and concerns. Participation from managers, quality engineers, and software professionals provides an ideal environment for identifying and discussing issues and concerns. The interaction provided by the forum contributes to the realization of a shared goal--high quality software product. Topics include: testing, software measurement, software surety, software reliability, SQA practices, assessments, software process improvement, certification and licensing of software professionals, CASE tools, software project management, inspections, and management`s role in ensuring SQA. The bulk of this document consists of vugraphs. Selected papers have been indexed separately for inclusion in the Energy Science and Technology Database.

  18. Development of CFD Software To Support the Engineering of Lost Foam Pattern Blowing and Steaming

    SciTech Connect

    Dr. Kenneth A. Williams; Dr. Dale M. Snider

    2003-02-04

    This CFD Project has led to a new commercial software package (Arena-flow-eps) for advanced engineering of lost foam pattern formation. Specifically, the new software models all fluid/particle/thermal phenomena during both the bead-blowing and the pattern-fusing cycles--within a single, integrated computational tool. Engineering analysis with Arena-flow-eps will enable foundries to now obtain desirable foam pattern characteristics in a reliable (consistent) manner, aided by an understanding of the fundamental fluid/thermal physics of the process. This will lead to significant reductions in casting scrap and energy usage, as well as enable future castings to satisfy stringent requirements on high-power-density and low-emissions in tomorrow's automotive and watercraft engines.

  19. The Application of V&V within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward

    1996-01-01

    Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.

  20. Model-based software engineering for an imaging CubeSat and its extrapolation to other missions

    NASA Astrophysics Data System (ADS)

    Mohammad, Atif; Straub, Jeremy; Korvald, Christoffer; Grant, Emanuel

    Small satellites with their limited computational capabilities require that software engineering techniques promote efficient use of spacecraft resources. A model-driven approach to software engineering is an excellent solution to this resource maximization challenge as it facilitates visualization of the key solution processes and data elements.

  1. The Relationship between Job Satisfaction and Intent to Turnover among Software Engineers in the Information Technology Industry

    ERIC Educational Resources Information Center

    Agada, Chuks N.

    2013-01-01

    The focus of this study was to examine the relationship between job satisfaction and intent to turnover among software engineers in the information technology (IT) industry. The population that was analyzed in this study was software engineers in the IT industry to determine whether there is a relationship between job satisfaction and intent to…

  2. Quality Documentation and Records in Engineering Education.

    ERIC Educational Resources Information Center

    Sarin, Sanjiv

    This paper discusses the importance of documentation and records in the administration of a engineering program. The new trends in academic program evaluation for accreditation indicate an ever increasing role of documented procedures and systematic data collection and reporting to support curriculum reform. The paper also recommends a…

  3. Software engineering in medical informatics: the academic hospital as learning environment.

    PubMed

    Prins, H; Cornet, R; van den Berg, F M; van der Togt, R; Abu-Hanna, A

    2002-01-01

    In 2001, the revised course Software Engineering has been implemented in the Medical Informatics curriculum at the Academic Medical Center, Amsterdam. This 13 weeks, full-time course consists of three parts: internship, theory and project. All parts are provided in problem-oriented manner with special attention for relevant skills such as project management, documentation and presentation. During the internship, students observe how health care professionals at several hospital wards work and how information supply is organized. In the theory part, students study concepts and methods of software engineering by means of case descriptions and self-directed learning. During the project, they apply their acquired knowledge to an observed, clinical information problem and complete several stages of the software engineering process. Evaluation by inquiry showed that, compared to other courses, students spent more time, and distributed their time more evenly, during the whole period of the course. In conjunction with theory, a combination of internship and project in a hospital seems to provide a surplus value compared to a practical in a computer laboratory. The integration of software theory, clinical practice and problem-based approach, contributed to the enthusiastic, intensive and realistic way students learned in this important topic that might be chosen as a future profession. PMID:15460769

  4. Factors Associated With Sleep Quality Among Operating Engineers

    PubMed Central

    Choi, Seung Hee; Terrell, Jeffrey E.; Pohl, Joanne M.; Redman, Richard W.

    2016-01-01

    Blue collar workers generally report high job stress and are exposed to loud noises at work and engage in many of the health behavioral factors, all of which have been associated with poor sleep quality. However, sleep quality of blue collar workers has not been studied extensively, and no studies have focused Operating Engineers (heavy equipment operators) among whom daytime fatigue would place them at high risk for accidents. Therefore, the purpose of this study was to determine variables associated with sleep quality among Operating Engineers. This was a cross-sectional survey design with a dependent variable of sleep quality and independent variables of personal and related health behavioral factors. A convenience sample of 498 Operating Engineers was recruited from approximately 16,000 Operating Engineers from entire State of Michigan in 2008. Linear regression was used to determine personal and related health behavior factors associated with sleep quality. Multivariate analyses showed that personal factors related to poor sleep quality were younger age, female sex, higher pain, more medical comorbidities and depressive symptoms and behavioral factors related to poor sleep quality were nicotine dependence. While sleep scores were similar to population norms, approximately 34% (n=143) showed interest in health services for sleep problems. While many personal factors are not changeable, interventions to improve sleep hygiene as well as interventions to treat pain, depression and smoking may improve sleep quality resulting in less absenteeism, fatal work accidents, use of sick leave, work disability, medical comorbidities, as well as subsequent mortality. PMID:23393021

  5. A Framework for Performing V&V within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1996-01-01

    Verification and validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In order to provide early detection of errors, V&V is conducted in parallel with system development, often beginning with the concept phase. In reuse-based software engineering, however, decisions on the requirements, design and even implementation of domain assets can be made prior to beginning development of a specific system. In this case, V&V must be performed during domain engineering in order to have an impact on system development. This paper describes a framework for performing V&V within architecture-centric, reuse-based software engineering. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  6. Health care professional workstation: software system construction using DSSA scenario-based engineering process.

    PubMed

    Hufnagel, S; Harbison, K; Silva, J; Mettala, E

    1994-01-01

    This paper describes a new method for the evolutionary determination of user requirements and system specifications called scenario-based engineering process (SEP). Health care professional workstations are critical components of large scale health care system architectures. We suggest that domain-specific software architectures (DSSAs) be used to specify standard interfaces and protocols for reusable software components throughout those architectures, including workstations. We encourage the use of engineering principles and abstraction mechanisms. Engineering principles are flexible guidelines, adaptable to particular situations. Abstraction mechanisms are simplifications for management of complexity. We recommend object-oriented design principles, graphical structural specifications, and formal components' behavioral specifications. We give an ambulatory care scenario and associated models to demonstrate SEP. The scenario uses health care terminology and gives patients' and health care providers' system views. Our goal is to have a threefold benefit. (i) Scenario view abstractions provide consistent interdisciplinary communications. (ii) Hierarchical object-oriented structures provide useful abstractions for reuse, understandability, and long term evolution. (iii) SEP and health care DSSA integration into computer aided software engineering (CASE) environments. These environments should support rapid construction and certification of individualized systems, from reuse libraries. PMID:8125652

  7. Software Facilitates Sharing of Water Quality Data Worldwide

    NASA Technical Reports Server (NTRS)

    2015-01-01

    John Freighery was an environmental engineer at Johnson Space Center when a new, simplified version of the coliform bacteria test was developed for astronaut use on the International Space Station. Through his New York City-based mWater Foundation, Freighery is using the test to help rural communities monitor their water supplies for contamination. The organization has also developed a mobile phone app to make the information publicly available.

  8. Contemporary issues in HIM. Software engineering--what does it mean to you?

    PubMed

    Wear, L L

    1994-02-01

    There have been significant advances in the way we develop software in the last two decades. Many companies are using the new process oriented approach to software development. Companies that use the new techniques and tools have reported improvements in both productivity and quality, but there are still companies developing software the way we did 30 years ago. If you saw the movie Jurassic Park, you saw the perfect way not to develop software. The programmer in the movie was the only person who knew the details of the system. No processes were followed, and there was no documentation. This was an absolutely perfect prescription for failure. Some of you are probably familiar with the term hacker which describes a person who spends hours sitting at a terminal hacking out code. Hackers have created some outstanding software products, but with today's complex systems, most companies are trying to get away from their dependence on hackers. They are instead turning to the process-oriented approach. When selecting software vendors, don't just look at the functionality of a product. Try to determine how the vendor develops software, and determine if you are dealing with hackers or a process-driven company. In the long run, you should get better, more reliable products from the latter. PMID:10132037

  9. Environmental engineering education: examples of accreditation and quality assurance

    NASA Astrophysics Data System (ADS)

    Caporali, E.; Catelani, M.; Manfrida, G.; Valdiserri, J.

    2013-12-01

    Environmental engineers respond to the challenges posed by a growing population, intensifying land-use pressures, natural resources exploitation as well as rapidly evolving technology. The environmental engineer must develop technically sound solutions within the framework of maintaining or improving environmental quality, complying with public policy, and optimizing the utilization of resources. The engineer provides system and component design, serves as a technical advisor in policy making and legal deliberations, develops management schemes for resources, and provides technical evaluations of systems. Through the current work of environmental engineers, individuals and businesses are able to understand how to coordinate society's interaction with the environment. There will always be a need for engineers who are able to integrate the latest technologies into systems to respond to the needs for food and energy while protecting natural resources. In general, the environment-related challenges and problems need to be faced at global level, leading to the globalization of the engineering profession which requires not only the capacity to communicate in a common technical language, but also the assurance of an adequate and common level of technical competences, knowledge and understanding. In this framework, the Europe-based EUR ACE (European Accreditation of Engineering Programmes) system, currently operated by ENAEE - European Network for Accreditation of Engineering Education can represent the proper framework and accreditation system in order to provide a set of measures to assess the quality of engineering degree programmes in Europe and abroad. The application of the accreditation model EUR-ACE, and of the National Italian Degree Courses Accreditation System, promoted by the Italian National Agency for the Evaluation of Universities and Research Institutes (ANVUR), to the Environmental Engineering Degree Courses at the University of Firenze is presented. In

  10. A Stirling engine for use with lower quality fuels

    NASA Astrophysics Data System (ADS)

    Paul, Christopher J.

    There is increasing interest in using renewable fuels from biomass or alternative fuels such as municipal waste to reduce the need for fossil based fuels. Due to the lower heating values and higher levels of impurities, small scale electricity generation is more problematic. Currently, there are not many technologically mature options for small scale electricity generation using lower quality fuels. Even though there are few manufacturers of Stirling engines, the history of their development for two centuries offers significant guidance in developing a viable small scale generator set using lower quality fuels. The history, development, and modeling of Stirling engines were reviewed to identify possible model and engine configurations. A Stirling engine model based on the finite volume, ideal adiabatic model was developed. Flow dissipation losses are shown to need correcting as they increase significantly at low mean engine pressure and high engine speed. The complete engine including external components was developed. A simple yet effective method of evaluating the external heat transfer to the Stirling engine was created that can be used with any second order Stirling engine model. A derivative of the General Motors Ground Power Unit 3 was designed. By significantly increasing heater, cooler and regenerator size at the expense of increased dead volume, and adding a combustion gas recirculation, a generator set with good efficiency was designed.

  11. Software quality and process improvement in scientific simulation codes

    SciTech Connect

    Ambrosiano, J.; Webster, R.

    1997-11-01

    This report contains viewgraphs on the quest to develope better simulation code quality through process modeling and improvement. This study is based on the experience of the authors and interviews with ten subjects chosen from simulation code development teams at LANL. This study is descriptive rather than scientific.

  12. Development of Software to Model AXAF-I Image Quality

    NASA Technical Reports Server (NTRS)

    Geary, Joseph; Hawkins, Lamar; Ahmad, Anees; Gong, Qian

    1997-01-01

    This report describes work conducted on Delivery Order 181 between October 1996 through June 1997. During this period software was written to: compute axial PSD's from RDOS AXAF-I mirror surface maps; plot axial surface errors and compute PSD's from HDOS "Big 8" axial scans; plot PSD's from FITS format PSD files; plot band-limited RMS vs axial and azimuthal position for multiple PSD files; combine and organize PSD's from multiple mirror surface measurements formatted as input to GRAZTRACE; modify GRAZTRACE to read FITS formatted PSD files; evaluate AXAF-I test results; improve and expand the capabilities of the GT x-ray mirror analysis package. During this period work began on a more user-friendly manual for the GT program, and improvements were made to the on-line help manual.

  13. Developing Engineering and Science Process Skills Using Design Software in an Elementary Education

    NASA Astrophysics Data System (ADS)

    Fusco, Christopher

    This paper examines the development of process skills through an engineering design approach to instruction in an elementary lesson that combines Science, Technology, Engineering, and Math (STEM). The study took place with 25 fifth graders in a public, suburban school district. Students worked in groups of five to design and construct model bridges based on research involving bridge building design software. The assessment was framed around individual student success as well as overall group processing skills. These skills were assessed through an engineering design packet rubric (student work), student surveys of learning gains, observation field notes, and pre- and post-assessment data. The results indicate that students can successfully utilize design software to inform constructions of model bridges, develop science process skills through problem based learning, and understand academic concepts through a design project. The final result of this study shows that design engineering is effective for developing cooperative learning skills. The study suggests that an engineering program offered as an elective or as part of the mandatory curriculum could be beneficial for developing students' critical thinking, inter- and intra-personal skills, along with an increased their understanding and awareness for scientific phenomena. In conclusion, combining a design approach to instruction with STEM can increase efficiency in these areas, generate meaningful learning, and influence student attitudes throughout their education.

  14. Assess/Mitigate Risk through the Use of Computer-Aided Software Engineering (CASE) Tools

    NASA Technical Reports Server (NTRS)

    Aguilar, Michael L.

    2013-01-01

    The NASA Engineering and Safety Center (NESC) was requested to perform an independent assessment of the mitigation of the Constellation Program (CxP) Risk 4421 through the use of computer-aided software engineering (CASE) tools. With the cancellation of the CxP, the assessment goals were modified to capture lessons learned and best practices in the use of CASE tools. The assessment goal was to prepare the next program for the use of these CASE tools. The outcome of the assessment is contained in this document.

  15. Quality Management in Astronomical Software and Data Systems

    NASA Astrophysics Data System (ADS)

    Radziwill, N. M.

    2007-10-01

    As the demand for more sophisticated facilities increases, the complexity of the technical and organizational challenges faced by operational space- and ground-based telescopes also increases. In many organizations, funding tends not to be proportional to this trend, and steps must be taken to cultivate a lean environment in both development and operations to consistently do more with less. To facilitate this transition, an organization must be aware of how it can meet quality-related goals, such as reducing variation, improving productivity of people and systems, streamlining processes, ensuring compliance with requirements (scientific, organizational, project, or regulatory), and increasing user satisfaction. Several organizations are already on this path. Quality-based techniques for the efficient, effective development of new telescope facilities and maintenance of existing facilities are described.

  16. Concept document of the repository-based software engineering program: A constructive appraisal

    NASA Technical Reports Server (NTRS)

    1992-01-01

    A constructive appraisal of the Concept Document of the Repository-Based Software Engineering Program is provided. The Concept Document is designed to provide an overview of the Repository-Based Software Engineering (RBSE) Program. The Document should be brief and provide the context for reading subsequent requirements and product specifications. That is, all requirements to be developed should be traceable to the Concept Document. Applied Expertise's analysis of the Document was directed toward assuring that: (1) the Executive Summary provides a clear, concise, and comprehensive overview of the Concept (rewrite as necessary); (2) the sections of the Document make best use of the NASA 'Data Item Description' for concept documents; (3) the information contained in the Document provides a foundation for subsequent requirements; and (4) the document adequately: identifies the problem being addressed; articulates RBSE's specific role; specifies the unique aspects of the program; and identifies the nature and extent of the program's users.

  17. The component-based architecture of the HELIOS medical software engineering environment.

    PubMed

    Degoulet, P; Jean, F C; Engelmann, U; Meinzer, H P; Baud, R; Sandblad, B; Wigertz, O; Le Meur, R; Jagermann, C

    1994-12-01

    The constitution of highly integrated health information networks and the growth of multimedia technologies raise new challenges for the development of medical applications. We describe in this paper the general architecture of the HELIOS medical software engineering environment devoted to the development and maintenance of multimedia distributed medical applications. HELIOS is made of a set of software components, federated by a communication channel called the HELIOS Unification Bus. The HELIOS kernel includes three main components, the Analysis-Design and Environment, the Object Information System and the Interface Manager. HELIOS services consist in a collection of toolkits providing the necessary facilities to medical application developers. They include Image Related services, a Natural Language Processor, a Decision Support System and Connection services. The project gives special attention to both object-oriented approaches and software re-usability that are considered crucial steps towards the development of more reliable, coherent and integrated applications. PMID:7882667

  18. Engineering in software testing: statistical testing based on a usage model applied to medical device development.

    PubMed

    Jones, P L; Swain, W T; Trammell, C J

    1999-01-01

    When a population is too large for exhaustive study, as is the case for all possible uses of a software system, a statistically correct sample must be drawn as a basis for inferences about the population. A Markov chain usage model is an engineering formalism that represents the population of possible uses for which a product is to be tested. In statistical testing of software based on a Markov chain usage model, the rich body of analytical results available for Markov chains provides numerous insights that can be used in both product development and test planing. A usage model is based on specifications rather than code, so insights that result from model building can inform product decisions in the early stages of a project when the opportunity to prevent problems is the greatest. Statistical testing based on a usage model provides a sound scientific basis for quantifying the reliability of software. PMID:10459417

  19. Use of general purpose mechanical computer assisted engineering software in orthopaedic surgical planning: advantages and limitations.

    PubMed

    Sutherland, C J; Bresina, S J; Gayou, D E

    1994-01-01

    Two surgical plans were developed for an appropriately complex reconstructive orthopaedic surgery case. One plan was developed with customary methods using two-dimensional (2D) radiographs. The second plan was developed with general purpose mechanical computer assisted engineering (MCAE) software using x-ray computed tomography (CT) data. The limitations of each method are identified. To create a surgical plan using three-dimensional (3D) medical datasets and MCAE software, five necessary steps were identified: (a) data reduction; (b) contour extraction; (c) 3D model creation; (d) extraction of mass properties; (e) model idealization. The principal limitation of general purpose MCAE software is the lack of pre-processing modules with which to address the unique requirements of medical image datasets. PMID:7850738

  20. An Advanced Educational Program for Software Design Engineering at Graduate School of Information Science and Technology of Osaka University

    NASA Astrophysics Data System (ADS)

    Masuzawa, Toshimitsu; Inoue, Katsuro; Murakami, Koso; Fujiwara, Toru; Nishio, Shojiro

    This paper gives an overview of an advanced educational program for software design engineering that is currently conducted at Graduate School of Information Science and Technology, Osaka University under the grant “ Initiatives for Attractive Education in Graduate Schools” from MEXT. Software design engineering is highly expected to play a critical role in winning success in designing the next-generation software systems. The aim of the program is to bring up young researchers with the latest design methodologies and practical design experience, who can pioneer the frontier of software design engineering. The program is conducted with the collaboration of industries that have rich practical experience and are facing the engineering problems to be solved in developing the next-generation software.

  1. Software Engineering Laboratory (SEL) database organization and user's guide, revision 2

    NASA Technical Reports Server (NTRS)

    Morusiewicz, Linda; Bristow, John

    1992-01-01

    The organization of the Software Engineering Laboratory (SEL) database is presented. Included are definitions and detailed descriptions of the database tables and views, the SEL data, and system support data. The mapping from the SEL and system support data to the base table is described. In addition, techniques for accessing the database through the Database Access Manager for the SEL (DAMSEL) system and via the ORACLE structured query language (SQL) are discussed.

  2. Configuration management plan. System definition and project development. Repository Based Software Engineering (RBSE) program

    NASA Technical Reports Server (NTRS)

    Mckay, Charles

    1991-01-01

    This is the configuration management Plan for the AdaNet Repository Based Software Engineering (RBSE) contract. This document establishes the requirements and activities needed to ensure that the products developed for the AdaNet RBSE contract are accurately identified, that proposed changes to the product are systematically evaluated and controlled, that the status of all change activity is known at all times, and that the product achieves its functional performance requirements and is accurately documented.

  3. Using a Formal Approach for Reverse Engineering and Design Recovery to Support Software Reuse

    NASA Technical Reports Server (NTRS)

    Gannod, Gerald C.

    2002-01-01

    This document describes 3rd year accomplishments and summarizes overall project accomplishments. Included as attachments are all published papers from year three. Note that the budget for this project was discontinued after year two, but that a residual budget from year two allowed minimal continuance into year three. Accomplishments include initial investigations into log-file based reverse engineering, service-based software reuse, and a source to XML generator.

  4. Domain ontologies in software engineering: use of Protégé with the EON architecture.

    PubMed

    Musen, M A

    1998-11-01

    Domain ontologies are formal descriptions of the classes of concepts and the relationships among those concepts that describe an application area. The Protégé software-engineering methodology provides a clear division between domain ontologies and domain-independent problem-solvers that, when mapped to domain ontologies, can solve application tasks. The Protégé approach allows domain ontologies to inform the total software-engineering process, and for ontologies to be shared among a variety of problem-solving components. We illustrate the approach by describing the development of EON, a set of middleware components that automate various aspects of protocol-directed therapy. Our work illustrates the organizing effect that domain ontologies can have on the software-development process. Ontologies, like all formal representations, have limitations in their ability to capture the semantics of application areas. Nevertheless, the capability of ontologies to encode clinical distinctions not usually captured by controlled medical terminologies provides significant advantages for developers and maintainers of clinical software applications. PMID:9865052

  5. Factors that Influence First-Career Choice of Undergraduate Engineers in Software Services Companies: A South Indian Experience

    ERIC Educational Resources Information Center

    Gokuladas, V. K.

    2010-01-01

    Purpose: The purpose of this paper is to identify how undergraduate engineering students differ in their perception about software services companies in India based on variables like gender, locations of the college and branches of engineering. Design/methodology/approach: Data obtained from 560 undergraduate engineering students who had the…

  6. SOFTWARE QUALITY ASSURANCE FOR EMERGENCY RESPONSE CONSEQUENCE ASSESSMENT MODELS AT DOE'S SAVANNAH RIVER SITE

    SciTech Connect

    Hunter, C

    2007-12-17

    The Savannah River National Laboratory's (SRNL) Atmospheric Technologies Group develops, maintains, and operates computer-based software applications for use in emergency response consequence assessment at DOE's Savannah River Site. These applications range from straightforward, stand-alone Gaussian dispersion models run with simple meteorological input to complex computational software systems with supporting scripts that simulate highly dynamic atmospheric processes. A software quality assurance program has been developed to ensure appropriate lifecycle management of these software applications. This program was designed to meet fully the overall structure and intent of SRNL's institutional software QA programs, yet remain sufficiently practical to achieve the necessary level of control in a cost-effective manner. A general overview of this program is described.

  7. Space Shuttle Program Primary Avionics Software System (PASS) Success Legacy - Quality and Reliability Date

    NASA Technical Reports Server (NTRS)

    Orr, James K.; Peltier, Daryl

    2010-01-01

    Thsi slide presentation reviews the avionics software system on board the space shuttle, with particular emphasis on the quality and reliability. The Primary Avionics Software System (PASS) provides automatic and fly-by-wire control of critical shuttle systems which executes in redundant computers. Charts given show the number of space shuttle flights vs time, PASS's development history, and other charts that point to the reliability of the system's development. The reliability of the system is also compared to predicted reliability.

  8. Implementation of Oak Ridge National Laboratory Software Quality Assurance Requirements for COMSOL 3.4

    SciTech Connect

    Freels, James D

    2008-01-01

    It is desirable for nuclear safety-related calculations to be performed at the Oak Ridge National Laboratory (ORNL) using COMSOL. The Department of Energy (DOE) has mandated that ORNL will incorporate software quality assurance (SQA) with special attention to nuclear safety-related software applications. The author has developed a procedure for implementing these DOE-mandated SQA requirements on nuclear safety-related software applicable to the High Flux Isotope Reactor (HFIR). This paper will describe how this procedure will be implemented for COMSOL so that nuclear safety-related calculations may be performed.

  9. Measuring the impact of computer resource quality on the software development process and product

    NASA Technical Reports Server (NTRS)

    Mcgarry, Frank; Valett, Jon; Hall, Dana

    1985-01-01

    The availability and quality of computer resources during the software development process was speculated to have measurable, significant impact on the efficiency of the development process and the quality of the resulting product. Environment components such as the types of tools, machine responsiveness, and quantity of direct access storage may play a major role in the effort to produce the product and in its subsequent quality as measured by factors such as reliability and ease of maintenance. During the past six years, the NASA Goddard Space Flight Center has conducted experiments with software projects in an attempt to better understand the impact of software development methodologies, environments, and general technologies on the software process and product. Data was extracted and examined from nearly 50 software development projects. All were related to support of satellite flight dynamics ground-based computations. The relationship between computer resources and the software development process and product as exemplified by the subject NASA data was examined. Based upon the results, a number of computer resource-related implications are provided.

  10. Analyst Tools and Quality Control Software for the ARM Data System

    SciTech Connect

    Moore, S.T.

    2004-12-14

    ATK Mission Research develops analyst tools and automated quality control software in order to assist the Atmospheric Radiation Measurement (ARM) Data Quality Office with their data inspection tasks. We have developed a web-based data analysis and visualization tool, called NCVweb, that allows for easy viewing of ARM NetCDF files. NCVweb, along with our library of sharable Interactive Data Language procedures and functions, allows even novice ARM researchers to be productive with ARM data with only minimal effort. We also contribute to the ARM Data Quality Office by analyzing ARM data streams, developing new quality control metrics, new diagnostic plots, and integrating this information into DQ HandS - the Data Quality Health and Status web-based explorer. We have developed several ways to detect outliers in ARM data streams and have written software to run in an automated fashion to flag these outliers.

  11. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  12. Multidimensional optimization of fusion reactors using heterogenous codes and engineering software

    NASA Astrophysics Data System (ADS)

    Hartwig, Zachary; Olynyk, Geoffrey; Whyte, Dennis

    2012-10-01

    Magnetic confinement fusion reactors are tightly coupled systems. The parameters under a designer's control, such as magnetic field, wall temperature, and blanket thickness, simultaneously affect the behavior, performance, and components of the reactor, leading to complex tradeoffs and design optimizations. In addition, the engineering analyses require non-trivial, self-consistent inputs, such as reactor geometry, to ensure high fidelity between the various physics and engineering design codes. We present a framework for analysis and multidimensional optimization of fusion reactor systems based on the coupling of heterogeneous codes and engineering software. While this approach is widely used in industry, most code-coupling efforts in fusion have been focused on plasma and edge physics. Instead, we use a simplified plasma model to concentrate on how fusion neutrons and heat transfer affect the design of the first wall, breeding blanket, and magnet systems. The framework combines solid modeling, neutronics, and engineering multiphysics codes and software, linked across Windows and Linux clusters. Initial results for optimizing the design of a compact, high-field tokamak reactor based on high-temperature demountable superconducting coils and a liquid blanket are presented.

  13. Digital reconstructed radiography quality control with software methods

    NASA Astrophysics Data System (ADS)

    Denis, Eloise; Beaumont, Stephane; Guedon, JeanPierre

    2005-04-01

    Nowadays, most of treatments for external radiotherapy are prepared with Treatment Planning Systems (TPS) which uses a virtual patient generated by a set of transverse slices acquired with a CT scanner of the patient in treatment position 1 2 3. In the first step of virtual simulation, the TPS is used to define a ballistic allowing a good target covering and the lowest irradiation for normal tissues. This parameters optimisation of the treatment with the TPS is realised with particular graphic tools allowing to: ×Contour the target, ×Expand the limit of the target in order to take into account contouring uncertainties, patient set up errors, movements of the target during the treatment (internal movement of the target and external movement of the patient), and beam's penumbra, ×Determine beams orientation and define dimensions and forms of the beams, ×Visualize beams on the patient's skin and calculate some characteristic points which will be tattooed on the patient to assist the patient set up before treating, ×Calculate for each beam a Digital Reconstructed Radiography (DRR) consisting in projecting the 3D CT virtual patient and beam limits with a cone beam geometry onto a plane. These DRR allow one for insuring the patient positioning during the treatment, essentially bone structures alignment by comparison with real radiography realized with the treatment X-ray source in the same geometric conditions (portal imaging). Then DRR are preponderant to insure the geometric accuracy of the treatment. For this reason quality control of its computation is mandatory4 . Until now, this control is realised with real test objects including some special inclusions4 5 . This paper proposes to use some numerical test objects to control the quality DRR calculation in terms of computation time, beam angle, divergence and magnification precision, spatial and contrast resolutions. The main advantage of this proposed method is to avoid a real test object CT acquisition

  14. Application of Domain Knowledge to Software Quality Assurance

    NASA Technical Reports Server (NTRS)

    Wild, Christian W.

    1997-01-01

    This work focused on capturing, using, and evolving a qualitative decision support structure across the life cycle of a project. The particular application of this study was towards business process reengineering and the representation of the business process in a set of Business Rules (BR). In this work, we defined a decision model which captured the qualitative decision deliberation process. It represented arguments both for and against proposed alternatives to a problem. It was felt that the subjective nature of many critical business policy decisions required a qualitative modeling approach similar to that of Lee and Mylopoulos. While previous work was limited almost exclusively to the decision capture phase, which occurs early in the project life cycle, we investigated the use of such a model during the later stages as well. One of our significant developments was the use of the decision model during the operational phase of a project. By operational phase, we mean the phase in which the system or set of policies which were earlier decided are deployed and put into practice. By making the decision model available to operational decision makers, they would have access to the arguments pro and con for a variety of actions and can thus make a more informed decision which balances the often conflicting criteria by which the value of action is measured. We also developed the concept of a 'monitored decision' in which metrics of performance were identified during the decision making process and used to evaluate the quality of that decision. It is important to monitor those decision which seem at highest risk of not meeting their stated objectives. Operational decisions are also potentially high risk decisions. Finally, we investigated the use of performance metrics for monitored decisions and audit logs of operational decisions in order to feed an evolutionary phase of the the life cycle. During evolution, decisions are revisisted, assumptions verified or refuted

  15. The (mis)use of subjective process measures in software engineering

    NASA Technical Reports Server (NTRS)

    Valett, Jon D.; Condon, Steven E.

    1993-01-01

    A variety of measures are used in software engineering research to develop an understanding of the software process and product. These measures fall into three broad categories: quantitative, characteristics, and subjective. Quantitative measures are those to which a numerical value can be assigned, for example effort or lines of code (LOC). Characteristics describe the software process or product; they might include programming language or the type of application. While such factors do not provide a quantitative measurement of a process or product, they do help characterize them. Subjective measures (as defined in this study) are those that are based on the opinion or opinions of individuals; they are somewhat unique and difficult to quantify. Capturing of subjective measure data typically involves development of some type of scale. For example, 'team experience' is one of the subjective measures that were collected and studied by the Software Engineering Laboratory (SEL). Certainly, team experience could have an impact on the software process or product; actually measuring a team's experience, however, is not a strictly mathematical exercise. Simply adding up each team member's years of experience appears inadequate. In fact, most researchers would agree that 'years' do not directly translate into 'experience.' Team experience must be defined subjectively and then a scale must be developed e.g., high experience versus low experience; or high, medium, low experience; or a different or more granular scale. Using this type of scale, a particular team's overall experience can be compared with that of other teams in the development environment. Defining, collecting, and scaling subjective measures is difficult. First, precise definitions of the measures must be established. Next, choices must be made about whose opinions will be solicited to constitute the data. Finally, care must be given to defining the right scale and level of granularity for measurement.

  16. When Spreadsheets Become Software - Quality Control Challenges and Approaches - 13360

    SciTech Connect

    Fountain, Stefanie A.; Chen, Emmie G.; Beech, John F.; Wyatt, Elizabeth E.; Quinn, Tanya B.; Seifert, Robert W.

    2013-07-01

    As part of a preliminary waste acceptance criteria (PWAC) development, several commercial models were employed, including the Hydrologic Evaluation of Landfill Performance model (HELP) [1], the Disposal Unit Source Term - Multiple Species model (DUSTMS) [2], and the Analytical Transient One, Two, and Three-Dimensional model (AT123D) [3]. The results of these models were post-processed in MS Excel spreadsheets to convert the model results to alternate units, compare the groundwater concentrations to the groundwater concentration thresholds, and then to adjust the waste contaminant masses (based on average concentration over the waste volume) as needed in an attempt to achieve groundwater concentrations at the limiting point of assessment that would meet the compliance concentrations while maximizing the potential use of the landfill (i.e., maximizing the volume of projected waste being generated that could be placed in the landfill). During the course of the PWAC calculation development, one of the Microsoft (MS) Excel spreadsheets used to post-process the results of the commercial model packages grew to include more than 575,000 formulas across 18 worksheets. This spreadsheet was used to assess six base scenarios as well as nine uncertainty/sensitivity scenarios. The complexity of the spreadsheet resulted in the need for a rigorous quality control (QC) procedure to verify data entry and confirm the accuracy of formulas. (authors)

  17. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1986-01-01

    Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.

  18. Designing the modern pump: engineering aspects of continuous subcutaneous insulin infusion software.

    PubMed

    Welsh, John B; Vargas, Steven; Williams, Gary; Moberg, Sheldon

    2010-06-01

    Insulin delivery systems attracted the efforts of biological, mechanical, electrical, and software engineers well before they were commercially viable. The introduction of the first commercial insulin pump in 1983 represents an enduring milestone in the history of diabetes management. Since then, pumps have become much more than motorized syringes and have assumed a central role in diabetes management by housing data on insulin delivery and glucose readings, assisting in bolus estimation, and interfacing smoothly with humans and compatible devices. Ensuring the integrity of the embedded software that controls these devices is critical to patient safety and regulatory compliance. As pumps and related devices evolve, software engineers will face challenges and opportunities in designing pumps that are safe, reliable, and feature-rich. The pumps and related systems must also satisfy end users, healthcare providers, and regulatory authorities. In particular, pumps that are combined with glucose sensors and appropriate algorithms will provide the basis for increasingly safe and precise automated insulin delivery-essential steps to developing a fully closed-loop system. PMID:20515305

  19. Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications

    NASA Technical Reports Server (NTRS)

    OKeefe, Matthew (Editor); Kerr, Christopher L. (Editor)

    1998-01-01

    This report contains the abstracts and technical papers from the Second International Workshop on Software Engineering and Code Design in Parallel Meteorological and Oceanographic Applications, held June 15-18, 1998, in Scottsdale, Arizona. The purpose of the workshop is to bring together software developers in meteorology and oceanography to discuss software engineering and code design issues for parallel architectures, including Massively Parallel Processors (MPP's), Parallel Vector Processors (PVP's), Symmetric Multi-Processors (SMP's), Distributed Shared Memory (DSM) multi-processors, and clusters. Issues to be discussed include: (1) code architectures for current parallel models, including basic data structures, storage allocation, variable naming conventions, coding rules and styles, i/o and pre/post-processing of data; (2) designing modular code; (3) load balancing and domain decomposition; (4) techniques that exploit parallelism efficiently yet hide the machine-related details from the programmer; (5) tools for making the programmer more productive; and (6) the proliferation of programming models (F--, OpenMP, MPI, and HPF).

  20. Development of Management Methodology for Engineering Production Quality

    NASA Astrophysics Data System (ADS)

    Gorlenko, O.; Miroshnikov, V.; Borbatc, N.

    2016-04-01

    The authors of the paper propose four directions of the methodology developing the quality management of engineering products that implement the requirements of new international standard ISO 9001:2015: the analysis of arrangement context taking into account stakeholders, the use of risk management, management of in-house knowledge, assessment of the enterprise activity according to the criteria of effectiveness