Sample records for kaeri software verification

  1. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  2. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  3. 77 FR 50723 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., ``Verification, Validation, Reviews, and Audits for Digital Computer Software used in Safety Systems of Nuclear... NRC regulations promoting the development of, and compliance with, software verification and...

  4. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  5. Integrated testing and verification system for research flight software

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  6. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  7. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  8. 78 FR 1162 - Cardiovascular Devices; Reclassification of External Cardiac Compressor

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-08

    ... safety and electromagnetic compatibility; For devices containing software, software verification... electromagnetic compatibility; For devices containing software, software verification, validation, and hazard... electrical components, appropriate analysis and testing must validate electrical safety and electromagnetic...

  9. Orbit attitude processor. STS-1 bench program verification test plan

    NASA Technical Reports Server (NTRS)

    Mcclain, C. R.

    1980-01-01

    A plan for the static verification of the STS-1 ATT PROC ORBIT software requirements is presented. The orbit version of the SAPIENS bench program is used to generate the verification data. A brief discussion of the simulation software and flight software modules is presented along with a description of the test cases.

  10. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  11. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  12. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  13. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT; ENVIRONMENTAL DECISION SUPPORT SOFTWARE; ENVIRONMENTAL SOFTWARE SITEPRO VERSION 2.0"

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  15. Security Verification Techniques Applied to PatchLink COTS Software

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer

    2006-01-01

    Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.

  16. A hardware-software system for the automation of verification and calibration of oil metering units secondary equipment

    NASA Astrophysics Data System (ADS)

    Boyarnikov, A. V.; Boyarnikova, L. V.; Kozhushko, A. A.; Sekachev, A. F.

    2017-08-01

    In the article the process of verification (calibration) of oil metering units secondary equipment is considered. The purpose of the work is to increase the reliability and reduce the complexity of this process by developing a software and hardware system that provides automated verification and calibration. The hardware part of this complex carries out the commutation of the measuring channels of the verified controller and the reference channels of the calibrator in accordance with the introduced algorithm. The developed software allows controlling the commutation of channels, setting values on the calibrator, reading the measured data from the controller, calculating errors and compiling protocols. This system can be used for checking the controllers of the secondary equipment of the oil metering units in the automatic verification mode (with the open communication protocol) or in the semi-automatic verification mode (without it). The peculiar feature of the approach used is the development of a universal signal switch operating under software control, which can be configured for various verification methods (calibration), which allows to cover the entire range of controllers of metering units secondary equipment. The use of automatic verification with the help of a hardware and software system allows to shorten the verification time by 5-10 times and to increase the reliability of measurements, excluding the influence of the human factor.

  17. Guidance and Control Software Project Data - Volume 3: Verification Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.

  18. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  19. An ontology based trust verification of software license agreement

    NASA Astrophysics Data System (ADS)

    Lu, Wenhuan; Li, Xiaoqing; Gan, Zengqin; Wei, Jianguo

    2017-08-01

    When we install software or download software, there will show up so big mass document to state the rights and obligations, for which lots of person are not patient to read it or understand it. That would may make users feel distrust for the software. In this paper, we propose an ontology based verification for Software License Agreement. First of all, this work proposed an ontology model for domain of Software License Agreement. The domain ontology is constructed by proposed methodology according to copyright laws and 30 software license agreements. The License Ontology can act as a part of generalized copyright law knowledge model, and also can work as visualization of software licenses. Based on this proposed ontology, a software license oriented text summarization approach is proposed which performances showing that it can improve the accuracy of software licenses summarizing. Based on the summarization, the underline purpose of the software license can be explicitly explored for trust verification.

  20. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  1. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  2. FORMED: Bringing Formal Methods to the Engineering Desktop

    DTIC Science & Technology

    2016-02-01

    integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification

  3. Program Model Checking as a New Trend

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Visser, Willem; Clancy, Daniel (Technical Monitor)

    2002-01-01

    This paper introduces a special section of STTT (International Journal on Software Tools for Technology Transfer) containing a selection of papers that were presented at the 7th International SPIN workshop, Stanford, August 30 - September 1, 2000. The workshop was named SPIN Model Checking and Software Verification, with an emphasis on model checking of programs. The paper outlines the motivation for stressing software verification, rather than only design and model verification, by presenting the work done in the Automated Software Engineering group at NASA Ames Research Center within the last 5 years. This includes work in software model checking, testing like technologies and static analysis.

  4. Software Model Checking Without Source Code

    NASA Technical Reports Server (NTRS)

    Chaki, Sagar; Ivers, James

    2009-01-01

    We present a framework, called AIR, for verifying safety properties of assembly language programs via software model checking. AIR extends the applicability of predicate abstraction and counterexample guided abstraction refinement to the automated verification of low-level software. By working at the assembly level, AIR allows verification of programs for which source code is unavailable-such as legacy and COTS software-and programs that use features-such as pointers, structures, and object-orientation-that are problematic for source-level software verification tools. In addition, AIR makes no assumptions about the underlying compiler technology. We have implemented a prototype of AIR and present encouraging results on several non-trivial examples.

  5. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  6. An assessment of space shuttle flight software development processes

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.

  7. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  8. Software development for airborne radar

    NASA Astrophysics Data System (ADS)

    Sundstrom, Ingvar G.

    Some aspects for development of software in a modern multimode airborne nose radar are described. First, an overview of where software is used in the radar units is presented. The development phases-system design, functional design, detailed design, function verification, and system verification-are then used as the starting point for the discussion. Methods, tools, and the most important documents are described. The importance of video flight recording in the early stages and use of a digital signal generators for performance verification is emphasized. Some future trends are discussed.

  9. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  10. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  11. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  12. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  13. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  14. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  15. Using Academia-Industry Partnerships to Enhance Software Verification & Validation Education via Active Learning Tools

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter

    2017-01-01

    Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…

  16. Integrated testing and verification system for research flight software design document

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  17. Virtual Platform for See Robustness Verification of Bootloader Embedded Software on Board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Da Silva, A.; Sánchez Prieto, S.; Polo, O.; Parra Espada, P.

    2013-05-01

    Because of the tough robustness requirements in space software development, it is imperative to carry out verification tasks at a very early development stage to ensure that the implemented exception mechanisms work properly. All this should be done long time before the real hardware is available. But even if real hardware is available the verification of software fault tolerance mechanisms can be difficult since real faulty situations must be systematically and artificially brought about which can be imposible on real hardware. To solve this problem the Alcala Space Research Group (SRG) has developed a LEON2 virtual platform (Leon2ViP) with fault injection capabilities. This way it is posible to run the exact same target binary software as runs on the physical system in a more controlled and deterministic environment, allowing a more strict requirements verification. Leon2ViP enables unmanned and tightly focused fault injection campaigns, not possible otherwise, in order to expose and diagnose flaws in the software implementation early. Furthermore, the use of a virtual hardware-in-the-loop approach makes it possible to carry out preliminary integration tests with the spacecraft emulator or the sensors. The use of Leon2ViP has meant a signicant improvement, in both time and cost, in the development and verification processes of the Instrument Control Unit boot software on board Solar Orbiter's Energetic Particle Detector.

  18. Using software security analysis to verify the secure socket layer (SSL) protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2004-01-01

    nal Aeronautics and Space Administration (NASA) have tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information the3, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach '' offers, among its capabilities, formal verification of software security properties, through the use of model based verification (MBV) to address software security risks. [1,2,3,4,5,6] MBV is a formal approach to software assurance that combines analysis of software, via abstract models, with technology, such as model checkers, that provide automation of the mechanical portions of the analysis process. This paper will discuss: The need for formal analysis to assure software systems with respect to software and why testing alone cannot provide it. The means by which MBV with a Flexible Modeling Framework (FMF) accomplishes the necessary analysis task. An example of FMF style MBV in the verification of properties over the Secure Socket Layer (SSL) communication protocol as a demonstration.

  19. The FoReVer Methodology: A MBSE Framework for Formal Verification

    NASA Astrophysics Data System (ADS)

    Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald

    2013-08-01

    The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.

  20. How Nasa's Independent Verification and Validation (IVandV) Program Builds Reliability into a Space Mission Software System (SMSS)

    NASA Technical Reports Server (NTRS)

    Fisher, Marcus S.; Northey, Jeffrey; Stanton, William

    2014-01-01

    The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.

  1. A methodology for model-based development and automated verification of software for aerospace systems

    NASA Astrophysics Data System (ADS)

    Martin, L.; Schatalov, M.; Hagner, M.; Goltz, U.; Maibaum, O.

    Today's software for aerospace systems typically is very complex. This is due to the increasing number of features as well as the high demand for safety, reliability, and quality. This complexity also leads to significant higher software development costs. To handle the software complexity, a structured development process is necessary. Additionally, compliance with relevant standards for quality assurance is a mandatory concern. To assure high software quality, techniques for verification are necessary. Besides traditional techniques like testing, automated verification techniques like model checking become more popular. The latter examine the whole state space and, consequently, result in a full test coverage. Nevertheless, despite the obvious advantages, this technique is rarely yet used for the development of aerospace systems. In this paper, we propose a tool-supported methodology for the development and formal verification of safety-critical software in the aerospace domain. The methodology relies on the V-Model and defines a comprehensive work flow for model-based software development as well as automated verification in compliance to the European standard series ECSS-E-ST-40C. Furthermore, our methodology supports the generation and deployment of code. For tool support we use the tool SCADE Suite (Esterel Technology), an integrated design environment that covers all the requirements for our methodology. The SCADE Suite is well established in avionics and defense, rail transportation, energy and heavy equipment industries. For evaluation purposes, we apply our approach to an up-to-date case study of the TET-1 satellite bus. In particular, the attitude and orbit control software is considered. The behavioral models for the subsystem are developed, formally verified, and optimized.

  2. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  3. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  4. Ada(R) Test and Verification System (ATVS)

    NASA Technical Reports Server (NTRS)

    Strelich, Tom

    1986-01-01

    The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.

  5. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  6. Runtime Verification of Pacemaker Functionality Using Hierarchical Fuzzy Colored Petri-nets.

    PubMed

    Majma, Negar; Babamir, Seyed Morteza; Monadjemi, Amirhassan

    2017-02-01

    Today, implanted medical devices are increasingly used for many patients and in case of diverse health problems. However, several runtime problems and errors are reported by the relevant organizations, even resulting in patient death. One of those devices is the pacemaker. The pacemaker is a device helping the patient to regulate the heartbeat by connecting to the cardiac vessels. This device is directed by its software, so any failure in this software causes a serious malfunction. Therefore, this study aims to a better way to monitor the device's software behavior to decrease the failure risk. Accordingly, we supervise the runtime function and status of the software. The software verification means examining limitations and needs of the system users by the system running software. In this paper, a method to verify the pacemaker software, based on the fuzzy function of the device, is presented. So, the function limitations of the device are identified and presented as fuzzy rules and then the device is verified based on the hierarchical Fuzzy Colored Petri-net (FCPN), which is formed considering the software limits. Regarding the experiences of using: 1) Fuzzy Petri-nets (FPN) to verify insulin pumps, 2) Colored Petri-nets (CPN) to verify the pacemaker and 3) To verify the pacemaker by a software agent with Petri-network based knowledge, which we gained during the previous studies, the runtime behavior of the pacemaker software is examined by HFCPN, in this paper. This is considered a developing step compared to the earlier work. HFCPN in this paper, compared to the FPN and CPN used in our previous studies reduces the complexity. By presenting the Petri-net (PN) in a hierarchical form, the verification runtime, decreased as 90.61% compared to the verification runtime in the earlier work. Since we need an inference engine in the runtime verification, we used the HFCPN to enhance the performance of the inference engine.

  7. Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems

    NASA Technical Reports Server (NTRS)

    Powell, John D.; Gilliam, David

    2004-01-01

    The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.

  8. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  9. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  10. Formal verification of medical monitoring software using Z language: a representative sample.

    PubMed

    Babamir, Seyed Morteza; Borhani, Mehdi

    2012-08-01

    Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.

  11. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  12. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  13. Firing Room Remote Application Software Development

    NASA Technical Reports Server (NTRS)

    Liu, Kan

    2015-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories (LACC) subsystem. In addition, a software test verification procedure document was created to verify and checkout LACC software for Launch Equipment Test Facility (LETF) testing.

  14. Software Tool Integrating Data Flow Diagrams and Petri Nets

    NASA Technical Reports Server (NTRS)

    Thronesbery, Carroll; Tavana, Madjid

    2010-01-01

    Data Flow Diagram - Petri Net (DFPN) is a software tool for analyzing other software to be developed. The full name of this program reflects its design, which combines the benefit of data-flow diagrams (which are typically favored by software analysts) with the power and precision of Petri-net models, without requiring specialized Petri-net training. (A Petri net is a particular type of directed graph, a description of which would exceed the scope of this article.) DFPN assists a software analyst in drawing and specifying a data-flow diagram, then translates the diagram into a Petri net, then enables graphical tracing of execution paths through the Petri net for verification, by the end user, of the properties of the software to be developed. In comparison with prior means of verifying the properties of software to be developed, DFPN makes verification by the end user more nearly certain, thereby making it easier to identify and correct misconceptions earlier in the development process, when correction is less expensive. After the verification by the end user, DFPN generates a printable system specification in the form of descriptions of processes and data.

  15. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  16. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  17. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  18. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  19. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  20. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  1. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  2. Component Verification and Certification in NASA Missions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Penix, John; Norvig, Peter (Technical Monitor)

    2001-01-01

    Software development for NASA missions is a particularly challenging task. Missions are extremely ambitious scientifically, have very strict time frames, and must be accomplished with a maximum degree of reliability. Verification technologies must therefore be pushed far beyond their current capabilities. Moreover, reuse and adaptation of software architectures and components must be incorporated in software development within and across missions. This paper discusses NASA applications that we are currently investigating from these perspectives.

  3. Software Verification of Orion Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  4. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  5. Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder

    NASA Technical Reports Server (NTRS)

    Lindsey, A. E.; Pecheur, Charles

    2004-01-01

    AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.

  6. Software for imaging phase-shift interference microscope

    NASA Astrophysics Data System (ADS)

    Malinovski, I.; França, R. S.; Couceiro, I. B.

    2018-03-01

    In recent years absolute interference microscope was created at National Metrology Institute of Brazil (INMETRO). The instrument by principle of operation is imaging phase-shifting interferometer (PSI) equipped with two stabilized lasers of different colour as traceable reference wavelength sources. We report here some progress in development of the software for this instrument. The status of undergoing internal validation and verification of the software is also reported. In contrast with standard PSI method, different methodology of phase evaluation is applied. Therefore, instrument specific procedures for software validation and verification are adapted and discussed.

  7. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  8. Abstraction and Assume-Guarantee Reasoning for Automated Software Verification

    NASA Technical Reports Server (NTRS)

    Chaki, S.; Clarke, E.; Giannakopoulou, D.; Pasareanu, C. S.

    2004-01-01

    Compositional verification and abstraction are the key techniques to address the state explosion problem associated with model checking of concurrent software. A promising compositional approach is to prove properties of a system by checking properties of its components in an assume-guarantee style. This article proposes a framework for performing abstraction and assume-guarantee reasoning of concurrent C code in an incremental and fully automated fashion. The framework uses predicate abstraction to extract and refine finite state models of software and it uses an automata learning algorithm to incrementally construct assumptions for the compositional verification of the abstract models. The framework can be instantiated with different assume-guarantee rules. We have implemented our approach in the COMFORT reasoning framework and we show how COMFORT out-performs several previous software model checking approaches when checking safety properties of non-trivial concurrent programs.

  9. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.

  10. Providing an empirical basis for optimizing the verification and testing phases of software development

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1992-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault density components so that the testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents an alternative approach for constructing such models that is intended to fulfill specific software engineering needs (i.e. dealing with partial/incomplete information and creating models that are easy to interpret). Our approach to classification is as follows: (1) to measure the software system to be considered; and (2) to build multivariate stochastic models for prediction. We present experimental results obtained by classifying FORTRAN components developed at the NASA/GSFC into two fault density classes: low and high. Also we evaluate the accuracy of the model and the insights it provides into the software process.

  11. A Verification Method of Inter-Task Cooperation in Embedded Real-time Systems and its Evaluation

    NASA Astrophysics Data System (ADS)

    Yoshida, Toshio

    In software development process of embedded real-time systems, the design of the task cooperation process is very important. The cooperating process of such tasks is specified by task cooperation patterns. Adoption of unsuitable task cooperation patterns has fatal influence on system performance, quality, and extendibility. In order to prevent repetitive work caused by the shortage of task cooperation performance, it is necessary to verify task cooperation patterns in an early software development stage. However, it is very difficult to verify task cooperation patterns in an early software developing stage where task program codes are not completed yet. Therefore, we propose a verification method using task skeleton program codes and a real-time kernel that has a function of recording all events during software execution such as system calls issued by task program codes, external interrupts, and timer interrupt. In order to evaluate the proposed verification method, we applied it to the software development process of a mechatronics control system.

  12. Verification of Java Programs using Symbolic Execution and Invariant Generation

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  13. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  14. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.

    The National Human Exposure Assessment Sur...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ENVIRONMENTAL DECISION SUPPORT SOFTWARE, DECISION FX, INC., GROUNDWATER FX

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ENVIRONMENTAL DECISION SUPPORT SOFTWARE, DECISION FX, INC. SAMPLING FX

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  17. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  18. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1976-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program assembly control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools are described, as well as the program test plans and their implementation on the various simulators. Failure effects analysis and the creation of special failure generating software for testing purposes are described.

  19. The 25 kW power module evolution study. Part 3: Conceptual design for power module evolution. Volume 6: WBS and dictionary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Program elements of the power module (PM) system, are identified, structured, and defined according to the planned work breakdown structure. Efforts required to design, develop, manufacture, test, checkout, launch and operate a protoflight assembled 25 kW, 50 kW and 100 kW PM include the preparation and delivery of related software, government furnished equipment, space support equipment, ground support equipment, launch site verification software, orbital verification software, and all related data items.

  20. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.

  1. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  2. Cosimulation of embedded system using RTOS software simulator

    NASA Astrophysics Data System (ADS)

    Wang, Shihao; Duan, Zhigang; Liu, Mingye

    2003-09-01

    Embedded system design often employs co-simulation to verify system's function; one efficient verification tool of software is Instruction Set Simulator (ISS). As a full functional model of target CPU, ISS interprets instruction of embedded software step by step, which usually is time-consuming since it simulates at low-level. Hence ISS often becomes the bottleneck of co-simulation in a complicated system. In this paper, a new software verification tools, the RTOS software simulator (RSS) was presented. The mechanism of its operation was described in a full details. In RSS method, RTOS API is extended and hardware simulator driver is adopted to deal with data-exchange and synchronism between the two simulators.

  3. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  4. The Design and Evaluation of Class Exercises as Active Learning Tools in Software Verification and Validation

    ERIC Educational Resources Information Center

    Wu, Peter Y.; Manohar, Priyadarshan A.; Acharya, Sushil

    2016-01-01

    It is well known that interesting questions can stimulate thinking and invite participation. Class exercises are designed to make use of questions to engage students in active learning. In a project toward building a community skilled in software verification and validation (SV&V), we critically review and further develop course materials in…

  5. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.

    The U.S.-Mexico Border Program is sponsored ...

  6. Apollo experience report: Guidance and control systems. Engineering simulation program

    NASA Technical Reports Server (NTRS)

    Gilbert, D. W.

    1973-01-01

    The Apollo Program experience from early 1962 to July 1969 with respect to the engineering-simulation support and the problems encountered is summarized in this report. Engineering simulation in support of the Apollo guidance and control system is discussed in terms of design analysis and verification, certification of hardware in closed-loop operation, verification of hardware/software compatibility, and verification of both software and procedures for each mission. The magnitude, time, and cost of the engineering simulations are described with respect to hardware availability, NASA and contractor facilities (for verification of the command module, the lunar module, and the primary guidance, navigation, and control system), and scheduling and planning considerations. Recommendations are made regarding implementation of similar, large-scale simulations for future programs.

  7. Using Penelope to assess the correctness of NASA Ada software: A demonstration of formal methods as a counterpart to testing

    NASA Technical Reports Server (NTRS)

    Eichenlaub, Carl T.; Harper, C. Douglas; Hird, Geoffrey

    1993-01-01

    Life-critical applications warrant a higher level of software reliability than has yet been achieved. Since it is not certain that traditional methods alone can provide the required ultra reliability, new methods should be examined as supplements or replacements. This paper describes a mathematical counterpart to the traditional process of empirical testing. ORA's Penelope verification system is demonstrated as a tool for evaluating the correctness of Ada software. Grady Booch's Ada calendar utility package, obtained through NASA, was specified in the Larch/Ada language. Formal verification in the Penelope environment established that many of the package's subprograms met their specifications. In other subprograms, failed attempts at verification revealed several errors that had escaped detection by testing.

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, ENVIRONMENTAL DECISION SUPPORT SOFTWARE, UNIVERSITY OF TENNESSEE RESEARCH CORPORATION, SPATIAL ANALYSIS AND DECISION ASSISTANCE (SADA)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  9. Design of the software development and verification system (SWDVS) for shuttle NASA study task 35

    NASA Technical Reports Server (NTRS)

    Drane, L. W.; Mccoy, B. J.; Silver, L. W.

    1973-01-01

    An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.

  10. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  11. Knowledge based system verification and validation as related to automation of space station subsystems: Rationale for a knowledge based system lifecycle

    NASA Technical Reports Server (NTRS)

    Richardson, Keith; Wong, Carla

    1988-01-01

    The role of verification and validation (V and V) in software has been to support and strengthen the software lifecycle and to ensure that the resultant code meets the standards of the requirements documents. Knowledge Based System (KBS) V and V should serve the same role, but the KBS lifecycle is ill-defined. The rationale of a simple form of the KBS lifecycle is presented, including accommodation to certain critical KBS differences from software development.

  12. Hosted Services for Advanced V and V Technologies: An Approach to Achieving Adoption without the Woes of Usage

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.; OMalley, Owen; Brew, William A.

    2003-01-01

    Attempts to achieve widespread use of software verification tools have been notably unsuccessful. Even 'straightforward', classic, and potentially effective verification tools such as lint-like tools face limits on their acceptance. These limits are imposed by the expertise required applying the tools and interpreting the results, the high false positive rate of many verification tools, and the need to integrate the tools into development environments. The barriers are even greater for more complex advanced technologies such as model checking. Web-hosted services for advanced verification technologies may mitigate these problems by centralizing tool expertise. The possible benefits of this approach include eliminating the need for software developer expertise in tool application and results filtering, and improving integration with other development tools.

  13. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  14. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  15. Land surface Verification Toolkit (LVT)

    NASA Technical Reports Server (NTRS)

    Kumar, Sujay V.

    2017-01-01

    LVT is a framework developed to provide an automated, consolidated environment for systematic land surface model evaluation Includes support for a range of in-situ, remote-sensing and other model and reanalysis products. Supports the analysis of outputs from various LIS subsystems, including LIS-DA, LIS-OPT, LIS-UE. Note: The Land Information System Verification Toolkit (LVT) is a NASA software tool designed to enable the evaluation, analysis and comparison of outputs generated by the Land Information System (LIS). The LVT software is released under the terms and conditions of the NASA Open Source Agreement (NOSA) Version 1.1 or later. Land Information System Verification Toolkit (LVT) NOSA.

  16. Behavioral biometrics for verification and recognition of malicious software agents

    NASA Astrophysics Data System (ADS)

    Yampolskiy, Roman V.; Govindaraju, Venu

    2008-04-01

    Homeland security requires technologies capable of positive and reliable identification of humans for law enforcement, government, and commercial applications. As artificially intelligent agents improve in their abilities and become a part of our everyday life, the possibility of using such programs for undermining homeland security increases. Virtual assistants, shopping bots, and game playing programs are used daily by millions of people. We propose applying statistical behavior modeling techniques developed by us for recognition of humans to the identification and verification of intelligent and potentially malicious software agents. Our experimental results demonstrate feasibility of such methods for both artificial agent verification and even for recognition purposes.

  17. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  18. Expert system verification and validation study: ES V/V Workshop

    NASA Technical Reports Server (NTRS)

    French, Scott; Hamilton, David

    1992-01-01

    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.

  19. Logic Model Checking of Time-Periodic Real-Time Systems

    NASA Technical Reports Server (NTRS)

    Florian, Mihai; Gamble, Ed; Holzmann, Gerard

    2012-01-01

    In this paper we report on the work we performed to extend the logic model checker SPIN with built-in support for the verification of periodic, real-time embedded software systems, as commonly used in aircraft, automobiles, and spacecraft. We first extended the SPIN verification algorithms to model priority based scheduling policies. Next, we added a library to support the modeling of periodic tasks. This library was used in a recent application of the SPIN model checker to verify the engine control software of an automobile, to study the feasibility of software triggers for unintended acceleration events.

  20. Formal Verification of Large Software Systems

    NASA Technical Reports Server (NTRS)

    Yin, Xiang; Knight, John

    2010-01-01

    We introduce a scalable proof structure to facilitate formal verification of large software systems. In our approach, we mechanically synthesize an abstract specification from the software implementation, match its static operational structure to that of the original specification, and organize the proof as the conjunction of a series of lemmas about the specification structure. By setting up a different lemma for each distinct element and proving each lemma independently, we obtain the important benefit that the proof scales easily for large systems. We present details of the approach and an illustration of its application on a challenge problem from the security domain

  1. A dedicated software application for treatment verification with off-line PET/CT imaging at the Heidelberg Ion Beam Therapy Center

    NASA Astrophysics Data System (ADS)

    Chen, W.; Bauer, J.; Kurz, C.; Tessonnier, T.; Handrack, J.; Haberer, T.; Debus, J.; Parodi, K.

    2017-01-01

    We present the workflow of the offline-PET based range verification method used at the Heidelberg Ion Beam Therapy Center, detailing the functionalities of an in-house developed software application, SimInterface14, with which range analysis is performed. Moreover, we introduce the design of a decision support system assessing uncertainties and facilitating physicians in decisions making for plan adaptation.

  2. Simulation verification techniques study. Task report 4: Simulation module performance parameters and performance standards

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Shuttle simulation software modules in the environment, crew station, vehicle configuration and vehicle dynamics categories are discussed. For each software module covered, a description of the module functions and operational modes, its interfaces with other modules, its stored data, inputs, performance parameters and critical performance parameters is given. Reference data sources which provide standards of performance are identified for each module. Performance verification methods are also discussed briefly.

  3. Simulated Order Verification and Medication Reconciliation during an Introductory Pharmacy Practice Experience.

    PubMed

    Metzger, Nicole L; Chesson, Melissa M; Momary, Kathryn M

    2015-09-25

    Objective. To create, implement, and assess a simulated medication reconciliation and an order verification activity using hospital training software. Design. A simulated patient with medication orders and home medications was built into existing hospital training software. Students in an institutional introductory pharmacy practice experience (IPPE) reconciled the patient's medications and determined whether or not to verify the inpatient orders based on his medical history and laboratory data. After reconciliation, students identified medication discrepancies and documented their rationale for rejecting inpatient orders. Assessment. For a 3-year period, the majority of students agreed the simulation enhanced their learning, taught valuable clinical decision-making skills, integrated material from previous courses, and stimulated their interest in institutional pharmacy. Overall feedback from student evaluations about the IPPE also was favorable. Conclusion. Use of existing hospital training software can affordably simulate the pharmacist's role in order verification and medication reconciliation, as well as improve clinical decision-making.

  4. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  5. Reflight certification software design specifications

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The PDSS/IMC Software Design Specification for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) is contained. The PDSS/IMC is to be used for checkout and verification of the IMC flight hardware and software by NASA/MSFC.

  6. Verification of an on line in vivo semiconductor dosimetry system for TBI with two TLD procedures.

    PubMed

    Sánchez-Doblado, F; Terrón, J A; Sánchez-Nieto, B; Arráns, R; Errazquin, L; Biggs, D; Lee, C; Núñez, L; Delgado, A; Muñiz, J L

    1995-01-01

    This work presents the verification of an on line in vivo dosimetry system based on semiconductors. Software and hardware has been designed to convert the diode signal into absorbed dose. Final verification was made in the form of an intercomparison with two independent thermoluminiscent (TLD) dosimetry systems, under TBI conditions.

  7. Automatic programming for critical applications

    NASA Technical Reports Server (NTRS)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  8. Practical Application of Model Checking in Software Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Skakkebaek, Jens Ulrik

    1999-01-01

    This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.

  9. Specification, Synthesis, and Verification of Software-based Control Protocols for Fault-Tolerant Space Systems

    DTIC Science & Technology

    2016-08-16

    Force Research Laboratory Space Vehicles Directorate AFRL /RVSV 3550 Aberdeen Ave, SE 11. SPONSOR/MONITOR’S REPORT Kirtland AFB, NM 87117-5776 NUMBER...Ft Belvoir, VA 22060-6218 1 cy AFRL /RVIL Kirtland AFB, NM 87117-5776 2 cys Official Record Copy AFRL /RVSV/Richard S. Erwin 1 cy... AFRL -RV-PS- AFRL -RV-PS- TR-2016-0112 TR-2016-0112 SPECIFICATION, SYNTHESIS, AND VERIFICATION OF SOFTWARE-BASED CONTROL PROTOCOLS FOR FAULT-TOLERANT

  10. Open-Source Software in Computational Research: A Case Study

    DOE PAGES

    Syamlal, Madhava; O'Brien, Thomas J.; Benyahia, Sofiane; ...

    2008-01-01

    A case study of open-source (OS) development of the computational research software MFIX, used for multiphase computational fluid dynamics simulations, is presented here. The verification and validation steps required for constructing modern computational software and the advantages of OS development in those steps are discussed. The infrastructure used for enabling the OS development of MFIX is described. The impact of OS development on computational research and education in gas-solids flow, as well as the dissemination of information to other areas such as geophysical and volcanology research, is demonstrated. This study shows that the advantages of OS development were realized inmore » the case of MFIX: verification by many users, which enhances software quality; the use of software as a means for accumulating and exchanging information; the facilitation of peer review of the results of computational research.« less

  11. Cassini's Test Methodology for Flight Software Verification and Operations

    NASA Technical Reports Server (NTRS)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  12. WFF TOPEX Software Documentation Overview, May 1999. Volume 2

    NASA Technical Reports Server (NTRS)

    Brooks, Ronald L.; Lee, Jeffrey

    2003-01-01

    This document provides an overview'of software development activities and the resulting products and procedures developed by the TOPEX Software Development Team (SWDT) at Wallops Flight Facility, in support of the WFF TOPEX Engineering Assessment and Verification efforts.

  13. Engineering of the LISA Pathfinder mission—making the experiment a practical reality

    NASA Astrophysics Data System (ADS)

    Warren, Carl; Dunbar, Neil; Backler, Mike

    2009-05-01

    LISA Pathfinder represents a unique challenge in the development of scientific spacecraft—not only is the LISA Test Package (LTP) payload a complex integrated development, placing stringent requirements on its developers and the spacecraft, but the payload also acts as the core sensor and actuator for the spacecraft, making the tasks of control design, software development and system verification unusually difficult. The micro-propulsion system which provides the remaining actuation also presents substantial development and verification challenges. As the mission approaches the system critical design review, flight hardware is completing verification and the process of verification using software and hardware simulators and test benches is underway. Preparation for operations has started, but critical milestones for LTP and field effect electric propulsion (FEEP) lie ahead. This paper summarizes the status of the present development and outlines the key challenges that must be overcome on the way to launch.

  14. Advanced software techniques for data management systems. Volume 1: Study of software aspects of the phase B space shuttle avionics system

    NASA Technical Reports Server (NTRS)

    Martin, F. H.

    1972-01-01

    An overview of the executive system design task is presented. The flight software executive system, software verification, phase B baseline avionics system review, higher order languages and compilers, and computer hardware features are also discussed.

  15. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  16. SAGA: A project to automate the management of software production systems

    NASA Technical Reports Server (NTRS)

    Campbell, Roy H.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.

    1987-01-01

    The Software Automation, Generation and Administration (SAGA) project is investigating the design and construction of practical software engineering environments for developing and maintaining aerospace systems and applications software. The research includes the practical organization of the software lifecycle, configuration management, software requirements specifications, executable specifications, design methodologies, programming, verification, validation and testing, version control, maintenance, the reuse of software, software libraries, documentation, and automated management.

  17. 7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.

  18. NASA software specification and evaluation system design, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The research to develop methods for reducing the effort expended in software and verification is reported. The development of a formal software requirements methodology, a formal specifications language, a programming language, a language preprocessor, and code analysis tools are discussed.

  19. Proceedings of the Second NASA Formal Methods Symposium

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar (Editor)

    2010-01-01

    This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.

  20. Case Study: Test Results of a Tool and Method for In-Flight, Adaptive Control System Verification on a NASA F-15 Flight Research Aircraft

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John

    2006-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.

  1. Development of a software safety process and a case study of its use

    NASA Technical Reports Server (NTRS)

    Knight, John C.

    1993-01-01

    The goal of this research is to continue the development of a comprehensive approach to software safety and to evaluate the approach with a case study. The case study is a major part of the project, and it involves the analysis of a specific safety-critical system from the medical equipment domain. The particular application being used was selected because of the availability of a suitable candidate system. We consider the results to be generally applicable and in no way particularly limited by the domain. The research is concentrating on issues raised by the specification and verification phases of the software lifecycle since they are central to our previously-developed rigorous definitions of software safety. The theoretical research is based on our framework of definitions for software safety. In the area of specification, the main topics being investigated are the development of techniques for building system fault trees that correctly incorporate software issues and the development of rigorous techniques for the preparation of software safety specifications. The research results are documented. Another area of theoretical investigation is the development of verification methods tailored to the characteristics of safety requirements. Verification of the correct implementation of the safety specification is central to the goal of establishing safe software. The empirical component of this research is focusing on a case study in order to provide detailed characterizations of the issues as they appear in practice, and to provide a testbed for the evaluation of various existing and new theoretical results, tools, and techniques. The Magnetic Stereotaxis System is summarized.

  2. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  3. Optimizing IV and V for Mature Organizations

    NASA Technical Reports Server (NTRS)

    Fuhman, Christopher

    2003-01-01

    NASA is intending for its future software development agencies to have at least a Level 3 rating in the Carnegie Mellon University Capability Maturity Model (CMM). The CMM has built-in Verification and Validation (V&V) processes that support higher software quality. Independent Verification and Validation (IV&V) of software developed by mature agencies can be therefore more effective than for software developed by less mature organizations. How is Independent V&V different with respect to the maturity of an organization? Knowing a priori the maturity of an organization's processes, how can IV&V planners better identify areas of need choose IV&V activities, etc? The objective of this research is to provide a complementary set of guidelines and criteria to assist the planning of IV&V activities on a project using a priori knowledge of the measurable levels of maturity of the organization developing the software.

  4. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1975-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program Assembly Control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools--the all-digital simulator, the hybrid simulator, and the Iron Bird simulator--are described, as well as the program test plans and their implementation on the various simulators. Failure-effects analysis and the creation of special failure-generating software for testing purposes are described. The quality of the end product is evidenced by the F-8 DFBW flight test program in which 42 flights, totaling 58 hours of flight time, were successfully made without any DFCS inflight software, or hardware, failures.

  5. 78 FR 47804 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ..., ``Configuration Management Plans for Digital Computer Software used in Safety Systems of Nuclear Power Plants... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., Reviews, and Audits for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This...

  6. Crowd Sourced Formal Verification-Augmentation (CSFV-A)

    DTIC Science & Technology

    2016-06-01

    Formal Verification (CSFV) program built games that recast FV problems into puzzles to make these problems more accessible, increasing the manpower to...construct FV proofs. This effort supported the CSFV program by hosting the games on a public website, and analyzed the gameplay for efficiency to...provide FV proofs. 15. SUBJECT TERMS Crowd Source, Software, Formal Verification, Games 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT

  7. Space shuttle orbiter avionics software: Post review report for the entry FACI (First Article Configuration Inspection). [including orbital flight tests integrated system

    NASA Technical Reports Server (NTRS)

    Markos, H.

    1978-01-01

    Status of the computer programs dealing with space shuttle orbiter avionics is reported. Specific topics covered include: delivery status; SSW software; SM software; DL software; GNC software; level 3/4 testing; level 5 testing; performance analysis, SDL readiness for entry first article configuration inspection; and verification assessment.

  8. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  9. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  10. Using Colored Stochastic Petri Net (CS-PN) software for protocol specification, validation, and evaluation

    NASA Technical Reports Server (NTRS)

    Zenie, Alexandre; Luguern, Jean-Pierre

    1987-01-01

    The specification, verification, validation, and evaluation, which make up the different steps of the CS-PN software are outlined. The colored stochastic Petri net software is applied to a Wound/Wait protocol decomposable into two principal modules: request or couple (transaction, granule) treatment module and wound treatment module. Each module is specified, verified, validated, and then evaluated separately, to deduce a verification, validation and evaluation of the complete protocol. The colored stochastic Petri nets tool is shown to be a natural extension of the stochastic tool, adapted to distributed systems and protocols, because the color conveniently takes into account the numerous sites, transactions, granules and messages.

  11. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  12. Software Quality Assurance and Verification for the MPACT Library Generation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yuxuan; Williams, Mark L.; Wiarda, Dorothea

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX andmore » VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.« less

  13. Final Report - Regulatory Considerations for Adaptive Systems

    NASA Technical Reports Server (NTRS)

    Wilkinson, Chris; Lynch, Jonathan; Bharadwaj, Raj

    2013-01-01

    This report documents the findings of a preliminary research study into new approaches to the software design assurance of adaptive systems. We suggest a methodology to overcome the software validation and verification difficulties posed by the underlying assumption of non-adaptive software in the requirementsbased- testing verification methods in RTCA/DO-178B and C. An analysis of the relevant RTCA/DO-178B and C objectives is presented showing the reasons for the difficulties that arise in showing satisfaction of the objectives and suggested additional means by which they could be satisfied. We suggest that the software design assurance problem for adaptive systems is principally one of developing correct and complete high level requirements and system level constraints that define the necessary system functional and safety properties to assure the safe use of adaptive systems. We show how analytical techniques such as model based design, mathematical modeling and formal or formal-like methods can be used to both validate the high level functional and safety requirements, establish necessary constraints and provide the verification evidence for the satisfaction of requirements and constraints that supplements conventional testing. Finally the report identifies the follow-on research topics needed to implement this methodology.

  14. Formal Validation of Aerospace Software

    NASA Astrophysics Data System (ADS)

    Lesens, David; Moy, Yannick; Kanig, Johannes

    2013-08-01

    Any single error in critical software can have catastrophic consequences. Even though failures are usually not advertised, some software bugs have become famous, such as the error in the MIM-104 Patriot. For space systems, experience shows that software errors are a serious concern: more than half of all satellite failures from 2000 to 2003 involved software. To address this concern, this paper addresses the use of formal verification of software developed in Ada.

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF URBAN RUNOFF MODELS

    EPA Science Inventory

    This paper will present the verification process and available results of the XP-SWMM modeling system produced by XP-Software conducted unde the USEPA's ETV Program. Wet weather flow (WWF) models are used throughout the US for the evaluation of storm and combined sewer systems. M...

  16. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  17. Tethered satellite system dynamics and control review panel and related activities, phase 3

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Two major tests of the Tethered Satellite System (TSS) engineering and flight units were conducted to demonstrate the functionality of the hardware and software. Deficiencies in the hardware/software integration tests (HSIT) led to a recommendation for more testing to be performed. Selected problem areas of tether dynamics were analyzed, including verification of the severity of skip rope oscillations, verification or comparison runs to explore dynamic phenomena observed in other simulations, and data generation runs to explore the performance of the time domain and frequency domain skip rope observers.

  18. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... standards. (f) The reviewer shall analyze all Fault Tree Analyses (FTA), Failure Mode and Effects... for each product vulnerability cited by the reviewer; (4) Identification of any documentation or... not properly followed; (6) Identification of the software verification and validation procedures, as...

  19. Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.

    2009-01-01

    Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.

  20. Geometrical verification system using Adobe Photoshop in radiotherapy.

    PubMed

    Ishiyama, Hiromichi; Suzuki, Koji; Niino, Keiji; Hosoya, Takaaki; Hayakawa, Kazushige

    2005-02-01

    Adobe Photoshop is used worldwide and is useful for comparing portal films with simulation films. It is possible to scan images and then view them simultaneously with this software. The purpose of this study was to assess the accuracy of a geometrical verification system using Adobe Photoshop. We prepared the following two conditions for verification. Under one condition, films were hanged on light boxes, and examiners measured distances between the isocenter on simulation films and that on portal films by adjusting the bony structures. Under the other condition, films were scanned into a computer and displayed using Adobe Photoshop, and examiners measured distances between the isocenter on simulation films and those on portal films by adjusting the bony structures. To obtain control data, lead balls were used as a fiducial point for matching the films accurately. The errors, defined as the differences between the control data and the measurement data, were assessed. Errors of the data obtained using Adobe Photoshop were significantly smaller than those of the data obtained from films on light boxes (p < 0.007). The geometrical verification system using Adobe Photoshop is available on any PC with this software and is useful for improving the accuracy of verification.

  1. Runtime Verification in Context : Can Optimizing Error Detection Improve Fault Diagnosis

    NASA Technical Reports Server (NTRS)

    Dwyer, Matthew B.; Purandare, Rahul; Person, Suzette

    2010-01-01

    Runtime verification has primarily been developed and evaluated as a means of enriching the software testing process. While many researchers have pointed to its potential applicability in online approaches to software fault tolerance, there has been a dearth of work exploring the details of how that might be accomplished. In this paper, we describe how a component-oriented approach to software health management exposes the connections between program execution, error detection, fault diagnosis, and recovery. We identify both research challenges and opportunities in exploiting those connections. Specifically, we describe how recent approaches to reducing the overhead of runtime monitoring aimed at error detection might be adapted to reduce the overhead and improve the effectiveness of fault diagnosis.

  2. Transmutation Fuel Performance Code Thermal Model Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  3. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  4. Case Studies for Enhancing Student Engagement and Active Learning in Software V&V Education

    ERIC Educational Resources Information Center

    Manohar, Priyadarshan A.; Acharya, Sushil; Wu, Peter; Hansen, Mary; Ansari, Ali; Schilling, Walter

    2015-01-01

    Two critical problems facing the software (S/W) industry today are the lack of appreciation of the full benefits that can be derived from Software Verification and Validation (V&V) and an associated problem of shortage of adequately trained V&V practitioners. To address this situation, the software V&V course curriculum at the author's…

  5. Proposal for hierarchical description of software systems

    NASA Technical Reports Server (NTRS)

    Thauboth, H.

    1973-01-01

    The programming of digital computers has developed into a new dimension full of diffculties, because the hardware of computers has become so powerful that more complex applications are entrusted to computers. The costs of software development, verification, and maintenance are outpacing those of the hardware and the trend is toward futher increase of sophistication of application of computers and consequently of sophistication of software. To obtain better visibility into software systems and to improve the structure of software systems for better tests, verification, and maintenance, a clear, but rigorous description and documentation of software is needed. The purpose of the report is to extend the present methods in order to obtain a documentation that better reflects the interplay between the various components and functions of a software system at different levels of detail without losing the precision in expression. This is done by the use of block diagrams, sequence diagrams, and cross-reference charts. In the appendices, examples from an actual large sofware system, i.e. the Marshall System for Aerospace Systems Simulation (MARSYAS), are presented. The proposed documentation structure is compatible to automation of updating significant portions of the documentation for better software change control.

  6. Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions

    DTIC Science & Technology

    2012-07-01

    Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software

  7. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  8. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  9. Precise and Scalable Static Program Analysis of NASA Flight Software

    NASA Technical Reports Server (NTRS)

    Brat, G.; Venet, A.

    2005-01-01

    Recent NASA mission failures (e.g., Mars Polar Lander and Mars Orbiter) illustrate the importance of having an efficient verification and validation process for such systems. One software error, as simple as it may be, can cause the loss of an expensive mission, or lead to budget overruns and crunched schedules. Unfortunately, traditional verification methods cannot guarantee the absence of errors in software systems. Therefore, we have developed the CGS static program analysis tool, which can exhaustively analyze large C programs. CGS analyzes the source code and identifies statements in which arrays are accessed out of bounds, or, pointers are used outside the memory region they should address. This paper gives a high-level description of CGS and its theoretical foundations. It also reports on the use of CGS on real NASA software systems used in Mars missions (from Mars PathFinder to Mars Exploration Rover) and on the International Space Station.

  10. Developing interpretable models with optimized set reduction for identifying high risk software components

    NASA Technical Reports Server (NTRS)

    Briand, Lionel C.; Basili, Victor R.; Hetmanski, Christopher J.

    1993-01-01

    Applying equal testing and verification effort to all parts of a software system is not very efficient, especially when resources are limited and scheduling is tight. Therefore, one needs to be able to differentiate low/high fault frequency components so that testing/verification effort can be concentrated where needed. Such a strategy is expected to detect more faults and thus improve the resulting reliability of the overall system. This paper presents the Optimized Set Reduction approach for constructing such models, intended to fulfill specific software engineering needs. Our approach to classification is to measure the software system and build multivariate stochastic models for predicting high risk system components. We present experimental results obtained by classifying Ada components into two classes: is or is not likely to generate faults during system and acceptance test. Also, we evaluate the accuracy of the model and the insights it provides into the error making process.

  11. Rule groupings: An approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.

  12. Formal verification of software-based medical devices considering medical guidelines.

    PubMed

    Daw, Zamira; Cleaveland, Rance; Vetter, Marcus

    2014-01-01

    Software-based devices have increasingly become an important part of several clinical scenarios. Due to their critical impact on human life, medical devices have very strict safety requirements. It is therefore necessary to apply verification methods to ensure that the safety requirements are met. Verification of software-based devices is commonly limited to the verification of their internal elements without considering the interaction that these elements have with other devices as well as the application environment in which they are used. Medical guidelines define clinical procedures, which contain the necessary information to completely verify medical devices. The objective of this work was to incorporate medical guidelines into the verification process in order to increase the reliability of the software-based medical devices. Medical devices are developed using the model-driven method deterministic models for signal processing of embedded systems (DMOSES). This method uses unified modeling language (UML) models as a basis for the development of medical devices. The UML activity diagram is used to describe medical guidelines as workflows. The functionality of the medical devices is abstracted as a set of actions that is modeled within these workflows. In this paper, the UML models are verified using the UPPAAL model-checker. For this purpose, a formalization approach for the UML models using timed automaton (TA) is presented. A set of requirements is verified by the proposed approach for the navigation-guided biopsy. This shows the capability for identifying errors or optimization points both in the workflow and in the system design of the navigation device. In addition to the above, an open source eclipse plug-in was developed for the automated transformation of UML models into TA models that are automatically verified using UPPAAL. The proposed method enables developers to model medical devices and their clinical environment using clinical workflows as one UML diagram. Additionally, the system design can be formally verified automatically.

  13. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  14. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  15. Verification of floating-point software

    NASA Technical Reports Server (NTRS)

    Hoover, Doug N.

    1990-01-01

    Floating point computation presents a number of problems for formal verification. Should one treat the actual details of floating point operations, or accept them as imprecisely defined, or should one ignore round-off error altogether and behave as if floating point operations are perfectly accurate. There is the further problem that a numerical algorithm usually only approximately computes some mathematical function, and we often do not know just how good the approximation is, even in the absence of round-off error. ORA has developed a theory of asymptotic correctness which allows one to verify floating point software with a minimum entanglement in these problems. This theory and its implementation in the Ariel C verification system are described. The theory is illustrated using a simple program which finds a zero of a given function by bisection. This paper is presented in viewgraph form.

  16. Software architecture standard for simulation virtual machine, version 2.0

    NASA Technical Reports Server (NTRS)

    Sturtevant, Robert; Wessale, William

    1994-01-01

    The Simulation Virtual Machine (SBM) is an Ada architecture which eases the effort involved in the real-time software maintenance and sustaining engineering. The Software Architecture Standard defines the infrastructure which all the simulation models are built from. SVM was developed for and used in the Space Station Verification and Training Facility.

  17. Cooperative GN&C development in a rapid prototyping environment. [flight software design for space vehicles

    NASA Technical Reports Server (NTRS)

    Bordano, Aldo; Uhde-Lacovara, JO; Devall, Ray; Partin, Charles; Sugano, Jeff; Doane, Kent; Compton, Jim

    1993-01-01

    The Navigation, Control and Aeronautics Division (NCAD) at NASA-JSC is exploring ways of producing Guidance, Navigation and Control (GN&C) flight software faster, better, and cheaper. To achieve these goals NCAD established two hardware/software facilities that take an avionics design project from initial inception through high fidelity real-time hardware-in-the-loop testing. Commercially available software products are used to develop the GN&C algorithms in block diagram form and then automatically generate source code from these diagrams. A high fidelity real-time hardware-in-the-loop laboratory provides users with the capability to analyze mass memory usage within the targeted flight computer, verify hardware interfaces, conduct system level verification, performance, acceptance testing, as well as mission verification using reconfigurable and mission unique data. To evaluate these concepts and tools, NCAD embarked on a project to build a real-time 6 DOF simulation of the Soyuz Assured Crew Return Vehicle flight software. To date, a productivity increase of 185 percent has been seen over traditional NASA methods for developing flight software.

  18. Recovering from "amnesia" brought about by radiation. Verification of the "Over the air" (OTA) application software update mechanism On-Board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Da Silva, Antonio; Sánchez Prieto, Sebastián; Rodriguez Polo, Oscar; Parra Espada, Pablo

    Computer memories are not supposed to forget, but they do. Because of the proximity of the Sun, from the Solar Orbiter boot software perspective, it is mandatory to look out for permanent memory errors resulting from (SEL) latch-up failures in application binaries stored in EEPROM and its SDRAM deployment areas. In this situation, the last line in defense established by FDIR mechanisms is the capability of the boot software to provide an accurate report of the memories’ damages and to perform an application software update, that avoid the harmed locations by flashing EEPROM with a new binary. This paper describes the OTA EEPROM firmware update procedure verification of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. Since the maximum number of rewrites on real EEPROM is limited and permanent memory faults cannot be friendly emulated in real hardware, the verification has been accomplished by the use of a LEON2 Virtual Platform (Leon2ViP) with fault injection capabilities and real SpaceWire interfaces developed by the Space Research Group (SRG) of the University of Alcalá. This way it is possible to run the exact same target binary software as if was run on the real ICU platform. Furthermore, the use of this virtual hardware-in-the-loop (VHIL) approach makes it possible to communicate with Electrical Ground Support Equipment (EGSE) through real SpaceWire interfaces in an agile, controlled and deterministic environment.

  19. Computer Program User’s Manual for FIREFINDER Digital Topographic Data Verification Library Dubbing System. Volume II. Dubbing.

    DTIC Science & Technology

    1982-01-29

    N - Nw .VA COMPUTER PROGRAM USER’S MANUAL FOR . 0FIREFINDER DIGITAL TOPOGRAPHIC DATA VERIFICATION LIBRARY DUBBING SYSTEM VOLUME II DUBBING 29 JANUARY...Digital Topographic Data Verification Library Dubbing System, Volume II, Dubbing 6. PERFORMING ORG. REPORT NUMER 7. AUTHOR(q) S. CONTRACT OR GRANT...Software Library FIREFINDER Dubbing 20. ABSTRACT (Continue an revWee *Ide II necessary end identify by leek mauber) PThis manual describes the computer

  20. NASA software specification and evaluation system: Software verification/validation techniques

    NASA Technical Reports Server (NTRS)

    1977-01-01

    NASA software requirement specifications were used in the development of a system for validating and verifying computer programs. The software specification and evaluation system (SSES) provides for the effective and efficient specification, implementation, and testing of computer software programs. The system as implemented will produce structured FORTRAN or ANSI FORTRAN programs, but the principles upon which SSES is designed allow it to be easily adapted to other high order languages.

  1. SLS Flight Software Testing: Using a Modified Agile Software Testing Approach

    NASA Technical Reports Server (NTRS)

    Bolton, Albanie T.

    2016-01-01

    NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner. Through agile testing, there has been increased value through individuals and interactions over processes and tools, improved customer collaboration, and improved responsiveness to changes through controlled planning. The presentation will describe agile testing methodology as taken with the SLS FSW Test and Verification team at Marshall Space Flight Center.

  2. SEPAC software configuration control plan and procedures, revision 1

    NASA Technical Reports Server (NTRS)

    1981-01-01

    SEPAC Software Configuration Control Plan and Procedures are presented. The objective of the software configuration control is to establish the process for maintaining configuration control of the SEPAC software beginning with the baselining of SEPAC Flight Software Version 1 and encompass the integration and verification tests through Spacelab Level IV Integration. They are designed to provide a simplified but complete configuration control process. The intent is to require a minimum amount of paperwork but provide total traceability of SEPAC software.

  3. Formal Analysis of the Remote Agent Before and After Flight

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Lowry, Mike; Park, SeungJoon; Pecheur, Charles; Penix, John; Visser, Willem; White, Jon L.

    2000-01-01

    This paper describes two separate efforts that used the SPIN model checker to verify deep space autonomy flight software. The first effort occurred at the beginning of a spiral development process and found five concurrency errors early in the design cycle that the developers acknowledge would not have been found through testing. This effort required a substantial manual modeling effort involving both abstraction and translation from the prototype LISP code to the PROMELA language used by SPIN. This experience and others led to research to address the gap between formal method tools and the development cycle used by software developers. The Java PathFinder tool which directly translates from Java to PROMELA was developed as part of this research, as well as automatic abstraction tools. In 1999 the flight software flew on a space mission, and a deadlock occurred in a sibling subsystem to the one which was the focus of the first verification effort. A second quick-response "cleanroom" verification effort found the concurrency error in a short amount of time. The error was isomorphic to one of the concurrency errors found during the first verification effort. The paper demonstrates that formal methods tools can find concurrency errors that indeed lead to loss of spacecraft functions, even for the complex software required for autonomy. Second, it describes progress in automatic translation and abstraction that eventually will enable formal methods tools to be inserted directly into the aerospace software development cycle.

  4. Application of Lightweight Formal Methods to Software Security

    NASA Technical Reports Server (NTRS)

    Gilliam, David P.; Powell, John D.; Bishop, Matt

    2005-01-01

    Formal specification and verification of security has proven a challenging task. There is no single method that has proven feasible. Instead, an integrated approach which combines several formal techniques can increase the confidence in the verification of software security properties. Such an approach which species security properties in a library that can be reused by 2 instruments and their methodologies developed for the National Aeronautics and Space Administration (NASA) at the Jet Propulsion Laboratory (JPL) are described herein The Flexible Modeling Framework (FMF) is a model based verijkation instrument that uses Promela and the SPIN model checker. The Property Based Tester (PBT) uses TASPEC and a Text Execution Monitor (TEM). They are used to reduce vulnerabilities and unwanted exposures in software during the development and maintenance life cycles.

  5. Guidance and Control Software Project Data - Volume 1: Planning Documents

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J. (Editor)

    2008-01-01

    The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the planning documents from the GCS project. Volume 1 contains five appendices: A. Plan for Software Aspects of Certification for the Guidance and Control Software Project; B. Software Development Standards for the Guidance and Control Software Project; C. Software Verification Plan for the Guidance and Control Software Project; D. Software Configuration Management Plan for the Guidance and Control Software Project; and E. Software Quality Assurance Activities.

  6. Validation and verification of a virtual environment for training naval submarine officers

    NASA Astrophysics Data System (ADS)

    Zeltzer, David L.; Pioch, Nicholas J.

    1996-04-01

    A prototype virtual environment (VE) has been developed for training a submarine officer of the desk (OOD) to perform in-harbor navigation on a surfaced submarine. The OOD, stationed on the conning tower of the vessel, is responsible for monitoring the progress of the boat as it negotiates a marked channel, as well as verifying the navigational suggestions of the below- deck piloting team. The VE system allows an OOD trainee to view a particular harbor and associated waterway through a head-mounted display, receive spoken reports from a simulated piloting team, give spoken commands to the helmsman, and receive verbal confirmation of command execution from the helm. The task analysis of in-harbor navigation, and the derivation of application requirements are briefly described. This is followed by a discussion of the implementation of the prototype. This implementation underwent a series of validation and verification assessment activities, including operational validation, data validation, and software verification of individual software modules as well as the integrated system. Validation and verification procedures are discussed with respect to the OOD application in particular, and with respect to VE applications in general.

  7. A system verification platform for high-density epiretinal prostheses.

    PubMed

    Chen, Kuanfu; Lo, Yi-Kai; Yang, Zhi; Weiland, James D; Humayun, Mark S; Liu, Wentai

    2013-06-01

    Retinal prostheses have restored light perception to people worldwide who have poor or no vision as a consequence of retinal degeneration. To advance the quality of visual stimulation for retinal implant recipients, a higher number of stimulation channels is expected in the next generation retinal prostheses, which poses a great challenge to system design and verification. This paper presents a system verification platform dedicated to the development of retinal prostheses. The system includes primary processing, dual-band power and data telemetry, a high-density stimulator array, and two methods for output verification. End-to-end system validation and individual functional block characterization can be achieved with this platform through visual inspection and software analysis. Custom-built software running on the computers also provides a good way for testing new features before they are realized by the ICs. Real-time visual feedbacks through the video displays make it easy to monitor and debug the system. The characterization of the wireless telemetry and the demonstration of the visual display are reported in this paper using a 256-channel retinal prosthetic IC as an example.

  8. A Verification-Driven Approach to Traceability and Documentation for Auto-Generated Mathematical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Fischer, Bernd

    2009-01-01

    Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.

  9. SU-F-T-494: A Multi-Institutional Study of Independent Dose Verification Using Golden Beam Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Itano, M; Yamazaki, T; Tachibana, R

    Purpose: In general, beam data of individual linac is measured for independent dose verification software program and the verification is performed as a secondary check. In this study, independent dose verification using golden beam data was compared to that using individual linac’s beam data. Methods: Six institutions were participated and three different beam data were prepared. The one was individual measured data (Original Beam Data, OBD) .The others were generated by all measurements from same linac model (Model-GBD) and all linac models (All-GBD). The three different beam data were registered to the independent verification software program for each institute. Subsequently,more » patient’s plans in eight sites (brain, head and neck, lung, esophagus, breast, abdomen, pelvis and bone) were analyzed using the verification program to compare doses calculated using the three different beam data. Results: 1116 plans were collected from six institutes. Compared to using the OBD, the results shows the variation using the Model-GBD based calculation and the All-GBD was 0.0 ± 0.3% and 0.0 ± 0.6%, respectively. The maximum variations were 1.2% and 2.3%, respectively. The plans with the variation over 1% shows the reference points were located away from the central axis with/without physical wedge. Conclusion: The confidence limit (2SD) using the Model-GBD and the All-GBD was within 0.6% and 1.2%, respectively. Thus, the use of golden beam data may be feasible for independent verification. In addition to it, the verification using golden beam data provide quality assurance of planning from the view of audit. This research is partially supported by Japan Agency for Medical Research and Development(AMED)« less

  10. Expert system verification and validation guidelines/workshop task. Deliverable no. 1: ES V/V guidelines

    NASA Technical Reports Server (NTRS)

    French, Scott W.

    1991-01-01

    The goals are to show that verifying and validating a software system is a required part of software development and has a direct impact on the software's design and structure. Workshop tasks are given in the areas of statistics, integration/system test, unit and architectural testing, and a traffic controller problem.

  11. On-Orbit Software Analysis

    NASA Technical Reports Server (NTRS)

    Moran, Susanne I.

    2004-01-01

    The On-Orbit Software Analysis Research Infusion Project was done by Intrinsyx Technologies Corporation (Intrinsyx) at the National Aeronautics and Space Administration (NASA) Ames Research Center (ARC). The Project was a joint collaborative effort between NASA Codes IC and SL, Kestrel Technology (Kestrel), and Intrinsyx. The primary objectives of the Project were: Discovery and verification of software program properties and dependencies, Detection and isolation of software defects across different versions of software, and Compilation of historical data and technical expertise for future applications

  12. V&V Within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1996-01-01

    Verification and Validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission-critical software. V&V is a systems engineering discipline that evaluates the software in a systems context, and is currently applied during the development of a specific application system. In order to bring the effectiveness of V&V to bear within reuse-based software engineering, V&V must be incorporated within the domain engineering process.

  13. Autonomy Software: V&V Challenges and Characteristics

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Visser, Willem

    2006-01-01

    The successful operation of unmanned air vehicles requires software with a high degree of autonomy. Only if high level functions can be carried out without human control and intervention, complex missions in a changing and potentially unknown environment can be carried out successfully. Autonomy software is highly mission and safety critical: failures, caused by flaws in the software cannot only jeopardize the mission, but could also endanger human life (e.g., a crash of an UAV in a densely populated area). Due to its large size, high complexity, and use of specialized algorithms (planner, constraint-solver, etc.), autonomy software poses specific challenges for its verification, validation, and certification. -- - we have carried out a survey among researchers aid scientists at NASA to study these issues. In this paper, we will present major results of this study, discussing the broad spectrum. of notions and characteristics of autonomy software and its challenges for design and development. A main focus of this survey was to evaluate verification and validation (V&V) issues and challenges, compared to the development of "traditional" safety-critical software. We will discuss important issues in V&V of autonomous software and advanced V&V tools which can help to mitigate software risks. Results of this survey will help to identify and understand safety concerns in autonomy software and will lead to improved strategies for mitigation of these risks.

  14. Highly efficient simulation environment for HDTV video decoder in VLSI design

    NASA Astrophysics Data System (ADS)

    Mao, Xun; Wang, Wei; Gong, Huimin; He, Yan L.; Lou, Jian; Yu, Lu; Yao, Qingdong; Pirsch, Peter

    2002-01-01

    With the increase of the complex of VLSI such as the SoC (System on Chip) of MPEG-2 Video decoder with HDTV scalability especially, simulation and verification of the full design, even as high as the behavior level in HDL, often proves to be very slow, costly and it is difficult to perform full verification until late in the design process. Therefore, they become bottleneck of the procedure of HDTV video decoder design, and influence it's time-to-market mostly. In this paper, the architecture of Hardware/Software Interface of HDTV video decoder is studied, and a Hardware-Software Mixed Simulation (HSMS) platform is proposed to check and correct error in the early design stage, based on the algorithm of MPEG-2 video decoding. The application of HSMS to target system could be achieved by employing several introduced approaches. Those approaches speed up the simulation and verification task without decreasing performance.

  15. Consistent model driven architecture

    NASA Astrophysics Data System (ADS)

    Niepostyn, Stanisław J.

    2015-09-01

    The goal of the MDA is to produce software systems from abstract models in a way where human interaction is restricted to a minimum. These abstract models are based on the UML language. However, the semantics of UML models is defined in a natural language. Subsequently the verification of consistency of these diagrams is needed in order to identify errors in requirements at the early stage of the development process. The verification of consistency is difficult due to a semi-formal nature of UML diagrams. We propose automatic verification of consistency of the series of UML diagrams originating from abstract models implemented with our consistency rules. This Consistent Model Driven Architecture approach enables us to generate automatically complete workflow applications from consistent and complete models developed from abstract models (e.g. Business Context Diagram). Therefore, our method can be used to check practicability (feasibility) of software architecture models.

  16. Inertial Upper Stage (IUS) software analysis

    NASA Technical Reports Server (NTRS)

    Grayson, W. L.; Nickel, C. E.; Rose, P. L.; Singh, R. P.

    1979-01-01

    The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis.

  17. Software Maintenance Exercises for a Software Engineering Project Course

    DTIC Science & Technology

    1989-02-01

    what is program style and how can it be measured? Program style has been defined as a "followed convention with respect to punctuation, capitalization ...convention with respect to punctuation, capitalization , and typographic arrangement and display." *DASC is a software tool that takes a syntactically...Specilleauons: A Frarnewo* * CM-12 Software Metrws CM- 13 Introduction to Softwarell Verification and Validation CM-14 Intelectual Property Protection for

  18. Improvement of Computer Software Quality through Software Automated Tools.

    DTIC Science & Technology

    1986-08-31

    requirement for increased emphasis on software quality assurance has lead to the creation of various methods of verification and validation. Experience...result was a vast array of methods , systems, languages and automated tools to assist in the process. Given that the primary role of quality assurance is...Unfortunately, there is no single method , tool or technique that can insure accurate, reliable and cost effective software. Therefore, government and industry

  19. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less

  20. Patient-specific IMRT verification using independent fluence-based dose calculation software: experimental benchmarking and initial clinical experience.

    PubMed

    Georg, Dietmar; Stock, Markus; Kroupa, Bernhard; Olofsson, Jörgen; Nyholm, Tufve; Ahnesjö, Anders; Karlsson, Mikael

    2007-08-21

    Experimental methods are commonly used for patient-specific intensity-modulated radiotherapy (IMRT) verification. The purpose of this study was to investigate the accuracy and performance of independent dose calculation software (denoted as 'MUV' (monitor unit verification)) for patient-specific quality assurance (QA). 52 patients receiving step-and-shoot IMRT were considered. IMRT plans were recalculated by the treatment planning systems (TPS) in a dedicated QA phantom, in which an experimental 1D and 2D verification (0.3 cm(3) ionization chamber; films) was performed. Additionally, an independent dose calculation was performed. The fluence-based algorithm of MUV accounts for collimator transmission, rounded leaf ends, tongue-and-groove effect, backscatter to the monitor chamber and scatter from the flattening filter. The dose calculation utilizes a pencil beam model based on a beam quality index. DICOM RT files from patient plans, exported from the TPS, were directly used as patient-specific input data in MUV. For composite IMRT plans, average deviations in the high dose region between ionization chamber measurements and point dose calculations performed with the TPS and MUV were 1.6 +/- 1.2% and 0.5 +/- 1.1% (1 S.D.). The dose deviations between MUV and TPS slightly depended on the distance from the isocentre position. For individual intensity-modulated beams (total 367), an average deviation of 1.1 +/- 2.9% was determined between calculations performed with the TPS and with MUV, with maximum deviations up to 14%. However, absolute dose deviations were mostly less than 3 cGy. Based on the current results, we aim to apply a confidence limit of 3% (with respect to the prescribed dose) or 6 cGy for routine IMRT verification. For off-axis points at distances larger than 5 cm and for low dose regions, we consider 5% dose deviation or 10 cGy acceptable. The time needed for an independent calculation compares very favourably with the net time for an experimental approach. The physical effects modelled in the dose calculation software MUV allow accurate dose calculations in individual verification points. Independent calculations may be used to replace experimental dose verification once the IMRT programme is mature.

  1. Comparison between In-house developed and Diamond commercial software for patient specific independent monitor unit calculation and verification with heterogeneity corrections.

    PubMed

    Kuppusamy, Vijayalakshmi; Nagarajan, Vivekanandan; Jeevanandam, Prakash; Murugan, Lavanya

    2016-02-01

    The study was aimed to compare two different monitor unit (MU) or dose verification software in volumetric modulated arc therapy (VMAT) using modified Clarkson's integration technique for 6 MV photons beams. In-house Excel Spreadsheet based monitor unit verification calculation (MUVC) program and PTW's DIAMOND secondary check software (SCS), version-6 were used as a secondary check to verify the monitor unit (MU) or dose calculated by treatment planning system (TPS). In this study 180 patients were grouped into 61 head and neck, 39 thorax and 80 pelvic sites. Verification plans are created using PTW OCTAVIUS-4D phantom and also measured using 729 detector chamber and array with isocentre as the suitable point of measurement for each field. In the analysis of 154 clinically approved VMAT plans with isocentre at a region above -350 HU, using heterogeneity corrections, In-house Spreadsheet based MUVC program and Diamond SCS showed good agreement TPS. The overall percentage average deviations for all sites were (-0.93% + 1.59%) and (1.37% + 2.72%) for In-house Excel Spreadsheet based MUVC program and Diamond SCS respectively. For 26 clinically approved VMAT plans with isocentre at a region below -350 HU showed higher variations for both In-house Spreadsheet based MUVC program and Diamond SCS. It can be concluded that for patient specific quality assurance (QA), the In-house Excel Spreadsheet based MUVC program and Diamond SCS can be used as a simple and fast accompanying to measurement based verification for plans with isocentre at a region above -350 HU. Copyright © 2016 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  2. Verification Testing: Meet User Needs Figure of Merit

    NASA Technical Reports Server (NTRS)

    Kelly, Bryan W.; Welch, Bryan W.

    2017-01-01

    Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible calculations. Additionally, this project will look at the coding generally and note inconsistencies, redundancies, and other aspects that may become problematic or slow down the codes run time. Certain scripts lacking in documentation also will be commented and cataloged.

  3. Definition of ground test for Large Space Structure (LSS) control verification

    NASA Technical Reports Server (NTRS)

    Waites, H. B.; Doane, G. B., III; Tollison, D. K.

    1984-01-01

    An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.

  4. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  5. Decision Engines for Software Analysis Using Satisfiability Modulo Theories Solvers

    NASA Technical Reports Server (NTRS)

    Bjorner, Nikolaj

    2010-01-01

    The area of software analysis, testing and verification is now undergoing a revolution thanks to the use of automated and scalable support for logical methods. A well-recognized premise is that at the core of software analysis engines is invariably a component using logical formulas for describing states and transformations between system states. The process of using this information for discovering and checking program properties (including such important properties as safety and security) amounts to automatic theorem proving. In particular, theorem provers that directly support common software constructs offer a compelling basis. Such provers are commonly called satisfiability modulo theories (SMT) solvers. Z3 is a state-of-the-art SMT solver. It is developed at Microsoft Research. It can be used to check the satisfiability of logical formulas over one or more theories such as arithmetic, bit-vectors, lists, records and arrays. The talk describes some of the technology behind modern SMT solvers, including the solver Z3. Z3 is currently mainly targeted at solving problems that arise in software analysis and verification. It has been applied to various contexts, such as systems for dynamic symbolic simulation (Pex, SAGE, Vigilante), for program verification and extended static checking (Spec#/Boggie, VCC, HAVOC), for software model checking (Yogi, SLAM), model-based design (FORMULA), security protocol code (F7), program run-time analysis and invariant generation (VS3). We will describe how it integrates support for a variety of theories that arise naturally in the context of the applications. There are several new promising avenues and the talk will touch on some of these and the challenges related to SMT solvers. Proceedings

  6. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  7. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (IIT-A-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures for the initial and periodic verification and validation of computer programs. The programs are used during the Arizona NHEXAS project and Border study at the Illinois Institute of Technology (IIT) site. Keywords: computers; s...

  8. Software Independent Verification and Validation (SIV&V) Simplified

    DTIC Science & Technology

    2006-12-01

    Configuration Item I/O Input/Output I2V2 Independent Integrated Verification and Validation IBM International Business Machines ICD Interface...IPT Integrated Product Team IRS Interface Requirements Specification ISD Integrated System Diagram ITD Integrated Test Description ITP ...programming languages such as COBOL (Common Business Oriented Language) (Codasyl committee 1960), and FORTRAN (FORmula TRANslator) ( IBM 1952) (Robat 11

  9. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    NASA Technical Reports Server (NTRS)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  10. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  11. A new plan-scoring method using normal tissue complication probability for personalized treatment plan decisions in prostate cancer

    NASA Astrophysics Data System (ADS)

    Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie; Chang, Kyung Hwan

    2018-01-01

    The aim of this study was to derive a new plan-scoring index using normal tissue complication probabilities to verify different plans in the selection of personalized treatment. Plans for 12 patients treated with tomotherapy were used to compare scoring for ranking. Dosimetric and biological indexes were analyzed for the plans for a clearly distinguishable group ( n = 7) and a similar group ( n = 12), using treatment plan verification software that we developed. The quality factor ( QF) of our support software for treatment decisions was consistent with the final treatment plan for the clearly distinguishable group (average QF = 1.202, 100% match rate, n = 7) and the similar group (average QF = 1.058, 33% match rate, n = 12). Therefore, we propose a normal tissue complication probability (NTCP) based on the plan scoring index for verification of different plans for personalized treatment-plan selection. Scoring using the new QF showed a 100% match rate (average NTCP QF = 1.0420). The NTCP-based new QF scoring method was adequate for obtaining biological verification quality and organ risk saving using the treatment-planning decision-support software we developed for prostate cancer.

  12. Proceedings of the Twenty-Third Annual Software Engineering Workshop

    NASA Technical Reports Server (NTRS)

    1999-01-01

    The Twenty-third Annual Software Engineering Workshop (SEW) provided 20 presentations designed to further the goals of the Software Engineering Laboratory (SEL) of the NASA-GSFC. The presentations were selected on their creativity. The sessions which were held on 2-3 of December 1998, centered on the SEL, Experimentation, Inspections, Fault Prediction, Verification and Validation, and Embedded Systems and Safety-Critical Systems.

  13. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  14. Verification and Validation of the Malicious Activity Simulation Tool (MAST) for Network Administrator Training and Evaluation

    DTIC Science & Technology

    2012-03-01

    to sell fake antivirus software ; Gammima, which was used to steal gaming login information; and Zeus, which was used to steal banking information...13 3. Viruses ......................................14 C. PROOF OF CONCEPT OF SOFTWARE TRAINING USING MALWARE MIMICS...33 2. Software .....................................34 3. COMPOSE CG-71 Virtual Machines ...............37 a. Integrated Shipboard Network System

  15. Shuttle avionics software development trials: Tribulations and successes, the backup flight system

    NASA Technical Reports Server (NTRS)

    Chevers, E. S.

    1985-01-01

    The development and verification of the Backup Flight System software (BFS) is discussed. The approach taken for the BFS was to develop a very simple and straightforward software program and then test it in every conceivable manner. The result was a program that contained approximately 12,000 full words including ground checkout and the built in test program for the computer. To perform verification, a series of tests was defined using the actual flight type hardware and simulated flight conditions. Then simulated flights were flown and detailed performance analysis was conducted. The intent of most BFS tests was to demonstrate that a stable flightpath could be obtained after engagement from an anomalous initial condition. The extention of the BFS to meet the requirements of the orbital flight test phase is also described.

  16. GlassForm

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2011-09-16

    GlassForm is a software tool for generating preliminary waste glass formulas for a given waste stream. The software is useful because it reduces the number of verification melts required to develop a suitable additive composition. The software includes property models that calculate glass properties of interest from the chemical composition of the waste glass. The software includes property models for glass viscosity, electrical conductivity, glass transition temperature, and leach resistance as measured by the 7-day product consistency test (PCT).

  17. Production of Reliable Flight Crucial Software: Validation Methods Research for Fault Tolerant Avionics and Control Systems Sub-Working Group Meeting

    NASA Technical Reports Server (NTRS)

    Dunham, J. R. (Editor); Knight, J. C. (Editor)

    1982-01-01

    The state of the art in the production of crucial software for flight control applications was addressed. The association between reliability metrics and software is considered. Thirteen software development projects are discussed. A short term need for research in the areas of tool development and software fault tolerance was indicated. For the long term, research in format verification or proof methods was recommended. Formal specification and software reliability modeling, were recommended as topics for both short and long term research.

  18. Simulator predicts transient flow for Malaysian subsea pipeline

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Inayat-Hussain, A.A.; Ayob, M.S.; Zain, A.B.M.

    1996-04-15

    In a step towards acquiring in-house capability in multiphase flow technology, Petronas Research and Scientific Services Sdn. Bhd., Kuala Lumpur, has developed two-phase flow simulation software for analyzing slow gas-condensate transient flow. Unlike its general-purpose contemporaries -- TACITE, OLGA, Traflow (OGJ, Jan. 3, 1994, p. 42; OGJ, Jan. 10, 1994, p. 52), and PLAC (AEA Technology, U.K.) -- ABASs is a dedicated software for slow transient flows generated during pigging operations in the Duyong network, offshore Malaysia. This network links the Duyong and Bekok fields to the onshore gas terminal (OGT) on the east coast of peninsular Malaysia. It predictsmore » the steady-state pressure drop vs. flow rates, condensate volume in the network, pigging dynamics including volume of produced slug, and the condensate build-up following pigging. The predictions of ABASs have been verified against field data obtained from the Duyong network. Presented here is an overview of the development, verification, and application of the ABASs software. Field data are presented for verification of the software, and several operational scenarios are simulated using the software. The field data and simulation study documented here will provide software users and developers with a further set of results on which to benchmark their own software and two-phase pipeline operating guidelines.« less

  19. FEBio: finite elements for biomechanics.

    PubMed

    Maas, Steve A; Ellis, Benjamin J; Ateshian, Gerard A; Weiss, Jeffrey A

    2012-01-01

    In the field of computational biomechanics, investigators have primarily used commercial software that is neither geared toward biological applications nor sufficiently flexible to follow the latest developments in the field. This lack of a tailored software environment has hampered research progress, as well as dissemination of models and results. To address these issues, we developed the FEBio software suite (http://mrl.sci.utah.edu/software/febio), a nonlinear implicit finite element (FE) framework, designed specifically for analysis in computational solid biomechanics. This paper provides an overview of the theoretical basis of FEBio and its main features. FEBio offers modeling scenarios, constitutive models, and boundary conditions, which are relevant to numerous applications in biomechanics. The open-source FEBio software is written in C++, with particular attention to scalar and parallel performance on modern computer architectures. Software verification is a large part of the development and maintenance of FEBio, and to demonstrate the general approach, the description and results of several problems from the FEBio Verification Suite are presented and compared to analytical solutions or results from other established and verified FE codes. An additional simulation is described that illustrates the application of FEBio to a research problem in biomechanics. Together with the pre- and postprocessing software PREVIEW and POSTVIEW, FEBio provides a tailored solution for research and development in computational biomechanics.

  20. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  1. Analysis of key technologies for virtual instruments metrology

    NASA Astrophysics Data System (ADS)

    Liu, Guixiong; Xu, Qingui; Gao, Furong; Guan, Qiuju; Fang, Qiang

    2008-12-01

    Virtual instruments (VIs) require metrological verification when applied as measuring instruments. Owing to the software-centered architecture, metrological evaluation of VIs includes two aspects: measurement functions and software characteristics. Complexity of software imposes difficulties on metrological testing of VIs. Key approaches and technologies for metrology evaluation of virtual instruments are investigated and analyzed in this paper. The principal issue is evaluation of measurement uncertainty. The nature and regularity of measurement uncertainty caused by software and algorithms can be evaluated by modeling, simulation, analysis, testing and statistics with support of powerful computing capability of PC. Another concern is evaluation of software features like correctness, reliability, stability, security and real-time of VIs. Technologies from software engineering, software testing and computer security domain can be used for these purposes. For example, a variety of black-box testing, white-box testing and modeling approaches can be used to evaluate the reliability of modules, components, applications and the whole VI software. The security of a VI can be assessed by methods like vulnerability scanning and penetration analysis. In order to facilitate metrology institutions to perform metrological verification of VIs efficiently, an automatic metrological tool for the above validation is essential. Based on technologies of numerical simulation, software testing and system benchmarking, a framework for the automatic tool is proposed in this paper. Investigation on implementation of existing automatic tools that perform calculation of measurement uncertainty, software testing and security assessment demonstrates the feasibility of the automatic framework advanced.

  2. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  3. Verified compilation of Concurrent Managed Languages

    DTIC Science & Technology

    2017-11-01

    designs for compiler intermediate representations that facilitate mechanized proofs and verification; and (d) a realistic case study that combines these...ideas to prove the correctness of a state-of- the-art concurrent garbage collector. 15. SUBJECT TERMS Program verification, compiler design ...Even though concurrency is a pervasive part of modern software and hardware systems, it has often been ignored in safety-critical system designs . A

  4. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    NASA Technical Reports Server (NTRS)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  5. An Empirical Verification of a-priori Learning Models on Mailing Archives in the Context of Online Learning Activities of Participants in Free\\Libre Open Source Software (FLOSS) Communities

    ERIC Educational Resources Information Center

    Mukala, Patrick; Cerone, Antonio; Turini, Franco

    2017-01-01

    Free\\Libre Open Source Software (FLOSS) environments are increasingly dubbed as learning environments where practical software engineering skills can be acquired. Numerous studies have extensively investigated how knowledge is acquired in these environments through a collaborative learning model that define a learning process. Such a learning…

  6. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  7. Abstract for 1999 Rational Software User Conference

    NASA Technical Reports Server (NTRS)

    Dunphy, Julia; Rouquette, Nicolas; Feather, Martin; Tung, Yu-Wen

    1999-01-01

    We develop spacecraft fault-protection software at NASA/JPL. Challenges exemplified by our task: 1) high-quality systems - need for extensive validation & verification; 2) multi-disciplinary context - involves experts from diverse areas; 3) embedded systems - must adapt to external practices, notations, etc.; and 4) development pressures - NASA's mandate of "better, faster, cheaper".

  8. The Software Maturity Matrix: A Software Performance Metric

    DTIC Science & Technology

    2003-01-28

    are for Managing n Use Them! n Unused measurements have the same value as last night’s unused hotel room or an empty airline seat. n Be Prepared to...standard measurements are implicit n Organization standard verification is implicit n Organization standard SMM training can be the basis of an

  9. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  10. Interface Generation and Compositional Verification in JavaPathfinder

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina

    2009-01-01

    We present a novel algorithm for interface generation of software components. Given a component, our algorithm uses learning techniques to compute a permissive interface representing legal usage of the component. Unlike our previous work, this algorithm does not require knowledge about the component s environment. Furthermore, in contrast to other related approaches, our algorithm computes permissive interfaces even in the presence of non-determinism in the component. Our algorithm is implemented in the JavaPathfinder model checking framework for UML statechart components. We have also added support for automated assume-guarantee style compositional verification in JavaPathfinder, using component interfaces. We report on the application of the presented approach to the generation of interfaces for flight software components.

  11. Large - scale Rectangular Ruler Automated Verification Device

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  12. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  13. Top down, bottom up structured programming and program structuring

    NASA Technical Reports Server (NTRS)

    Hamilton, M.; Zeldin, S.

    1972-01-01

    New design and programming techniques for shuttle software. Based on previous Apollo experience, recommendations are made to apply top-down structured programming techniques to shuttle software. New software verification techniques for large software systems are recommended. HAL, the higher order language selected for the shuttle flight code, is discussed and found to be adequate for implementing these techniques. Recommendations are made to apply the workable combination of top-down, bottom-up methods in the management of shuttle software. Program structuring is discussed relevant to both programming and management techniques.

  14. SU-E-T-435: Development and Commissioning of a Complete System for In-Vivo Dosimetry and Range Verification in Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samuel, D; Testa, M; Park, Y

    Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separablemore » into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.« less

  15. Validation of a Quality Management Metric

    DTIC Science & Technology

    2000-09-01

    quality management metric (QMM) was used to measure the performance of ten software managers on Department of Defense (DoD) software development programs. Informal verification and validation of the metric compared the QMM score to an overall program success score for the entire program and yielded positive correlation. The results of applying the QMM can be used to characterize the quality of software management and can serve as a template to improve software management performance. Future work includes further refining the QMM, applying the QMM scores to provide feedback

  16. Seven Processes that Enable NASA Software Engineering Technologies

    NASA Technical Reports Server (NTRS)

    Housch, Helen; Godfrey, Sally

    2011-01-01

    This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.

  17. Evaluation of verification and testing tools for FORTRAN programs

    NASA Technical Reports Server (NTRS)

    Smith, K. A.

    1980-01-01

    Two automated software verification and testing systems were developed for use in the analysis of computer programs. An evaluation of the static analyzer DAVE and the dynamic analyzer PET, which are used in the analysis of FORTRAN programs on Control Data (CDC) computers, are described. Both systems were found to be effective and complementary, and are recommended for use in testing FORTRAN programs.

  18. Testing of Hand-Held Mine Detection Systems

    DTIC Science & Technology

    2015-01-08

    ITOP 04-2-5208 for guidance on software testing . Testing software is necessary to ensure that safety is designed into the software algorithm, and that...sensor verification areas or target lanes. F.2. TESTING OBJECTIVES. a. Testing objectives will impact on the test design . Some examples of...overall safety, performance, and reliability of the system. It describes activities necessary to ensure safety is designed into the system under test

  19. Joint Logistics Commanders’ Biennial Software Workshop (4th) Orlando II: Solving the PDSS (Post Deployment Software Support) Challenge Held in Orlando, Florida on 27-29 January 87. Volume 2. Proceedings

    DTIC Science & Technology

    1987-06-01

    described the state )f ruaturity of software engineering as being equivalent to the state of maturity of Civil Engineering before Pythagoras invented the...formal verification languages, theorem provers or secure configuration 0 management tools would have to be maintained and used in the PDSS Center to

  20. Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.

    PubMed

    Dasbach, Erik J; Elbasha, Elamin H

    2017-07-01

    Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.

  1. Independent verification and validation for Space Shuttle flight software

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The Committee for Review of Oversight Mechanisms for Space Shuttle Software was asked by the National Aeronautics and Space Administration's (NASA) Office of Space Flight to determine the need to continue independent verification and validation (IV&V) for Space Shuttle flight software. The Committee found that the current IV&V process is necessary to maintain NASA's stringent safety and quality requirements for man-rated vehicles. Therefore, the Committee does not support NASA's plan to eliminate funding for the IV&V effort in fiscal year 1993. The Committee believes that the Space Shuttle software development process is not adequate without IV&V and that elimination of IV&V as currently practiced will adversely affect the overall quality and safety of the software, both now and in the future. Furthermore, the Committee was told that no organization within NASA has the expertise or the manpower to replace the current IV&V function in a timely fashion, nor will building this expertise elsewhere necessarily reduce cost. Thus, the Committee does not recommend moving IV&V functions to other organizations within NASA unless the current IV&V is maintained for as long as it takes to build comparable expertise in the replacing organization.

  2. Knowledge-based system verification and validation

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C.

    1990-01-01

    The objective of this task is to develop and evaluate a methodology for verification and validation (V&V) of knowledge-based systems (KBS) for space station applications with high reliability requirements. The approach consists of three interrelated tasks. The first task is to evaluate the effectiveness of various validation methods for space station applications. The second task is to recommend requirements for KBS V&V for Space Station Freedom (SSF). The third task is to recommend modifications to the SSF to support the development of KBS using effectiveness software engineering and validation techniques. To accomplish the first task, three complementary techniques will be evaluated: (1) Sensitivity Analysis (Worchester Polytechnic Institute); (2) Formal Verification of Safety Properties (SRI International); and (3) Consistency and Completeness Checking (Lockheed AI Center). During FY89 and FY90, each contractor will independently demonstrate the user of his technique on the fault detection, isolation, and reconfiguration (FDIR) KBS or the manned maneuvering unit (MMU), a rule-based system implemented in LISP. During FY91, the application of each of the techniques to other knowledge representations and KBS architectures will be addressed. After evaluation of the results of the first task and examination of Space Station Freedom V&V requirements for conventional software, a comprehensive KBS V&V methodology will be developed and documented. Development of highly reliable KBS's cannot be accomplished without effective software engineering methods. Using the results of current in-house research to develop and assess software engineering methods for KBS's as well as assessment of techniques being developed elsewhere, an effective software engineering methodology for space station KBS's will be developed, and modification of the SSF to support these tools and methods will be addressed.

  3. Specification and Verification of Medical Monitoring System Using Petri-nets.

    PubMed

    Majma, Negar; Babamir, Seyed Morteza

    2014-07-01

    To monitor the patient behavior, data are collected from patient's body by a medical monitoring device so as to calculate the output using embedded software. Incorrect calculations may endanger the patient's life if the software fails to meet the patient's requirements. Accordingly, the veracity of the software behavior is a matter of concern in the medicine; moreover, the data collected from the patient's body are fuzzy. Some methods have already dealt with monitoring the medical monitoring devices; however, model based monitoring fuzzy computations of such devices have been addressed less. The present paper aims to present synthesizing a fuzzy Petri-net (FPN) model to verify behavior of a sample medical monitoring device called continuous infusion insulin (INS) because Petri-net (PN) is one of the formal and visual methods to verify the software's behavior. The device is worn by the diabetic patients and then the software calculates the INS dose and makes a decision for injection. The input and output of the infusion INS software are not crisp in the real world; therefore, we present them in fuzzy variables. Afterwards, we use FPN instead of clear PN to model the fuzzy variables. The paper follows three steps to synthesize an FPN to deal with verification of the infusion INS device: (1) Definition of fuzzy variables, (2) definition of fuzzy rules and (3) design of the FPN model to verify the software behavior.

  4. Space shuttle engineering and operations support. Avionics system engineering

    NASA Technical Reports Server (NTRS)

    Broome, P. A.; Neubaur, R. J.; Welsh, R. T.

    1976-01-01

    The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.

  5. Verification and Validation for Flight-Critical Systems (VVFCS)

    NASA Technical Reports Server (NTRS)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  6. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  7. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowell, Michael W

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oakmore » Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).« less

  8. Methodology for Software Reliability Prediction. Volume 2.

    DTIC Science & Technology

    1987-11-01

    The overall acquisition ,z program shall include the resources, schedule, management, structure , and controls necessary to ensure that specified AD...Independent Verification/Validation - Programming Team Structure - Educational Level of Team Members - Experience Level of Team Members * Methods Used...Prediction or Estimation Parameter Supported: Software - Characteristics 3. Objectives: Structured programming studies and Government Ur.’.. procurement

  9. Critical Software for Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Preden, Antonio; Kaschner, Jens; Rettig, Felix; Rodriggs, Michael

    2017-01-01

    The NASA Orion vehicle that will fly to the moon in the next years is propelled along its mission by the European Service Module (ESM), developed by ESA and its prime contractor Airbus Defense and Space. This paper describes the development of the Propulsion Drive Electronics (PDE) Software that provides the interface between the propulsion hardware of the European Service Module with the Orion flight computers, and highlights the challenges that have been faced during the development. Particularly, the specific aspects relevant to Human Spaceflight in an international cooperation are presented, as the compliance to both European and US standards and the software criticality classification to the highest category A. An innovative aspect of the PDE SW is its Time- Triggered Ethernet interface with the Orion Flight Computers, which has never been flown so far on any European spacecraft. Finally the verification aspects are presented, applying the most exigent quality requirements defined in the European Cooperation for Space Standardization (ECSS) standards such as the structural coverage analysis of the object code and the recourse to an independent software verification and validation activity carried on in parallel by a different team.

  10. CaveMan Enterprise version 1.0 Software Validation and Verification.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hart, David

    The U.S. Department of Energy Strategic Petroleum Reserve stores crude oil in caverns solution-mined in salt domes along the Gulf Coast of Louisiana and Texas. The CaveMan software program has been used since the late 1990s as one tool to analyze pressure mea- surements monitored at each cavern. The purpose of this monitoring is to catch potential cavern integrity issues as soon as possible. The CaveMan software was written in Microsoft Visual Basic, and embedded in a Microsoft Excel workbook; this method of running the CaveMan software is no longer sustainable. As such, a new version called CaveMan Enter- prisemore » has been developed. CaveMan Enterprise version 1.0 does not have any changes to the CaveMan numerical models. CaveMan Enterprise represents, instead, a change from desktop-managed work- books to an enterprise framework, moving data management into coordinated databases and porting the numerical modeling codes into the Python programming language. This document provides a report of the code validation and verification testing.« less

  11. The capability of lithography simulation based on MVM-SEM® system

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong

    2015-10-01

    The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.

  12. Active Mirror Predictive and Requirements Verification Software (AMP-ReVS)

    NASA Technical Reports Server (NTRS)

    Basinger, Scott A.

    2012-01-01

    This software is designed to predict large active mirror performance at various stages in the fabrication lifecycle of the mirror. It was developed for 1-meter class powered mirrors for astronomical purposes, but is extensible to other geometries. The package accepts finite element model (FEM) inputs and laboratory measured data for large optical-quality mirrors with active figure control. It computes phenomenological contributions to the surface figure error using several built-in optimization techniques. These phenomena include stresses induced in the mirror by the manufacturing process and the support structure, the test procedure, high spatial frequency errors introduced by the polishing process, and other process-dependent deleterious effects due to light-weighting of the mirror. Then, depending on the maturity of the mirror, it either predicts the best surface figure error that the mirror will attain, or it verifies that the requirements for the error sources have been met once the best surface figure error has been measured. The unique feature of this software is that it ties together physical phenomenology with wavefront sensing and control techniques and various optimization methods including convex optimization, Kalman filtering, and quadratic programming to both generate predictive models and to do requirements verification. This software combines three distinct disciplines: wavefront control, predictive models based on FEM, and requirements verification using measured data in a robust, reusable code that is applicable to any large optics for ground and space telescopes. The software also includes state-of-the-art wavefront control algorithms that allow closed-loop performance to be computed. It allows for quantitative trade studies to be performed for optical systems engineering, including computing the best surface figure error under various testing and operating conditions. After the mirror manufacturing process and testing have been completed, the software package can be used to verify that the underlying requirements have been met.

  13. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Brantley

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided tomore » achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.« less

  14. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  15. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  16. Applicability of SREM to the Verification of Management Information System Software Requirements. Volume I.

    DTIC Science & Technology

    1981-04-30

    However, SREM was not designed to harmonize these kinds of problems. Rather, it is a tool to investigate the logic of the processing specified in the... design . Supoorting programs were also conducted to perform basic research into such areas as software reliability, static and dynamic validation techniques...development. 0 Maintain requirements development independent of the target machine and the eventual software design . 0. Allow for easy response to

  17. Technology Transfer Challenges for High-Assurance Software Engineering Tools

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.

    2003-01-01

    In this paper, we describe our experience with the challenges thar we are currently facing in our effort to develop advanced software verification and validation tools. We categorize these challenges into several areas: cost benefits modeling, tool usability, customer application domain, and organizational issues. We provide examples of challenges in each area and identrfj, open research issues in areas which limit our ability to transfer high-assurance software engineering tools into practice.

  18. YIP Formal Synthesis of Software-Based Control Protocols for Fractionated,Composable Autonomous Systems

    DTIC Science & Technology

    2016-07-08

    Systems Using Automata Theory and Barrier Certifi- cates We developed a sound but incomplete method for the computational verification of specifications...method merges ideas from automata -based model checking with those from control theory including so-called barrier certificates and optimization-based... Automata theory meets barrier certificates: Temporal logic verification of nonlinear systems,” IEEE Transactions on Automatic Control, 2015. [J2] R

  19. Crowd-Sourced Help with Emergent Knowledge for Optimized Formal Verification (CHEKOFV)

    DTIC Science & Technology

    2016-03-01

    up game Binary Fission, which was deployed during Phase Two of CHEKOFV. Xylem: The Code of Plants is a casual game for players using mobile ...there are the design and engineering challenges of building a game infrastructure that integrates verification technology with crowd participation...the backend processes that annotate the originating software. Allowing players to construct their own equations opened up the flexibility to receive

  20. A Tool for Verification and Validation of Neural Network Based Adaptive Controllers for High Assurance Systems

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Schumann, Johann

    2004-01-01

    High reliability of mission- and safety-critical software systems has been identified by NASA as a high-priority technology challenge. We present an approach for the performance analysis of a neural network (NN) in an advanced adaptive control system. This problem is important in the context of safety-critical applications that require certification, such as flight software in aircraft. We have developed a tool to measure the performance of the NN during operation by calculating a confidence interval (error bar) around the NN's output. Our tool can be used during pre-deployment verification as well as monitoring the network performance during operation. The tool has been implemented in Simulink and simulation results on a F-15 aircraft are presented.

  1. Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia

    1996-01-01

    The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.

  2. RELAP5-3D Resolution of Known Restart/Backup Issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesina, George L.; Anderson, Nolan A.

    2014-12-01

    The state-of-the-art nuclear reactor system safety analysis computer program developed at the Idaho National Laboratory (INL), RELAP5-3D, continues to adapt to changes in computer hardware and software and to develop to meet the ever-expanding needs of the nuclear industry. To continue at the forefront, code testing must evolve with both code and industry developments, and it must work correctly. To best ensure this, the processes of Software Verification and Validation (V&V) are applied. Verification compares coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions. A form of this, sequentialmore » verification, checks code specifications against coding only when originally written then applies regression testing which compares code calculations between consecutive updates or versions on a set of test cases to check that the performance does not change. A sequential verification testing system was specially constructed for RELAP5-3D to both detect errors with extreme accuracy and cover all nuclear-plant-relevant code features. Detection is provided through a “verification file” that records double precision sums of key variables. Coverage is provided by a test suite of input decks that exercise code features and capabilities necessary to model a nuclear power plant. A matrix of test features and short-running cases that exercise them is presented. This testing system is used to test base cases (called null testing) as well as restart and backup cases. It can test RELAP5-3D performance in both standalone and coupled (through PVM to other codes) runs. Application of verification testing revealed numerous restart and backup issues in both standalone and couple modes. This document reports the resolution of these issues.« less

  3. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bojechko, Casey; Phillps, Mark; Kalet, Alan

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into differentmore » failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.« less

  4. On the engineering of crucial software

    NASA Technical Reports Server (NTRS)

    Pratt, T. W.; Knight, J. C.; Gregory, S. T.

    1983-01-01

    The various aspects of the conventional software development cycle are examined. This cycle was the basis of the augmented approach contained in the original grant proposal. This cycle was found inadequate for crucial software development, and the justification for this opinion is presented. Several possible enhancements to the conventional software cycle are discussed. Software fault tolerance, a possible enhancement of major importance, is discussed separately. Formal verification using mathematical proof is considered. Automatic programming is a radical alternative to the conventional cycle and is discussed. Recommendations for a comprehensive approach are presented, and various experiments which could be conducted in AIRLAB are described.

  5. Tools for Embedded Computing Systems Software

    NASA Technical Reports Server (NTRS)

    1978-01-01

    A workshop was held to assess the state of tools for embedded systems software and to determine directions for tool development. A synopsis of the talk and the key figures of each workshop presentation, together with chairmen summaries, are presented. The presentations covered four major areas: (1) tools and the software environment (development and testing); (2) tools and software requirements, design, and specification; (3) tools and language processors; and (4) tools and verification and validation (analysis and testing). The utility and contribution of existing tools and research results for the development and testing of embedded computing systems software are described and assessed.

  6. Using formal specification in the Guidance and Control Software (GCS) experiment. Formal design and verification technology for life critical systems

    NASA Technical Reports Server (NTRS)

    Weber, Doug; Jamsek, Damir

    1994-01-01

    The goal of this task was to investigate how formal methods could be incorporated into a software engineering process for flight-control systems under DO-178B and to demonstrate that process by developing a formal specification for NASA's Guidance and Controls Software (GCS) Experiment. GCS is software to control the descent of a spacecraft onto a planet's surface. The GCS example is simplified from a real example spacecraft, but exhibits the characteristics of realistic spacecraft control software. The formal specification is written in Larch.

  7. Advanced fingerprint verification software

    NASA Astrophysics Data System (ADS)

    Baradarani, A.; Taylor, J. R. B.; Severin, F.; Maev, R. Gr.

    2016-05-01

    We have developed a fingerprint software package that can be used in a wide range of applications from law enforcement to public and private security systems, and to personal devices such as laptops, vehicles, and door- locks. The software and processing units are a unique implementation of new and sophisticated algorithms that compete with the current best systems in the world. Development of the software package has been in line with the third generation of our ultrasonic fingerprinting machine1. Solid and robust performance is achieved in the presence of misplaced and low quality fingerprints.

  8. Perpetual Model Validation

    DTIC Science & Technology

    2017-03-01

    models of software execution, for example memory access patterns, to check for security intrusions. Additional research was performed to tackle the...considered using indirect models of software execution, for example memory access patterns, to check for security intrusions. Additional research ...deterioration for example , no longer corresponds to the model used during verification time. Finally, the research looked at ways to combine hybrid systems

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: EVALUATION OF THE XP-SWMM STORMWATER WASTEWATER MANAGEMENT MODEL, VERSION 8.2, 2000, FROM XP SOFTWARE, INC.

    EPA Science Inventory

    XP-SWMM is a commercial software package used throughout the United States and around the world for simulation of storm, sanitary and combined sewer systems. It was designed based on the EPA Storm Water Management Model (EPA SWMM), but has enhancements and additional algorithms f...

  10. Design, Development and Delivery of Active Learning Tools in Software Verification & Validation Education

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan Anant; Wu, Peter; Maxim, Bruce; Hansen, Mary

    2018-01-01

    Active learning tools are critical in imparting real world experiences to the students within a classroom environment. This is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains with little to no training. However, there is a well-recognized need for the…

  11. A Software Assurance Framework for Mitigating the Risks of Malicious Software in Embedded Systems Used in Aircraft

    DTIC Science & Technology

    2011-09-01

    to show cryptographic signature # generation on a UNIX system # SHA=/bin/ sha256 CSDB=/tmp/csdb CODEBASE=. touch "$CSDB" find "$CODEBASE" -type f...artifacts generated earlier. 81 #! /bin/sh # # Demo program to show cryptographic signature # verification on a UNIX system # SHA=/bin/ sha256 CSDB=/tmp

  12. Cleanroom Software Engineering Reference Model. Version 1.0.

    DTIC Science & Technology

    1996-11-01

    teams. It also serves as a baseline for continued evolution of Cleanroom practice. The scope of the CRM is software management , specification...addition to project staff, participants include management , peer organization representatives, and customer representatives as appropriate for...2 Review the status of the process with management , the project team, peer groups, and the customer . These verification activities include

  13. PDSS/IMC requirements and functional specifications

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The system (software and hardware) requirements for the Payload Development Support System (PDSS)/Image Motion Compensator (IMC) are provided. The PDSS/IMC system provides the capability for performing Image Motion Compensator Electronics (IMCE) flight software test, checkout, and verification and provides the capability for monitoring the IMC flight computer system during qualification testing for fault detection and fault isolation.

  14. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  15. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  16. CASL Dakota Capabilities Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Simmons, Chris; Williams, Brian J.

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  17. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  18. Evaluation of an expert system for fault detection, isolation, and recovery in the manned maneuvering unit

    NASA Technical Reports Server (NTRS)

    Rushby, John; Crow, Judith

    1990-01-01

    The authors explore issues in the specification, verification, and validation of artificial intelligence (AI) based software, using a prototype fault detection, isolation and recovery (FDIR) system for the Manned Maneuvering Unit (MMU). They use this system as a vehicle for exploring issues in the semantics of C-Language Integrated Production System (CLIPS)-style rule-based languages, the verification of properties relating to safety and reliability, and the static and dynamic analysis of knowledge based systems. This analysis reveals errors and shortcomings in the MMU FDIR system and raises a number of issues concerning software engineering in CLIPs. The authors came to realize that the MMU FDIR system does not conform to conventional definitions of AI software, despite the fact that it was intended and indeed presented as an AI system. The authors discuss this apparent disparity and related questions such as the role of AI techniques in space and aircraft operations and the suitability of CLIPS for critical applications.

  19. A Verification System for Distributed Objects with Asynchronous Method Calls

    NASA Astrophysics Data System (ADS)

    Ahrendt, Wolfgang; Dylla, Maximilian

    We present a verification system for Creol, an object-oriented modeling language for concurrent distributed applications. The system is an instance of KeY, a framework for object-oriented software verification, which has so far been applied foremost to sequential Java. Building on KeY characteristic concepts, like dynamic logic, sequent calculus, explicit substitutions, and the taclet rule language, the system presented in this paper addresses functional correctness of Creol models featuring local cooperative thread parallelism and global communication via asynchronous method calls. The calculus heavily operates on communication histories which describe the interfaces of Creol units. Two example scenarios demonstrate the usage of the system.

  20. Goddard high resolution spectrograph science verification and data analysis

    NASA Technical Reports Server (NTRS)

    1992-01-01

    The data analysis performed was to support the Orbital Verification (OV) and Science Verification (SV) of the GHRS was in the areas of the Digicon detector's performance and stability, wavelength calibration, and geomagnetic induced image motion. The results of the analyses are briefly described. Detailed results are given in the form of attachments. Specialized software was developed for the analyses. Calibration files were formatted according to the specifications in a Space Telescope Science report. IRAS images were restored of the Large Magellanic Cloud using a blocked iterative algorithm. The algorithm works with the raw data scans without regridding or interpolating the data on an equally spaced image grid.

  1. Deductive Evaluation: Formal Code Analysis With Low User Burden

    NASA Technical Reports Server (NTRS)

    Di Vito, Ben. L

    2016-01-01

    We describe a framework for symbolically evaluating iterative C code using a deductive approach that automatically discovers and proves program properties. Although verification is not performed, the method can infer detailed program behavior. Software engineering work flows could be enhanced by this type of analysis. Floyd-Hoare verification principles are applied to synthesize loop invariants, using a library of iteration-specific deductive knowledge. When needed, theorem proving is interleaved with evaluation and performed on the fly. Evaluation results take the form of inferred expressions and type constraints for values of program variables. An implementation using PVS (Prototype Verification System) is presented along with results for sample C functions.

  2. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Morten T.; Wendt, Fabian F.; Robertson, Amy N.

    2016-07-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  3. Expert system verification and validation study. Phase 2: Requirements Identification. Delivery 2: Current requirements applicability

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.

  4. 6th Annual CMMI Technology Conference and User Group

    DTIC Science & Technology

    2006-11-17

    Operationally Oriented; Customer Focused Proven Approach – Level of Detail Beginner Decision Table (DT) is a tabular representation with tailoring options to...written to reflect the experience of the author Software Engineering led the process charge in the ’80s – Used Flowcharts – CASE tools – “data...Postpo ned PCR. Verification Steps • EPG configuration audits • EPG configuration status reports Flowcharts and Entry, Task, Verification and eXit

  5. Verification of cardiac mechanics software: benchmark problems and solutions for testing active and passive material behaviour.

    PubMed

    Land, Sander; Gurev, Viatcheslav; Arens, Sander; Augustin, Christoph M; Baron, Lukas; Blake, Robert; Bradley, Chris; Castro, Sebastian; Crozier, Andrew; Favino, Marco; Fastl, Thomas E; Fritz, Thomas; Gao, Hao; Gizzi, Alessio; Griffith, Boyce E; Hurtado, Daniel E; Krause, Rolf; Luo, Xiaoyu; Nash, Martyn P; Pezzuto, Simone; Plank, Gernot; Rossi, Simone; Ruprecht, Daniel; Seemann, Gunnar; Smith, Nicolas P; Sundnes, Joakim; Rice, J Jeremy; Trayanova, Natalia; Wang, Dafang; Jenny Wang, Zhinuo; Niederer, Steven A

    2015-12-08

    Models of cardiac mechanics are increasingly used to investigate cardiac physiology. These models are characterized by a high level of complexity, including the particular anisotropic material properties of biological tissue and the actively contracting material. A large number of independent simulation codes have been developed, but a consistent way of verifying the accuracy and replicability of simulations is lacking. To aid in the verification of current and future cardiac mechanics solvers, this study provides three benchmark problems for cardiac mechanics. These benchmark problems test the ability to accurately simulate pressure-type forces that depend on the deformed objects geometry, anisotropic and spatially varying material properties similar to those seen in the left ventricle and active contractile forces. The benchmark was solved by 11 different groups to generate consensus solutions, with typical differences in higher-resolution solutions at approximately 0.5%, and consistent results between linear, quadratic and cubic finite elements as well as different approaches to simulating incompressible materials. Online tools and solutions are made available to allow these tests to be effectively used in verification of future cardiac mechanics software.

  6. Fragility Analysis Methodology for Degraded Structures and Passive Components in Nuclear Power Plants - Illustrated using a Condensate Storage Tank

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, J.; Braverman, J.; Hofmayer, C.

    2010-06-30

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structuresmore » and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. In the Year 1 scope of work, BNL collected and reviewed degradation occurrences in US NPPs and identified important aging characteristics needed for the seismic capability evaluations. This information is presented in the Annual Report for the Year 1 Task, identified as BNL Report-81741-2008 and also designated as KAERI/RR-2931/2008. The report presents results of the statistical and trending analysis of this data and compares the results to prior aging studies. In addition, the report provides a description of U.S. current regulatory requirements, regulatory guidance documents, generic communications, industry standards and guidance, and past research related to aging degradation of SSCs. In the Year 2 scope of work, BNL carried out a research effort to identify and assess degradation models for the long-term behavior of dominant materials that are determined to be risk significant to NPPs. Multiple models have been identified for concrete, carbon and low-alloy steel, and stainless steel. These models are documented in the Annual Report for the Year 2 Task, identified as BNL Report-82249-2009 and also designated as KAERI/TR-3757/2009. This report describes the research effort performed by BNL for the Year 3 scope of work. The objective is for BNL to develop the seismic fragility capacity for a condensate storage tank with various degradation scenarios. The conservative deterministic failure margin method has been utilized for the undegraded case and has been modified to accommodate the degraded cases. A total of five seismic fragility analysis cases have been described: (1) undegraded case, (2) degraded stainless tank shell, (3) degraded anchor bolts, (4) anchorage concrete cracking, and (5)a perfect combination of the three degradation scenarios. Insights from these fragility analyses are also presented.« less

  7. Applicability of SREM to the Verification of Management Information System Software Requirements. Volume II.

    DTIC Science & Technology

    1981-04-30

    f --tlu Final-Report: Applicability of SREM to the Verification of Management Information System Software Requirements, wtch was prepared for the Army...MA _________ TO ________ UTA 1ASE ___________ StMZ25. 70.aC. .. 3CA, c(ie m(Sl f :~ rin I : ruq in SBII Z tSI. M 4.7/.3 69.9 . MA S U/WA0 1.241.5 96.8...IR.D iTEM B-2 C4 .4 . I.I z- 0 44 f - U l c- I ao V. a, I. vv!N0 ~ q * a - i= - a ~ ePcu m ~ bft 0 = z z z z z Uz 4 P4 -F5 zz - -4 zzz z C6 z c. 0. 4 4 v

  8. MO-FG-202-01: A Fast Yet Sensitive EPID-Based Real-Time Treatment Verification System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahmad, M; Nourzadeh, H; Neal, B

    2016-06-15

    Purpose: To create a real-time EPID-based treatment verification system which robustly detects treatment delivery and patient attenuation variations. Methods: Treatment plan DICOM files sent to the record-and-verify system are captured and utilized to predict EPID images for each planned control point using a modified GPU-based digitally reconstructed radiograph algorithm which accounts for the patient attenuation, source energy fluence, source size effects, and MLC attenuation. The DICOM and predicted images are utilized by our C++ treatment verification software which compares EPID acquired 1024×768 resolution frames acquired at ∼8.5hz from Varian Truebeam™ system. To maximize detection sensitivity, image comparisons determine (1) ifmore » radiation exists outside of the desired treatment field; (2) if radiation is lacking inside the treatment field; (3) if translations, rotations, and magnifications of the image are within tolerance. Acquisition was tested with known test fields and prior patient fields. Error detection was tested in real-time and utilizing images acquired during treatment with another system. Results: The computational time of the prediction algorithms, for a patient plan with 350 control points and 60×60×42cm^3 CT volume, is 2–3minutes on CPU and <27 seconds on GPU for 1024×768 images. The verification software requires a maximum of ∼9ms and ∼19ms for 512×384 and 1024×768 resolution images, respectively, to perform image analysis and dosimetric validations. Typical variations in geometric parameters between reference and the measured images are 0.32°for gantry rotation, 1.006 for scaling factor, and 0.67mm for translation. For excess out-of-field/missing in-field fluence, with masks extending 1mm (at isocenter) from the detected aperture edge, the average total in-field area missing EPID fluence was 1.5mm2 the out-of-field excess EPID fluence was 8mm^2, both below error tolerances. Conclusion: A real-time verification software, with EPID images prediction algorithm, was developed. The system is capable of performing verifications between frames acquisitions and identifying source(s) of any out-of-tolerance variations. This work was supported in part by Varian Medical Systems.« less

  9. Markov Chains For Testing Redundant Software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1990-01-01

    Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.

  10. Computer software documentation

    NASA Technical Reports Server (NTRS)

    Comella, P. A.

    1973-01-01

    A tutorial in the documentation of computer software is presented. It presents a methodology for achieving an adequate level of documentation as a natural outgrowth of the total programming effort commencing with the initial problem statement and definition and terminating with the final verification of code. It discusses the content of adequate documentation, the necessity for such documentation and the problems impeding achievement of adequate documentation.

  11. Software Testing and Verification in Climate Model Development

    NASA Technical Reports Server (NTRS)

    Clune, Thomas L.; Rood, RIchard B.

    2011-01-01

    Over the past 30 years most climate models have grown from relatively simple representations of a few atmospheric processes to a complex multi-disciplinary system. Computer infrastructure over that period has gone from punch card mainframes to modem parallel clusters. Model implementations have become complex, brittle, and increasingly difficult to extend and maintain. Existing verification processes for model implementations rely almost exclusively upon some combination of detailed analysis of output from full climate simulations and system-level regression tests. In additional to being quite costly in terms of developer time and computing resources, these testing methodologies are limited in terms of the types of defects that can be detected, isolated and diagnosed. Mitigating these weaknesses of coarse-grained testing with finer-grained "unit" tests has been perceived as cumbersome and counter-productive. In the commercial software sector, recent advances in tools and methodology have led to a renaissance for systematic fine-grained testing. We discuss the availability of analogous tools for scientific software and examine benefits that similar testing methodologies could bring to climate modeling software. We describe the unique challenges faced when testing complex numerical algorithms and suggest techniques to minimize and/or eliminate the difficulties.

  12. Using Combined SFTA and SFMECA Techniques for Space Critical Software

    NASA Astrophysics Data System (ADS)

    Nicodemos, F. G.; Lahoz, C. H. N.; Abdala, M. A. D.; Saotome, O.

    2012-01-01

    This work addresses the combined Software Fault Tree Analysis (SFTA) and Software Failure Modes, Effects and Criticality Analysis (SFMECA) techniques applied to space critical software of satellite launch vehicles. The combined approach is under research as part of the Verification and Validation (V&V) efforts to increase software dependability and as future application in other projects under development at Instituto de Aeronáutica e Espaço (IAE). The applicability of such approach was conducted on system software specification and applied to a case study based on the Brazilian Satellite Launcher (VLS). The main goal is to identify possible failure causes and obtain compensating provisions that lead to inclusion of new functional and non-functional system software requirements.

  13. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    NASA Technical Reports Server (NTRS)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  14. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  15. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  16. Atmospheric transport modelling in support of CTBT verification—overview and basic concepts

    NASA Astrophysics Data System (ADS)

    Wotawa, Gerhard; De Geer, Lars-Erik; Denier, Philippe; Kalinowski, Martin; Toivonen, Harri; D'Amours, Real; Desiato, Franco; Issartel, Jean-Pierre; Langer, Matthias; Seibert, Petra; Frank, Andreas; Sloan, Craig; Yamazawa, Hiromi

    Under the provisions of the Comprehensive Nuclear-Test-Ban Treaty (CTBT), a global monitoring system comprising different verification technologies is currently being set up. The network will include 80 radionuclide (RN) stations distributed all over the globe that measure treaty-relevant radioactive species. While the seismic subsystem cannot distinguish between chemical and nuclear explosions, RN monitoring would provide the "smoking gun" of a possible treaty violation. Atmospheric transport modelling (ATM) will be an integral part of CTBT verification, since it provides a geo-temporal location capability for the RN technology. In this paper, the basic concept for the future ATM software system to be installed at the International Data Centre is laid out. The system is based on the operational computation of multi-dimensional source-receptor sensitivity fields for all RN samples by means of adjoint tracer transport modelling. While the source-receptor matrix methodology has already been applied in the past, the system that we suggest will be unique and unprecedented, since it is global, real-time and aims at uncovering source scenarios that are compatible with measurements. Furthermore, it has to deal with source dilution ratios that are by orders of magnitude larger than in typical transport model applications. This new verification software will need continuous scientific attention, and may well provide a prototype system for future applications in areas of environmental monitoring, emergency response and verification of other international agreements and treaties.

  17. Design of verification platform for wireless vision sensor networks

    NASA Astrophysics Data System (ADS)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  18. Towards Certification of a Space System Application of Fault Detection and Isolation

    NASA Technical Reports Server (NTRS)

    Feather, Martin S.; Markosian, Lawrence Z.

    2008-01-01

    Advanced fault detection, isolation and recovery (FDIR) software is being investigated at NASA as a means to the improve reliability and availability of its space systems. Certification is a critical step in the acceptance of such software. Its attainment hinges on performing the necessary verification and validation to show that the software will fulfill its requirements in the intended setting. Presented herein is our ongoing work to plan for the certification of a pilot application of advanced FDIR software in a NASA setting. We describe the application, and the key challenges and opportunities it offers for certification.

  19. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  20. Spacecraft attitude calibration/verification baseline study

    NASA Technical Reports Server (NTRS)

    Chen, L. C.

    1981-01-01

    A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.

  1. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  2. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Morten T.; Wendt, Fabian; Robertson, Amy

    2016-08-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  3. Software safety - A user's practical perspective

    NASA Technical Reports Server (NTRS)

    Dunn, William R.; Corliss, Lloyd D.

    1990-01-01

    Software safety assurance philosophy and practices at the NASA Ames are discussed. It is shown that, to be safe, software must be error-free. Software developments on two digital flight control systems and two ground facility systems are examined, including the overall system and software organization and function, the software-safety issues, and their resolution. The effectiveness of safety assurance methods is discussed, including conventional life-cycle practices, verification and validation testing, software safety analysis, and formal design methods. It is concluded (1) that a practical software safety technology does not yet exist, (2) that it is unlikely that a set of general-purpose analytical techniques can be developed for proving that software is safe, and (3) that successful software safety-assurance practices will have to take into account the detailed design processes employed and show that the software will execute correctly under all possible conditions.

  4. Software assurance standard

    NASA Technical Reports Server (NTRS)

    1992-01-01

    This standard specifies the software assurance program for the provider of software. It also delineates the assurance activities for the provider and the assurance data that are to be furnished by the provider to the acquirer. In any software development effort, the provider is the entity or individual that actually designs, develops, and implements the software product, while the acquirer is the entity or individual who specifies the requirements and accepts the resulting products. This standard specifies at a high level an overall software assurance program for software developed for and by NASA. Assurance includes the disciplines of quality assurance, quality engineering, verification and validation, nonconformance reporting and corrective action, safety assurance, and security assurance. The application of these disciplines during a software development life cycle is called software assurance. Subsequent lower-level standards will specify the specific processes within these disciplines.

  5. Unmanned Systems Safety Guide for DoD Acquisition

    DTIC Science & Technology

    2007-06-27

    Weapons release authorization validation. • Weapons release verification . • Weapons release abort/back-out, including clean -up or reset of weapons...conditions, clean room, stress) and other environments (e.g. software engineering environment, electromagnetic) related to system utilization. Error 22 (1...A solid or liquid energetic substance (or a mixture of substances) which is in itself capable, OUSD (AT&L) Systems and Software Engineering

  6. 75 FR 12811 - Petition for Waiver of Compliance

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-17

    ... verifying accuracy of the check sum and CRC values of all programmable elements used in the solid-state... software being used. This verification is done by comparing the parameters found on all programmable...

  7. Verification of Sulfate Attack Penetration Rates for Saltstone Disposal Unit Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Flach, G. P.

    Recent Special Analysis modeling of Saltstone Disposal Units consider sulfate attack on concrete and utilize degradation rates estimated from Cementitious Barriers Partnership software simulations. This study provides an independent verification of those simulation results using an alternative analysis method and an independent characterization data source. The sulfate penetration depths estimated herein are similar to the best-estimate values in SRNL-STI-2013-00118 Rev. 2 and well below the nominal values subsequently used to define Saltstone Special Analysis base cases.

  8. TORMES-BEXUS 17 and 19: Precursor of the 6U CubeSat 3CAT-2

    NASA Astrophysics Data System (ADS)

    Carreno-Luengo, H.; Amezaga, A.; Bolet, A.; Vidal, D.; Jane, J.; Munoz, J. F.; Olive, R.; Camps, A.; Carola, J.; Catarino, N.; Hagenfeldt, M.; Palomo, P.; Cornara, S.

    2015-09-01

    3Cat-2 Assembly, Integration and Verification (AIV) activities of the Engineering Model (EM) and the Flight Model (FM) are being carried out at present. The Attitude Determination and Control System (ADCS) and Flight Software (FSW) validation campaigns will be performed at Universitat Politècnica de Catalunya (UPC) during the incomings months. An analysis and verification of the 3Cat-2 key mission requirements has been performed. The main results are summarized in this work.

  9. A Mechanism of Modeling and Verification for SaaS Customization Based on TLA

    NASA Astrophysics Data System (ADS)

    Luan, Shuai; Shi, Yuliang; Wang, Haiyang

    With the gradually mature of SOA and the rapid development of Internet, SaaS has become a popular software service mode. The customized action of SaaS is usually subject to internal and external dependency relationships. This paper first introduces a method for modeling customization process based on Temporal Logic of Actions, and then proposes a verification algorithm to assure that each step in customization will not cause unpredictable influence on system and follow the related rules defined by SaaS provider.

  10. Expert system verification concerns in an operations environment

    NASA Technical Reports Server (NTRS)

    Goodwin, Mary Ann; Robertson, Charles C.

    1987-01-01

    The Space Shuttle community is currently developing a number of knowledge-based tools, primarily expert systems, to support Space Shuttle operations. It is proposed that anticipating and responding to the requirements of the operations environment will contribute to a rapid and smooth transition of expert systems from development to operations, and that the requirements for verification are critical to this transition. The paper identifies the requirements of expert systems to be used for flight planning and support and compares them to those of existing procedural software used for flight planning and support. It then explores software engineering concepts and methodology that can be used to satisfy these requirements, to aid the transition from development to operations and to support the operations environment during the lifetime of expert systems. Many of these are similar to those used for procedural hardware.

  11. Spot: A Programming Language for Verified Flight Software

    NASA Technical Reports Server (NTRS)

    Bocchino, Robert L., Jr.; Gamble, Edward; Gostelow, Kim P.; Some, Raphael R.

    2014-01-01

    The C programming language is widely used for programming space flight software and other safety-critical real time systems. C, however, is far from ideal for this purpose: as is well known, it is both low-level and unsafe. This paper describes Spot, a language derived from C for programming space flight systems. Spot aims to maintain compatibility with existing C code while improving the language and supporting verification with the SPIN model checker. The major features of Spot include actor-based concurrency, distributed state with message passing and transactional updates, and annotations for testing and verification. Spot also supports domain-specific annotations for managing spacecraft state, e.g., communicating telemetry information to the ground. We describe the motivation and design rationale for Spot, give an overview of the design, provide examples of Spot's capabilities, and discuss the current status of the implementation.

  12. EOSlib, Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Nathan; Menikoff, Ralph

    2017-02-03

    Equilibrium thermodynamics underpins many of the technologies used throughout theoretical physics, yet verification of the various theoretical models in the open literature remains challenging. EOSlib provides a single, consistent, verifiable implementation of these models, in a single, easy-to-use software package. It consists of three parts: a software library implementing various published equation-of-state (EOS) models; a database of fitting parameters for various materials for these models; and a number of useful utility functions for simplifying thermodynamic calculations such as computing Hugoniot curves or Riemann problem solutions. Ready availability of this library will enable reliable code-to- code testing of equation-of-state implementations, asmore » well as a starting point for more rigorous verification work. EOSlib also provides a single, consistent API for its analytic and tabular EOS models, which simplifies the process of comparing models for a particular application.« less

  13. System Testing of Ground Cooling System Components

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler Steven

    2014-01-01

    This internship focused primarily upon software unit testing of Ground Cooling System (GCS) components, one of the three types of tests (unit, integrated, and COTS/regression) utilized in software verification. Unit tests are used to test the software of necessary components before it is implemented into the hardware. A unit test determines that the control data, usage procedures, and operating procedures of a particular component are tested to determine if the program is fit for use. Three different files are used to make and complete an efficient unit test. These files include the following: Model Test file (.mdl), Simulink SystemTest (.test), and autotest (.m). The Model Test file includes the component that is being tested with the appropriate Discrete Physical Interface (DPI) for testing. The Simulink SystemTest is a program used to test all of the requirements of the component. The autotest tests that the component passes Model Advisor and System Testing, and puts the results into proper files. Once unit testing is completed on the GCS components they can then be implemented into the GCS Schematic and the software of the GCS model as a whole can be tested using integrated testing. Unit testing is a critical part of software verification; it allows for the testing of more basic components before a model of higher fidelity is tested, making the process of testing flow in an orderly manner.

  14. From Bridges and Rockets, Lessons for Software Systems

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael

    2004-01-01

    Although differences exist between building software systems and building physical structures such as bridges and rockets, enough similarities exist that software engineers can learn lessons from failures in traditional engineering disciplines. This paper draws lessons from two well-known failures the collapse of the Tacoma Narrows Bridge in 1940 and the destruction of the space shuttle Challenger in 1986 and applies these lessons to software system development. The following specific applications are made: (1) the verification and validation of a software system should not be based on a single method, or a single style of methods; (2) the tendency to embrace the latest fad should be overcome; and (3) the introduction of software control into safety-critical systems should be done cautiously.

  15. Seismic Fragility Analysis of a Condensate Storage Tank with Age-Related Degradations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, J.; Braverman, J.; Hofmayer, C

    2011-04-01

    The Korea Atomic Energy Research Institute (KAERI) is conducting a five-year research project to develop a realistic seismic risk evaluation system which includes the consideration of aging of structures and components in nuclear power plants (NPPs). The KAERI research project includes three specific areas that are essential to seismic probabilistic risk assessment (PRA): (1) probabilistic seismic hazard analysis, (2) seismic fragility analysis including the effects of aging, and (3) a plant seismic risk analysis. Since 2007, Brookhaven National Laboratory (BNL) has entered into a collaboration agreement with KAERI to support its development of seismic capability evaluation technology for degraded structuresmore » and components. The collaborative research effort is intended to continue over a five year period. The goal of this collaboration endeavor is to assist KAERI to develop seismic fragility analysis methods that consider the potential effects of age-related degradation of structures, systems, and components (SSCs). The research results of this multi-year collaboration will be utilized as input to seismic PRAs. This report describes the research effort performed by BNL for the Year 4 scope of work. This report was developed as an update to the Year 3 report by incorporating a major supplement to the Year 3 fragility analysis. In the Year 4 research scope, an additional study was carried out to consider an additional degradation scenario, in which the three basic degradation scenarios, i.e., degraded tank shell, degraded anchor bolts, and cracked anchorage concrete, are combined in a non-perfect correlation manner. A representative operational water level is used for this effort. Building on the same CDFM procedure implemented for the Year 3 Tasks, a simulation method was applied using optimum Latin Hypercube samples to characterize the deterioration behavior of the fragility capacity as a function of age-related degradations. The results are summarized in Section 5 and Appendices G through I.« less

  16. On flattening filter‐free portal dosimetry

    PubMed Central

    Novais, Juan Castro; Molina López, María Yolanda; Maqueda, Sheila Ruiz

    2016-01-01

    Varian introduced (in 2010) the option of removing the flattening filter (FF) in their C‐Arm linacs for intensity‐modulated treatments. This mode, called flattening filter‐free (FFF), offers the advantage of a greater dose rate. Varian's “Portal Dosimetry” is an electronic portal imager device (EPID)‐based tool for IMRT verification. This tool lacks the capability of verifying flattening filter‐free (FFF) modes due to saturation and lack of an image prediction algorithm. (Note: the latest versions of this software and EPID correct these issues.) The objective of the present study is to research the feasibility of said verifications (with the older versions of the software and EPID). By placing the EPID at a greater distance, the images can be acquired without saturation, yielding a linearity similar to the flattened mode. For the image prediction, a method was optimized based on the clinically used algorithm (analytical anisotropic algorithm (AAA)) over a homogeneous phantom. The depth inside the phantom and its electronic density were tailored. An application was developed to allow the conversion of a dose plane (in DICOM format) to Varian's custom format for Portal Dosimetry. The proposed method was used for the verification of test and clinical fields for the three qualities used in our institution for IMRT: 6X, 6FFF and 10FFF. The method developed yielded a positive verification (more than 95% of the points pass a 2%/2 mm gamma) for both the clinical and test fields. This method was also capable of “predicting” static and wedged fields. A workflow for the verification of FFF fields was developed. This method relies on the clinical algorithm used for dose calculation and is able to verify the FFF modes, as well as being useful for machine quality assurance. The procedure described does not require new hardware. This method could be used as a verification of Varian's Portal Dose Image Prediction. PACS number(s): 87.53.Kn, 87.55.T‐, 87.56.bd, 87.59.‐e PMID:27455487

  17. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  18. Evolution of Software-Only-Simulation at NASA IV and V

    NASA Technical Reports Server (NTRS)

    McCarty, Justin; Morris, Justin; Zemerick, Scott

    2014-01-01

    Software-Only-Simulations have been an emerging but quickly developing field of study throughout NASA. The NASA Independent Verification Validation (IVV) Independent Test Capability (ITC) team has been rapidly building a collection of simulators for a wide range of NASA missions. ITC specializes in full end-to-end simulations that enable developers, VV personnel, and operators to test-as-you-fly. In four years, the team has delivered a wide variety of spacecraft simulations that have ranged from low complexity science missions such as the Global Precipitation Management (GPM) satellite and the Deep Space Climate Observatory (DSCOVR), to the extremely complex missions such as the James Webb Space Telescope (JWST) and Space Launch System (SLS).This paper describes the evolution of ITCs technologies and processes that have been utilized to design, implement, and deploy end-to-end simulation environments for various NASA missions. A comparison of mission simulators are discussed with focus on technology and lessons learned in complexity, hardware modeling, and continuous integration. The paper also describes the methods for executing the missions unmodified flight software binaries (not cross-compiled) for verification and validation activities.

  19. Firefly: an optical lithographic system for the fabrication of holographic security labels

    NASA Astrophysics Data System (ADS)

    Calderón, Jorge; Rincón, Oscar; Amézquita, Ricardo; Pulido, Iván.; Amézquita, Sebastián.; Bernal, Andrés.; Romero, Luis; Agudelo, Viviana

    2016-03-01

    This paper introduces Firefly, an optical lithography origination system that has been developed to produce holographic masters of high quality. This mask-less lithography system has a resolution of 418 nm half-pitch, and generates holographic masters with the optical characteristics required for security applications of level 1 (visual verification), level 2 (pocket reader verification) and level 3 (forensic verification). The holographic master constitutes the main core of the manufacturing process of security holographic labels used for the authentication of products and documents worldwide. Additionally, the Firefly is equipped with a software tool that allows for the hologram design from graphic formats stored in bitmaps. The software is capable of generating and configuring basic optical effects such as animation and color, as well as effects of high complexity such as Fresnel lenses, engraves and encrypted images, among others. The Firefly technology gathers together optical lithography, digital image processing and the most advanced control systems, making possible a competitive equipment that challenges the best technologies in the industry of holographic generation around the world. In this paper, a general description of the origination system is provided as well as some examples of its capabilities.

  20. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data.

    PubMed

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-07-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.

  1. WebPrInSeS: automated full-length clone sequence identification and verification using high-throughput sequencing data

    PubMed Central

    Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart

    2010-01-01

    High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601

  2. Integrated Software Health Management for Aircraft GN and C

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Mengshoel, Ole

    2011-01-01

    Modern aircraft rely heavily on dependable operation of many safety-critical software components. Despite careful design, verification and validation (V&V), on-board software can fail with disastrous consequences if it encounters problematic software/hardware interaction or must operate in an unexpected environment. We are using a Bayesian approach to monitor the software and its behavior during operation and provide up-to-date information about the health of the software and its components. The powerful reasoning mechanism provided by our model-based Bayesian approach makes reliable diagnosis of the root causes possible and minimizes the number of false alarms. Compilation of the Bayesian model into compact arithmetic circuits makes SWHM feasible even on platforms with limited CPU power. We show initial results of SWHM on a small simulator of an embedded aircraft software system, where software and sensor faults can be injected.

  3. SU-F-T-268: A Feasibility Study of Independent Dose Verification for Vero4DRT

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yamashita, M; Kokubo, M; Institute of Biomedical Research and Innovation, Kobe, Hyogo

    2016-06-15

    Purpose: Vero4DRT (Mitsubishi Heavy Industries Ltd.) has been released for a few years. The treatment planning system (TPS) of Vero4DRT is dedicated, so the measurement is the only method of dose verification. There have been no reports of independent dose verification using Clarksonbased algorithm for Vero4DRT. An independent dose verification software program of the general-purpose linac using a modified Clarkson-based algorithm was modified for Vero4DRT. In this study, we evaluated the accuracy of independent dose verification program and the feasibility of the secondary check for Vero4DRT. Methods: iPlan (Brainlab AG) was used as the TPS. PencilBeam Convolution was used formore » dose calculation algorithm of IMRT and X-ray Voxel Monte Carlo was used for the others. Simple MU Analysis (SMU, Triangle Products, Japan) was used as the independent dose verification software program in which CT-based dose calculation was performed using a modified Clarkson-based algorithm. In this study, 120 patients’ treatment plans were collected in our institute. The treatments were performed using the conventional irradiation for lung and prostate, SBRT for lung and Step and shoot IMRT for prostate. Comparison in dose between the TPS and the SMU was done and confidence limits (CLs, Mean ± 2SD %) were compared to those from the general-purpose linac. Results: As the results of the CLs, the conventional irradiation (lung, prostate), SBRT (lung) and IMRT (prostate) show 2.2 ± 3.5% (CL of the general-purpose linac: 2.4 ± 5.3%), 1.1 ± 1.7% (−0.3 ± 2.0%), 4.8 ± 3.7% (5.4 ± 5.3%) and −0.5 ± 2.5% (−0.1 ± 3.6%), respectively. The CLs for Vero4DRT show similar results to that for the general-purpose linac. Conclusion: The independent dose verification for the new linac is clinically available as a secondary check and we performed the check with the similar tolerance level of the general-purpose linac. This research is partially supported by Japan Agency for Medical Research and Development (AMED)« less

  4. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1996-01-01

    Research in the year covered by this reporting period has been primarily directed toward: continued development of mock-ups of computer screens for operator of a digital reactor control system; development of a reactor simulation to permit testing of various elements of the control system; formal specification of user interfaces; fault-tree analysis including software; evaluation of formal verification techniques; and continued development of a software documentation system. Technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers.

  5. Application of Compton-suppressed self-induced XRF to spent nuclear fuel measurement

    NASA Astrophysics Data System (ADS)

    Park, Se-Hwan; Jo, Kwang Ho; Lee, Seung Kyu; Seo, Hee; Lee, Chaehun; Won, Byung-Hee; Ahn, Seong-Kyu; Ku, Jeong-Hoe

    2017-11-01

    Self-induced X-ray fluorescence (XRF) is a technique by which plutonium (Pu) content in spent nuclear fuel can be directly quantified. In the present work, this method successfully measured the plutonium/uranium (Pu/U) peak ratio of a pressurized water reactor (PWR)'s spent nuclear fuel at the Korea atomic energy research institute (KAERI)'s post irradiation examination facility (PIEF). In order to reduce the Compton background in the low-energy X-ray region, the Compton suppression system additionally was implemented. By use of this system, the spectrum's background level was reduced by a factor of approximately 2. This work shows that Compton-suppressed selfinduced XRF can be effectively applied to Pu accounting in spent nuclear fuel.

  6. Integrating Requirements Engineering, Modeling, and Verification Technologies into Software and Systems Engineering

    DTIC Science & Technology

    2007-10-28

    Software Engineering, FASE󈧉, volume 3442 of Lecture Notes in Computer Science, pages 175--189. Springer, 2005. Andreas Bauer, Martin Leucker, and Jonathan ...of Personnel receiving masters degrees NAME Markus Strohmeier Gerrit Hanselmann Jonathan Streit Ernst Sassen 4Total Number: Names of personnel...developed and documented mainly within the master thesis by Jonathan Streit [Str06]: • Jonathan Streit. Development of a programming language like tem

  7. Mining Program Source Code for Improving Software Quality

    DTIC Science & Technology

    2013-01-01

    conduct static verification on the software application under analysis to detect defects around APIs. (a) Papers published in peer-reviewed journals...N/A for none) Enter List of papers submitted or published that acknowledge ARO support from the start of the project to the date of this printing...List the papers , including journal references, in the following categories: Received Paper 05/06/2013 21.00 Tao Xie, Suresh Thummalapenta, David Lo

  8. Customer Avionics Interface Development and Analysis (CAIDA): Software Developer for Avionics Systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Sherry L.

    2018-01-01

    The Customer Avionics Interface Development and Analysis (CAIDA) supports the testing of the Launch Control System (LCS), NASA's command and control system for the Space Launch System (SLS), Orion Multi-Purpose Crew Vehicle (MPCV), and ground support equipment. The objective of the semester-long internship was to support day-to-day operations of CAIDA and help prepare for verification and validation of CAIDA software.

  9. Specification Improvement Through Analysis of Proof Structure (SITAPS): High Assurance Software Development

    DTIC Science & Technology

    2016-02-01

    proof in mathematics. For example, consider the proof of the Pythagorean Theorem illustrated at: http://www.cut-the-knot.org/ pythagoras / where 112...methods and tools have made significant progress in their ability to model software designs and prove correctness theorems about the systems modeled...assumption criticality” or “ theorem root set size” SITAPS detects potentially brittle verification cases. SITAPS provides tools and techniques that

  10. Parallel Software Model Checking

    DTIC Science & Technology

    2015-01-08

    checker. This project will explore this strategy to parallelize the generalized PDR algorithm for software model checking. It belongs to TF1 due to its ... focus on formal verification . Generalized PDR. Generalized Property Driven Rechability (GPDR) i is an algorithm for solving HORN-SMT reachability...subject to a penalty for failing to comply with a collection of information if it does not display a currently valid OMB control number. 1. REPORT DATE 08

  11. Simulating flow around scaled model of a hypersonic vehicle in wind tunnel

    NASA Astrophysics Data System (ADS)

    Markova, T. V.; Aksenov, A. A.; Zhluktov, S. V.; Savitsky, D. V.; Gavrilov, A. D.; Son, E. E.; Prokhorov, A. N.

    2016-11-01

    A prospective hypersonic HEXAFLY aircraft is considered in the given paper. In order to obtain the aerodynamic characteristics of a new construction design of the aircraft, experiments with a scaled model have been carried out in a wind tunnel under different conditions. The runs have been performed at different angles of attack with and without hydrogen combustion in the scaled propulsion engine. However, the measured physical quantities do not provide all the information about the flowfield. Numerical simulation can complete the experimental data as well as to reduce the number of wind tunnel experiments. Besides that, reliable CFD software can be used for calculations of the aerodynamic characteristics for any possible design of the full-scale aircraft under different operation conditions. The reliability of the numerical predictions must be confirmed in verification study of the software. The given work is aimed at numerical investigation of the flowfield around and inside the scaled model of the HEXAFLY-CIAM module under wind tunnel conditions. A cold run (without combustion) was selected for this study. The calculations are performed in the FlowVision CFD software. The flow characteristics are compared against the available experimental data. The carried out verification study confirms the capability of the FlowVision CFD software to calculate the flows discussed.

  12. SDO FlatSat Facility

    NASA Technical Reports Server (NTRS)

    Amason, David L.

    2008-01-01

    The goal of the Solar Dynamics Observatory (SDO) is to understand and, ideally, predict the solar variations that influence life and society. It's instruments will measure the properties of the Sun and will take hifh definition images of the Sun every few seconds, all day every day. The FlatSat is a high fidelity electrical and functional representation of the SDO spacecraft bus. It is a high fidelity test bed for Integration & Test (I & T), flight software, and flight operations. For I & T purposes FlatSat will be a driver to development and dry run electrical integration procedures, STOL test procedures, page displays, and the command and telemetry database. FlatSat will also serve as a platform for flight software acceptance and systems testing for the flight software system component including the spacecraft main processors, power supply electronics, attitude control electronic, gimbal control electrons and the S-band communications card. FlatSat will also benefit the flight operations team through post-launch flight software code and table update development and verification and verification of new and updated flight operations products. This document highlights the benefits of FlatSat; describes the building of FlatSat; provides FlatSat facility requirements, access roles and responsibilities; and, and discusses FlatSat mechanical and electrical integration and functional testing.

  13. TU-C-BRE-11: 3D EPID-Based in Vivo Dosimetry: A Major Step Forward Towards Optimal Quality and Safety in Radiation Oncology Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mijnheer, B; Mans, A; Olaciregui-Ruiz, I

    Purpose: To develop a 3D in vivo dosimetry method that is able to substitute pre-treatment verification in an efficient way, and to terminate treatment delivery if the online measured 3D dose distribution deviates too much from the predicted dose distribution. Methods: A back-projection algorithm has been further developed and implemented to enable automatic 3D in vivo dose verification of IMRT/VMAT treatments using a-Si EPIDs. New software tools were clinically introduced to allow automated image acquisition, to periodically inspect the record-and-verify database, and to automatically run the EPID dosimetry software. The comparison of the EPID-reconstructed and planned dose distribution is donemore » offline to raise automatically alerts and to schedule actions when deviations are detected. Furthermore, a software package for online dose reconstruction was also developed. The RMS of the difference between the cumulative planned and reconstructed 3D dose distributions was used for triggering a halt of a linac. Results: The implementation of fully automated 3D EPID-based in vivo dosimetry was able to replace pre-treatment verification for more than 90% of the patient treatments. The process has been fully automated and integrated in our clinical workflow where over 3,500 IMRT/VMAT treatments are verified each year. By optimizing the dose reconstruction algorithm and the I/O performance, the delivered 3D dose distribution is verified in less than 200 ms per portal image, which includes the comparison between the reconstructed and planned dose distribution. In this way it was possible to generate a trigger that can stop the irradiation at less than 20 cGy after introducing large delivery errors. Conclusion: The automatic offline solution facilitated the large scale clinical implementation of 3D EPID-based in vivo dose verification of IMRT/VMAT treatments; the online approach has been successfully tested for various severe delivery errors.« less

  14. WE-D-BRA-04: Online 3D EPID-Based Dose Verification for Optimum Patient Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, H; Rozendaal, R; Olaciregui-Ruiz, I

    2015-06-15

    Purpose: To develop an online 3D dose verification tool based on EPID transit dosimetry to ensure optimum patient safety in radiotherapy treatments. Methods: A new software package was developed which processes EPID portal images online using a back-projection algorithm for the 3D dose reconstruction. The package processes portal images faster than the acquisition rate of the portal imager (∼ 2.5 fps). After a portal image is acquired, the software seeks for “hot spots” in the reconstructed 3D dose distribution. A hot spot is in this study defined as a 4 cm{sup 3} cube where the average cumulative reconstructed dose exceedsmore » the average total planned dose by at least 20% and 50 cGy. If a hot spot is detected, an alert is generated resulting in a linac halt. The software has been tested by irradiating an Alderson phantom after introducing various types of serious delivery errors. Results: In our first experiment the Alderson phantom was irradiated with two arcs from a 6 MV VMAT H&N treatment having a large leaf position error or a large monitor unit error. For both arcs and both errors the linac was halted before dose delivery was completed. When no error was introduced, the linac was not halted. The complete processing of a single portal frame, including hot spot detection, takes about 220 ms on a dual hexacore Intel Xeon 25 X5650 CPU at 2.66 GHz. Conclusion: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for various kinds of gross delivery errors. The detection of hot spots was proven to be effective for the timely detection of these errors. Current work is focused on hot spot detection criteria for various treatment sites and the introduction of a clinical pilot program with online verification of hypo-fractionated (lung) treatments.« less

  15. SU-E-T-254: Development of a HDR-BT QA Tool for Verification of Source Position with Oncentra Applicator Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumazaki, Y; Miyaura, K; Hirai, R

    2015-06-15

    Purpose: To develop a High Dose Rate Brachytherapy (HDR-BT) quality assurance (QA) tool for verification of source position with Oncentra applicator modeling, and to report the results of radiation source positions with this tool. Methods: We developed a HDR-BT QA phantom and automated analysis software for verification of source position with Oncentra applicator modeling for the Fletcher applicator used in the MicroSelectron HDR system. This tool is intended for end-to-end tests that mimic the clinical 3D image-guided brachytherapy (3D-IGBT) workflow. The phantom is a 30x30x3 cm cuboid phantom with radiopaque markers, which are inserted into the phantom to evaluate applicatormore » tips and reference source positions; positions are laterally shifted 10 mm from the applicator axis. The markers are lead-based and scatter radiation to expose the films. Gafchromic RTQA2 films are placed on the applicators. The phantom includes spaces to embed the applicators. The source position is determined as the distance between the exposed source position and center position of two pairs of the first radiopaque markers. We generated a 3D-IGBT plan with applicator modeling. The first source position was 6 mm from the applicator tips, and the second source position was 10 mm from the first source position. Results: All source positions were consistent with the exposed positions within 1 mm for all Fletcher applicators using in-house software. Moreover, the distance between source positions was in good agreement with the reference distance. Applicator offset, determined as the distance from the applicator tips at the first source position in the treatment planning system, was accurate. Conclusion: Source position accuracy of applicator modeling used in 3D-IGBT was acceptable. This phantom and software will be useful as a HDR-BT QA tool for verification of source position with Oncentra applicator modeling.« less

  16. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  17. Development of the integrated control system for the microwave ion source of the PEFP 100-MeV proton accelerator

    NASA Astrophysics Data System (ADS)

    Song, Young-Gi; Seol, Kyung-Tae; Jang, Ji-Ho; Kwon, Hyeok-Jung; Cho, Yong-Sub

    2012-07-01

    The Proton Engineering Frontier Project (PEFP) 20-MeV proton linear accelerator is currently operating at the Korea Atomic Energy Research Institute (KAERI). The ion source of the 100-MeV proton linac needs at least a 100-hour operation time. To meet the goal, we have developed a microwave ion source that uses no filament. For the ion source, a remote control system has been developed by using experimental physics and the industrial control system (EPICS) software framework. The control system consists of a versa module europa (VME) and EPICS-based embedded applications running on a VxWorks real-time operating system. The main purpose of the control system is to control and monitor the operational variables of the components remotely and to protect operators from radiation exposure and the components from critical problems during beam extraction. We successfully performed the operation test of the control system to confirm the degree of safety during the hardware performance.

  18. Telescience Resource Kit (TReK)

    NASA Technical Reports Server (NTRS)

    Lippincott, Jeff

    2015-01-01

    Telescience Resource Kit (TReK) is one of the Huntsville Operations Support Center (HOSC) remote operations solutions. It can be used to monitor and control International Space Station (ISS) payloads from anywhere in the world. It is comprised of a suite of software applications and libraries that provide generic data system capabilities and access to HOSC services. The TReK Software has been operational since 2000. A new cross-platform version of TReK is under development. The new software is being released in phases during the 2014-2016 timeframe. The TReK Release 3.x series of software is the original TReK software that has been operational since 2000. This software runs on Windows. It contains capabilities to support traditional telemetry and commanding using CCSDS (Consultative Committee for Space Data Systems) packets. The TReK Release 4.x series of software is the new cross platform software. It runs on Windows and Linux. The new TReK software will support communication using standard IP protocols and traditional telemetry and commanding. All the software listed above is compatible and can be installed and run together on Windows. The new TReK software contains a suite of software that can be used by payload developers on the ground and onboard (TReK Toolkit). TReK Toolkit is a suite of lightweight libraries and utility applications for use onboard and on the ground. TReK Desktop is the full suite of TReK software -most useful on the ground. When TReK Desktop is released, the TReK installation program will provide the option to choose just the TReK Toolkit portion of the software or the full TReK Desktop suite. The ISS program is providing the TReK Toolkit software as a generic flight software capability offered as a standard service to payloads. TReK Software Verification was conducted during the April/May 2015 timeframe. Payload teams using the TReK software onboard can reference the TReK software verification. TReK will be demonstrated on-orbit running on an ISS provided T61p laptop. Target Timeframe: September 2015 -2016. The on-orbit demonstration will collect benchmark metrics, and will be used in the future to provide live demonstrations during ISS Payload Conferences. Benchmark metrics and demonstrations will address the protocols described in SSP 52050-0047 Ku Forward section 3.3.7. (Associated term: CCSDS File Delivery Protocol (CFDP)).

  19. MVP-CA Methodology for the Expert System Advocate's Advisor (ESAA)

    DOT National Transportation Integrated Search

    1997-11-01

    The Multi-Viewpoint Clustering Analysis (MVP-CA) tool is a semi-automated tool to provide a valuable aid for comprehension, verification, validation, maintenance, integration, and evolution of complex knowledge-based software systems. In this report,...

  20. Test and Verification Approach for the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Strong, Edward

    2008-01-01

    This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.

  1. System diagnostic builder

    NASA Technical Reports Server (NTRS)

    Nieten, Joseph L.; Burke, Roger

    1992-01-01

    The System Diagnostic Builder (SDB) is an automated software verification and validation tool using state-of-the-art Artificial Intelligence (AI) technologies. The SDB is used extensively by project BURKE at NASA-JSC as one component of a software re-engineering toolkit. The SDB is applicable to any government or commercial organization which performs verification and validation tasks. The SDB has an X-window interface, which allows the user to 'train' a set of rules for use in a rule-based evaluator. The interface has a window that allows the user to plot up to five data parameters (attributes) at a time. Using these plots and a mouse, the user can identify and classify a particular behavior of the subject software. Once the user has identified the general behavior patterns of the software, he can train a set of rules to represent his knowledge of that behavior. The training process builds rules and fuzzy sets to use in the evaluator. The fuzzy sets classify those data points not clearly identified as a particular classification. Once an initial set of rules is trained, each additional data set given to the SDB will be used by a machine learning mechanism to refine the rules and fuzzy sets. This is a passive process and, therefore, it does not require any additional operator time. The evaluation component of the SDB can be used to validate a single software system using some number of different data sets, such as a simulator. Moreover, it can be used to validate software systems which have been re-engineered from one language and design methodology to a totally new implementation.

  2. A progress report on a NASA research program for embedded computer systems software

    NASA Technical Reports Server (NTRS)

    Foudriat, E. C.; Senn, E. H.; Will, R. W.; Straeter, T. A.

    1979-01-01

    The paper presents the results of the second stage of the Multipurpose User-oriented Software Technology (MUST) program. Four primary areas of activities are discussed: programming environment, HAL/S higher-order programming language support, the Integrated Verification and Testing System (IVTS), and distributed system language research. The software development environment is provided by the interactive software invocation system. The higher-order programming language (HOL) support chosen for consideration is HAL/S mainly because at the time it was one of the few HOLs with flight computer experience and it is the language used on the Shuttle program. The overall purpose of IVTS is to provide a 'user-friendly' software testing system which is highly modular, user controlled, and cooperative in nature.

  3. Building quality into medical product software design.

    PubMed

    Mallory, S R

    1993-01-01

    The software engineering and quality assurance disciplines are a requisite to the design of safe and effective software-based medical devices. It is in the areas of software methodology and process that the most beneficial application of these disciplines to software development can be made. Software is a product of complex operations and methodologies and is not amenable to the traditional electromechanical quality assurance processes. Software quality must be built in by the developers, with the software verification and validation engineers acting as the independent instruments for ensuring compliance with performance objectives and with development and maintenance standards. The implementation of a software quality assurance program is a complex process involving management support, organizational changes, and new skill sets, but the benefits are profound. Its rewards provide safe, reliable, cost-effective, maintainable, and manageable software, which may significantly speed the regulatory review process and therefore potentially shorten the overall time to market. The use of a trial project can greatly facilitate the learning process associated with the first-time application of a software quality assurance program.

  4. Application of Architectural Patterns and Lightweight Formal Method for the Validation and Verification of Safety Critical Systems

    DTIC Science & Technology

    2013-09-01

    to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS

  5. Definition of ground test for verification of large space structure control

    NASA Technical Reports Server (NTRS)

    Doane, G. B., III; Glaese, J. R.; Tollison, D. K.; Howsman, T. G.; Curtis, S. (Editor); Banks, B.

    1984-01-01

    Control theory and design, dynamic system modelling, and simulation of test scenarios are the main ideas discussed. The overall effort is the achievement at Marshall Space Flight Center of a successful ground test experiment of a large space structure. A simplified planar model of ground test experiment of a large space structure. A simplified planar model of ground test verification was developed. The elimination from that model of the uncontrollable rigid body modes was also examined. Also studied was the hardware/software of computation speed.

  6. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  7. Hubble Space Telescope Fine Guidance Sensors Instrument Handbook, version 4.0

    NASA Technical Reports Server (NTRS)

    Holfeltz, S. T. (Editor)

    1994-01-01

    This is a revised version of the Hubble Space Telescope Fine Guidance Sensor Instrument Handbook. The main goal of this edition is to help the potential General Observer (GO) learn how to most efficiently use the Fine Guidance Sensors (FGS's). First, the actual performance of the FGS's as scientific instruments is reviewed. Next, each of the available operating modes of the FGS's are reviewed in turn. The status and findings of pertinent calibrations, including Orbital Verification, Science Verification, and Instrument Scientist Calibrations are included as well as the relevant data reduction software.

  8. Design, Development, and Automated Verification of an Integrity-Protected Hypervisor

    DTIC Science & Technology

    2012-07-16

    mechanism for implementing software virtualization. Since hypervisors execute at a very high privilege level, they must be secure. A fundamental security...using the CBMC model checker. CBMC verified XMHF?s implementation ? about 4700 lines of C code ? in about 80 seconds using less than 2GB of RAM. 15...Hypervisors are a popular mechanism for implementing software virtualization. Since hypervisors execute at a very high privilege level, they must be

  9. Model-based software design

    NASA Technical Reports Server (NTRS)

    Iscoe, Neil; Liu, Zheng-Yang; Feng, Guohui; Yenne, Britt; Vansickle, Larry; Ballantyne, Michael

    1992-01-01

    Domain-specific knowledge is required to create specifications, generate code, and understand existing systems. Our approach to automating software design is based on instantiating an application domain model with industry-specific knowledge and then using that model to achieve the operational goals of specification elicitation and verification, reverse engineering, and code generation. Although many different specification models can be created from any particular domain model, each specification model is consistent and correct with respect to the domain model.

  10. Scanning and Measuring Device for Diagnostic of Barrel Bore

    NASA Astrophysics Data System (ADS)

    Marvan, Ales; Hajek, Josef; Vana, Jan; Dvorak, Radim; Drahansky, Martin; Jankovych, Robert; Skvarek, Jozef

    The article discusses the design, mechanical design, electronics and software for robot diagnosis of barrels with caliber of 120 mm to 155 mm. This diagnostic device is intended primarily for experimental research and verification of appropriate methods and technologies for the diagnosis of the main bore guns. Article also discusses the design of sensors and software, the issue of data processing and image reconstruction obtained by scanning of the surface of the bore.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weinmann-Smith, Robert

    The identiFINDER2 is an easily portable handheld NaI gamma detector. The IAEA uses the safeguards version of the identiFINDER2 and calls it the HM-5. The HM-5 has built in software to analyze the detection signal specifically for IAEA verification applications.

  12. Firing Room Remote Application Software Development

    NASA Technical Reports Server (NTRS)

    Liu, Kan

    2014-01-01

    The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories subsystem. In addition, a Conversion Fusion project was created to show specific approved checkout and launch engineering data for public-friendly display purposes.

  13. V & V Within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1996-01-01

    Verification and validation (V&V) is used to increase the level of assurance of critical software, particularly that of safety-critical and mission critical software. This paper describes the working group's success in identifying V&V tasks that could be performed in the domain engineering and transition levels of reuse-based software engineering. The primary motivation for V&V at the domain level is to provide assurance that the domain requirements are correct and that the domain artifacts correctly implement the domain requirements. A secondary motivation is the possible elimination of redundant V&V activities at the application level. The group also considered the criteria and motivation for performing V&V in domain engineering.

  14. Software Development Technologies for Reactive, Real-Time, and Hybrid Systems: Summary of Research

    NASA Technical Reports Server (NTRS)

    Manna, Zohar

    1998-01-01

    This research is directed towards the implementation of a comprehensive deductive-algorithmic environment (toolkit) for the development and verification of high assurance reactive systems, especially concurrent, real-time, and hybrid systems. For this, we have designed and implemented the STCP (Stanford Temporal Prover) verification system. Reactive systems have an ongoing interaction with their environment, and their computations are infinite sequences of states. A large number of systems can be seen as reactive systems, including hardware, concurrent programs, network protocols, and embedded systems. Temporal logic provides a convenient language for expressing properties of reactive systems. A temporal verification methodology provides procedures for proving that a given system satisfies a given temporal property. The research covered necessary theoretical foundations as well as implementation and application issues.

  15. TU-C-17A-03: An Integrated Contour Evaluation Software Tool Using Supervised Pattern Recognition for Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chen, H; Tan, J; Kavanaugh, J

    Purpose: Radiotherapy (RT) contours delineated either manually or semiautomatically require verification before clinical usage. Manual evaluation is very time consuming. A new integrated software tool using supervised pattern contour recognition was thus developed to facilitate this process. Methods: The contouring tool was developed using an object-oriented programming language C# and application programming interfaces, e.g. visualization toolkit (VTK). The C# language served as the tool design basis. The Accord.Net scientific computing libraries were utilized for the required statistical data processing and pattern recognition, while the VTK was used to build and render 3-D mesh models from critical RT structures in real-timemore » and 360° visualization. Principal component analysis (PCA) was used for system self-updating geometry variations of normal structures based on physician-approved RT contours as a training dataset. The inhouse design of supervised PCA-based contour recognition method was used for automatically evaluating contour normality/abnormality. The function for reporting the contour evaluation results was implemented by using C# and Windows Form Designer. Results: The software input was RT simulation images and RT structures from commercial clinical treatment planning systems. Several abilities were demonstrated: automatic assessment of RT contours, file loading/saving of various modality medical images and RT contours, and generation/visualization of 3-D images and anatomical models. Moreover, it supported the 360° rendering of the RT structures in a multi-slice view, which allows physicians to visually check and edit abnormally contoured structures. Conclusion: This new software integrates the supervised learning framework with image processing and graphical visualization modules for RT contour verification. This tool has great potential for facilitating treatment planning with the assistance of an automatic contour evaluation module in avoiding unnecessary manual verification for physicians/dosimetrists. In addition, its nature as a compact and stand-alone tool allows for future extensibility to include additional functions for physicians’ clinical needs.« less

  16. GCS plan for software aspects of certification

    NASA Technical Reports Server (NTRS)

    Shagnea, Anita M.; Lowman, Douglas S.; Withers, B. Edward

    1990-01-01

    As part of the Guidance and Control Software (GCS) research project being sponsored by NASA to evaluate the failure processes of software, standard industry software development procedures are being employed. To ensure that these procedures are authentic, the guidelines outlined in the Radio Technical Commission for Aeronautics (RTCA/DO-178A document entitled, software considerations in airborne systems and equipment certification, were adopted. A major aspect of these guidelines is proper documentation. As such, this report, the plan for software aspects of certification, was produced in accordance with DO-178A. An overview is given of the GCS research project, including the goals of the project, project organization, and project schedules. It also specifies the plans for all aspects of the project which relate to the certification of the GCS implementations developed under a NASA contract. These plans include decisions made regarding the software specification, accuracy requirements, configuration management, implementation development and verification, and the development of the GCS simulator.

  17. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinh, Nam; Athe, Paridhi; Jones, Christopher

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. Thismore » approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.« less

  18. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  19. VALIDATION OF ANSYS FINITE ELEMENT ANALYSIS SOFTWARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    HAMM, E.R.

    2003-06-27

    This document provides a record of the verification and Validation of the ANSYS Version 7.0 software that is installed on selected CH2M HILL computers. The issues addressed include: Software verification, installation, validation, configuration management and error reporting. The ANSYS{reg_sign} computer program is a large scale multi-purpose finite element program which may be used for solving several classes of engineering analysis. The analysis capabilities of ANSYS Full Mechanical Version 7.0 installed on selected CH2M Hill Hanford Group (CH2M HILL) Intel processor based computers include the ability to solve static and dynamic structural analyses, steady-state and transient heat transfer problems, mode-frequency andmore » buckling eigenvalue problems, static or time-varying magnetic analyses and various types of field and coupled-field applications. The program contains many special features which allow nonlinearities or secondary effects to be included in the solution, such as plasticity, large strain, hyperelasticity, creep, swelling, large deflections, contact, stress stiffening, temperature dependency, material anisotropy, and thermal radiation. The ANSYS program has been in commercial use since 1970, and has been used extensively in the aerospace, automotive, construction, electronic, energy services, manufacturing, nuclear, plastics, oil and steel industries.« less

  20. Simulation-To-Flight (STF-1): A Mission to Enable CubeSat Software-Based Validation and Verification

    NASA Technical Reports Server (NTRS)

    Morris, Justin; Zemerick, Scott; Grubb, Matt; Lucas, John; Jaridi, Majid; Gross, Jason N.; Ohi, Nicholas; Christian, John A.; Vassiliadis, Dimitris; Kadiyala, Anand; hide

    2016-01-01

    The Simulation-to-Flight 1 (STF-1) CubeSat mission aims to demonstrate how legacy simulation technologies may be adapted for flexible and effective use on missions using the CubeSat platform. These technologies, named NASA Operational Simulator (NOS), have demonstrated significant value on several missions such as James Webb Space Telescope, Global Precipitation Measurement, Juno, and Deep Space Climate Observatory in the areas of software development, mission operations/training, verification and validation (V&V), test procedure development and software systems check-out. STF-1 will demonstrate a highly portable simulation and test platform that allows seamless transition of mission development artifacts to flight products. This environment will decrease development time of future CubeSat missions by lessening the dependency on hardware resources. In addition, through a partnership between NASA GSFC, the West Virginia Space Grant Consortium and West Virginia University, the STF-1 CubeSat will hosts payloads for three secondary objectives that aim to advance engineering and physical-science research in the areas of navigation systems of small satellites, provide useful data for understanding magnetosphere-ionosphere coupling and space weather, and verify the performance and durability of III-V Nitride-based materials.

  1. Optimum-AIV: A planning and scheduling system for spacecraft AIV

    NASA Technical Reports Server (NTRS)

    Arentoft, M. M.; Fuchs, Jens J.; Parrod, Y.; Gasquet, Andre; Stader, J.; Stokes, I.; Vadon, H.

    1991-01-01

    A project undertaken for the European Space Agency (ESA) is presented. The project is developing a knowledge based software system for planning and scheduling of activities for spacecraft assembly, integration, and verification (AIV). The system extends into the monitoring of plan execution and the plan repair phase. The objectives are to develop an operational kernel of a planning, scheduling, and plan repair tool, called OPTIMUM-AIV, and to provide facilities which will allow individual projects to customize the kernel to suit its specific needs. The kernel shall consist of a set of software functionalities for assistance in initial specification of the AIV plan, in verification and generation of valid plans and schedules for the AIV activities, and in interactive monitoring and execution problem recovery for the detailed AIV plans. Embedded in OPTIMUM-AIV are external interfaces which allow integration with alternative scheduling systems and project databases. The current status of the OPTIMUM-AIV project, as of Jan. 1991, is that a further analysis of the AIV domain has taken place through interviews with satellite AIV experts, a software requirement document (SRD) for the full operational tool was approved, and an architectural design document (ADD) for the kernel excluding external interfaces is ready for review.

  2. Translating expert system rules into Ada code with validation and verification

    NASA Technical Reports Server (NTRS)

    Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam

    1991-01-01

    The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.

  3. Overview of open resources to support automated structure verification and elucidation

    EPA Science Inventory

    Cheminformatics methods form an essential basis for providing analytical scientists with access to data, algorithms and workflows. There are an increasing number of free online databases (compound databases, spectral libraries, data repositories) and a rich collection of software...

  4. Assertion Mechanisms in Programming Languages.

    DTIC Science & Technology

    1979-11-01

    the Construction and Verific3tion of ALPHAPD Programs ", IEEE Transactions on Software Engineering , voL. 2, no. 4, p. 253-265, 1 1-7t [Zelkooitz a...be true at a point in program execut ion. The languaje designer has several options when considering the semantics of an assertion mechanism... Software Engineering , vol. SE-i, no. 2, p. 156-173, June 1975. [Hansen] G. J. Hansen, G. A. Shoults and J. D. Coinmeat, "Construction of a Transportaole

  5. Operation of the Computer Software Management and Information Center (COSMIC)

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The major operational areas of the COSMIC center are described. Quantitative data on the software submittals, program verification, and evaluation are presented. The dissemination activities are summarized. Customer services and marketing activities of the center for the calendar year are described. Those activities devoted to the maintenance and support of selected programs are described. A Customer Information system, the COSMIC Abstract Recording System Project, and the COSMIC Microfiche Project are summarized. Operational cost data are summarized.

  6. Research of aerohydrodynamic and aeroelastic processes on PNRPU HPC system

    NASA Astrophysics Data System (ADS)

    Modorskii, V. Ya.; Shevelev, N. A.

    2016-10-01

    Research of aerohydrodynamic and aeroelastic processes with the High Performance Computing Complex in PNIPU is actively conducted within the university priority development direction "Aviation engine and gas turbine technology". Work is carried out in two areas: development and use of domestic software and use of well-known foreign licensed applied software packets. In addition, the third direction associated with the verification of computational experiments - physical modeling, with unique proprietary experimental installations is being developed.

  7. NEAMS SOFTWARE V&V PLAN FOR THE MARMOT SOFTWARE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Michael R Tonks

    2014-03-01

    In order to ensure the accuracy and quality of the microstructure based materials models being developed in conjunction with MARMOT simulations, MARMOT must undergo exhaustive verification and validation. Only after this process can we confidently rely on the MARMOT code to predict the microstructure evolution within the fuel. Therefore, in this report we lay out a V&V plan for the MARMOT code, highlighting where existing data could be used and where new data is required.

  8. Building confidence and credibility amid growing model and computing complexity

    NASA Astrophysics Data System (ADS)

    Evans, K. J.; Mahajan, S.; Veneziani, C.; Kennedy, J. H.

    2017-12-01

    As global Earth system models are developed to answer an ever-wider range of science questions, software products that provide robust verification, validation, and evaluation must evolve in tandem. Measuring the degree to which these new models capture past behavior, predict the future, and provide the certainty of predictions is becoming ever more challenging for reasons that are generally well known, yet are still challenging to address. Two specific and divergent needs for analysis of the Accelerated Climate Model for Energy (ACME) model - but with a similar software philosophy - are presented to show how a model developer-based focus can address analysis needs during expansive model changes to provide greater fidelity and execute on multi-petascale computing facilities. A-PRIME is a python script-based quick-look overview of a fully-coupled global model configuration to determine quickly if it captures specific behavior before significant computer time and expense is invested. EVE is an ensemble-based software framework that focuses on verification of performance-based ACME model development, such as compiler or machine settings, to determine the equivalence of relevant climate statistics. The challenges and solutions for analysis of multi-petabyte output data are highlighted from the aspect of the scientist using the software, with the aim of fostering discussion and further input from the community about improving developer confidence and community credibility.

  9. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.

  10. A systematic approach to the Planck LFI end-to-end test and its application to the DPC Level 1 pipeline

    NASA Astrophysics Data System (ADS)

    Frailis, M.; Maris, M.; Zacchei, A.; Morisset, N.; Rohlfs, R.; Meharga, M.; Binko, P.; Türler, M.; Galeotta, S.; Gasparo, F.; Franceschi, E.; Butler, R. C.; D'Arcangelo, O.; Fogliani, S.; Gregorio, A.; Lowe, S. R.; Maggio, G.; Malaspina, M.; Mandolesi, N.; Manzato, P.; Pasian, F.; Perrotta, F.; Sandri, M.; Terenzi, L.; Tomasi, M.; Zonca, A.

    2009-12-01

    The Level 1 of the Planck LFI Data Processing Centre (DPC) is devoted to the handling of the scientific and housekeeping telemetry. It is a critical component of the Planck ground segment which has to strictly commit to the project schedule to be ready for the launch and flight operations. In order to guarantee the quality necessary to achieve the objectives of the Planck mission, the design and development of the Level 1 software has followed the ESA Software Engineering Standards. A fundamental step in the software life cycle is the Verification and Validation of the software. The purpose of this work is to show an example of procedures, test development and analysis successfully applied to a key software project of an ESA mission. We present the end-to-end validation tests performed on the Level 1 of the LFI-DPC, by detailing the methods used and the results obtained. Different approaches have been used to test the scientific and housekeeping data processing. Scientific data processing has been tested by injecting signals with known properties directly into the acquisition electronics, in order to generate a test dataset of real telemetry data and reproduce as much as possible nominal conditions. For the HK telemetry processing, validation software have been developed to inject known parameter values into a set of real housekeeping packets and perform a comparison with the corresponding timelines generated by the Level 1. With the proposed validation and verification procedure, where the on-board and ground processing are viewed as a single pipeline, we demonstrated that the scientific and housekeeping processing of the Planck-LFI raw data is correct and meets the project requirements.

  11. Energy Information Systems

    Science.gov Websites

    Energy Analytics Campaign > 2014-2018 Assessment of Automated M&V Methods > 2012-2018 Better Assessment of automated measurement and verification methods. Granderson, J. et al. Lawrence Berkeley . PDF, 726 KB Performance Metrics and Objective Testing Methods for Energy Baseline Modeling Software

  12. Proceedings of the 3rd Annual Conference on Aerospace Computational Control, volume 1

    NASA Technical Reports Server (NTRS)

    Bernard, Douglas E. (Editor); Man, Guy K. (Editor)

    1989-01-01

    Conference topics included definition of tool requirements, advanced multibody component representation descriptions, model reduction, parallel computation, real time simulation, control design and analysis software, user interface issues, testing and verification, and applications to spacecraft, robotics, and aircraft.

  13. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    NASA Technical Reports Server (NTRS)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  14. Provably trustworthy systems.

    PubMed

    Klein, Gerwin; Andronick, June; Keller, Gabriele; Matichuk, Daniel; Murray, Toby; O'Connor, Liam

    2017-10-13

    We present recent work on building and scaling trustworthy systems with formal, machine-checkable proof from the ground up, including the operating system kernel, at the level of binary machine code. We first give a brief overview of the seL4 microkernel verification and how it can be used to build verified systems. We then show two complementary techniques for scaling these methods to larger systems: proof engineering, to estimate verification effort; and code/proof co-generation, for scalable development of provably trustworthy applications.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Author(s).

  15. ITK: enabling reproducible research and open science

    PubMed Central

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46. PMID:24600387

  16. ITK: enabling reproducible research and open science.

    PubMed

    McCormick, Matthew; Liu, Xiaoxiao; Jomier, Julien; Marion, Charles; Ibanez, Luis

    2014-01-01

    Reproducibility verification is essential to the practice of the scientific method. Researchers report their findings, which are strengthened as other independent groups in the scientific community share similar outcomes. In the many scientific fields where software has become a fundamental tool for capturing and analyzing data, this requirement of reproducibility implies that reliable and comprehensive software platforms and tools should be made available to the scientific community. The tools will empower them and the public to verify, through practice, the reproducibility of observations that are reported in the scientific literature. Medical image analysis is one of the fields in which the use of computational resources, both software and hardware, are an essential platform for performing experimental work. In this arena, the introduction of the Insight Toolkit (ITK) in 1999 has transformed the field and facilitates its progress by accelerating the rate at which algorithmic implementations are developed, tested, disseminated and improved. By building on the efficiency and quality of open source methodologies, ITK has provided the medical image community with an effective platform on which to build a daily workflow that incorporates the true scientific practices of reproducibility verification. This article describes the multiple tools, methodologies, and practices that the ITK community has adopted, refined, and followed during the past decade, in order to become one of the research communities with the most modern reproducibility verification infrastructure. For example, 207 contributors have created over 2400 unit tests that provide over 84% code line test coverage. The Insight Journal, an open publication journal associated with the toolkit, has seen over 360,000 publication downloads. The median normalized closeness centrality, a measure of knowledge flow, resulting from the distributed peer code review system was high, 0.46.

  17. Challenges and Demands on Automated Software Revision

    NASA Technical Reports Server (NTRS)

    Bonakdarpour, Borzoo; Kulkarni, Sandeep S.

    2008-01-01

    In the past three decades, automated program verification has undoubtedly been one of the most successful contributions of formal methods to software development. However, when verification of a program against a logical specification discovers bugs in the program, manual manipulation of the program is needed in order to repair it. Thus, in the face of existence of numerous unverified and un- certified legacy software in virtually any organization, tools that enable engineers to automatically verify and subsequently fix existing programs are highly desirable. In addition, since requirements of software systems often evolve during the software life cycle, the issue of incomplete specification has become a customary fact in many design and development teams. Thus, automated techniques that revise existing programs according to new specifications are of great assistance to designers, developers, and maintenance engineers. As a result, incorporating program synthesis techniques where an algorithm generates a program, that is correct-by-construction, seems to be a necessity. The notion of manual program repair described above turns out to be even more complex when programs are integrated with large collections of sensors and actuators in hostile physical environments in the so-called cyber-physical systems. When such systems are safety/mission- critical (e.g., in avionics systems), it is essential that the system reacts to physical events such as faults, delays, signals, attacks, etc, so that the system specification is not violated. In fact, since it is impossible to anticipate all possible such physical events at design time, it is highly desirable to have automated techniques that revise programs with respect to newly identified physical events according to the system specification.

  18. A survey of program slicing for software engineering

    NASA Technical Reports Server (NTRS)

    Beck, Jon

    1993-01-01

    This research concerns program slicing which is used as a tool for program maintainence of software systems. Program slicing decreases the level of effort required to understand and maintain complex software systems. It was first designed as a debugging aid, but it has since been generalized into various tools and extended to include program comprehension, module cohesion estimation, requirements verification, dead code elimination, and maintainence of several software systems, including reverse engineering, parallelization, portability, and reuse component generation. This paper seeks to address and define terminology, theoretical concepts, program representation, different program graphs, developments in static slicing, dynamic slicing, and semantics and mathematical models. Applications for conventional slicing are presented, along with a prognosis of future work in this field.

  19. Application of industry-standard guidelines for the validation of avionics software

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Shagnea, Anita M.

    1990-01-01

    The application of industry standards to the development of avionics software is discussed, focusing on verification and validation activities. It is pointed out that the procedures that guide the avionics software development and testing process are under increased scrutiny. The DO-178A guidelines, Software Considerations in Airborne Systems and Equipment Certification, are used by the FAA for certifying avionics software. To investigate the effectiveness of the DO-178A guidelines for improving the quality of avionics software, guidance and control software (GCS) is being developed according to the DO-178A development method. It is noted that, due to the extent of the data collection and configuration management procedures, any phase in the life cycle of a GCS implementation can be reconstructed. Hence, a fundamental development and testing platform has been established that is suitable for investigating the adequacy of various software development processes. In particular, the overall effectiveness and efficiency of the development method recommended by the DO-178A guidelines are being closely examined.

  20. Shuttle avionics software trials, tribulations and success

    NASA Technical Reports Server (NTRS)

    Henderson, O. L.

    1985-01-01

    The early problems and the solutions developed to provide the required quality software needed to support the space shuttle engine development program are described. The decision to use a programmable digital control system on the space shuttle engine was primarily based upon the need for a flexible control system capable of supporting the total engine mission on a large complex pump fed engine. The mission definition included all control phases from ground checkout through post shutdown propellant dumping. The flexibility of the controller through reprogrammable software allowed the system to respond to the technical challenges and innovation required to develop both the engine and controller hardware. This same flexibility, however, placed a severe strain on the capability of the software development and verification organization. The overall development program required that the software facility accommodate significant growth in both the software requirements and the number of software packages delivered. This challenge was met by reorganization and evolution in the process of developing and verifying software.

  1. Separating essentials from incidentals: an execution architecture for real-time control systems

    NASA Technical Reports Server (NTRS)

    Dvorak, Daniel; Reinholtz, Kirk

    2004-01-01

    This paper describes an execution architecture that makes such systems far more analyzable and verifiable by aggressive separation of concerns. The architecture separates two key software concerns: transformations of global state, as defined in pure functions; and sequencing/timing of transformations, as performed by an engine that enforces four prime invariants. The important advantage of this architecture, besides facilitating verification, is that it encourages formal specification of systems in a vocabulary that brings systems engineering closer to software engineering.

  2. Programs for Testing an SSME-Monitoring System

    NASA Technical Reports Server (NTRS)

    Lang, Andre; Cecil, Jimmie; Heusinger, Ralph; Freestone, Kathleen; Blue, Lisa; Wilkerson, DeLisa; McMahon, Leigh Anne; Hall, Richard B.; Varnavas, Kosta; Smith, Keary; hide

    2007-01-01

    A suite of computer programs has been developed for special test equipment (STE) that is used in verification testing of the Health Management Computer Integrated Rack Assembly (HMCIRA), a ground-based system of analog and digital electronic hardware and software for "flight-like" testing for development of components of an advanced health-management system for the space shuttle main engine (SSME). The STE software enables the STE to simulate the analog input and the data flow of an SSME test firing from start to finish.

  3. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  4. The Application of V&V within Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward

    1996-01-01

    Verification and Validation (V&V) is performed during application development for many systems, especially safety-critical and mission-critical systems. The V&V process is intended to discover errors as early as possible during the development process. Early discovery is important in order to minimize the cost and other impacts of correcting these errors. In reuse-based software engineering, decisions on the requirements, design and even implementation of domain assets can can be made prior to beginning development of a specific system. in order to bring the effectiveness of V&V to bear within reuse-based software engineering. V&V must be incorporated within the domain engineering process.

  5. ESSAA: Embedded system safety analysis assistant

    NASA Technical Reports Server (NTRS)

    Wallace, Peter; Holzer, Joseph; Guarro, Sergio; Hyatt, Larry

    1987-01-01

    The Embedded System Safety Analysis Assistant (ESSAA) is a knowledge-based tool that can assist in identifying disaster scenarios. Imbedded software issues hazardous control commands to the surrounding hardware. ESSAA is intended to work from outputs to inputs, as a complement to simulation and verification methods. Rather than treating the software in isolation, it examines the context in which the software is to be deployed. Given a specified disasterous outcome, ESSAA works from a qualitative, abstract model of the complete system to infer sets of environmental conditions and/or failures that could cause a disasterous outcome. The scenarios can then be examined in depth for plausibility using existing techniques.

  6. Simulation verification techniques study. Subsystem simulation validation techniques

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1974-01-01

    Techniques for validation of software modules which simulate spacecraft onboard systems are discussed. An overview of the simulation software hierarchy for a shuttle mission simulator is provided. A set of guidelines for the identification of subsystem/module performance parameters and critical performance parameters are presented. Various sources of reference data to serve as standards of performance for simulation validation are identified. Environment, crew station, vehicle configuration, and vehicle dynamics simulation software are briefly discussed from the point of view of their interfaces with subsystem simulation modules. A detailed presentation of results in the area of vehicle subsystems simulation modules is included. A list of references, conclusions and recommendations are also given.

  7. Copilot: Monitoring Embedded Systems

    NASA Technical Reports Server (NTRS)

    Pike, Lee; Wegmann, Nis; Niller, Sebastian; Goodloe, Alwyn

    2012-01-01

    Runtime verification (RV) is a natural fit for ultra-critical systems, where correctness is imperative. In ultra-critical systems, even if the software is fault-free, because of the inherent unreliability of commodity hardware and the adversity of operational environments, processing units (and their hosted software) are replicated, and fault-tolerant algorithms are used to compare the outputs. We investigate both software monitoring in distributed fault-tolerant systems, as well as implementing fault-tolerance mechanisms using RV techniques. We describe the Copilot language and compiler, specifically designed for generating monitors for distributed, hard real-time systems. We also describe two case-studies in which we generated Copilot monitors in avionics systems.

  8. Simscape Modeling Verification in the Simulink Development Environment

    NASA Technical Reports Server (NTRS)

    Volle, Christopher E. E.

    2014-01-01

    The purpose of the Simulation Product Group of the Control and Data Systems division of the NASA Engineering branch at Kennedy Space Center is to provide a realtime model and simulation of the Ground Subsystems participating in vehicle launching activities. The simulation software is part of the Spaceport Command and Control System (SCCS) and is designed to support integrated launch operation software verification, and console operator training. Using Mathworks Simulink tools, modeling engineers currently build models from the custom-built blocks to accurately represent ground hardware. This is time consuming and costly due to required rigorous testing and peer reviews to be conducted for each custom-built block. Using Mathworks Simscape tools, modeling time can be reduced since there would be no custom-code developed. After careful research, the group came to the conclusion it is feasible to use Simscape's blocks in MatLab's Simulink. My project this fall was to verify the accuracy of the Crew Access Arm model developed using Simscape tools running in the Simulink development environment.

  9. Version 2.0 Visual Sample Plan (VSP): UXO Module Code Description and Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gilbert, Richard O.; Wilson, John E.; O'Brien, Robert F.

    2003-05-06

    The Pacific Northwest National Laboratory (PNNL) is developing statistical methods for determining the amount of geophysical surveys conducted along transects (swaths) that are needed to achieve specified levels of confidence of finding target areas (TAs) of anomalous readings and possibly unexploded ordnance (UXO) at closed, transferring and transferred (CTT) Department of Defense (DoD) ranges and other sites. The statistical methods developed by PNNL have been coded into the UXO module of the Visual Sample Plan (VSP) software code that is being developed by PNNL with support from the DoD, the U.S. Department of Energy (DOE, and the U.S. Environmental Protectionmore » Agency (EPA). (The VSP software and VSP Users Guide (Hassig et al, 2002) may be downloaded from http://dqo.pnl.gov/vsp.) This report describes and documents the statistical methods developed and the calculations and verification testing that have been conducted to verify that VSPs implementation of these methods is correct and accurate.« less

  10. A measurement system for large, complex software programs

    NASA Technical Reports Server (NTRS)

    Rone, Kyle Y.; Olson, Kitty M.; Davis, Nathan E.

    1994-01-01

    This paper describes measurement systems required to forecast, measure, and control activities for large, complex software development and support programs. Initial software cost and quality analysis provides the foundation for meaningful management decisions as a project evolves. In modeling the cost and quality of software systems, the relationship between the functionality, quality, cost, and schedule of the product must be considered. This explicit relationship is dictated by the criticality of the software being developed. This balance between cost and quality is a viable software engineering trade-off throughout the life cycle. Therefore, the ability to accurately estimate the cost and quality of software systems is essential to providing reliable software on time and within budget. Software cost models relate the product error rate to the percent of the project labor that is required for independent verification and validation. The criticality of the software determines which cost model is used to estimate the labor required to develop the software. Software quality models yield an expected error discovery rate based on the software size, criticality, software development environment, and the level of competence of the project and developers with respect to the processes being employed.

  11. Maintaining the Health of Software Monitors

    NASA Technical Reports Server (NTRS)

    Person, Suzette; Rungta, Neha

    2013-01-01

    Software health management (SWHM) techniques complement the rigorous verification and validation processes that are applied to safety-critical systems prior to their deployment. These techniques are used to monitor deployed software in its execution environment, serving as the last line of defense against the effects of a critical fault. SWHM monitors use information from the specification and implementation of the monitored software to detect violations, predict possible failures, and help the system recover from faults. Changes to the monitored software, such as adding new functionality or fixing defects, therefore, have the potential to impact the correctness of both the monitored software and the SWHM monitor. In this work, we describe how the results of a software change impact analysis technique, Directed Incremental Symbolic Execution (DiSE), can be applied to monitored software to identify the potential impact of the changes on the SWHM monitor software. The results of DiSE can then be used by other analysis techniques, e.g., testing, debugging, to help preserve and improve the integrity of the SWHM monitor as the monitored software evolves.

  12. TU-FG-BRB-05: A 3 Dimensional Prompt Gamma Imaging System for Range Verification in Proton Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Draeger, E; Chen, H; Polf, J

    2016-06-15

    Purpose: To report on the initial developments of a clinical 3-dimensional (3D) prompt gamma (PG) imaging system for proton radiotherapy range verification. Methods: The new imaging system under development consists of a prototype Compton camera to measure PG emission during proton beam irradiation and software to reconstruct, display, and analyze 3D images of the PG emission. For initial test of the system, PGs were measured with a prototype CC during a 200 cGy dose delivery with clinical proton pencil beams (ranging from 100 MeV – 200 MeV) to a water phantom. Measurements were also carried out with the CC placedmore » 15 cm from the phantom for a full range 150 MeV pencil beam and with its range shifted by 2 mm. Reconstructed images of the PG emission were displayed by the clinical PG imaging software and compared to the dose distributions of the proton beams calculated by a commercial treatment planning system. Results: Measurements made with the new PG imaging system showed that a 3D image could be reconstructed from PGs measured during the delivery of 200 cGy of dose, and that shifts in the Bragg peak range of as little as 2 mm could be detected. Conclusion: Initial tests of a new PG imaging system show its potential to provide 3D imaging and range verification for proton radiotherapy. Based on these results, we have begun work to improve the system with the goal that images can be produced from delivery of as little as 20 cGy so that the system could be used for in-vivo proton beam range verification on a daily basis.« less

  13. Data Packages for the Hanford Immobilized Low Activity Tank Waste Performance Assessment 2001 Version [SEC 1 THRU 5

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    MANN, F.M.

    Data package supporting the 2001 Immobilized Low-Activity Waste Performance Analysis. Geology, hydrology, geochemistry, facility, waste form, and dosimetry data based on recent investigation are provided. Verification and benchmarking packages for selected software codes are provided.

  14. Home | Simulation Research

    Science.gov Websites

    Group specializes in the research, development and deployment of software that support the design and controls design, the Spawn of EnergyPlus next-generation simulation engine, for building and control energy systems tools for OpenBuildingControl to support control design, deployment and verification of building

  15. Simulation based mask defect repair verification and disposition

    NASA Astrophysics Data System (ADS)

    Guo, Eric; Zhao, Shirley; Zhang, Skin; Qian, Sandy; Cheng, Guojie; Vikram, Abhishek; Li, Ling; Chen, Ye; Hsiang, Chingyun; Zhang, Gary; Su, Bo

    2009-10-01

    As the industry moves towards sub-65nm technology nodes, the mask inspection, with increased sensitivity and shrinking critical defect size, catches more and more nuisance and false defects. Increased defect counts pose great challenges in the post inspection defect classification and disposition: which defect is real defect, and among the real defects, which defect should be repaired and how to verify the post-repair defects. In this paper, we address the challenges in mask defect verification and disposition, in particular, in post repair defect verification by an efficient methodology, using SEM mask defect images, and optical inspection mask defects images (only for verification of phase and transmission related defects). We will demonstrate the flow using programmed mask defects in sub-65nm technology node design. In total 20 types of defects were designed including defects found in typical real circuit environments with 30 different sizes designed for each type. The SEM image was taken for each programmed defect after the test mask was made. Selected defects were repaired and SEM images from the test mask were taken again. Wafers were printed with the test mask before and after repair as defect printability references. A software tool SMDD-Simulation based Mask Defect Disposition-has been used in this study. The software is used to extract edges from the mask SEM images and convert them into polygons to save in GDSII format. Then, the converted polygons from the SEM images were filled with the correct tone to form mask patterns and were merged back into the original GDSII design file. This merge is for the purpose of contour simulation-since normally the SEM images cover only small area (~1 μm) and accurate simulation requires including larger area of optical proximity effect. With lithography process model, the resist contour of area of interest (AOI-the area surrounding a mask defect) can be simulated. If such complicated model is not available, a simple optical model can be used to get simulated aerial image intensity in the AOI. With built-in contour analysis functions, the SMDD software can easily compare the contour (or intensity) differences between defect pattern and normal pattern. With user provided judging criteria, this software can be easily disposition the defect based on contour comparison. In addition, process sensitivity properties, like MEEF and NILS, can be readily obtained in the AOI with a lithography model, which will make mask defect disposition criteria more intelligent.

  16. NASA software documentation standard software engineering program

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as Standard) can be applied to the documentation of all NASA software. This Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. This basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  17. The New Payload Handling System for the G erman On- Orbit Verification Satellite TET with the Sensor Bus as Example for Payloads

    NASA Astrophysics Data System (ADS)

    Heyer, H.-V.; Föckersperger, S.; Lattner, K.; Moldenhauer, W.; Schmolke, J.; Turk, M.; Willemsen, P.; Schlicker, M.; Westerdorff, K.

    2008-08-01

    The technology verification satellite TET (Technologie ErprobungsTräger) is the core element of the German On-Orbit-Verification (OOV) program of new technologies and techniques. The goal of this program is the support of the German space industry and research facilities for on-orbit verification of satellite technologies. The TET satellite is a small satellite developed and built in Germany under leadership of Kayser-Threde. The satellite bus is based on the successfully operated satellite BIRD and the newly developed payload platform with the new payload handling system called NVS (Nutzlastversorgungs-system). The NVS can be detailed in three major parts: the power supply the processor boards and the I/O-interfaces. The NVS is realized via several PCBs in Europe format which are connected to each other via an integrated backplane. The payloads are connected by front connectors to the NVS. This paper describes the concept, architecture, and the hard-/software of the NVS. Phase B of this project was successfully finished last year.

  18. Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process

  19. Are Earth System model software engineering practices fit for purpose? A case study.

    NASA Astrophysics Data System (ADS)

    Easterbrook, S. M.; Johns, T. C.

    2009-04-01

    We present some analysis and conclusions from a case study of the culture and practices of scientists at the Met Office and Hadley Centre working on the development of software for climate and Earth System models using the MetUM infrastructure. The study examined how scientists think about software correctness, prioritize their requirements in making changes, and develop a shared understanding of the resulting models. We conclude that highly customized techniques driven strongly by scientific research goals have evolved for verification and validation of such models. In a formal software engineering context these represents costly, but invaluable, software integration tests with considerable benefits. The software engineering practices seen also exhibit recognisable features of both agile and open source software development projects - self-organisation of teams consistent with a meritocracy rather than top-down organisation, extensive use of informal communication channels, and software developers who are generally also users and science domain experts. We draw some general conclusions on whether these practices work well, and what new software engineering challenges may lie ahead as Earth System models become ever more complex and petascale computing becomes the norm.

  20. Experimental Evaluation of Verification and Validation Tools on Martian Rover Software

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume; Giannakopoulou, Dimitra; Goldberg, Allen; Havelund, Klaus; Lowry, Mike; Pasareanu, Corina; Venet, Arnaud; Visser, Willem

    2003-01-01

    To achieve its science objectives in deep space exploration, NASA has a need for science platform vehicles to autonomously make control decisions in a time frame that excludes intervention from Earth-based controllers. Round-trip light-time is one significant factor motivating autonomy capability, another factor is the need to reduce ground support operations cost. An unsolved problem potentially impeding the adoption of autonomy capability is the verification and validation of such software systems, which exhibit far more behaviors (and hence distinct execution paths in the software) than is typical in current deepspace platforms. Hence the need for a study to benchmark advanced Verification and Validation (V&V) tools on representative autonomy software. The objective of the study was to access the maturity of different technologies, to provide data indicative of potential synergies between them, and to identify gaps in the technologies with respect to the challenge of autonomy V&V. The study consisted of two parts: first, a set of relatively independent case studies of different tools on the same autonomy code, second a carefully controlled experiment with human participants on a subset of these technologies. This paper describes the second part of the study. Overall, nearly four hundred hours of data on human use of three different advanced V&V tools were accumulated, with a control group that used conventional testing methods. The experiment simulated four independent V&V teams debugging three successive versions of an executive controller for a Martian Rover. Defects were carefully seeded into the three versions based on a profile of defects from CVS logs that occurred in the actual development of the executive controller. The rest of the document is structured a s follows. In section 2 and 3, we respectively describe the tools used in the study and the rover software that was analyzed. In section 4 the methodology for the experiment is described; this includes the code preparation, seeding of defects, participant training and experimental setup. Next we give a qualitative overview of how the experiment went from the point of view of each technology; model checking (section 5), static analysis (section 6), runtime analysis (section 7) and testing (section 8). The find section gives some preliminary quantitative results on how the tools compared.

  1. Automated Analysis of Stateflow Models

    NASA Technical Reports Server (NTRS)

    Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier

    2017-01-01

    Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.

  2. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-05-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  3. Consolidated View on Space Software Engineering Problems - An Empirical Study

    NASA Astrophysics Data System (ADS)

    Silva, N.; Vieira, M.; Ricci, D.; Cotroneo, D.

    2015-09-01

    Independent software verification and validation (ISVV) has been a key process for engineering quality assessment for decades, and is considered in several international standards. The “European Space Agency (ESA) ISVV Guide” is used for the European Space market to drive the ISVV tasks and plans, and to select applicable tasks and techniques. Software artefacts have room for improvement due to the amount if issues found during ISVV tasks. This article presents the analysis of the results of a large set of ISVV issues originated from three different ESA missions-amounting to more than 1000 issues. The study presents the main types, triggers and impacts related to the ISVV issues found and sets the path for a global software engineering improvement based on the most common deficiencies identified for space projects.

  4. Seismology software: state of the practice

    NASA Astrophysics Data System (ADS)

    Smith, W. Spencer; Zeng, Zheng; Carette, Jacques

    2018-02-01

    We analyzed the state of practice for software development in the seismology domain by comparing 30 software packages on four aspects: product, implementation, design, and process. We found room for improvement in most seismology software packages. The principal areas of concern include a lack of adequate requirements and design specification documents, a lack of test data to assess reliability, a lack of examples to get new users started, and a lack of technological tools to assist with managing the development process. To assist going forward, we provide recommendations for a document-driven development process that includes a problem statement, development plan, requirement specification, verification and validation (V&V) plan, design specification, code, V&V report, and a user manual. We also provide advice on tool use, including issue tracking, version control, code documentation, and testing tools.

  5. Instrument Systems Analysis and Verification Facility (ISAVF) users guide

    NASA Technical Reports Server (NTRS)

    Davis, J. F.; Thomason, J. O.; Wolfgang, J. L.

    1985-01-01

    The ISAVF facility is primarily an interconnected system of computers, special purpose real time hardware, and associated generalized software systems, which will permit the Instrument System Analysts, Design Engineers and Instrument Scientists, to perform trade off studies, specification development, instrument modeling, and verification of the instrument, hardware performance. It is not the intent of the ISAVF to duplicate or replace existing special purpose facilities such as the Code 710 Optical Laboratories or the Code 750 Test and Evaluation facilities. The ISAVF will provide data acquisition and control services for these facilities, as needed, using remote computer stations attached to the main ISAVF computers via dedicated communication lines.

  6. The PLAID graphics analysis impact on the space program

    NASA Technical Reports Server (NTRS)

    Nguyen, Jennifer P.; Wheaton, Aneice L.; Maida, James C.

    1994-01-01

    An ongoing project design often requires visual verification at various stages. These requirements are critically important because the subsequent phases of that project might depend on the complete verification of a particular stage. Currently, there are several software packages at JSC that provide such simulation capabilities. We present the simulation capabilities of the PLAID modeling system used in the Flight Crew Support Division for human factors analyses. We summarize some ongoing studies in kinematics, lighting, EVA activities, and discuss various applications in the mission planning of the current Space Shuttle flights and the assembly sequence of the Space Station Freedom with emphasis on the redesign effort.

  7. Verification of statistical method CORN for modeling of microfuel in the case of high grain concentration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Chukbar, B. K., E-mail: bchukbar@mail.ru

    Two methods of modeling a double-heterogeneity fuel are studied: the deterministic positioning and the statistical method CORN of the MCU software package. The effect of distribution of microfuel in a pebble bed on the calculation results is studied. The results of verification of the statistical method CORN for the cases of the microfuel concentration up to 170 cm{sup –3} in a pebble bed are presented. The admissibility of homogenization of the microfuel coating with the graphite matrix is studied. The dependence of the reactivity on the relative location of fuel and graphite spheres in a pebble bed is found.

  8. Independent Verification and Validation (IV and V) Criteria

    NASA Technical Reports Server (NTRS)

    McGill, Kenneth

    2000-01-01

    The purpose of this appendix is to establish quantifiable criteria for determining whether IV&V should be applied to a given software development. Since IV&V should begin in the Formulation Subprocess of a project, the process here described is based on metrics which are available before project approval.

  9. Automating Nuclear-Safety-Related SQA Procedures with Custom Applications

    DOE PAGES

    Freels, James D.

    2016-01-01

    Nuclear safety-related procedures are rigorous for good reason. Small design mistakes can quickly turn into unwanted failures. Researchers at Oak Ridge National Laboratory worked with COMSOL to define a simulation app that automates the software quality assurance (SQA) verification process and provides results in less than 24 hours.

  10. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  11. Current status of 3D EPID-based in vivo dosimetry in The Netherlands Cancer Institute

    NASA Astrophysics Data System (ADS)

    Mijnheer, B.; Olaciregui-Ruiz, I.; Rozendaal, R.; Spreeuw, H.; van Herk, M.; Mans, A.

    2015-01-01

    3D in vivo dose verification using a-Si EPIDs is performed routinely in our institution for almost all RT treatments. The EPID-based 3D dose distribution is reconstructed using a back-projection algorithm and compared with the planned dose distribution using 3D gamma evaluation. Dose-reconstruction and gamma-evaluation software runs automatically, and deviations outside the alert criteria are immediately available and investigated, in combination with inspection of cone-beam CT scans. The implementation of our 3D EPID- based in vivo dosimetry approach was able to replace pre-treatment verification for more than 90% of the patient treatments. Clinically relevant deviations could be detected for approximately 1 out of 300 patient treatments (IMRT and VMAT). Most of these errors were patient related anatomical changes or deviations from the routine clinical procedure, and would not have been detected by pre-treatment verification. Moreover, 3D EPID-based in vivo dose verification is a fast and accurate tool to assure the safe delivery of RT treatments. It provides clinically more useful information and is less time consuming than pre-treatment verification measurements. Automated 3D in vivo dosimetry is therefore a prerequisite for large-scale implementation of patient-specific quality assurance of RT treatments.

  12. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example.

    PubMed

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D

    2014-03-01

    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  13. Closing the Certification Gaps in Adaptive Flight Control Software

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2008-01-01

    Over the last five decades, extensive research has been performed to design and develop adaptive control systems for aerospace systems and other applications where the capability to change controller behavior at different operating conditions is highly desirable. Although adaptive flight control has been partially implemented through the use of gain-scheduled control, truly adaptive control systems using learning algorithms and on-line system identification methods have not seen commercial deployment. The reason is that the certification process for adaptive flight control software for use in national air space has not yet been decided. The purpose of this paper is to examine the gaps between the state-of-the-art methodologies used to certify conventional (i.e., non-adaptive) flight control system software and what will likely to be needed to satisfy FAA airworthiness requirements. These gaps include the lack of a certification plan or process guide, the need to develop verification and validation tools and methodologies to analyze adaptive controller stability and convergence, as well as the development of metrics to evaluate adaptive controller performance at off-nominal flight conditions. This paper presents the major certification gap areas, a description of the current state of the verification methodologies, and what further research efforts will likely be needed to close the gaps remaining in current certification practices. It is envisioned that closing the gap will require certain advances in simulation methods, comprehensive methods to determine learning algorithm stability and convergence rates, the development of performance metrics for adaptive controllers, the application of formal software assurance methods, the application of on-line software monitoring tools for adaptive controller health assessment, and the development of a certification case for adaptive system safety of flight.

  14. Fault Management Architectures and the Challenges of Providing Software Assurance

    NASA Technical Reports Server (NTRS)

    Savarino, Shirley; Fitz, Rhonda; Fesq, Lorraine; Whitman, Gerek

    2015-01-01

    Fault Management (FM) is focused on safety, the preservation of assets, and maintaining the desired functionality of the system. How FM is implemented varies among missions. Common to most missions is system complexity due to a need to establish a multi-dimensional structure across hardware, software and spacecraft operations. FM is necessary to identify and respond to system faults, mitigate technical risks and ensure operational continuity. Generally, FM architecture, implementation, and software assurance efforts increase with mission complexity. Because FM is a systems engineering discipline with a distributed implementation, providing efficient and effective verification and validation (V&V) is challenging. A breakout session at the 2012 NASA Independent Verification & Validation (IV&V) Annual Workshop titled "V&V of Fault Management: Challenges and Successes" exposed this issue in terms of V&V for a representative set of architectures. NASA's Software Assurance Research Program (SARP) has provided funds to NASA IV&V to extend the work performed at the Workshop session in partnership with NASA's Jet Propulsion Laboratory (JPL). NASA IV&V will extract FM architectures across the IV&V portfolio and evaluate the data set, assess visibility for validation and test, and define software assurance methods that could be applied to the various architectures and designs. This SARP initiative focuses efforts on FM architectures from critical and complex projects within NASA. The identification of particular FM architectures and associated V&V/IV&V techniques provides a data set that can enable improved assurance that a system will adequately detect and respond to adverse conditions. Ultimately, results from this activity will be incorporated into the NASA Fault Management Handbook providing dissemination across NASA, other agencies and the space community. This paper discusses the approach taken to perform the evaluations and preliminary findings from the research.

  15. Fault Management Architectures and the Challenges of Providing Software Assurance

    NASA Technical Reports Server (NTRS)

    Savarino, Shirley; Fitz, Rhonda; Fesq, Lorraine; Whitman, Gerek

    2015-01-01

    The satellite systems Fault Management (FM) is focused on safety, the preservation of assets, and maintaining the desired functionality of the system. How FM is implemented varies among missions. Common to most is system complexity due to a need to establish a multi-dimensional structure across hardware, software and operations. This structure is necessary to identify and respond to system faults, mitigate technical risks and ensure operational continuity. These architecture, implementation and software assurance efforts increase with mission complexity. Because FM is a systems engineering discipline with a distributed implementation, providing efficient and effective verification and validation (VV) is challenging. A breakout session at the 2012 NASA Independent Verification Validation (IVV) Annual Workshop titled VV of Fault Management: Challenges and Successes exposed these issues in terms of VV for a representative set of architectures. NASA's IVV is funded by NASA's Software Assurance Research Program (SARP) in partnership with NASA's Jet Propulsion Laboratory (JPL) to extend the work performed at the Workshop session. NASA IVV will extract FM architectures across the IVV portfolio and evaluate the data set for robustness, assess visibility for validation and test, and define software assurance methods that could be applied to the various architectures and designs. This work focuses efforts on FM architectures from critical and complex projects within NASA. The identification of particular FM architectures, visibility, and associated VVIVV techniques provides a data set that can enable higher assurance that a satellite system will adequately detect and respond to adverse conditions. Ultimately, results from this activity will be incorporated into the NASA Fault Management Handbook providing dissemination across NASA, other agencies and the satellite community. This paper discusses the approach taken to perform the evaluations and preliminary findings from the research including identification of FM architectures, visibility observations, and methods utilized for VVIVV.

  16. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  17. Software risk management through independent verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John R.; Zhou, Tong C.; Wood, Ralph

    1995-01-01

    Software project managers need tools to estimate and track project goals in a continuous fashion before, during, and after development of a system. In addition, they need an ability to compare the current project status with past project profiles to validate management intuition, identify problems, and then direct appropriate resources to the sources of problems. This paper describes a measurement-based approach to calculating the risk inherent in meeting project goals that leverages past project metrics and existing estimation and tracking models. We introduce the IV&V Goal/Questions/Metrics model, explain its use in the software development life cycle, and describe our attempts to validate the model through the reverse engineering of existing projects.

  18. Software IV and V Research Priorities and Applied Program Accomplishments Within NASA

    NASA Technical Reports Server (NTRS)

    Blazy, Louis J.

    2000-01-01

    The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering

  19. COVERT: A Framework for Finding Buffer Overflows in C Programs via Software Verification

    DTIC Science & Technology

    2010-08-01

    is greater than the allocated size of B. In the case of a type-safe language or a language with runtime bounds checking (such as Java), an overflow...leads either to a (compile-time) type error or a (runtime) exception. In such languages , a buffer overflow can lead to a denial of service attack (i.e...of current and legacy software is written in unsafe languages (such as C or C++) that allow buffers to be overflowed with impunity. For reasons such as

  20. Automated Test for NASA CFS

    NASA Technical Reports Server (NTRS)

    McComas, David C.; Strege, Susanne L.; Carpenter, Paul B. Hartman, Randy

    2015-01-01

    The core Flight System (cFS) is a flight software (FSW) product line developed by the Flight Software Systems Branch (FSSB) at NASA's Goddard Space Flight Center (GSFC). The cFS uses compile-time configuration parameters to implement variable requirements to enable portability across embedded computing platforms and to implement different end-user functional needs. The verification and validation of these requirements is proving to be a significant challenge. This paper describes the challenges facing the cFS and the results of a pilot effort to apply EXB Solution's testing approach to the cFS applications.

  1. Formal Verification Toolkit for Requirements and Early Design Stages

    NASA Technical Reports Server (NTRS)

    Badger, Julia M.; Miller, Sheena Judson

    2011-01-01

    Efficient flight software development from natural language requirements needs an effective way to test designs earlier in the software design cycle. A method to automatically derive logical safety constraints and the design state space from natural language requirements is described. The constraints can then be checked using a logical consistency checker and also be used in a symbolic model checker to verify the early design of the system. This method was used to verify a hybrid control design for the suit ports on NASA Johnson Space Center's Space Exploration Vehicle against safety requirements.

  2. Software technology testbed softpanel prototype

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The following subject areas are covered: analysis of using Ada for the development of real-time control systems for the Space Station; analysis of the functionality of the Application Generator; analysis of the User Support Environment criteria; analysis of the SSE tools and procedures which are to be used for the development of ground/flight software for the Space Station; analysis if the CBATS tutorial (an Ada tutorial package); analysis of Interleaf; analysis of the Integration, Test and Verification process of the Space Station; analysis of the DMS on-orbit flight architecture; analysis of the simulation architecture.

  3. "Test" is a Four Letter Word

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pope, G M

    2005-05-03

    For a number of years I had the pleasure of teaching Testing Seminars all over the world and meeting and learning from others in our field. Over a twelve year period, I always asked the following questions to Software Developers, Test Engineers, and Managers who took my two or three day seminar on Software Testing: 'When was the first time you heard the word test'? 'Where were you when you first heard the word test'? 'Who said the word test'? 'How did the word test make you feel'? Most of the thousands of responses were similar to 'It was mymore » third grade teacher at school, and I felt nervous and afraid'. Now there were a few exceptions like 'It was my third grade teacher, and I was happy and excited to show how smart I was'. But by and large, my informal survey found that 'testing' is a word to which most people attach negative meanings, based on its historical context. So why is this important to those of us in the software development business? Because I have found that a preponderance of software developers do not get real excited about hearing that the software they just wrote is going to be 'tested' by the Test Group. Typical reactions I have heard over the years run from: 'I'm sure there is nothing wrong with the software, so go ahead and test it, better you find defects than our customers'. to these extremes: 'There is no need to test my software because there is nothing wrong with it'. 'You are not qualified to test my software because you don't know as much as I do about it'. 'If any Test Engineers come into our office again to test our software we will throw them through the third floor window'. So why is there such a strong negative reaction to testing? It is primitive. It goes back to grade school for many of us. It is a negative word that congers up negative emotions. In other words, 'test' is a four letter word. How many of us associate 'Joy' with 'Test'? Not many. It is hard for most of us to reprogram associations learned at an early age. So what can we do about it (short of hypnotic therapy for software developers)? Well one concept I have used (and still use) is to not call testing 'testing'. Call it something else. Ever wonder why most of the Independent Software Testing groups are called Software Quality Assurance groups? Now you know. Software Quality Assurance is not such a negatively charged phrase, even though Software Quality Assurance is much more than simply testing. It was a real blessing when the concept of Validation and Verification came about for software. Now I define Validation to mean assuring that the product produced does the right thing (usually what the customer wants it to do), and verification means that the product was built the right way (in accordance with some good design principles and practices). So I have deliberately called the System Test Group the Verification and Validation Group, or V&V Group, as a way of avoiding the negative image problem. I remember once having a conversation with a developer colleague who said, in the heat of battle, that it was fine to V&V his code, just don't test it! Once again V&V includes many things besides testing, but it just doesn't sound like an onerous thing to do to software. In my current job, working at a highly regarded national laboratory with world renowned physicists, I have again encountered the negativity about testing software. Except here they don't take kindly to Software Quality Assurance or Software Verification and Validation either. After all, software is just a trivial tool to automate algorithms that implement physics models. Testing, SQA, and V&V take time and get in the way of completing ground breaking science experiments. So I have again had to change the name of software testing to something less negative in the physics world. I found (the hard way) that if I requested more time to do software experimentation, the physicist's resistance melted. And so the conversation continues, 'We have time to run more software experiments. Just don't waste any time testing the software'! In case the concept of not calling testing 'testing' appeals to you, and there may be an opportunity for you to take the sting out of the name at your place of employment, I have compiled a table of things that testing could be called besides 'testing'. Of course we can embellish this by adding some good sounding prefixes and suffixes also. To come up with alternate names for testing, pick a word from columns A, B, and C in the table below. For instance Unified Acceptance Trials (A2,B7,C3) or Tailored Observational Demonstration (A6,B5,C5) or Agile Criteria Scoring (A3,B8,C8) or Rapid Requirement Proof (A1,B9,C7) or Satisfaction Assurance (B10,C1). You can probably think of some additional combinations appropriate for your industry.« less

  4. 76 FR 17407 - Proposed Subsequent Arrangement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-29

    ... United States of America and the Argentine Republic Concerning Peaceful Uses of Nuclear Energy. DATES... originally obtained by KAERI from U.S. Department of Energy's National Nuclear Security Administration Y-12... DEPARTMENT OF ENERGY Proposed Subsequent Arrangement AGENCY: Office of Nonproliferation and...

  5. 77 FR 34367 - Proposed Subsequent Arrangement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-06-11

    ... reactors, and a research reactor, at the Post Irradiation Examination Facility (PIEF), the Irradiated.../2011, ``Post-Irradiation Examination and R&D Programs Using Irradiated Fuels at KAERI,'' dated June... fuel elements for post-irradiation examination and for research, development and manufacture of DUPIC...

  6. Software Fault Tolerance: A Tutorial

    NASA Technical Reports Server (NTRS)

    Torres-Pomales, Wilfredo

    2000-01-01

    Because of our present inability to produce error-free software, software fault tolerance is and will continue to be an important consideration in software systems. The root cause of software design errors is the complexity of the systems. Compounding the problems in building correct software is the difficulty in assessing the correctness of software for highly complex systems. After a brief overview of the software development processes, we note how hard-to-detect design faults are likely to be introduced during development and how software faults tend to be state-dependent and activated by particular input sequences. Although component reliability is an important quality measure for system level analysis, software reliability is hard to characterize and the use of post-verification reliability estimates remains a controversial issue. For some applications software safety is more important than reliability, and fault tolerance techniques used in those applications are aimed at preventing catastrophes. Single version software fault tolerance techniques discussed include system structuring and closure, atomic actions, inline fault detection, exception handling, and others. Multiversion techniques are based on the assumption that software built differently should fail differently and thus, if one of the redundant versions fails, it is expected that at least one of the other versions will provide an acceptable output. Recovery blocks, N-version programming, and other multiversion techniques are reviewed.

  7. SU-F-J-72: A Clinical Usable Integrated Contouring Quality Evaluation Software for Radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jiang, S; Dolly, S; Cai, B

    Purpose: To introduce the Auto Contour Evaluation (ACE) software, which is the clinical usable, user friendly, efficient and all-in-one toolbox for automatically identify common contouring errors in radiotherapy treatment planning using supervised machine learning techniques. Methods: ACE is developed with C# using Microsoft .Net framework and Windows Presentation Foundation (WPF) for elegant GUI design and smooth GUI transition animations through the integration of graphics engines and high dots per inch (DPI) settings on modern high resolution monitors. The industrial standard software design pattern, Model-View-ViewModel (MVVM) pattern, is chosen to be the major architecture of ACE for neat coding structure, deepmore » modularization, easy maintainability and seamless communication with other clinical software. ACE consists of 1) a patient data importing module integrated with clinical patient database server, 2) a 2D DICOM image and RT structure simultaneously displaying module, 3) a 3D RT structure visualization module using Visualization Toolkit or VTK library and 4) a contour evaluation module using supervised pattern recognition algorithms to detect contouring errors and display detection results. ACE relies on supervised learning algorithms to handle all image processing and data processing jobs. Implementations of related algorithms are powered by Accord.Net scientific computing library for better efficiency and effectiveness. Results: ACE can take patient’s CT images and RT structures from commercial treatment planning software via direct user input or from patients’ database. All functionalities including 2D and 3D image visualization and RT contours error detection have been demonstrated with real clinical patient cases. Conclusion: ACE implements supervised learning algorithms and combines image processing and graphical visualization modules for RT contours verification. ACE has great potential for automated radiotherapy contouring quality verification. Structured with MVVM pattern, it is highly maintainable and extensible, and support smooth connections with other clinical software tools.« less

  8. Real-Time Extended Interface Automata for Software Testing Cases Generation

    PubMed Central

    Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin

    2014-01-01

    Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system. PMID:24892080

  9. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  10. Intelligent Tools for Planning Knowledge base Development and Verification

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  11. Static and Completion Analysis for Planning Knowledge Base Development and Verification

    NASA Technical Reports Server (NTRS)

    Chien, Steve A.

    1996-01-01

    A key obstacle hampering fielding of AI planning applications is the considerable expense of developing, verifying, updating, and maintaining the planning knowledge base (KB). Planning systems must be able to compare favorably in terms of software lifecycle costs to other means of automation such as scripts or rule-based expert systems.

  12. A Teamwork-Oriented Air Traffic Control Simulator

    DTIC Science & Technology

    2006-06-01

    the software development methodology of this work , this chapter is viewed as the acquisition phase of this model. The end of the ...Maintenance phase Changed Verification Retirement Development Maintenance 37 because the different controllers working in these phases usually...traditional operation such as scaling the airport and personalizing the working environment. 4. Pilot Specification The

  13. U.S. Army Aeromedical Research Laboratory Annual Progress Report FY 1986

    DTIC Science & Technology

    1986-10-01

    19 Contracts ................................................. 19 Small Business Innovation...universities and businesses which parallels the research requirements of the laboratories under the USAMRDC command. Because of the scientific manpower...Software is being written to allow double entry verification of data. 2) Small business innovation research Each year, in compliance with the Small

  14. A Software Hub for High Assurance Model-Driven Development and Analysis

    DTIC Science & Technology

    2007-01-23

    verification of UML models in TLPVS. In Thomas Baar, Alfred Strohmeier, Ana Moreira, and Stephen J. Mellor, editors, UML 2004 - The Unified Modeling...volume 3785 of Lecture Notes in Computer Science, pages 52–65, Manchester, UK, Nov 2005. Springer. [GH04] Günter Graw and Peter Herrmann. Transformation

  15. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    PubMed

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  16. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification

    PubMed Central

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system. PMID:27918594

  17. 242A Distributed Control System Year 2000 Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEATS, M.C.

    1999-08-31

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct yearmore » 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3{trademark} (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each.« less

  18. 78 FR 72072 - Proposed Subsequent Arrangement

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-02

    ... Uses of Nuclear Energy Between the United States of America and the European Atomic Energy Community... Nuclear Security Administration, Department of Energy. Telephone: 202-586-3806 or email: Sean.Oehlbert... program. KAERI originally obtained the material from the U.S. Department of Energy/National Nuclear...

  19. Feasibility study on dosimetry verification of volumetric-modulated arc therapy-based total marrow irradiation.

    PubMed

    Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K

    2013-03-04

    The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.

  20. Validation and Verification of LADEE Models and Software

    NASA Technical Reports Server (NTRS)

    Gundy-Burlet, Karen

    2013-01-01

    The Lunar Atmosphere Dust Environment Explorer (LADEE) mission will orbit the moon in order to measure the density, composition and time variability of the lunar dust environment. The ground-side and onboard flight software for the mission is being developed using a Model-Based Software methodology. In this technique, models of the spacecraft and flight software are developed in a graphical dynamics modeling package. Flight Software requirements are prototyped and refined using the simulated models. After the model is shown to work as desired in this simulation framework, C-code software is automatically generated from the models. The generated software is then tested in real time Processor-in-the-Loop and Hardware-in-the-Loop test beds. Travelling Road Show test beds were used for early integration tests with payloads and other subsystems. Traditional techniques for verifying computational sciences models are used to characterize the spacecraft simulation. A lightweight set of formal methods analysis, static analysis, formal inspection and code coverage analyses are utilized to further reduce defects in the onboard flight software artifacts. These techniques are applied early and often in the development process, iteratively increasing the capabilities of the software and the fidelity of the vehicle models and test beds.

  1. Formal Analysis of BPMN Models Using Event-B

    NASA Astrophysics Data System (ADS)

    Bryans, Jeremy W.; Wei, Wei

    The use of business process models has gone far beyond documentation purposes. In the development of business applications, they can play the role of an artifact on which high level properties can be verified and design errors can be revealed in an effort to reduce overhead at later software development and diagnosis stages. This paper demonstrates how formal verification may add value to the specification, design and development of business process models in an industrial setting. The analysis of these models is achieved via an algorithmic translation from the de-facto standard business process modeling language BPMN to Event-B, a widely used formal language supported by the Rodin platform which offers a range of simulation and verification technologies.

  2. Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results

    NASA Technical Reports Server (NTRS)

    Burken, John J.; Larson, Richard R.

    2009-01-01

    F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.

  3. Research on radiation exposure from CT part of hybrid camera and diagnostic CT

    NASA Astrophysics Data System (ADS)

    Solný, Pavel; Zimák, Jaroslav

    2014-11-01

    Research on radiation exposure from CT part of hybrid camera in seven different Departments of Nuclear Medicine (DNM) was conducted. Processed data and effective dose (E) estimations led to the idea of phantom verification and comparison of absorbed doses and software estimation. Anonymous data from about 100 examinations from each DNM was gathered. Acquired data was processed and utilized by dose estimation programs (ExPACT, ImPACT, ImpactDose) with respect to the type of examination and examination procedures. Individual effective doses were calculated using enlisted programs. Preserving the same procedure in dose estimation process allows us to compare the resulting E. Some differences and disproportions during dose estimation led to the idea of estimated E verification. Consequently, two different sets of about 100 of TLD 100H detectors were calibrated for measurement inside the Aldersnon RANDO Anthropomorphic Phantom. Standard examination protocols were examined using a 2 Slice CT- part of hybrid SPECT/CT. Moreover, phantom exposure from body examining protocol for 32 Slice and 64 Slice diagnostic CT scanner was also verified. Absorbed dose (DT,R) measured using TLD detectors was compared with software estimation of equivalent dose HT values, computed by E estimation software. Though, only limited number of cavities for detectors enabled measurement within the regions of lung, liver, thyroid and spleen-pancreas region, some basic comparison is possible.

  4. TransFit: Finite element analysis data fitting software

    NASA Technical Reports Server (NTRS)

    Freeman, Mark

    1993-01-01

    The Advanced X-Ray Astrophysics Facility (AXAF) mission support team has made extensive use of geometric ray tracing to analyze the performance of AXAF developmental and flight optics. One important aspect of this performance modeling is the incorporation of finite element analysis (FEA) data into the surface deformations of the optical elements. TransFit is software designed for the fitting of FEA data of Wolter I optical surface distortions with a continuous surface description which can then be used by SAO's analytic ray tracing software, currently OSAC (Optical Surface Analysis Code). The improved capabilities of Transfit over previous methods include bicubic spline fitting of FEA data to accommodate higher spatial frequency distortions, fitted data visualization for assessing the quality of fit, the ability to accommodate input data from three FEA codes plus other standard formats, and options for alignment of the model coordinate system with the ray trace coordinate system. TransFit uses the AnswerGarden graphical user interface (GUI) to edit input parameters and then access routines written in PV-WAVE, C, and FORTRAN to allow the user to interactively create, evaluate, and modify the fit. The topics covered include an introduction to TransFit: requirements, designs philosophy, and implementation; design specifics: modules, parameters, fitting algorithms, and data displays; a procedural example; verification of performance; future work; and appendices on online help and ray trace results of the verification section.

  5. Some key considerations in evolving a computer system and software engineering support environment for the space station program

    NASA Technical Reports Server (NTRS)

    Mckay, C. W.; Bown, R. L.

    1985-01-01

    The space station data management system involves networks of computing resources that must work cooperatively and reliably over an indefinite life span. This program requires a long schedule of modular growth and an even longer period of maintenance and operation. The development and operation of space station computing resources will involve a spectrum of systems and software life cycle activities distributed across a variety of hosts, an integration, verification, and validation host with test bed, and distributed targets. The requirement for the early establishment and use of an apporopriate Computer Systems and Software Engineering Support Environment is identified. This environment will support the Research and Development Productivity challenges presented by the space station computing system.

  6. Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.

  7. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Framework for Small-Scale Experiments in Software Engineering: Guidance and Control Software Project: Software Engineering Case Study

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.

    1998-01-01

    Software is becoming increasingly significant in today's critical avionics systems. To achieve safe, reliable software, government regulatory agencies such as the Federal Aviation Administration (FAA) and the Department of Defense mandate the use of certain software development methods. However, little scientific evidence exists to show a correlation between software development methods and product quality. Given this lack of evidence, a series of experiments has been conducted to understand why and how software fails. The Guidance and Control Software (GCS) project is the latest in this series. The GCS project is a case study of the Requirements and Technical Concepts for Aviation RTCA/DO-178B guidelines, Software Considerations in Airborne Systems and Equipment Certification. All civil transport airframe and equipment vendors are expected to comply with these guidelines in building systems to be certified by the FAA for use in commercial aircraft. For the case study, two implementations of a guidance and control application were developed to comply with the DO-178B guidelines for Level A (critical) software. The development included the requirements, design, coding, verification, configuration management, and quality assurance processes. This paper discusses the details of the GCS project and presents the results of the case study.

  9. Guidelines for preparing software user documentation

    NASA Technical Reports Server (NTRS)

    Miller, Diane F.

    1987-01-01

    Clear, easy-to-use software user's manuals make strong demands on special technical communication techniques. Principles and guidelines are given for analyzing the audience and dealing with wide-ranging backgrounds of potential users. Types of information to be included in a complete manual are suggested, with a technique for creating a user-oriented rather than process-oriented organization. Accuracy verification is emphasized. Simple tips are gievn for formatting for quick comprehension and reference, for deciding on packaging, for creating helpful illustrations and examples, and for setting up clear and consistent conventions. Simple guidelines are offered for writing clearly and concisely and for editing.

  10. Using Automated Theorem Provers to Certify Auto-Generated Aerospace Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen; Fischer, Bernd; Schumann, Johann

    2004-01-01

    We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). For full automation, however, the obligations must be aggressively preprocessed and simplified We describe the unique requirements this places on the ATP and demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATP to solve the proof tasks. Experiments on more than 25,000 tasks were carried out using Vampire, Spass, and e-setheo.

  11. A Practical Tutorial on Modified Condition/Decision Coverage

    NASA Technical Reports Server (NTRS)

    Hayhurst, Kelly J.; Veerhusen, Dan S.; Chilenski, John J.; Rierson, Leanna K.

    2001-01-01

    This tutorial provides a practical approach to assessing modified condition/decision coverage (MC/DC) for aviation software products that must comply with regulatory guidance for DO-178B level A software. The tutorial's approach to MC/DC is a 5-step process that allows a certification authority or verification analyst to evaluate MC/DC claims without the aid of a coverage tool. In addition to the MC/DC approach, the tutorial addresses factors to consider in selecting and qualifying a structural coverage analysis tool, tips for reviewing life cycle data related to MC/DC, and pitfalls common to structural coverage analysis.

  12. SCaN Testbed Software Development and Lessons Learned

    NASA Technical Reports Server (NTRS)

    Kacpura, Thomas J.; Varga, Denise M.

    2012-01-01

    National Aeronautics and Space Administration (NASA) has developed an on-orbit, adaptable, Software Defined Radio (SDR)Space Telecommunications Radio System (STRS)-based testbed facility to conduct a suite of experiments to advance technologies, reduce risk, and enable future mission capabilities on the International Space Station (ISS). The SCAN Testbed Project will provide NASA, industry, other Government agencies, and academic partners the opportunity to develop and field communications, navigation, and networking technologies in the laboratory and space environment based on reconfigurable, SDR platforms and the STRS Architecture.The SDRs are a new technology for NASA, and the support infrastructure they require is different from legacy, fixed function radios. SDRs offer the ability to reconfigure on-orbit communications by changing software for new waveforms and operating systems to enable new capabilities or fix any anomalies, which was not a previous option. They are not stand alone devices, but required a new approach to effectively control them and flow data. This requires extensive software to be developed to utilize the full potential of these reconfigurable platforms. The paper focuses on development, integration and testing as related to the avionics processor system, and the software required to command, control, monitor, and interact with the SDRs, as well as the other communication payload elements. An extensive effort was required to develop the flight software and meet the NASA requirements for software quality and safety. The flight avionics must be radiation tolerant, and these processors have limited capability in comparison to terrestrial counterparts. A big challenge was that there are three SDRs onboard, and interfacing with multiple SDRs simultaneously complicatesd the effort. The effort also includes ground software, which is a key element for both the command of the payload, and displaying data created by the payload. The verification of the software was an extensive effort. The challenges of specifying a suitable test matrix with reconfigurable systems that offer numerous configurations is highlighted. Since the flight system testing requires methodical, controlled testing that limits risk, a nearly identical ground system to the on-orbit flight system was required to develop the software and write verification procedures before it was installed and tested on the flight system. The development of the SCAN testbed was an accelerated effort to meet launch constraints, and this paper discusses tradeoffs made to balance needed software functionality and still maintain the schedule. Future upgrades are discussed that optimize the avionics and allow experimenters to utilize the SCAN testbed potential.

  13. A Verification Method for MASOES.

    PubMed

    Perozo, N; Aguilar Perozo, J; Terán, O; Molina, H

    2013-02-01

    MASOES is a 3agent architecture for designing and modeling self-organizing and emergent systems. This architecture describes the elements, relationships, and mechanisms, both at the individual and the collective levels, that favor the analysis of the self-organizing and emergent phenomenon without mathematically modeling the system. In this paper, a method is proposed for verifying MASOES from the point of view of design in order to study the self-organizing and emergent behaviors of the modeled systems. The verification criteria are set according to what is proposed in MASOES for modeling self-organizing and emerging systems and the principles of the wisdom of crowd paradigm and the fuzzy cognitive map (FCM) theory. The verification method for MASOES has been implemented in a tool called FCM Designer and has been tested to model a community of free software developers that works under the bazaar style as well as a Wikipedia community in order to study their behavior and determine their self-organizing and emergent capacities.

  14. Simple method to verify OPC data based on exposure condition

    NASA Astrophysics Data System (ADS)

    Moon, James; Ahn, Young-Bae; Oh, Sey-Young; Nam, Byung-Ho; Yim, Dong Gyu

    2006-03-01

    In a world where Sub100nm lithography tool is an everyday household item for device makers, shrinkage of the device is at a rate that no one ever have imagined. With the shrinkage of device at such a high rate, demand placed on Optical Proximity Correction (OPC) is like never before. To meet this demand with respect to shrinkage rate of the device, more aggressive OPC tactic is involved. Aggressive OPC tactics is a must for sub 100nm lithography tech but this tactic eventually results in greater room for OPC error and complexity of the OPC data. Until now, Optical Rule Check (ORC) or Design Rule Check (DRC) was used to verify this complex OPC error. But each of these methods has its pros and cons. ORC verification of OPC data is rather accurate "process" wise but inspection of full chip device requires a lot of money (Computer , software,..) and patience (run time). DRC however has no such disadvantage, but accuracy of the verification is a total downfall "process" wise. In this study, we were able to create a new method for OPC data verification that combines the best of both ORC and DRC verification method. We created a method that inspects the biasing of the OPC data with respect to the illumination condition of the process that's involved. This new method for verification was applied to 80nm tech ISOLATION and GATE layer of the 512M DRAM device and showed accuracy equivalent to ORC inspection with run time that of DRC verification.

  15. Using Statechart Assertion for the Formal Validation and Verification of a Real-Time Software System: A Case Study

    DTIC Science & Technology

    2011-03-01

    could be an entry point into a repeated task (or thread). The following example uses binary semaphores . The VxWorks operating system utilizes binary... semaphores via system calls: SemTake and SemGive. These semaphores are used primarily for mutual exclusion to protect resources from being accessed

  16. High-Assurance Spiral

    DTIC Science & Technology

    2017-11-01

    Public Release; Distribution Unlimited. PA# 88ABW-2017-5388 Date Cleared: 30 OCT 2017 13. SUPPLEMENTARY NOTES 14. ABSTRACT Cyber- physical systems... physical processes that interact in intricate manners. This makes verification of the software complex and unwieldy. In this report, an approach towards...resulting implementations. 15. SUBJECT TERMS Cyber- physical systems, Formal guarantees, Code generation 16. SECURITY CLASSIFICATION OF: 17

  17. Air Force Research Laboratory Technology Milestones 2007

    DTIC Science & Technology

    2007-01-01

    Propulsion Fuel Pumps and Fuel Systems Liquid Rockets and Combustion Gas Generators Micropropulsion Gears Monopropellants High-Cycle Fatigue and Its... Systems Electric Propulsion Engine Health Monitoring Systems High-Energy-Density Matter Exhaust Nozzles Injectors and Spray Measurements Fans Laser...of software models to drive development of component-based systems and lightweight domain-specific specification and verification technology. Highly

  18. 49 CFR Appendix F to Part 236 - Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System Safety...

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ..., national, or international standards. (f) The reviewer shall analyze all Fault Tree Analyses (FTA), Failure... cited by the reviewer; (4) Identification of any documentation or information sought by the reviewer...) Identification of the hardware and software verification and validation procedures for the PTC system's safety...

  19. Model-Based Engineering for Supply Chain Risk Management

    DTIC Science & Technology

    2015-09-30

    Privacy, 2009 [19] Julien Delange Wheel Brake System Example using AADL; Feiler, Peter; Hansson, Jörgen; de Niz, Dionisio; & Wrage, Lutz. System ...University Software Engineering Institute Abstract—Expanded use of commercial components has increased the complexity of system assurance...verification. Model- based engineering (MBE) offers a means to design, develop, analyze, and maintain a complex system architecture. Architecture Analysis

  20. OHD/HL - RFCDEV/HVT: 2nd charter

    Science.gov Websites

    hydrologists, and managers using existing software (IVP and EVS) and to support the development of CHPS-VS , as well as RFC case studies using these standards. The RFC case studies will include the analyses of capabilities developed at OHD and at the RFCs to produce the standard verification products using IVP, EVS, and

  1. Payload crew training complex simulation engineer's handbook

    NASA Technical Reports Server (NTRS)

    Shipman, D. L.

    1984-01-01

    The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.

  2. Virtual Satellite

    NASA Technical Reports Server (NTRS)

    Hammrs, Stephan R.

    2008-01-01

    Virtual Satellite (VirtualSat) is a computer program that creates an environment that facilitates the development, verification, and validation of flight software for a single spacecraft or for multiple spacecraft flying in formation. In this environment, enhanced functionality and autonomy of navigation, guidance, and control systems of a spacecraft are provided by a virtual satellite that is, a computational model that simulates the dynamic behavior of the spacecraft. Within this environment, it is possible to execute any associated software, the development of which could benefit from knowledge of, and possible interaction (typically, exchange of data) with, the virtual satellite. Examples of associated software include programs for simulating spacecraft power and thermal- management systems. This environment is independent of the flight hardware that will eventually host the flight software, making it possible to develop the software simultaneously with, or even before, the hardware is delivered. Optionally, by use of interfaces included in VirtualSat, hardware can be used instead of simulated. The flight software, coded in the C or C++ programming language, is compilable and loadable into VirtualSat without any special modifications. Thus, VirtualSat can serve as a relatively inexpensive software test-bed for development test, integration, and post-launch maintenance of spacecraft flight software.

  3. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities

    NASA Technical Reports Server (NTRS)

    Hebert, Phillip W., Sr.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Hughes, Mark S.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition systems (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis development and deployment.

  4. GeMS: an advanced software package for designing synthetic genes.

    PubMed

    Jayaraj, Sebastian; Reid, Ralph; Santi, Daniel V

    2005-01-01

    A user-friendly, advanced software package for gene design is described. The software comprises an integrated suite of programs-also provided as stand-alone tools-that automatically performs the following tasks in gene design: restriction site prediction, codon optimization for any expression host, restriction site inclusion and exclusion, separation of long sequences into synthesizable fragments, T(m) and stem-loop determinations, optimal oligonucleotide component design and design verification/error-checking. The output is a complete design report and a list of optimized oligonucleotides to be prepared for subsequent gene synthesis. The user interface accommodates both inexperienced and experienced users. For inexperienced users, explanatory notes are provided such that detailed instructions are not necessary; for experienced users, a streamlined interface is provided without such notes. The software has been extensively tested in the design and successful synthesis of over 400 kb of genes, many of which exceeded 5 kb in length.

  5. Software control and system configuration management - A process that works

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1983-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  6. Marshall Space Flight Center Ground Systems Development and Integration

    NASA Technical Reports Server (NTRS)

    Wade, Gina

    2016-01-01

    Ground Systems Development and Integration performs a variety of tasks in support of the Mission Operations Laboratory (MOL) and other Center and Agency projects. These tasks include various systems engineering processes such as performing system requirements development, system architecture design, integration, verification and validation, software development, and sustaining engineering of mission operations systems that has evolved the Huntsville Operations Support Center (HOSC) into a leader in remote operations for current and future NASA space projects. The group is also responsible for developing and managing telemetry and command configuration and calibration databases. Personnel are responsible for maintaining and enhancing their disciplinary skills in the areas of project management, software engineering, software development, software process improvement, telecommunications, networking, and systems management. Domain expertise in the ground systems area is also maintained and includes detailed proficiency in the areas of real-time telemetry systems, command systems, voice, video, data networks, and mission planning systems.

  7. Space station dynamics, attitude control and momentum management

    NASA Technical Reports Server (NTRS)

    Sunkel, John W.; Singh, Ramen P.; Vengopal, Ravi

    1989-01-01

    The Space Station Attitude Control System software test-bed provides a rigorous environment for the design, development and functional verification of GN and C algorithms and software. The approach taken for the simulation of the vehicle dynamics and environmental models using a computationally efficient algorithm is discussed. The simulation includes capabilities for docking/berthing dynamics, prescribed motion dynamics associated with the Mobile Remote Manipulator System (MRMS) and microgravity disturbances. The vehicle dynamics module interfaces with the test-bed through the central Communicator facility which is in turn driven by the Station Control Simulator (SCS) Executive. The Communicator addresses issues such as the interface between the discrete flight software and the continuous vehicle dynamics, and multi-programming aspects such as the complex flow of control in real-time programs. Combined with the flight software and redundancy management modules, the facility provides a flexible, user-oriented simulation platform.

  8. Knowledge-based assistance in costing the space station DMS

    NASA Technical Reports Server (NTRS)

    Henson, Troy; Rone, Kyle

    1988-01-01

    The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.

  9. Toward an integrated software platform for systems pharmacology

    PubMed Central

    Ghosh, Samik; Matsuoka, Yukiko; Asai, Yoshiyuki; Hsin, Kun-Yi; Kitano, Hiroaki

    2013-01-01

    Understanding complex biological systems requires the extensive support of computational tools. This is particularly true for systems pharmacology, which aims to understand the action of drugs and their interactions in a systems context. Computational models play an important role as they can be viewed as an explicit representation of biological hypotheses to be tested. A series of software and data resources are used for model development, verification and exploration of the possible behaviors of biological systems using the model that may not be possible or not cost effective by experiments. Software platforms play a dominant role in creativity and productivity support and have transformed many industries, techniques that can be applied to biology as well. Establishing an integrated software platform will be the next important step in the field. © 2013 The Authors. Biopharmaceutics & Drug Disposition published by John Wiley & Sons, Ltd. PMID:24150748

  10. Software control and system configuration management: A systems-wide approach

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1984-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  11. Computer modeling in the practice of acoustical consulting: An evolving variety of uses from marketing and diagnosis through design to eventually research

    NASA Astrophysics Data System (ADS)

    Madaras, Gary S.

    2002-05-01

    The use of computer modeling as a marketing, diagnosis, design, and research tool in the practice of acoustical consulting is discussed. From the time it is obtained, the software can be used as an effective marketing tool. It is not until the software basics are learned and some amount of testing and verification occurs that the software can be used as a tool for diagnosing the acoustics of existing rooms. A greater understanding of the output types and formats as well as experience in interpreting the results is required before the software can be used as an efficient design tool. Lastly, it is only after repetitive use as a design tool that the software can be used as a cost-effective means of conducting research in practice. The discussion is supplemented with specific examples of actual projects provided by various consultants within multiple firms. Focus is placed on the use of CATT-Acoustic software and predicting the room acoustics of large performing arts halls as well as other public assembly spaces.

  12. Experiences in improving the state of the practice in verification and validation of knowledge-based systems

    NASA Technical Reports Server (NTRS)

    Culbert, Chris; French, Scott W.; Hamilton, David

    1994-01-01

    Knowledge-based systems (KBS's) are in general use in a wide variety of domains, both commercial and government. As reliance on these types of systems grows, the need to assess their quality and validity reaches critical importance. As with any software, the reliability of a KBS can be directly attributed to the application of disciplined programming and testing practices throughout the development life-cycle. However, there are some essential differences between conventional software and KBSs, both in construction and use. The identification of these differences affect the verification and validation (V&V) process and the development of techniques to handle them. The recognition of these differences is the basis of considerable on-going research in this field. For the past three years IBM (Federal Systems Company - Houston) and the Software Technology Branch (STB) of NASA/Johnson Space Center have been working to improve the 'state of the practice' in V&V of Knowledge-based systems. This work was motivated by the need to maintain NASA's ability to produce high quality software while taking advantage of new KBS technology. To date, the primary accomplishment has been the development and teaching of a four-day workshop on KBS V&V. With the hope of improving the impact of these workshops, we also worked directly with NASA KBS projects to employ concepts taught in the workshop. This paper describes two projects that were part of this effort. In addition to describing each project, this paper describes problems encountered and solutions proposed in each case, with particular emphasis on implications for transferring KBS V&V technology beyond the NASA domain.

  13. Optical Testing and Verification Methods for the James Webb Space Telescope Integrated Science Instrument Module Element

    NASA Technical Reports Server (NTRS)

    Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; hide

    2016-01-01

    NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.

  14. Neutron Source Facility Training Simulator Based on EPICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Young Soo; Wei, Thomas Y.; Vilim, Richard B.

    A plant operator training simulator is developed for training the plant operators as well as for design verification of plant control system (PCS) and plant protection system (PPS) for the Kharkov Institute of Physics and Technology Neutron Source Facility. The simulator provides the operator interface for the whole plant including the sub-critical assembly coolant loop, target coolant loop, secondary coolant loop, and other facility systems. The operator interface is implemented based on Experimental Physics and Industrial Control System (EPICS), which is a comprehensive software development platform for distributed control systems. Since its development at Argonne National Laboratory, it has beenmore » widely adopted in the experimental physics community, e.g. for control of accelerator facilities. This work is the first implementation for a nuclear facility. The main parts of the operator interface are the plant control panel and plant protection panel. The development involved implementation of process variable database, sequence logic, and graphical user interface (GUI) for the PCS and PPS utilizing EPICS and related software tools, e.g. sequencer for sequence logic, and control system studio (CSS-BOY) for graphical use interface. For functional verification of the PCS and PPS, a plant model is interfaced, which is a physics-based model of the facility coolant loops implemented as a numerical computer code. The training simulator is tested and demonstrated its effectiveness in various plant operation sequences, e.g. start-up, shut-down, maintenance, and refueling. It was also tested for verification of the plant protection system under various trip conditions.« less

  15. Optical testing and verification methods for the James Webb Space Telescope Integrated Science Instrument Module element

    NASA Astrophysics Data System (ADS)

    Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; Eichhorn, William L.; Glasse, Alistair C.; Gracey, Renee; Hartig, George F.; Howard, Joseph M.; Kelly, Douglas M.; Kimble, Randy A.; Kirk, Jeffrey R.; Kubalak, David A.; Landsman, Wayne B.; Lindler, Don J.; Malumuth, Eliot M.; Maszkiewicz, Michael; Rieke, Marcia J.; Rowlands, Neil; Sabatke, Derek S.; Smith, Corbett T.; Smith, J. Scott; Sullivan, Joseph F.; Telfer, Randal C.; Te Plate, Maurice; Vila, M. Begoña.; Warner, Gerry D.; Wright, David; Wright, Raymond H.; Zhou, Julia; Zielinski, Thomas P.

    2016-09-01

    NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy. The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM), that contains four science instruments (SI) and the Fine Guidance Sensor (FGS). The SIs are mounted to a composite metering structure. The SIs and FGS were integrated to the ISIM structure and optically tested at NASA's Goddard Space Flight Center using the Optical Telescope Element SIMulator (OSIM). OSIM is a full-field, cryogenic JWST telescope simulator. SI performance, including alignment and wavefront error, was evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, implementation of associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.

  16. Design of a verifiable subset for HAL/S

    NASA Technical Reports Server (NTRS)

    Browne, J. C.; Good, D. I.; Tripathi, A. R.; Young, W. D.

    1979-01-01

    An attempt to evaluate the applicability of program verification techniques to the existing programming language, HAL/S is discussed. HAL/S is a general purpose high level language designed to accommodate the software needs of the NASA Space Shuttle project. A diversity of features for scientific computing, concurrent and real-time programming, and error handling are discussed. The criteria by which features were evaluated for inclusion into the verifiable subset are described. Individual features of HAL/S with respect to these criteria are examined and justification for the omission of various features from the subset is provided. Conclusions drawn from the research are presented along with recommendations made for the use of HAL/S with respect to the area of program verification.

  17. Concept Verification Test - Evaluation of Spacelab/Payload operation concepts

    NASA Technical Reports Server (NTRS)

    Mcbrayer, R. O.; Watters, H. H.

    1977-01-01

    The Concept Verification Test (CVT) procedure is used to study Spacelab operational concepts by conducting mission simulations in a General Purpose Laboratory (GPL) which represents a possible design of Spacelab. In conjunction with the laboratory a Mission Development Simulator, a Data Management System Simulator, a Spacelab Simulator, and Shuttle Interface Simulator have been designed. (The Spacelab Simulator is more functionally and physically representative of the Spacelab than the GPL.) Four simulations of Spacelab mission experimentation were performed, two involving several scientific disciplines, one involving life sciences, and the last involving material sciences. The purpose of the CVT project is to support the pre-design and development of payload carriers and payloads, and to coordinate hardware, software, and operational concepts of different developers and users.

  18. Classification and Verification of Handwritten Signatures with Time Causal Information Theory Quantifiers.

    PubMed

    Rosso, Osvaldo A; Ospina, Raydonal; Frery, Alejandro C

    2016-01-01

    We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups.

  19. Methodology evaluation: Effects of independent verification and intergration on one class of application

    NASA Technical Reports Server (NTRS)

    Page, J.

    1981-01-01

    The effects of an independent verification and integration (V and I) methodology on one class of application are described. Resource profiles are discussed. The development environment is reviewed. Seven measures are presented to test the hypothesis that V and I improve the development and product. The V and I methodology provided: (1) a decrease in requirements ambiguities and misinterpretation; (2) no decrease in design errors; (3) no decrease in the cost of correcting errors; (4) a decrease in the cost of system and acceptance testing; (5) an increase in early discovery of errors; (6) no improvement in the quality of software put into operation; and (7) a decrease in productivity and an increase in cost.

  20. Automatic documentation system extension to multi-manufacturers' computers and to measure, improve, and predict software reliability

    NASA Technical Reports Server (NTRS)

    Simmons, D. B.

    1975-01-01

    The DOMONIC system has been modified to run on the Univac 1108 and the CDC 6600 as well as the IBM 370 computer system. The DOMONIC monitor system has been implemented to gather data which can be used to optimize the DOMONIC system and to predict the reliability of software developed using DOMONIC. The areas of quality metrics, error characterization, program complexity, program testing, validation and verification are analyzed. A software reliability model for estimating program completion levels and one on which to base system acceptance have been developed. The DAVE system which performs flow analysis and error detection has been converted from the University of Colorado CDC 6400/6600 computer to the IBM 360/370 computer system for use with the DOMONIC system.

  1. Assuring NASA's Safety and Mission Critical Software

    NASA Technical Reports Server (NTRS)

    Deadrick, Wesley

    2015-01-01

    What is IV&V? Independent Verification and Validation (IV&V) is an objective examination of safety and mission critical software processes and products. Independence: 3 Key parameters: Technical Independence; Managerial Independence; Financial Independence. NASA IV&V perspectives: Will the system's software: Do what it is supposed to do?; Not do what it is not supposed to do?; Respond as expected under adverse conditions?. Systems Engineering: Determines if the right system has been built and that it has been built correctly. IV&V Technical Approaches: Aligned with IEEE 1012; Captured in a Catalog of Methods; Spans the full project lifecycle. IV&V Assurance Strategy: The IV&V Project's strategy for providing mission assurance; Assurance Strategy is driven by the specific needs of an individual project; Implemented via an Assurance Design; Communicated via Assurance Statements.

  2. NASA Software Documentation Standard

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The NASA Software Documentation Standard (hereinafter referred to as "Standard") is designed to support the documentation of all software developed for NASA; its goal is to provide a framework and model for recording the essential information needed throughout the development life cycle and maintenance of a software system. The NASA Software Documentation Standard can be applied to the documentation of all NASA software. The Standard is limited to documentation format and content requirements. It does not mandate specific management, engineering, or assurance standards or techniques. This Standard defines the format and content of documentation for software acquisition, development, and sustaining engineering. Format requirements address where information shall be recorded and content requirements address what information shall be recorded. This Standard provides a framework to allow consistency of documentation across NASA and visibility into the completeness of project documentation. The basic framework consists of four major sections (or volumes). The Management Plan contains all planning and business aspects of a software project, including engineering and assurance planning. The Product Specification contains all technical engineering information, including software requirements and design. The Assurance and Test Procedures contains all technical assurance information, including Test, Quality Assurance (QA), and Verification and Validation (V&V). The Management, Engineering, and Assurance Reports is the library and/or listing of all project reports.

  3. First CLIPS Conference Proceedings, volume 1

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The first Conference of C Language Production Systems (CLIPS) hosted by the NASA-Lyndon B. Johnson Space Center in August 1990 is presented. Articles included engineering applications, intelligent tutors and training, intelligent software engineering, automated knowledge acquisition, network applications, verification and validation, enhancements to CLIPS, space shuttle quality control/diagnosis applications, space shuttle and real-time applications, and medical, biological, and agricultural applications.

  4. 15 CFR Supplement No. 1 to Part 730 - Information Collection Requirements Under the Paperwork Reduction Act: OMB Control Numbers

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Triangular Transactions Involving Commodities Covered by a U.S. Import Certificate § 748.10(e). 0694-0012... Delivery Verification Certificate §§ 748.13 and 762.2(b). 0694-0017 International Import Certificate § 748... Assurance Requirement of License Exception TSR (Technology and Software Under Restriction) §§ 740.3(d) and...

  5. Independent Verification and Validation Program

    NASA Technical Reports Server (NTRS)

    Bailey, Brandon T.

    2015-01-01

    Presentation to be given to European Space Agency counterparts to give an overview of NASA's IVV Program and the layout and structure of the Software Testing and Research laboratory maintained at IVV. Seeking STI-ITAR review due to the international audience. Most of the information has been presented to public audiences in the past, with some variations on data, or is in the public domain.

  6. [The study of noninvasive ventilator impeller based on ANSYS].

    PubMed

    Hu, Zhaoyan; Lu, Pan; Xie, Haiming; Zhou, Yaxu

    2011-06-01

    An impeller plays a significant role in the non-invasive ventilator. This paper shows a model of impeller for noninvasive ventilator established with the software Solidworks. The model was studied for feasibility based on ANSYS. Then stress and strain of the impeller were discussed under the external loads. The results of the analysis provided verification for the reliable design of impellers.

  7. JPL control/structure interaction test bed real-time control computer architecture

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1989-01-01

    The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.

  8. Agile deployment and code coverage testing metrics of the boot software on-board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Parra, Pablo; da Silva, Antonio; Polo, Óscar R.; Sánchez, Sebastián

    2018-02-01

    In this day and age, successful embedded critical software needs agile and continuous development and testing procedures. This paper presents the overall testing and code coverage metrics obtained during the unit testing procedure carried out to verify the correctness of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. The ICU boot software is a critical part of the project so its verification should be addressed at an early development stage, so any test case missed in this process may affect the quality of the overall on-board software. According to the European Cooperation for Space Standardization ESA standards, testing this kind of critical software must cover 100% of the source code statement and decision paths. This leads to the complete testing of fault tolerance and recovery mechanisms that have to resolve every possible memory corruption or communication error brought about by the space environment. The introduced procedure enables fault injection from the beginning of the development process and enables to fulfill the exigent code coverage demands on the boot software.

  9. Model Transformation for a System of Systems Dependability Safety Case

    NASA Technical Reports Server (NTRS)

    Murphy, Judy; Driskell, Stephen B.

    2010-01-01

    Software plays an increasingly larger role in all aspects of NASA's science missions. This has been extended to the identification, management and control of faults which affect safety-critical functions and by default, the overall success of the mission. Traditionally, the analysis of fault identification, management and control are hardware based. Due to the increasing complexity of system, there has been a corresponding increase in the complexity in fault management software. The NASA Independent Validation & Verification (IV&V) program is creating processes and procedures to identify, and incorporate safety-critical software requirements along with corresponding software faults so that potential hazards may be mitigated. This Specific to Generic ... A Case for Reuse paper describes the phases of a dependability and safety study which identifies a new, process to create a foundation for reusable assets. These assets support the identification and management of specific software faults and, their transformation from specific to generic software faults. This approach also has applications to other systems outside of the NASA environment. This paper addresses how a mission specific dependability and safety case is being transformed to a generic dependability and safety case which can be reused for any type of space mission with an emphasis on software fault conditions.

  10. Reprocessing Close Range Terrestrial and Uav Photogrammetric Projects with the Dbat Toolbox for Independent Verification and Quality Control

    NASA Astrophysics Data System (ADS)

    Murtiyoso, A.; Grussenmeyer, P.; Börlin, N.

    2017-11-01

    Photogrammetry has recently seen a rapid increase in many applications, thanks to developments in computing power and algorithms. Furthermore with the democratisation of UAVs (Unmanned Aerial Vehicles), close range photogrammetry has seen more and more use due to the easier capability to acquire aerial close range images. In terms of photogrammetric processing, many commercial software solutions exist in the market that offer results from user-friendly environments. However, in most commercial solutions, a black-box approach to photogrammetric calculations is often used. This is understandable in light of the proprietary nature of the algorithms, but it may pose a problem if the results need to be validated in an independent manner. In this paper, the Damped Bundle Adjustment Toolbox (DBAT) developed for Matlab was used to reprocess some photogrammetric projects that were processed using the commercial software Agisoft Photoscan. Several scenarios were experimented on in order to see the performance of DBAT in reprocessing terrestrial and UAV close range photogrammetric projects in several configurations of self-calibration setting. Results show that DBAT managed to reprocess PS projects and generate metrics which can be useful for project verification.

  11. Upgrades at the NASA Langley Research Center National Transonic Facility

    NASA Technical Reports Server (NTRS)

    Paryz, Roman W.

    2012-01-01

    Several projects have been completed or are nearing completion at the NASA Langley Research Center (LaRC) National Transonic Facility (NTF). The addition of a Model Flow-Control/Propulsion Simulation test capability to the NTF provides a unique, transonic, high-Reynolds number test capability that is well suited for research in propulsion airframe integration studies, circulation control high-lift concepts, powered lift, and cruise separation flow control. A 1992 vintage Facility Automation System (FAS) that performs the control functions for tunnel pressure, temperature, Mach number, model position, safety interlock and supervisory controls was replaced using current, commercially available components. This FAS upgrade also involved a design study for the replacement of the facility Mach measurement system and the development of a software-based simulation model of NTF processes and control systems. The FAS upgrades were validated by a post upgrade verification wind tunnel test. The data acquisition system (DAS) upgrade project involves the design, purchase, build, integration, installation and verification of a new DAS by replacing several early 1990's vintage computer systems with state of the art hardware/software. This paper provides an update on the progress made in these efforts. See reference 1.

  12. [Quality of life and adherence to nebulised antibiotic therapy using a new device in non-cystic fibrosis bronchiectasis].

    PubMed

    Gulini, Martina; Prados, Concepción; Pérez, Amparo; Romero, David; Feliz, Darwin; Gómez Carrera, Luis; Cabanillas, Juan José; Barbero, Javier; Alvarez-Sala, Rodolfo

    2012-01-01

    Analysis of psychological variables, quality of life, depression and anxiety, subjective perception of compliance, level of satisfaction with the electronic device, and objective verification) in order to study the therapeutic adherence, in non-cystic fibrosis bronchiectasis (NCFB) colonized by Pseudomonas aeruginosa. A cross-sectional descriptive study was conducted on 22 NCFB stable patients colonised by Pseudomonas aeruginosa (two groups of 11 subjects: GROUP 1, Pari-LC plus nebuliser device, and GROUP 2, with the I- neb device), both containing sodium colistemethate. Different variables, were obtained, such as, self-perceived health (St George questionnaire), state of mind (HADS scale), and subjective perception of therapeutic compliance and objective verification by using the device software. There were no significant differences in questionnaire results. The therapeutic support in GROUP 2, could be proved objectively (up to 60%) using specific software, and the excellent tolerability using a specific questionnaire created for the study. These types of aerosol devices demonstrate a high level of therapeutic support, that can be measured objectively and is different from what the patients subjectively indicates. Copyright © 2011 Elsevier España, S.L. All rights reserved.

  13. STAR-CCM+ Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pointer, William David

    2016-09-30

    The commercial Computational Fluid Dynamics (CFD) code STAR-CCM+ provides general purpose finite volume method solutions for fluid dynamics and energy transport. This document defines plans for verification and validation (V&V) of the base code and models implemented within the code by the Consortium for Advanced Simulation of Light water reactors (CASL). The software quality assurance activities described herein are port of the overall software life cycle defined in the CASL Software Quality Assurance (SQA) Plan [Sieger, 2015]. STAR-CCM+ serves as the principal foundation for development of an advanced predictive multi-phase boiling simulation capability within CASL. The CASL Thermal Hydraulics Methodsmore » (THM) team develops advanced closure models required to describe the subgrid-resolution behavior of secondary fluids or fluid phases in multiphase boiling flows within the Eulerian-Eulerian framework of the code. These include wall heat partitioning models that describe the formation of vapor on the surface and the forces the define bubble/droplet dynamic motion. The CASL models are implemented as user coding or field functions within the general framework of the code. This report defines procedures and requirements for V&V of the multi-phase CFD capability developed by CASL THM. Results of V&V evaluations will be documented in a separate STAR-CCM+ V&V assessment report. This report is expected to be a living document and will be updated as additional validation cases are identified and adopted as part of the CASL THM V&V suite.« less

  14. Machine-assisted verification of latent fingerprints: first results for nondestructive contact-less optical acquisition techniques with a CWL sensor

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Kiltz, Stefan; Krapyvskyy, Dmytro; Dittmann, Jana; Vielhauer, Claus; Leich, Marcus

    2011-11-01

    A machine-assisted analysis of traces from crime scenes might be possible with the advent of new high-resolution non-destructive contact-less acquisition techniques for latent fingerprints. This requires reliable techniques for the automatic extraction of fingerprint features from latent and exemplar fingerprints for matching purposes using pattern recognition approaches. Therefore, we evaluate the NIST Biometric Image Software for the feature extraction and verification of contact-lessly acquired latent fingerprints to determine potential error rates. Our exemplary test setup includes 30 latent fingerprints from 5 people in two test sets that are acquired from different surfaces using a chromatic white light sensor. The first test set includes 20 fingerprints on two different surfaces. It is used to determine the feature extraction performance. The second test set includes one latent fingerprint on 10 different surfaces and an exemplar fingerprint to determine the verification performance. This utilized sensing technique does not require a physical or chemical visibility enhancement of the fingerprint residue, thus the original trace remains unaltered for further investigations. No particular feature extraction and verification techniques have been applied to such data, yet. Hence, we see the need for appropriate algorithms that are suitable to support forensic investigations.

  15. Online 3D EPID-based dose verification: Proof of concept.

    PubMed

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  16. SU-F-I-11: Software Development for 4D-CBCT Research of Real-Time-Image Gated Spot Scanning Proton Therapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fujii, T; Fujii, Y; Shimizu, S

    Purpose: To acquire correct information for inside the body in patient positioning of Real-time-image Gated spot scanning Proton Therapy (RGPT), utilization of tomographic image at exhale phase of patient respiration obtained from 4-dimensional Cone beam CT (4D-CBCT) has been desired. We developed software named “Image Analysis Platform” for 4D-CBCT researches which has technique to segment projection-images based on 3D marker position in the body. The 3D marker position can be obtained by using two axes CBCT system at Hokkaido University Hospital Proton Therapy Center. Performance verification of the software was implemented. Methods: The software calculates 3D marker position retrospectively bymore » using matching positions on pair projection-images obtained by two axes fluoroscopy mode of CBCT system. Log data of 3D marker tracking are outputted after the tracking. By linking the Log data and gantry-angle file of projection-image, all projection-images are equally segmented to spatial five-phases according to marker 3D position of SI direction and saved to specified phase folder. Segmented projection-images are used for CBCT reconstruction of each phase. As performance verification of the software, test of segmented projection-images was implemented for sample CT phantom (Catphan) image acquired by two axes fluoroscopy mode of CBCT. Dummy marker was added on the images. Motion of the marker was modeled to move in 3D space. Motion type of marker is sin4 wave function has amplitude 10.0 mm/5.0 mm/0 mm, cycle 4 s/4 s/0 s for SI/AP/RL direction. Results: The marker was tracked within 0.58 mm accuracy in 3D for all images, and it was confirmed that all projection-images were segmented and saved to each phase folder correctly. Conclusion: We developed software for 4D-CBCT research which can segment projection-image based on 3D marker position. It will be helpful to create high quality of 4D-CBCT reconstruction image for RGPT.« less

  17. Clouds and the Earth's Radiant Energy System (CERES) Visualization Single Satellite Footprint (SSF) Plot Generator

    NASA Technical Reports Server (NTRS)

    Barsi, Julia A.

    1995-01-01

    The first Clouds and the Earth's Radiant Energy System (CERES) instrument will be launched in 1997 to collect data on the Earth's radiation budget. The data retrieved from the satellite will be processed through twelve subsystems. The Single Satellite Footprint (SSF) plot generator software was written to assist scientists in the early stages of CERES data analysis, producing two-dimensional plots of the footprint radiation and cloud data generated by one of the subsystems. Until the satellite is launched, however, software developers need verification tools to check their code. This plot generator will aid programmers by geolocating algorithm result on a global map.

  18. Independent calculation-based verification of IMRT plans using a 3D dose-calculation engine.

    PubMed

    Arumugam, Sankar; Xing, Aitang; Goozee, Gary; Holloway, Lois

    2013-01-01

    Independent monitor unit verification of intensity-modulated radiation therapy (IMRT) plans requires detailed 3-dimensional (3D) dose verification. The aim of this study was to investigate using a 3D dose engine in a second commercial treatment planning system (TPS) for this task, facilitated by in-house software. Our department has XiO and Pinnacle TPSs, both with IMRT planning capability and modeled for an Elekta-Synergy 6MV photon beam. These systems allow the transfer of computed tomography (CT) data and RT structures between them but do not allow IMRT plans to be transferred. To provide this connectivity, an in-house computer programme was developed to convert radiation therapy prescription (RTP) files as generated by many planning systems into either XiO or Pinnacle IMRT file formats. Utilization of the technique and software was assessed by transferring 14 IMRT plans from XiO and Pinnacle onto the other system and performing 3D dose verification. The accuracy of the conversion process was checked by comparing the 3D dose matrices and dose volume histograms (DVHs) of structures for the recalculated plan on the same system. The developed software successfully transferred IMRT plans generated by 1 planning system into the other. Comparison of planning target volume (TV) DVHs for the original and recalculated plans showed good agreement; a maximum difference of 2% in mean dose, - 2.5% in D95, and 2.9% in V95 was observed. Similarly, a DVH comparison of organs at risk showed a maximum difference of +7.7% between the original and recalculated plans for structures in both high- and medium-dose regions. However, for structures in low-dose regions (less than 15% of prescription dose) a difference in mean dose up to +21.1% was observed between XiO and Pinnacle calculations. A dose matrix comparison of original and recalculated plans in XiO and Pinnacle TPSs was performed using gamma analysis with 3%/3mm criteria. The mean and standard deviation of pixels passing gamma tolerance for XiO-generated IMRT plans was 96.1 ± 1.3, 96.6 ± 1.2, and 96.0 ± 1.5 in axial, coronal, and sagittal planes respectively. Corresponding results for Pinnacle-generated IMRT plans were 97.1 ± 1.5, 96.4 ± 1.2, and 96.5 ± 1.3 in axial, coronal, and sagittal planes respectively. © 2013 American Association of Medical Dosimetrists.

  19. Common Data Acquisition Systems (DAS) Software Development for Rocket Propulsion Test (RPT) Test Facilities - A General Overview

    NASA Technical Reports Server (NTRS)

    Hebert, Phillip W., Sr.; Hughes, Mark S.; Davis, Dawn M.; Turowski, Mark P.; Holladay, Wendy T.; Marshall, PeggL.; Duncan, Michael E.; Morris, Jon A.; Franzl, Richard W.

    2012-01-01

    The advent of the commercial space launch industry and NASA's more recent resumption of operation of Stennis Space Center's large test facilities after thirty years of contractor control resulted in a need for a non-proprietary data acquisition system (DAS) software to support government and commercial testing. The software is designed for modularity and adaptability to minimize the software development effort for current and future data systems. An additional benefit of the software's architecture is its ability to easily migrate to other testing facilities thus providing future commonality across Stennis. Adapting the software to other Rocket Propulsion Test (RPT) Centers such as MSFC, White Sands, and Plumbrook Station would provide additional commonality and help reduce testing costs for NASA. Ultimately, the software provides the government with unlimited rights and guarantees privacy of data to commercial entities. The project engaged all RPT Centers and NASA's Independent Verification & Validation facility to enhance product quality. The design consists of a translation layer which provides the transparency of the software application layers to underlying hardware regardless of test facility location and a flexible and easily accessible database. This presentation addresses system technical design, issues encountered, and the status of Stennis' development and deployment.

  20. International collaboration on used fuel disposition crystalline rocks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Yifeng; Gardner, Payton; Kim, Geon-Young

    Active participation in international R&D is crucial for achieving the UFD long-term goals of conducting “experiments to fill data needs and confirm advanced modeling approaches” (by 2015) and of having a “robust modeling and experimental basis for evaluation of multiple disposal system options” (by 2020). DOE’s Office of Nuclear Energy (NE) and its Office of Used Fuel Disposition Research and Development (UFD) have developed a strategic plan to advance cooperation with international partners. The international collaboration on the evaluation of crystalline disposal media at Sandia National Laboratories (SNL) in FY16 focused on the following four activities: (1) thermal-hydrologic-mechanical-chemical modeling singlemore » fracture evolution; (2) simulations of flow and transport in Bedrichov Tunnel, Czech Republic, (3) completion of streaming potential testing at Korean Atomic Energy Research Institute (KAERI), and (4) technical data exchange with KAERI on thermal-hydrologic-mechanical (THM) properties and specifications of bentonite buffer materials. The first two activities are part of the Development of Coupled Models and their Validation against Experiments (DECOVALEX-2015) project.« less

  1. Current and anticipated uses of thermal hydraulic codes in Korea

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kim, Kyung-Doo; Chang, Won-Pyo

    1997-07-01

    In Korea, the current uses of thermal hydraulic codes are categorized into 3 areas. The first application is in designing both nuclear fuel and NSSS. The codes have usually been introduced based on the technology transfer programs agreed between KAERI and the foreign vendors. Another area is in the supporting of the plant operations and licensing by the utility. The third category is research purposes. In this area assessments and some applications to the safety issue resolutions are major activities using the best estimate thermal hydraulic codes such as RELAP5/MOD3 and CATHARE2. Recently KEPCO plans to couple thermal hydraulic codesmore » with a neutronics code for the design of the evolutionary type reactor by 2004. KAERI also plans to develop its own best estimate thermal hydraulic code, however, application range is different from KEPCO developing code. Considering these activities, it is anticipated that use of the best estimate hydraulic analysis code developed in Korea may be possible in the area of safety evaluation within 10 years.« less

  2. Evidence Arguments for Using Formal Methods in Software Certification

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Pai, Ganesh

    2013-01-01

    We describe a generic approach for automatically integrating the output generated from a formal method/tool into a software safety assurance case, as an evidence argument, by (a) encoding the underlying reasoning as a safety case pattern, and (b) instantiating it using the data produced from the method/tool. We believe this approach not only improves the trustworthiness of the evidence generated from a formal method/tool, by explicitly presenting the reasoning and mechanisms underlying its genesis, but also provides a way to gauge the suitability of the evidence in the context of the wider assurance case. We illustrate our work by application to a real example-an unmanned aircraft system- where we invoke a formal code analysis tool from its autopilot software safety case, automatically transform the verification output into an evidence argument, and then integrate it into the former.

  3. Automated Test Environment for a Real-Time Control System

    NASA Technical Reports Server (NTRS)

    Hall, Ronald O.

    1994-01-01

    An automated environment with hardware-in-the-loop has been developed by Rocketdyne Huntsville for test of a real-time control system. The target system of application is the man-rated real-time system which controls the Space Shuttle Main Engines (SSME). The primary use of the environment is software verification and validation, but it is also useful for evaluation and analysis of SSME avionics hardware and mathematical engine models. It provides a test bed for the integration of software and hardware. The principles and skills upon which it operates may be applied to other target systems, such as those requiring hardware-in-the-loop simulation and control system development. Potential applications are in problem domains demanding highly reliable software systems requiring testing to formal requirements and verifying successful transition to/from off-nominal system states.

  4. The Role and Quality of Software Safety in the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Layman, Lucas; Basili, Victor R.; Zelkowitz, Marvin V.

    2010-01-01

    In this study, we examine software safety risk in the early design phase of the NASA Constellation spaceflight program. Obtaining an accurate, program-wide picture of software safety risk is difficult across multiple, independently-developing systems. We leverage one source of safety information, hazard analysis, to provide NASA quality assurance managers with information regarding the ongoing state of software safety across the program. The goal of this research is two-fold: 1) to quantify the relative importance of software with respect to system safety; and 2) to quantify the level of risk presented by software in the hazard analysis. We examined 154 hazard reports created during the preliminary design phase of three major flight hardware systems within the Constellation program. To quantify the importance of software, we collected metrics based on the number of software-related causes and controls of hazardous conditions. To quantify the level of risk presented by software, we created a metric scheme to measure the specificity of these software causes. We found that from 49-70% of hazardous conditions in the three systems could be caused by software or software was involved in the prevention of the hazardous condition. We also found that 12-17% of the 2013 hazard causes involved software, and that 23-29% of all causes had a software control. Furthermore, 10-12% of all controls were software-based. There is potential for inaccuracy in these counts, however, as software causes are not consistently scoped, and the presence of software in a cause or control is not always clear. The application of our software specificity metrics also identified risks in the hazard reporting process. In particular, we found a number of traceability risks in the hazard reports may impede verification of software and system safety.

  5. Efficiency and Flexibility of Fingerprint Scheme Using Partial Encryption and Discrete Wavelet Transform to Verify User in Cloud Computing.

    PubMed

    Yassin, Ali A

    2014-01-01

    Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification.

  6. Efficiency and Flexibility of Fingerprint Scheme Using Partial Encryption and Discrete Wavelet Transform to Verify User in Cloud Computing

    PubMed Central

    Yassin, Ali A.

    2014-01-01

    Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification. PMID:27355051

  7. Design for Verification: Using Design Patterns to Build Reliable Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Koga, Dennis (Technical Monitor)

    2003-01-01

    Components so far have been mainly used in commercial software development to reduce time to market. While some effort has been spent on formal aspects of components, most of this was done in the context of programming language or operating system framework integration. As a consequence, increased reliability of composed systems is mainly regarded as a side effect of a more rigid testing of pre-fabricated components. In contrast to this, Design for Verification (D4V) puts the focus on component specific property guarantees, which are used to design systems with high reliability requirements. D4V components are domain specific design pattern instances with well-defined property guarantees and usage rules, which are suitable for automatic verification. The guaranteed properties are explicitly used to select components according to key system requirements. The D4V hypothesis is that the same general architecture and design principles leading to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the limitations of conventional reliability assurance measures, such as too large a state space or too many execution paths.

  8. SPR Hydrostatic Column Model Verification and Validation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bettin, Giorgia; Lord, David; Rudeen, David Keith

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extendedmore » nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.« less

  9. Using ICT techniques for improving mechatronic systems' dependability

    NASA Astrophysics Data System (ADS)

    Miron, Emanuel; Silva, João P. M. A.; Machado, José; Olaru, Dumitru; Prisacaru, Gheorghe

    2013-10-01

    The use of analysis techniques for industrial controller's analysis, such as Simulation and Formal Verification, is complex on industrial context. This complexity is due to the fact that such techniques require sometimes high investment in specific skilled human resources that have sufficient theoretical knowledge in those domains. This paper aims, mainly, to show that it is possible to obtain a timed automata model for formal verification purposes, considering the CAD model of a mechanical component. This systematic approach can be used, by companies, for the analysis of industrial controllers programs. For this purpose, it is discussed, in the paper, the best way to systematize these procedures, and this paper describes, only, the first step of a complex process and promotes a discussion of the main difficulties that can be found and a possibility for handle those difficulties. A library for formal verification purposes is obtained from original 3D CAD models using Software as a Service platform (SaaS) that, nowadays, has become a common deliverable model for many applications, because SaaS is typically accessed by users via internet access.

  10. Real time radiotherapy verification with Cherenkov imaging: development of a system for beamlet verification

    NASA Astrophysics Data System (ADS)

    Pogue, B. W.; Krishnaswamy, V.; Jermyn, M.; Bruza, P.; Miao, T.; Ware, William; Saunders, S. L.; Andreozzi, J. M.; Gladstone, D. J.; Jarvis, L. A.

    2017-05-01

    Cherenkov imaging has been shown to allow near real time imaging of the beam entrance and exit on patient tissue, with the appropriate intensified camera and associated image processing. A dedicated system has been developed for research into full torso imaging of whole breast irradiation, where the dual camera system captures the beam shape for all beamlets used in this treatment protocol. Particularly challenging verification measurement exists in dynamic wedge, field in field, and boost delivery, and the system was designed to capture these as they are delivered. Two intensified CMOS (ICMOS) cameras were developed and mounted in a breast treatment room, and pilot studies for intensity and stability were completed. Software tools to contour the treatment area have been developed and are being tested prior to initiation of the full trial. At present, it is possible to record delivery of individual beamlets as small as a single MLC thickness, and readout at 20 frames per second is achieved. Statistical analysis of system repeatibilty and stability is presented, as well as pilot human studies.

  11. What are the ultimate limits to computational techniques: verifier theory and unverifiability

    NASA Astrophysics Data System (ADS)

    Yampolskiy, Roman V.

    2017-09-01

    Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification.

  12. Program Model Checking: A Practitioner's Guide

    NASA Technical Reports Server (NTRS)

    Pressburger, Thomas T.; Mansouri-Samani, Masoud; Mehlitz, Peter C.; Pasareanu, Corina S.; Markosian, Lawrence Z.; Penix, John J.; Brat, Guillaume P.; Visser, Willem C.

    2008-01-01

    Program model checking is a verification technology that uses state-space exploration to evaluate large numbers of potential program executions. Program model checking provides improved coverage over testing by systematically evaluating all possible test inputs and all possible interleavings of threads in a multithreaded system. Model-checking algorithms use several classes of optimizations to reduce the time and memory requirements for analysis, as well as heuristics for meaningful analysis of partial areas of the state space Our goal in this guidebook is to assemble, distill, and demonstrate emerging best practices for applying program model checking. We offer it as a starting point and introduction for those who want to apply model checking to software verification and validation. The guidebook will not discuss any specific tool in great detail, but we provide references for specific tools.

  13. Classification and Verification of Handwritten Signatures with Time Causal Information Theory Quantifiers

    PubMed Central

    Ospina, Raydonal; Frery, Alejandro C.

    2016-01-01

    We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups. PMID:27907014

  14. Mission Control Center (MCC) System Specification for the Shuttle Orbital Flight Test (OFT) Timeframe

    NASA Technical Reports Server (NTRS)

    1976-01-01

    System specifications to be used by the mission control center (MCC) for the shuttle orbital flight test (OFT) time frame were described. The three support systems discussed are the communication interface system (CIS), the data computation complex (DCC), and the display and control system (DCS), all of which may interfere with, and share processing facilities with other applications processing supporting current MCC programs. The MCC shall provide centralized control of the space shuttle OFT from launch through orbital flight, entry, and landing until the Orbiter comes to a stop on the runway. This control shall include the functions of vehicle management in the area of hardware configuration (verification), flight planning, communication and instrumentation configuration management, trajectory, software and consumables, payloads management, flight safety, and verification of test conditions/environment.

  15. Specification and Verification of Secure Concurrent and Distributed Software Systems

    DTIC Science & Technology

    1992-02-01

    primitive search strategies work for operating systems that contain relatively few operations . As the number of operations increases, so does the the...others have granted him access to, etc . The burden of security falls on the operating system , although appropriate hardware support can minimize the...Guttag, J. Horning, and R. Levin. Synchronization primitives for a multiprocessor: a formal specification. Symposium on Operating System Principles

  16. Safety in Numbers

    DTIC Science & Technology

    2010-11-27

    analysis and verification. While at Wisconsin, Dr. Gopan was awarded the CISCO fellowship for two consecutive years. Mr. John Phillips has many years...using short (56-bit) keys for encryption (e.g., with DES or RC5) [45]. Today, it is used to understand protein folding [10]. IBM‘s World Community...Bicocca. Dipartimento di Informatica, Sistemistica e Comunicazione. Laboratorio di Test e Analisi del Software, Milano. Technical Report LTA:2004:05

  17. Fly-by-light technology development plan

    NASA Technical Reports Server (NTRS)

    Todd, J. R.; Williams, T.; Goldthorpe, S.; Hay, J.; Brennan, M.; Sherman, B.; Chen, J.; Yount, Larry J.; Hess, Richard F.; Kravetz, J.

    1990-01-01

    The driving factors and developments which make a fly-by-light (FBL) viable are discussed. Documentation, analyses, and recommendations are provided on the major issues pertinent to facilitating the U.S. implementation of commercial FBL aircraft before the turn of the century. Areas of particular concern include ultra-reliable computing (hardware/software); electromagnetic environment (EME); verification and validation; optical techniques; life-cycle maintenance; and basis and procedures for certification.

  18. SU-D-BRC-03: Development and Validation of an Online 2D Dose Verification System for Daily Patient Plan Delivery Accuracy Check

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, J; Hu, W; Xing, Y

    Purpose: All plan verification systems for particle therapy are designed to do plan verification before treatment. However, the actual dose distributions during patient treatment are not known. This study develops an online 2D dose verification tool to check the daily dose delivery accuracy. Methods: A Siemens particle treatment system with a modulated scanning spot beam is used in our center. In order to do online dose verification, we made a program to reconstruct the delivered 2D dose distributions based on the daily treatment log files and depth dose distributions. In the log files we can get the focus size, positionmore » and particle number for each spot. A gamma analysis is used to compare the reconstructed dose distributions with the dose distributions from the TPS to assess the daily dose delivery accuracy. To verify the dose reconstruction algorithm, we compared the reconstructed dose distributions to dose distributions measured using PTW 729XDR ion chamber matrix for 13 real patient plans. Then we analyzed 100 treatment beams (58 carbon and 42 proton) for prostate, lung, ACC, NPC and chordoma patients. Results: For algorithm verification, the gamma passing rate was 97.95% for the 3%/3mm and 92.36% for the 2%/2mm criteria. For patient treatment analysis,the results were 97.7%±1.1% and 91.7%±2.5% for carbon and 89.9%±4.8% and 79.7%±7.7% for proton using 3%/3mm and 2%/2mm criteria, respectively. The reason for the lower passing rate for the proton beam is that the focus size deviations were larger than for the carbon beam. The average focus size deviations were −14.27% and −6.73% for proton and −5.26% and −0.93% for carbon in the x and y direction respectively. Conclusion: The verification software meets our requirements to check for daily dose delivery discrepancies. Such tools can enhance the current treatment plan and delivery verification processes and improve safety of clinical treatments.« less

  19. Software Validation via Model Animation

    NASA Technical Reports Server (NTRS)

    Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.

    2015-01-01

    This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.

  20. Advanced software integration: The case for ITV facilities

    NASA Technical Reports Server (NTRS)

    Garman, John R.

    1990-01-01

    The array of technologies and methodologies involved in the development and integration of avionics software has moved almost as rapidly as computer technology itself. Future avionics systems involve major advances and risks in the following areas: (1) Complexity; (2) Connectivity; (3) Security; (4) Duration; and (5) Software engineering. From an architectural standpoint, the systems will be much more distributed, involve session-based user interfaces, and have the layered architectures typified in the layers of abstraction concepts popular in networking. Typified in the NASA Space Station Freedom will be the highly distributed nature of software development itself. Systems composed of independent components developed in parallel must be bound by rigid standards and interfaces, the clean requirements and specifications. Avionics software provides a challenge in that it can not be flight tested until the first time it literally flies. It is the binding of requirements for such an integration environment into the advances and risks of future avionics systems that form the basis of the presented concept and the basic Integration, Test, and Verification concept within the development and integration life cycle of Space Station Mission and Avionics systems.

Top