Sample records for verification test problems

  1. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less

  2. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  3. Standardized Definitions for Code Verification Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  4. Verification and benchmark testing of the NUFT computer code

    NASA Astrophysics Data System (ADS)

    Lee, K. H.; Nitao, J. J.; Kulshrestha, A.

    1993-10-01

    This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.

  5. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  6. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  7. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose

    2018-03-01

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less

  8. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel Jose

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less

  9. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  10. Enhanced verification test suite for physics simulation codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, James R.; Brock, Jerry S.; Brandon, Scott T.

    2008-09-01

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations.

  11. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  12. The Sedov Blast Wave as a Radial Piston Verification Test

    DOE PAGES

    Pederson, Clark; Brown, Bart; Morgan, Nathaniel

    2016-06-22

    The Sedov blast wave is of great utility as a verification problem for hydrodynamic methods. The typical implementation uses an energized cell of finite dimensions to represent the energy point source. We avoid this approximation by directly finding the effects of the energy source as a boundary condition (BC). Furthermore, the proposed method transforms the Sedov problem into an outward moving radial piston problem with a time-varying velocity. A portion of the mesh adjacent to the origin is removed and the boundaries of this hole are forced with the velocities from the Sedov solution. This verification test is implemented onmore » two types of meshes, and convergence is shown. Our results from the typical initial condition (IC) method and the new BC method are compared.« less

  13. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Buren, Kendra L.; Canfield, Jesse M.; Hemez, Francois M.

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verificationmore » test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.« less

  14. WRAP-RIB antenna technology development

    NASA Technical Reports Server (NTRS)

    Freeland, R. E.; Garcia, N. F.; Iwamoto, H.

    1985-01-01

    The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.

  15. Implementation and verification of global optimization benchmark problems

    NASA Astrophysics Data System (ADS)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  16. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    NASA Technical Reports Server (NTRS)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  17. Verification of cardiac mechanics software: benchmark problems and solutions for testing active and passive material behaviour.

    PubMed

    Land, Sander; Gurev, Viatcheslav; Arens, Sander; Augustin, Christoph M; Baron, Lukas; Blake, Robert; Bradley, Chris; Castro, Sebastian; Crozier, Andrew; Favino, Marco; Fastl, Thomas E; Fritz, Thomas; Gao, Hao; Gizzi, Alessio; Griffith, Boyce E; Hurtado, Daniel E; Krause, Rolf; Luo, Xiaoyu; Nash, Martyn P; Pezzuto, Simone; Plank, Gernot; Rossi, Simone; Ruprecht, Daniel; Seemann, Gunnar; Smith, Nicolas P; Sundnes, Joakim; Rice, J Jeremy; Trayanova, Natalia; Wang, Dafang; Jenny Wang, Zhinuo; Niederer, Steven A

    2015-12-08

    Models of cardiac mechanics are increasingly used to investigate cardiac physiology. These models are characterized by a high level of complexity, including the particular anisotropic material properties of biological tissue and the actively contracting material. A large number of independent simulation codes have been developed, but a consistent way of verifying the accuracy and replicability of simulations is lacking. To aid in the verification of current and future cardiac mechanics solvers, this study provides three benchmark problems for cardiac mechanics. These benchmark problems test the ability to accurately simulate pressure-type forces that depend on the deformed objects geometry, anisotropic and spatially varying material properties similar to those seen in the left ventricle and active contractile forces. The benchmark was solved by 11 different groups to generate consensus solutions, with typical differences in higher-resolution solutions at approximately 0.5%, and consistent results between linear, quadratic and cubic finite elements as well as different approaches to simulating incompressible materials. Online tools and solutions are made available to allow these tests to be effectively used in verification of future cardiac mechanics software.

  18. Verification testing of the PKI collector at Sandia National Laboratories, Albuquerque, New Mexico

    NASA Technical Reports Server (NTRS)

    Hauger, J. S.; Pond, S. L.

    1982-01-01

    Verification testing of a solar collector was undertaken prior to its operation as part of an industrial process heat plant at Capitol Concrete Products in Topeka, Kansas. Testing was performed at a control plant installed at Sandia National Laboratory, Albuquerque, New Mexico (SNLA). Early results show that plant performance is even better than anticipated and far in excess of test criteria. Overall plant efficiencies of 65 to 80 percent were typical during hours of good insolation. A number of flaws and imperfections were detected during operability testing, the most important being a problem in elevation drive alignment due to a manufacturing error. All problems were corrected as they occurred and the plant, with over 40 hours of operation, is currently continuing operability testing in a wholly-automatic mode.

  19. Verification testing of the PKI collector at Sandia National Laboratories, Albuquerque, New Mexico

    NASA Astrophysics Data System (ADS)

    Hauger, J. S.; Pond, S. L.

    1982-07-01

    Verification testing of a solar collector was undertaken prior to its operation as part of an industrial process heat plant at Capitol Concrete Products in Topeka, Kansas. Testing was performed at a control plant installed at Sandia National Laboratory, Albuquerque, New Mexico (SNLA). Early results show that plant performance is even better than anticipated and far in excess of test criteria. Overall plant efficiencies of 65 to 80 percent were typical during hours of good insolation. A number of flaws and imperfections were detected during operability testing, the most important being a problem in elevation drive alignment due to a manufacturing error. All problems were corrected as they occurred and the plant, with over 40 hours of operation, is currently continuing operability testing in a wholly-automatic mode.

  20. TOUGH Simulations of the Updegraff's Set of Fluid and Heat Flow Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moridis, G.J.; Pruess

    1992-11-01

    The TOUGH code [Pruess, 1987] for two-phase flow of water, air, and heat in penneable media has been exercised on a suite of test problems originally selected and simulated by C. D. Updegraff [1989]. These include five 'verification' problems for which analytical or numerical solutions are available, and three 'validation' problems that model laboratory fluid and heat flow experiments. All problems could be run without any code modifications (*). Good and efficient numerical performance, as well as accurate results were obtained throughout. Additional code verification and validation problems from the literature are briefly summarized, and suggestions are given for propermore » applications of TOUGH and related codes.« less

  1. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  2. Verification of Algebra Step Problems: A Chronometric Study of Human Problem Solving. Technical Report No. 253. Psychology and Education Series.

    ERIC Educational Resources Information Center

    Matthews, Paul G.; Atkinson, Richard C.

    This paper reports an experiment designed to test theoretical relations among fast problem solving, more complex and slower problem solving, and research concerning fundamental memory processes. Using a cathode ray tube, subjects were presented with propositions of the form "Y is in list X" which they memorized. In later testing they were asked to…

  3. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  4. Sierra/Aria 4.48 Verification Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal Fluid Development Team

    Presented in this document is a portion of the tests that exist in the Sierra Thermal/Fluids verification test suite. Each of these tests is run nightly with the Sierra/TF code suite and the results of the test checked under mesh refinement against the correct analytic result. For each of the tests presented in this document the test setup, derivation of the analytic solution, and comparison of the code results to the analytic solution is provided. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems.

  5. Sierra/SolidMechanics 4.46 Example Problems Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose

    Presented in this document are tests that exist in the Sierra/SolidMechanics example problem suite, which is a subset of the Sierra/SM regression and performance test suite. These examples showcase common and advanced code capabilities. A wide variety of other regression and verification tests exist in the Sierra/SM test suite that are not included in this manual.

  6. A study of compositional verification based IMA integration method

    NASA Astrophysics Data System (ADS)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  7. Testing Dialog-Verification of SIP Phones with Single-Message Denial-of-Service Attacks

    NASA Astrophysics Data System (ADS)

    Seedorf, Jan; Beckers, Kristian; Huici, Felipe

    The Session Initiation Protocol (SIP) is widely used for signaling in multimedia communications. However, many SIP implementations are still in their infancy and vulnerable to malicious messages. We investigate flaws in the SIP implementations of eight phones, showing that the deficient verification of SIP dialogs further aggravates the problem by making it easier for attacks to succeed. Our results show that the majority of the phones we tested are susceptible to these attacks.

  8. Verification assessment of piston boundary conditions for Lagrangian simulation of compressible flow similarity solutions

    DOE PAGES

    Ramsey, Scott D.; Ivancic, Philip R.; Lilieholm, Jennifer F.

    2015-12-10

    This work is concerned with the use of similarity solutions of the compressible flow equations as benchmarks or verification test problems for finite-volume compressible flow simulation software. In practice, this effort can be complicated by the infinite spatial/temporal extent of many candidate solutions or “test problems.” Methods can be devised with the intention of ameliorating this inconsistency with the finite nature of computational simulation; the exact strategy will depend on the code and problem archetypes under investigation. For example, self-similar shock wave propagation can be represented in Lagrangian compressible flow simulations as rigid boundary-driven flow, even if no such “piston”more » is present in the counterpart mathematical similarity solution. The purpose of this work is to investigate in detail the methodology of representing self-similar shock wave propagation as a piston-driven flow in the context of various test problems featuring simple closed-form solutions of infinite spatial/temporal extent. The closed-form solutions allow for the derivation of similarly closed-form piston boundary conditions (BCs) for use in Lagrangian compressible flow solvers. Finally, the consequences of utilizing these BCs (as opposed to directly initializing the self-similar solution in a computational spatial grid) are investigated in terms of common code verification analysis metrics (e.g., shock strength/position errors and global convergence rates).« less

  9. Verification assessment of piston boundary conditions for Lagrangian simulation of compressible flow similarity solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey, Scott D.; Ivancic, Philip R.; Lilieholm, Jennifer F.

    This work is concerned with the use of similarity solutions of the compressible flow equations as benchmarks or verification test problems for finite-volume compressible flow simulation software. In practice, this effort can be complicated by the infinite spatial/temporal extent of many candidate solutions or “test problems.” Methods can be devised with the intention of ameliorating this inconsistency with the finite nature of computational simulation; the exact strategy will depend on the code and problem archetypes under investigation. For example, self-similar shock wave propagation can be represented in Lagrangian compressible flow simulations as rigid boundary-driven flow, even if no such “piston”more » is present in the counterpart mathematical similarity solution. The purpose of this work is to investigate in detail the methodology of representing self-similar shock wave propagation as a piston-driven flow in the context of various test problems featuring simple closed-form solutions of infinite spatial/temporal extent. The closed-form solutions allow for the derivation of similarly closed-form piston boundary conditions (BCs) for use in Lagrangian compressible flow solvers. Finally, the consequences of utilizing these BCs (as opposed to directly initializing the self-similar solution in a computational spatial grid) are investigated in terms of common code verification analysis metrics (e.g., shock strength/position errors and global convergence rates).« less

  10. Tethered satellite system dynamics and control review panel and related activities, phase 3

    NASA Technical Reports Server (NTRS)

    1991-01-01

    Two major tests of the Tethered Satellite System (TSS) engineering and flight units were conducted to demonstrate the functionality of the hardware and software. Deficiencies in the hardware/software integration tests (HSIT) led to a recommendation for more testing to be performed. Selected problem areas of tether dynamics were analyzed, including verification of the severity of skip rope oscillations, verification or comparison runs to explore dynamic phenomena observed in other simulations, and data generation runs to explore the performance of the time domain and frequency domain skip rope observers.

  11. Computational-hydrodynamic studies of the Noh compressible flow problem using non-ideal equations of state

    NASA Astrophysics Data System (ADS)

    Honnell, Kevin; Burnett, Sarah; Yorke, Chloe'; Howard, April; Ramsey, Scott

    2017-06-01

    The Noh problem is classic verification problem in the field of compressible flows. Simple to conceptualize, it is nonetheless difficult for numerical codes to predict correctly, making it an ideal code-verification test bed. In its original incarnation, the fluid is a simple ideal gas; once validated, however, these codes are often used to study highly non-ideal fluids and solids. In this work the classic Noh problem is extended beyond the commonly-studied polytropic ideal gas to more realistic equations of state (EOS) including the stiff gas, the Nobel-Abel gas, and the Carnahan-Starling hard-sphere fluid, thus enabling verification studies to be performed on more physically-realistic fluids. Exact solutions are compared with numerical results obtained from the Lagrangian hydrocode FLAG, developed at Los Alamos. For these more realistic EOSs, the simulation errors decreased in magnitude both at the origin and at the shock, but also spread more broadly about these points compared to the ideal EOS. The overall spatial convergence rate remained first order.

  12. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less

  13. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example.

    PubMed

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D

    2014-03-01

    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. Verification bias an underrecognized source of error in assessing the efficacy of medical imaging.

    PubMed

    Petscavage, Jonelle M; Richardson, Michael L; Carr, Robert B

    2011-03-01

    Diagnostic tests are validated by comparison against a "gold standard" reference test. When the reference test is invasive or expensive, it may not be applied to all patients. This can result in biased estimates of the sensitivity and specificity of the diagnostic test. This type of bias is called "verification bias," and is a common problem in imaging research. The purpose of our study is to estimate the prevalence of verification bias in the recent radiology literature. All issues of the American Journal of Roentgenology (AJR), Academic Radiology, Radiology, and European Journal of Radiology (EJR) between November 2006 and October 2009 were reviewed for original research articles mentioning sensitivity or specificity as endpoints. Articles were read to determine whether verification bias was present and searched for author recognition of verification bias in the design. During 3 years, these journals published 2969 original research articles. A total of 776 articles used sensitivity or specificity as an outcome. Of these, 211 articles demonstrated potential verification bias. The fraction of articles with potential bias was respectively 36.4%, 23.4%, 29.5%, and 13.4% for AJR, Academic Radiology, Radiology, and EJR. The total fraction of papers with potential bias in which the authors acknowledged this bias was 17.1%. Verification bias is a common and frequently unacknowledged source of error in efficacy studies of diagnostic imaging. Bias can often be eliminated by proper study design. When it cannot be eliminated, it should be estimated and acknowledged. Published by Elsevier Inc.

  15. Assessment of a hybrid finite element and finite volume code for turbulent incompressible flows

    DOE PAGES

    Xia, Yidong; Wang, Chuanjin; Luo, Hong; ...

    2015-12-15

    Hydra-TH is a hybrid finite-element/finite-volume incompressible/low-Mach flow simulation code based on the Hydra multiphysics toolkit being developed and used for thermal-hydraulics applications. In the present work, a suite of verification and validation (V&V) test problems for Hydra-TH was defined to meet the design requirements of the Consortium for Advanced Simulation of Light Water Reactors (CASL). The intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the Hydra-TH solution methods. The simulation problems vary in complexity from laminar to turbulent flows. A set of RANS and LES turbulence models were used in themore » simulation of four classical test problems. Numerical results obtained by Hydra-TH agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these turbulence models in Hydra-TH. Where possible, we have attempted some form of solution verification to identify sensitivities in the solution methods, and to suggest best practices when using the Hydra-TH code.« less

  16. Assessment of a hybrid finite element and finite volume code for turbulent incompressible flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Yidong; Wang, Chuanjin; Luo, Hong

    Hydra-TH is a hybrid finite-element/finite-volume incompressible/low-Mach flow simulation code based on the Hydra multiphysics toolkit being developed and used for thermal-hydraulics applications. In the present work, a suite of verification and validation (V&V) test problems for Hydra-TH was defined to meet the design requirements of the Consortium for Advanced Simulation of Light Water Reactors (CASL). The intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the Hydra-TH solution methods. The simulation problems vary in complexity from laminar to turbulent flows. A set of RANS and LES turbulence models were used in themore » simulation of four classical test problems. Numerical results obtained by Hydra-TH agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these turbulence models in Hydra-TH. Where possible, we have attempted some form of solution verification to identify sensitivities in the solution methods, and to suggest best practices when using the Hydra-TH code.« less

  17. Verification Tests for Sierra/SM's Reproducing Kernal Particle Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giffin, Brian D.

    2015-09-01

    This report seeks to verify the proper implemention of RKPM within Sierra by comparing the results from several basic example problems excecuted with RKPM against the analytical and FEM solutions for these same problems. This report was compiled as a summer student intern project.

  18. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Jun Soo; Choi, Yong Joon

    The RELAP-7 code verification and validation activities are ongoing under the code assessment plan proposed in the previous document (INL-EXT-16-40015). Among the list of V&V test problems in the ‘RELAP-7 code V&V RTM (Requirements Traceability Matrix)’, the RELAP-7 7-equation model has been tested with additional demonstration problems and the results of these tests are reported in this document. In this report, we describe the testing process, the test cases that were conducted, and the results of the evaluation.

  20. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  1. Bias in estimating accuracy of a binary screening test with differential disease verification

    PubMed Central

    Brinton, John T.; Ringham, Brandy M.; Glueck, Deborah H.

    2011-01-01

    SUMMARY Sensitivity, specificity, positive and negative predictive value are typically used to quantify the accuracy of a binary screening test. In some studies it may not be ethical or feasible to obtain definitive disease ascertainment for all subjects using a gold standard test. When a gold standard test cannot be used an imperfect reference test that is less than 100% sensitive and specific may be used instead. In breast cancer screening, for example, follow-up for cancer diagnosis is used as an imperfect reference test for women where it is not possible to obtain gold standard results. This incomplete ascertainment of true disease, or differential disease verification, can result in biased estimates of accuracy. In this paper, we derive the apparent accuracy values for studies subject to differential verification. We determine how the bias is affected by the accuracy of the imperfect reference test, the percent who receive the imperfect reference standard test not receiving the gold standard, the prevalence of the disease, and the correlation between the results for the screening test and the imperfect reference test. It is shown that designs with differential disease verification can yield biased estimates of accuracy. Estimates of sensitivity in cancer screening trials may be substantially biased. However, careful design decisions, including selection of the imperfect reference test, can help to minimize bias. A hypothetical breast cancer screening study is used to illustrate the problem. PMID:21495059

  2. Status of BOUT fluid turbulence code: improvements and verification

    NASA Astrophysics Data System (ADS)

    Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.

    2006-10-01

    BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.

  3. A new technique for measuring listening and reading literacy in developing countries

    NASA Astrophysics Data System (ADS)

    Greene, Barbara A.; Royer, James M.; Anzalone, Stephen

    1990-03-01

    One problem in evaluating educational interventions in developing countries is the absence of tests that adequately reflect the culture and curriculum. The Sentence Verification Technique is a new procedure for measuring reading and listening comprehension that allows for the development of tests based on materials indigenous to a given culture. The validity of using the Sentence Verification Technique to measure reading comprehension in Grenada was evaluated in the present study. The study involved 786 students at standards 3, 4 and 5. The tests for each standard consisted of passages that varied in difficulty. The students identified as high ability students in all three standards performed better than those identified as low ability. All students performed better with easier passages. Additionally, students in higher standards performed bettter than students in lower standards on a given passage. These results supported the claim that the Sentence Verification Technique is a valid measure of reading comprehension in Grenada.

  4. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy; Kim, Youngkwang; Conway, Claire; Conway, Darrel J.

    2017-01-01

    This paper describes the processes and results of Verification and Validation (VV) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The VV effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  5. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy (Compiler); Kim, Youngkwang; Conway, Claire (Compiler); Conway, Darrel

    2017-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  6. Verification of ANSYS Fluent and OpenFOAM CFD platforms for prediction of impact flow

    NASA Astrophysics Data System (ADS)

    Tisovská, Petra; Peukert, Pavel; Kolář, Jan

    The main goal of the article is a verification of the heat transfer coefficient numerically predicted by two CDF platforms - ANSYS-Fluent and OpenFOAM on the problem of impact flows oncoming from 2D nozzle. Various mesh parameters and solver settings were tested under several boundary conditions and compared to known experimental results. The best solver setting, suitable for further optimization of more complex geometry is evaluated.

  7. Comprehensive test ban negotiations

    NASA Astrophysics Data System (ADS)

    Grab, G. Allen; Heckrotte, Warren

    1983-10-01

    Although it has been a stated policy goal of American and Soviet leaders since 1958 (with the exception of Ronald Reagan), the world today is still without a Comprehensive Test Ban Treaty. Throughout their history, test an negotiatins have been plagued by a number of persistent problems. Chief among these is East-West differences on the verification question, with the United States concerned about the problem of possible Soviet cheating and the USSR concerned about the protection of its national sovereignty. In addition, internal bureaucratic politics have played a major role in preventing the successful conclusion of an agreement. Despite these problems, the superpowers have concluded several significant partial meausres: a brief (1958-1961) total moratorium on nuclear weapons tests; the Limited Test Ban Treaty of 1963, banning tests in the air, water and outer space; the Threshold Test Ban Treaty of 1974 (150 KT limit on underground explosions); and the Peaceful Nuclear Explosions Treaty of 1976 (150 KT limit on individal PNEs). Today, the main U.S. objections to a CTBT center is the nuclear weapons laboratories, the Department of Energy, and the Pentagon, who all stress the issues of stockpile reliability and verification. Those who remain committed to a CTBT emphasize and the potential political leverage it offers in checking both horizontal and vertical proliferation.

  8. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  9. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  10. APPLICATION OF STEEL PIPE PILE LOADING TESTS TO DESIGN VERIFICATION OF FOUNDATION OF THE TOKYO GATE BRIDGE

    NASA Astrophysics Data System (ADS)

    Saitou, Yutaka; Kikuchi, Yoshiaki; Kusakabe, Osamu; Kiyomiya, Osamu; Yoneyama, Haruo; Kawakami, Taiji

    Steel sheet pipe pile foundations with large diameter steel pipe sheet pile were used for the foundation of the main pier of the Tokyo Gateway bridge. However, as for the large diameter steel pipe pile, the bearing mechanism including a pile tip plugging effect is still unclear due to lack of the practical examinations even though loading tests are performed on Trans-Tokyo Bay Highway. In the light of the foregoing problems, static pile loading tests both vertical and horizontal directions, a dynamic loading test, and cone penetration tests we re conducted for determining proper design parameters of the ground for the foundations. Design parameters were determined rationally based on the tests results. Rational design verification was obtained from this research.

  11. Simulation and Analysis of Converging Shock Wave Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey, Scott D.; Shashkov, Mikhail J.

    2012-06-21

    Results and analysis pertaining to the simulation of the Guderley converging shock wave test problem (and associated code verification hydrodynamics test problems involving converging shock waves) in the LANL ASC radiation-hydrodynamics code xRAGE are presented. One-dimensional (1D) spherical and two-dimensional (2D) axi-symmetric geometric setups are utilized and evaluated in this study, as is an instantiation of the xRAGE adaptive mesh refinement capability. For the 2D simulations, a 'Surrogate Guderley' test problem is developed and used to obviate subtleties inherent to the true Guderley solution's initialization on a square grid, while still maintaining a high degree of fidelity to the originalmore » problem, and minimally straining the general credibility of associated analysis and conclusions.« less

  12. BIGHORN Computational Fluid Dynamics Theory, Methodology, and Code Verification & Validation Benchmark Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Yidong; Andrs, David; Martineau, Richard Charles

    This document presents the theoretical background for a hybrid finite-element / finite-volume fluid flow solver, namely BIGHORN, based on the Multiphysics Object Oriented Simulation Environment (MOOSE) computational framework developed at the Idaho National Laboratory (INL). An overview of the numerical methods used in BIGHORN are discussed and followed by a presentation of the formulation details. The document begins with the governing equations for the compressible fluid flow, with an outline of the requisite constitutive relations. A second-order finite volume method used for solving the compressible fluid flow problems is presented next. A Pressure-Corrected Implicit Continuous-fluid Eulerian (PCICE) formulation for timemore » integration is also presented. The multi-fluid formulation is being developed. Although multi-fluid is not fully-developed, BIGHORN has been designed to handle multi-fluid problems. Due to the flexibility in the underlying MOOSE framework, BIGHORN is quite extensible, and can accommodate both multi-species and multi-phase formulations. This document also presents a suite of verification & validation benchmark test problems for BIGHORN. The intent for this suite of problems is to provide baseline comparison data that demonstrates the performance of the BIGHORN solution methods on problems that vary in complexity from laminar to turbulent flows. Wherever possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using BIGHORN.« less

  13. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.

  14. A Single-Boundary Accumulator Model of Response Times in an Addition Verification Task

    PubMed Central

    Faulkenberry, Thomas J.

    2017-01-01

    Current theories of mathematical cognition offer competing accounts of the interplay between encoding and calculation in mental arithmetic. Additive models propose that manipulations of problem format do not interact with the cognitive processes used in calculation. Alternatively, interactive models suppose that format manipulations have a direct effect on calculation processes. In the present study, we tested these competing models by fitting participants' RT distributions in an arithmetic verification task with a single-boundary accumulator model (the shifted Wald distribution). We found that in addition to providing a more complete description of RT distributions, the accumulator model afforded a potentially more sensitive test of format effects. Specifically, we found that format affected drift rate, which implies that problem format has a direct impact on calculation processes. These data give further support for an interactive model of mental arithmetic. PMID:28769853

  15. Warhead verification as inverse problem: Applications of neutron spectrum unfolding from organic-scintillator measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lawrence, Chris C.; Flaska, Marek; Pozzi, Sara A.

    2016-08-14

    Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrixmore » condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.« less

  16. Warhead verification as inverse problem: Applications of neutron spectrum unfolding from organic-scintillator measurements

    NASA Astrophysics Data System (ADS)

    Lawrence, Chris C.; Febbraro, Michael; Flaska, Marek; Pozzi, Sara A.; Becchetti, F. D.

    2016-08-01

    Verification of future warhead-dismantlement treaties will require detection of certain warhead attributes without the disclosure of sensitive design information, and this presents an unusual measurement challenge. Neutron spectroscopy—commonly eschewed as an ill-posed inverse problem—may hold special advantages for warhead verification by virtue of its insensitivity to certain neutron-source parameters like plutonium isotopics. In this article, we investigate the usefulness of unfolded neutron spectra obtained from organic-scintillator data for verifying a particular treaty-relevant warhead attribute: the presence of high-explosive and neutron-reflecting materials. Toward this end, several improvements on current unfolding capabilities are demonstrated: deuterated detectors are shown to have superior response-matrix condition to that of standard hydrogen-base scintintillators; a novel data-discretization scheme is proposed which removes important detector nonlinearities; and a technique is described for re-parameterizing the unfolding problem in order to constrain the parameter space of solutions sought, sidestepping the inverse problem altogether. These improvements are demonstrated with trial measurements and verified using accelerator-based time-of-flight calculation of reference spectra. Then, a demonstration is presented in which the elemental compositions of low-Z neutron-attenuating materials are estimated to within 10%. These techniques could have direct application in verifying the presence of high-explosive materials in a neutron-emitting test item, as well as other for treaty verification challenges.

  17. The escape of high explosive products: An exact-solution problem for verification of hydrodynamics codes

    DOE PAGES

    Doebling, Scott William

    2016-10-22

    This paper documents the escape of high explosive (HE) products problem. The problem, first presented by Fickett & Rivard, tests the implementation and numerical behavior of a high explosive detonation and energy release model and its interaction with an associated compressible hydrodynamics simulation code. The problem simulates the detonation of a finite-length, one-dimensional piece of HE that is driven by a piston from one end and adjacent to a void at the other end. The HE equation of state is modeled as a polytropic ideal gas. The HE detonation is assumed to be instantaneous with an infinitesimal reaction zone. Viamore » judicious selection of the material specific heat ratio, the problem has an exact solution with linear characteristics, enabling a straightforward calculation of the physical variables as a function of time and space. Lastly, implementation of the exact solution in the Python code ExactPack is discussed, as are verification cases for the exact solution code.« less

  18. Real-Time Ada Problem Study

    DTIC Science & Technology

    1989-03-24

    Specified Test Verification Matri_ .. 39 3.2.6.5 Test Generation Assistance. .............. . .. ......... 40 3.2.7 Maintenance...lack of intimate knowledge of how the runtime links to the compiler generated code. Furthermore, the runime must meet a rigorous set of tests to insure...projects, and is not provided. Along with the library, a set of tests should be provided to verify the accuracy of the library after changes have been

  19. Practical Formal Verification of Diagnosability of Large Models via Symbolic Model Checking

    NASA Technical Reports Server (NTRS)

    Cavada, Roberto; Pecheur, Charles

    2003-01-01

    This document reports on the activities carried out during a four-week visit of Roberto Cavada at the NASA Ames Research Center. The main goal was to test the practical applicability of the framework proposed, where a diagnosability problem is reduced to a Symbolic Model Checking problem. Section 2 contains a brief explanation of major techniques currently used in Symbolic Model Checking, and how these techniques can be tuned in order to obtain good performances when using Model Checking tools. Diagnosability is performed on large and structured models of real plants. Section 3 describes how these plants are modeled, and how models can be simplified to improve the performance of Symbolic Model Checkers. Section 4 reports scalability results. Three test cases are briefly presented, and several parameters and techniques have been applied on those test cases in order to produce comparison tables. Furthermore, comparison between several Model Checkers is reported. Section 5 summarizes the application of diagnosability verification to a real application. Several properties have been tested, and results have been highlighted. Finally, section 6 draws some conclusions, and outlines future lines of research.

  20. PERFORMANCE TESTING OF AIR CLEANING PRODUCTS

    EPA Science Inventory

    The paper discuses the application of the Environmental Technology Verification (ETV) Program for products that clean ventilation air to the problem of protecting buildings from chemical and biological attack. This program is funded by the U.S. Environmental Protection Agency und...

  1. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  2. NAS Grid Benchmarks. 1.0

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob; Frumkin, Michael; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We provide a paper-and-pencil specification of a benchmark suite for computational grids. It is based on the NAS (NASA Advanced Supercomputing) Parallel Benchmarks (NPB) and is called the NAS Grid Benchmarks (NGB). NGB problems are presented as data flow graphs encapsulating an instance of a slightly modified NPB task in each graph node, which communicates with other nodes by sending/receiving initialization data. Like NPB, NGB specifies several different classes (problem sizes). In this report we describe classes S, W, and A, and provide verification values for each. The implementor has the freedom to choose any language, grid environment, security model, fault tolerance/error correction mechanism, etc., as long as the resulting implementation passes the verification test and reports the turnaround time of the benchmark.

  3. Assessment of a Hybrid Continuous/Discontinuous Galerkin Finite Element Code for Geothermal Reservoir Simulations

    DOE PAGES

    Xia, Yidong; Podgorney, Robert; Huang, Hai

    2016-03-17

    FALCON (“Fracturing And Liquid CONvection”) is a hybrid continuous / discontinuous Galerkin finite element geothermal reservoir simulation code based on the MOOSE (“Multiphysics Object-Oriented Simulation Environment”) framework being developed and used for multiphysics applications. In the present work, a suite of verification and validation (“V&V”) test problems for FALCON was defined to meet the design requirements, and solved to the interests of enhanced geothermal system (“EGS”) design. Furthermore, the intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the FALCON solution methods. The simulation problems vary in complexity from singly mechanical ormore » thermo process, to coupled thermo-hydro-mechanical processes in geological porous media. Numerical results obtained by FALCON agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these capabilities in FALCON. Some form of solution verification has been attempted to identify sensitivities in the solution methods, where possible, and suggest best practices when using the FALCON code.« less

  4. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  5. Students' Use of Technological Tools for Verification Purposes in Geometry Problem Solving

    ERIC Educational Resources Information Center

    Papadopoulos, Ioannis; Dagdilelis, Vassilios

    2008-01-01

    Despite its importance in mathematical problem solving, verification receives rather little attention by the students in classrooms, especially at the primary school level. Under the hypotheses that (a) non-standard tasks create a feeling of uncertainty that stimulates the students to proceed to verification processes and (b) computational…

  6. Compressive sensing using optimized sensing matrix for face verification

    NASA Astrophysics Data System (ADS)

    Oey, Endra; Jeffry; Wongso, Kelvin; Tommy

    2017-12-01

    Biometric appears as one of the solutions which is capable in solving problems that occurred in the usage of password in terms of data access, for example there is possibility in forgetting password and hard to recall various different passwords. With biometrics, physical characteristics of a person can be captured and used in the identification process. In this research, facial biometric is used in the verification process to determine whether the user has the authority to access the data or not. Facial biometric is chosen as its low cost implementation and generate quite accurate result for user identification. Face verification system which is adopted in this research is Compressive Sensing (CS) technique, in which aims to reduce dimension size as well as encrypt data in form of facial test image where the image is represented in sparse signals. Encrypted data can be reconstructed using Sparse Coding algorithm. Two types of Sparse Coding namely Orthogonal Matching Pursuit (OMP) and Iteratively Reweighted Least Squares -ℓp (IRLS-ℓp) will be used for comparison face verification system research. Reconstruction results of sparse signals are then used to find Euclidean norm with the sparse signal of user that has been previously saved in system to determine the validity of the facial test image. Results of system accuracy obtained in this research are 99% in IRLS with time response of face verification for 4.917 seconds and 96.33% in OMP with time response of face verification for 0.4046 seconds with non-optimized sensing matrix, while 99% in IRLS with time response of face verification for 13.4791 seconds and 98.33% for OMP with time response of face verification for 3.1571 seconds with optimized sensing matrix.

  7. Vibroacoustic Response of Pad Structures to Space Shuttle Launch Acoustic Loads

    NASA Technical Reports Server (NTRS)

    Margasahayam, R. N.; Caimi, Raoul E.

    1995-01-01

    This paper presents a deterministic theory for the random vibration problem for predicting the response of structures in the low-frequency range (0 to 20 hertz) of launch transients. Also presented are some innovative ways to characterize noise and highlights of ongoing test-analysis correlation efforts titled the Verification Test Article (VETA) project.

  8. Expert system verification and validation guidelines/workshop task. Deliverable no. 1: ES V/V guidelines

    NASA Technical Reports Server (NTRS)

    French, Scott W.

    1991-01-01

    The goals are to show that verifying and validating a software system is a required part of software development and has a direct impact on the software's design and structure. Workshop tasks are given in the areas of statistics, integration/system test, unit and architectural testing, and a traffic controller problem.

  9. Problems experienced and envisioned for dynamical physical systems

    NASA Technical Reports Server (NTRS)

    Ryan, R. S.

    1985-01-01

    The use of high performance systems, which is the trend of future space systems, naturally leads to lower margins and a higher sensitivity to parameter variations and, therefore, more problems of dynamical physical systems. To circumvent dynamic problems of these systems, appropriate design, verification analysis, and tests must be planned and conducted. The basic design goal is to define the problem before it occurs. The primary approach for meeting this goal is a good understanding and reviewing of the problems experienced in the past in terms of the system under design. This paper reviews many of the dynamic problems experienced in space systems design and operation, categorizes them as to causes, and envisions future program implications, developing recommendations for analysis and test approaches.

  10. Verification of MCNP6.2 for Nuclear Criticality Safety Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2017-05-10

    Several suites of verification/validation benchmark problems were run in early 2017 to verify that the new production release of MCNP6.2 performs correctly for nuclear criticality safety applications (NCS). MCNP6.2 results for several NCS validation suites were compared to the results from MCNP6.1 [1] and MCNP6.1.1 [2]. MCNP6.1 is the production version of MCNP® released in 2013, and MCNP6.1.1 is the update released in 2014. MCNP6.2 includes all of the standard features for NCS calculations that have been available for the past 15 years, along with new features for sensitivity-uncertainty based methods for NCS validation [3]. Results from the benchmark suitesmore » were compared with results from previous verification testing [4-8]. Criticality safety analysts should consider testing MCNP6.2 on their particular problems and validation suites. No further development of MCNP5 is planned. MCNP6.1 is now 4 years old, and MCNP6.1.1 is now 3 years old. In general, released versions of MCNP are supported only for about 5 years, due to resource limitations. All future MCNP improvements, bug fixes, user support, and new capabilities are targeted only to MCNP6.2 and beyond.« less

  11. Litmus tests for verification of feeding tube location in infants: evaluation of their clinical use.

    PubMed

    Nyqvist, Kerstin Hedberg; Sorell, Annette; Ewald, Uwe

    2005-04-01

    To examine the clinical use of litmus paper tests for the assessment of aspirates in infants. In connection with establishing a programme for home care of infants with requirement of tube feeding with parents as the infants' carers, the need for a research-based method for verification of feeding tube position was identified by nurses as a complement to other methods. In adult care the litmus paper test is commonly used when visual inspection is not sufficient for assessment of aspirates obtained from feeding tubes. Observational study. Nurses performed litmus tests for verification of feeding tube location in a convenience sample of 60 infants born at a gestational age (GA) of 24-42 weeks. Presence/absence and volumes of aspirates were recorded as well as positive/negative litmus test reactions. Analyses on the association between test results and the infants' GA and postmenstrual and postnatal age at the time of the tests were conducted. Data were obtained from 2970 tube feeds. Aspirates were present on 1840 occasions (62%). A higher proportion of infants with absence of aspirates were born at a GA below 32 weeks. A positive reaction occurred in 97% of the tests in volumes between 0.01 and 22 ml. Birth at a GA below 32 weeks and respiratory problems were associated with negative tests. The high ratio of positive litmus reactions at all maturational levels supports the bedside use of analysis of pH in gastric aspirates for verification of feeding tube location. Application of pH indicator paper is recommended as a complementary method for assessment of aspirates from feeding tubes.

  12. A feasibility study of treatment verification using EPID cine images for hypofractionated lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoli; Lin, Tong; Jiang, Steve

    2009-09-01

    We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  13. Software Quality Assurance and Verification for the MPACT Library Generation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yuxuan; Williams, Mark L.; Wiarda, Dorothea

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX andmore » VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.« less

  14. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  15. Inspection and Verification of Domain Models with PlanWorks and Aver

    NASA Technical Reports Server (NTRS)

    Bedrax-Weiss, Tania; Frank, Jeremy; Iatauro, Michael; McGann, Conor

    2006-01-01

    When developing a domain model, it seems natural to bring the traditional informal tools of inspection and verification, debuggers and automated test suites, to bear upon the problems that will inevitably arise. Debuggers that allow inspection of registers and memory and stepwise execution have been a staple of software development of all sorts from the very beginning. Automated testing has repeatedly proven its considerable worth, to the extent that an entire design philosophy (Test Driven Development) has been developed around the writing of tests. Unfortunately, while not entirely without their uses, the limitations of these tools and the nature of the complexity of models and the underlying planning systems make the diagnosis of certain classes of problems and the verification of their solutions difficult or impossible. Debuggers provide a good local view of executing code, allowing a fine-grained look at algorithms and data. This view is, however, usually only at the level of the current scope in the implementation language, and the data-inspection capabilities of most debuggers usually consist of on-line print statements. More modem graphical debuggers offer a sort of tree view of data structures, but even this is too low-level and is often inappropriate for the kinds of structures created by planning systems. For instance, god or constraint networks are at best awkward when visualized as trees. Any any non-structural link between data structures, as through a lookup table, isn't captured at all. Further, while debuggers have powerful breakpointing facilities that are suitable for finding specific algorithmic errors, they have little use in the diagnosis of modeling errors.

  16. Verification of a Constraint Force Equation Methodology for Modeling Multi-Body Stage Separation

    NASA Technical Reports Server (NTRS)

    Tartabini, Paul V.; Roithmayr, Carlos; Toniolo, Matthew D.; Karlgaard, Christopher; Pamadi, Bandu N.

    2008-01-01

    This paper discusses the verification of the Constraint Force Equation (CFE) methodology and its implementation in the Program to Optimize Simulated Trajectories II (POST2) for multibody separation problems using three specially designed test cases. The first test case involves two rigid bodies connected by a fixed joint; the second case involves two rigid bodies connected with a universal joint; and the third test case is that of Mach 7 separation of the Hyper-X vehicle. For the first two cases, the POST2/CFE solutions compared well with those obtained using industry standard benchmark codes, namely AUTOLEV and ADAMS. For the Hyper-X case, the POST2/CFE solutions were in reasonable agreement with the flight test data. The CFE implementation in POST2 facilitates the analysis and simulation of stage separation as an integral part of POST2 for seamless end-to-end simulations of launch vehicle trajectories.

  17. Crowd Sourced Formal Verification-Augmentation (CSFV-A)

    DTIC Science & Technology

    2016-06-01

    Formal Verification (CSFV) program built games that recast FV problems into puzzles to make these problems more accessible, increasing the manpower to...construct FV proofs. This effort supported the CSFV program by hosting the games on a public website, and analyzed the gameplay for efficiency to...provide FV proofs. 15. SUBJECT TERMS Crowd Source, Software, Formal Verification, Games 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT

  18. Development, Verification and Validation of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.

  19. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  20. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  1. Model verification of large structural systems

    NASA Technical Reports Server (NTRS)

    Lee, L. T.; Hasselman, T. K.

    1977-01-01

    A methodology was formulated, and a general computer code implemented for processing sinusoidal vibration test data to simultaneously make adjustments to a prior mathematical model of a large structural system, and resolve measured response data to obtain a set of orthogonal modes representative of the test model. The derivation of estimator equations is shown along with example problems. A method for improving the prior analytic model is included.

  2. Assessment of a hybrid finite element and finite volume code for turbulent incompressible flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Yidong, E-mail: yidong.xia@inl.gov; Wang, Chuanjin; Luo, Hong

    Hydra-TH is a hybrid finite-element/finite-volume incompressible/low-Mach flow simulation code based on the Hydra multiphysics toolkit being developed and used for thermal-hydraulics applications. In the present work, a suite of verification and validation (V&V) test problems for Hydra-TH was defined to meet the design requirements of the Consortium for Advanced Simulation of Light Water Reactors (CASL). The intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the Hydra-TH solution methods. The simulation problems vary in complexity from laminar to turbulent flows. A set of RANS and LES turbulence models were used in themore » simulation of four classical test problems. Numerical results obtained by Hydra-TH agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these turbulence models in Hydra-TH. Where possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using the Hydra-TH code. -- Highlights: •We performed a comprehensive study to verify and validate the turbulence models in Hydra-TH. •Hydra-TH delivers 2nd-order grid convergence for the incompressible Navier–Stokes equations. •Hydra-TH can accurately simulate the laminar boundary layers. •Hydra-TH can accurately simulate the turbulent boundary layers with RANS turbulence models. •Hydra-TH delivers high-fidelity LES capability for simulating turbulent flows in confined space.« less

  3. Design Authority in the Test Programme Definition: The Alenia Spazio Experience

    NASA Astrophysics Data System (ADS)

    Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.

    2004-08-01

    In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).

  4. Speaker verification using committee neural networks.

    PubMed

    Reddy, Narender P; Buch, Ojas A

    2003-10-01

    Security is a major problem in web based access or remote access to data bases. In the present study, the technique of committee neural networks was developed for speech based speaker verification. Speech data from the designated speaker and several imposters were obtained. Several parameters were extracted in the time and frequency domains, and fed to neural networks. Several neural networks were trained and the five best performing networks were recruited into the committee. The committee decision was based on majority voting of the member networks. The committee opinion was evaluated with further testing data. The committee correctly identified the designated speaker in (50 out of 50) 100% of the cases and rejected imposters in (150 out of 150) 100% of the cases. The committee decision was not unanimous in majority of the cases tested.

  5. Software engineering and automatic continuous verification of scientific software

    NASA Astrophysics Data System (ADS)

    Piggott, M. D.; Hill, J.; Farrell, P. E.; Kramer, S. C.; Wilson, C. R.; Ham, D.; Gorman, G. J.; Bond, T.

    2011-12-01

    Software engineering of scientific code is challenging for a number of reasons including pressure to publish and a lack of awareness of the pitfalls of software engineering by scientists. The Applied Modelling and Computation Group at Imperial College is a diverse group of researchers that employ best practice software engineering methods whilst developing open source scientific software. Our main code is Fluidity - a multi-purpose computational fluid dynamics (CFD) code that can be used for a wide range of scientific applications from earth-scale mantle convection, through basin-scale ocean dynamics, to laboratory-scale classic CFD problems, and is coupled to a number of other codes including nuclear radiation and solid modelling. Our software development infrastructure consists of a number of free tools that could be employed by any group that develops scientific code and has been developed over a number of years with many lessons learnt. A single code base is developed by over 30 people for which we use bazaar for revision control, making good use of the strong branching and merging capabilities. Using features of Canonical's Launchpad platform, such as code review, blueprints for designing features and bug reporting gives the group, partners and other Fluidity uers an easy-to-use platform to collaborate and allows the induction of new members of the group into an environment where software development forms a central part of their work. The code repositoriy are coupled to an automated test and verification system which performs over 20,000 tests, including unit tests, short regression tests, code verification and large parallel tests. Included in these tests are build tests on HPC systems, including local and UK National HPC services. The testing of code in this manner leads to a continuous verification process; not a discrete event performed once development has ceased. Much of the code verification is done via the "gold standard" of comparisons to analytical solutions via the method of manufactured solutions. By developing and verifying code in tandem we avoid a number of pitfalls in scientific software development and advocate similar procedures for other scientific code applications.

  6. Space Shuttle Plume and Plume Impingement Study

    NASA Technical Reports Server (NTRS)

    Tevepaugh, J. A.; Penny, M. M.

    1977-01-01

    The extent of the influence of the propulsion system exhaust plumes on the vehicle performance and control characteristics is a complex function of vehicle geometry, propulsion system geometry, engine operating conditions and vehicle flight trajectory were investigated. Analytical support of the plume technology test program was directed at the two latter problem areas: (1) definition of the full-scale exhaust plume characteristics, (2) application of appropriate similarity parameters; and (3) analysis of wind tunnel test data. Verification of the two-phase plume and plume impingement models was directed toward the definition of the full-scale exhaust plume characteristics and the separation motor impingement problem.

  7. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: II. Experimental validation under varying temperature

    NASA Astrophysics Data System (ADS)

    Lin, Y. Q.; Ren, W. X.; Fang, S. E.

    2011-11-01

    Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.

  8. Rule Systems for Runtime Verification: A Short Tutorial

    NASA Astrophysics Data System (ADS)

    Barringer, Howard; Havelund, Klaus; Rydeheard, David; Groce, Alex

    In this tutorial, we introduce two rule-based systems for on and off-line trace analysis, RuleR and LogScope. RuleR is a conditional rule-based system, which has a simple and easily implemented algorithm for effective runtime verification, and into which one can compile a wide range of temporal logics and other specification formalisms used for runtime verification. Specifications can be parameterized with data, or even with specifications, allowing for temporal logic combinators to be defined. We outline a number of simple syntactic extensions of core RuleR that can lead to further conciseness of specification but still enabling easy and efficient implementation. RuleR is implemented in Java and we will demonstrate its ease of use in monitoring Java programs. LogScope is a derivation of RuleR adding a simple very user-friendly temporal logic. It was developed in Python, specifically for supporting testing of spacecraft flight software for NASA’s next 2011 Mars mission MSL (Mars Science Laboratory). The system has been applied by test engineers to analysis of log files generated by running the flight software. Detailed logging is already part of the system design approach, and hence there is no added instrumentation overhead caused by this approach. While post-mortem log analysis prevents the autonomous reaction to problems possible with traditional runtime verification, it provides a powerful tool for test automation. A new system is being developed that integrates features from both RuleR and LogScope.

  9. Microscopy as a statistical, Rényi-Ulam, half-lie game: a new heuristic search strategy to accelerate imaging.

    PubMed

    Drumm, Daniel W; Greentree, Andrew D

    2017-11-07

    Finding a fluorescent target in a biological environment is a common and pressing microscopy problem. This task is formally analogous to the canonical search problem. In ideal (noise-free, truthful) search problems, the well-known binary search is optimal. The case of half-lies, where one of two responses to a search query may be deceptive, introduces a richer, Rényi-Ulam problem and is particularly relevant to practical microscopy. We analyse microscopy in the contexts of Rényi-Ulam games and half-lies, developing a new family of heuristics. We show the cost of insisting on verification by positive result in search algorithms; for the zero-half-lie case bisectioning with verification incurs a 50% penalty in the average number of queries required. The optimal partitioning of search spaces directly following verification in the presence of random half-lies is determined. Trisectioning with verification is shown to be the most efficient heuristic of the family in a majority of cases.

  10. 242A Distributed Control System Year 2000 Acceptance Test Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    TEATS, M.C.

    1999-08-31

    This report documents acceptance test results for the 242-A Evaporator distributive control system upgrade to D/3 version 9.0-2 for year 2000 compliance. This report documents the test results obtained by acceptance testing as directed by procedure HNF-2695. This verification procedure will document the initial testing and evaluation of the potential 242-A Distributed Control System (DCS) operating difficulties across the year 2000 boundary and the calendar adjustments needed for the leap year. Baseline system performance data will be recorded using current, as-is operating system software. Data will also be collected for operating system software that has been modified to correct yearmore » 2000 problems. This verification procedure is intended to be generic such that it may be performed on any D/3{trademark} (GSE Process Solutions, Inc.) distributed control system that runs with the VMSTM (Digital Equipment Corporation) operating system. This test may be run on simulation or production systems depending upon facility status. On production systems, DCS outages will occur nine times throughout performance of the test. These outages are expected to last about 10 minutes each.« less

  11. Modal Test/Analysis Correlation of Space Station Structures Using Nonlinear Sensitivity

    NASA Technical Reports Server (NTRS)

    Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan

    1992-01-01

    The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlation. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.

  12. Modal test/analysis correlation of Space Station structures using nonlinear sensitivity

    NASA Technical Reports Server (NTRS)

    Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan

    1992-01-01

    The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlations. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.

  13. Simulation of Laboratory Tests of Steel Arch Support

    NASA Astrophysics Data System (ADS)

    Horyl, Petr; Šňupárek, Richard; Maršálek, Pavel; Pacześniowski, Krzysztof

    2017-03-01

    The total load-bearing capacity of steel arch yielding roadways supports is among their most important characteristics. These values can be obtained in two ways: experimental measurements in a specialized laboratory or computer modelling by FEM. Experimental measurements are significantly more expensive and more time-consuming. However, for proper tuning, a computer model is very valuable and can provide the necessary verification by experiment. In the cooperating workplaces of GIG Katowice, VSB-Technical University of Ostrava and the Institute of Geonics ASCR this verification was successful. The present article discusses the conditions and results of this verification for static problems. The output is a tuned computer model, which may be used for other calculations to obtain the load-bearing capacity of other types of steel arch supports. Changes in other parameters such as the material properties of steel, size torques, friction coefficient values etc. can be determined relatively quickly by changing the properties of the investigated steel arch supports.

  14. Urine sampling and collection system optimization and testing

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Geating, J. A.; Koesterer, M. G.

    1975-01-01

    A Urine Sampling and Collection System (USCS) engineering model was developed to provide for the automatic collection, volume sensing and sampling of urine from each micturition. The purpose of the engineering model was to demonstrate verification of the system concept. The objective of the optimization and testing program was to update the engineering model, to provide additional performance features and to conduct system testing to determine operational problems. Optimization tasks were defined as modifications to minimize system fluid residual and addition of thermoelectric cooling.

  15. A Verification-Driven Approach to Traceability and Documentation for Auto-Generated Mathematical Software

    NASA Technical Reports Server (NTRS)

    Denney, Ewen W.; Fischer, Bernd

    2009-01-01

    Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.

  16. Circuitbot

    DTIC Science & Technology

    2016-03-01

    constraints problem. Game rules described valid moves allowing player to generate a memory graph performing improved C program verification . 15. SUBJECT...TERMS Formal Verification , Static Analysis, Abstract Interpretation, Pointer Analysis, Fixpoint Iteration 16. SECURITY CLASSIFICATION OF: 17...36 3.4.12 Example: Game Play . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.4.13 Verification

  17. CASL Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mousseau, Vincent Andrew; Dinh, Nam

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation andmore » verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.« less

  18. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  19. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  20. 40 CFR 1066.420 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Pre-test verification procedures and pre-test data collection. 1066.420 Section 1066.420 Protection of Environment ENVIRONMENTAL PROTECTION... Test § 1066.420 Pre-test verification procedures and pre-test data collection. (a) Follow the...

  1. 40 CFR 1066.420 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Pre-test verification procedures and pre-test data collection. 1066.420 Section 1066.420 Protection of Environment ENVIRONMENTAL PROTECTION... Test § 1066.420 Pre-test verification procedures and pre-test data collection. (a) Follow the...

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION TEST PROTOCOL, GENERAL VENTILATION FILTERS

    EPA Science Inventory

    The Environmental Technology Verification Test Protocol, General Ventilation Filters provides guidance for verification tests.

    Reference is made in the protocol to the ASHRAE 52.2P "Method of Testing General Ventilation Air-cleaning Devices for Removal Efficiency by P...

  3. Environmental Technology Verification: Supplement to Test/QA Plan for Biological and Aerosol Testing of General Ventilation Air Cleaners; Bioaerosol Inactivation Efficiency by HVAC In-Duct Ultraviolet Light Air Cleaners

    EPA Science Inventory

    The Air Pollution Control Technology Verification Center has selected general ventilation air cleaners as a technology area. The Generic Verification Protocol for Biological and Aerosol Testing of General Ventilation Air Cleaners is on the Environmental Technology Verification we...

  4. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dunlop, W H

    It is my pleasure to be here to day to participate in this Conference. My thanks to the organizers for preparing such an interesting agenda on a very difficult topic. My effort in preparing my presentation was performed under the auspices of the U.S. Department of Energy by University of California, Lawrence Livermore National Laboratory under Contract W-7405-Eng-48. And as many of you know Lawrence Livermore National Laboratory is now, as of Oct 1st, under contract to the Lawrence Livermore National Security LLC. There has been a long history of how to view verification of arms control agreements. The basismore » for verification during the days of SALT was that verification would be based on each country's national technical means. For treaties dealing with strategic missiles this worked well as the individual items subject to verification were of such a size that they were visible by the National Technical Means available at the time. And it was felt that the counting of missiles and launchers could be verified by our National Technical Means. For nuclear testing treaties the use of seismic measurements developed into a capability that was reasonably robust for all but the smallest of nuclear tests. However, once we had the Threshold Test Ban Treaty, there was a significant problem in that the fidelity of the measurements were not sufficient to determine if a test was slightly above the 150 kt limit or slightly below the 150 kt limit. This led some in the US to believe that the Soviet Union was not living up to the TTBT agreement. An on-site verification protocol was negotiated in 1988 and 1989 that allowed the US to make hydrodynamic yield measurements on Soviet tests above 50 kt yield and regional seismic measurements on all tests above 35 kt of yield; and the Soviets to make the same type of measurements on US tests to ensure that they were not over 150 kt. These on-site measurements were considered reasonably intrusive. Again the measurement capability was not perfect and it was expected that occasionally there might be a verification measurement that was slightly above 150 kt. But the accuracy was much improved over the earlier seismic measurements. In fact some of this improvement was because as part of this verification protocol the US and Soviet Union provided the yields of several past tests to improve seismic calibrations. This actually helped provide a much needed calibration for the seismic measurements. It was also accepted that since nuclear tests were to a large part R&D related, it was also expected that occasionally there might be a test that was slightly above 150 kt, as you could not always predict the yield with high accuracy in advance of the test. While one could hypothesize that the Soviets could do a test at some other location than their test sites, if it were even a small fraction of 150 kt it would clearly be observed and would be a violation of the treaty. So the issue of clandestine tests of significance was easily covered for this particular treaty.« less

  5. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  6. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...

  7. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...

  8. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) For...

  9. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...

  10. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...

  11. Process Document, Joint Verification Protocol, and Joint Test Plan for Verification of HACH-LANGE GmbH LUMIStox 300 Bench Top Luminometer and ECLOX Handheld Luminometer for Luminescent Bacteria Test for use in Wastewater

    EPA Science Inventory

    The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...

  12. Life sciences laboratory breadboard simulations for shuttle

    NASA Technical Reports Server (NTRS)

    Taketa, S. T.; Simmonds, R. C.; Callahan, P. X.

    1975-01-01

    Breadboard simulations of life sciences laboratory concepts for conducting bioresearch in space were undertaken as part of the concept verification testing program. Breadboard simulations were conducted to test concepts of and scope problems associated with bioresearch support equipment and facility requirements and their operational integration for conducting manned research in earth orbital missions. It emphasized requirements, functions, and procedures for candidate research on crew members (simulated) and subhuman primates and on typical radioisotope studies in rats, a rooster, and plants.

  13. Monitoring and verification R&D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existingmore » energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.« less

  14. Test/QA Plan for Verification of Leak Detection and Repair Technologies

    EPA Science Inventory

    The purpose of the leak detection and repair (LDAR) test and quality assurance plan is to specify procedures for a verification test applicable to commercial LDAR technologies. The purpose of the verification test is to evaluate the performance of participating technologies in b...

  15. Test/QA Plan (TQAP) for Verification of Semi-Continuous Ambient Air Monitoring Systems

    EPA Science Inventory

    The purpose of the semi-continuous ambient air monitoring technology (or MARGA) test and quality assurance plan is to specify procedures for a verification test applicable to commercial semi-continuous ambient air monitoring technologies. The purpose of the verification test is ...

  16. Kleene Algebra and Bytecode Verification

    DTIC Science & Technology

    2016-04-27

    computing the star (Kleene closure) of a matrix of transfer functions. In this paper we show how this general framework applies to the problem of Java ...bytecode verification. We show how to specify transfer functions arising in Java bytecode verification in such a way that the Kleene algebra operations...potentially improve the performance over the standard worklist algorithm when a small cutset can be found. Key words: Java , bytecode, verification, static

  17. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  18. Final Report - Regulatory Considerations for Adaptive Systems

    NASA Technical Reports Server (NTRS)

    Wilkinson, Chris; Lynch, Jonathan; Bharadwaj, Raj

    2013-01-01

    This report documents the findings of a preliminary research study into new approaches to the software design assurance of adaptive systems. We suggest a methodology to overcome the software validation and verification difficulties posed by the underlying assumption of non-adaptive software in the requirementsbased- testing verification methods in RTCA/DO-178B and C. An analysis of the relevant RTCA/DO-178B and C objectives is presented showing the reasons for the difficulties that arise in showing satisfaction of the objectives and suggested additional means by which they could be satisfied. We suggest that the software design assurance problem for adaptive systems is principally one of developing correct and complete high level requirements and system level constraints that define the necessary system functional and safety properties to assure the safe use of adaptive systems. We show how analytical techniques such as model based design, mathematical modeling and formal or formal-like methods can be used to both validate the high level functional and safety requirements, establish necessary constraints and provide the verification evidence for the satisfaction of requirements and constraints that supplements conventional testing. Finally the report identifies the follow-on research topics needed to implement this methodology.

  19. Environmental Technology Verification Report for Abraxis Ecologenia® 17β-Estradiol (E2) Microplate Enzyme-Linked Immunosorbent Assay (ELISA) Test Kits

    EPA Science Inventory

    This verification test was conducted according to procedures specifiedin the Test/QA Planfor Verification of Enzyme-Linked Immunosorbent Assay (ELISA) Test Kis for the Quantitative Determination of Endocrine Disrupting Compounds (EDCs) in Aqueous Phase Samples. Deviations to the...

  20. Test/QA Plan for Verification of Cavity Ringdown Spectroscopy Systems for Ammonia Monitoring in Stack Gas

    EPA Science Inventory

    The purpose of the cavity ringdown spectroscopy (CRDS) technology test and quality assurance plan is to specify procedures for a verification test applicable to commercial cavity ringdown spectroscopy technologies. The purpose of the verification test is to evaluate the performa...

  1. 40 CFR 86.1849-01 - Right of entry.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... entity who conducts or causes to be conducted in-use verification or in-use confirmatory testing under... where any such certification or in-use verification or in-use confirmatory testing or any procedures or... test vehicle used for certification, in-use verification or in-use confirmatory testing which is being...

  2. Markov Chains For Testing Redundant Software

    NASA Technical Reports Server (NTRS)

    White, Allan L.; Sjogren, Jon A.

    1990-01-01

    Preliminary design developed for validation experiment that addresses problems unique to assuring extremely high quality of multiple-version programs in process-control software. Approach takes into account inertia of controlled system in sense it takes more than one failure of control program to cause controlled system to fail. Verification procedure consists of two steps: experimentation (numerical simulation) and computation, with Markov model for each step.

  3. 76 FR 50164 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-12

    ...-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing... correct certain portions of the Protocol Gas Verification Program and Minimum Competency Requirements for... final rule that amends the Agency's Protocol Gas Verification Program (PGVP) and the minimum competency...

  4. Verification test report on a solar heating and hot water system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  5. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  6. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    PubMed

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  7. Verification of performance specifications of a molecular test: cystic fibrosis carrier testing using the Luminex liquid bead array.

    PubMed

    Lacbawan, Felicitas L; Weck, Karen E; Kant, Jeffrey A; Feldman, Gerald L; Schrijver, Iris

    2012-01-01

    The number of clinical laboratories introducing various molecular tests to their existing test menu is continuously increasing. Prior to offering a US Food and Drug Administration-approved test, it is necessary that performance characteristics of the test, as claimed by the company, are verified before the assay is implemented in a clinical laboratory. To provide an example of the verification of a specific qualitative in vitro diagnostic test: cystic fibrosis carrier testing using the Luminex liquid bead array (Luminex Molecular Diagnostics, Inc, Toronto, Ontario). The approach used by an individual laboratory for verification of a US Food and Drug Administration-approved assay is described. Specific verification data are provided to highlight the stepwise verification approach undertaken by a clinical diagnostic laboratory. Protocols for verification of in vitro diagnostic assays may vary between laboratories. However, all laboratories must verify several specific performance specifications prior to implementation of such assays for clinical use. We provide an example of an approach used for verifying performance of an assay for cystic fibrosis carrier screening.

  8. 40 CFR 1065.920 - PEMS Calibrations and verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... verification. The verification consists of operating an engine over a duty cycle in the laboratory and... by laboratory equipment as follows: (1) Mount an engine on a dynamometer for laboratory testing...

  9. An Investigation into Solution Verification for CFD-DEM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fullmer, William D.; Musser, Jordan

    This report presents the study of the convergence behavior of the computational fluid dynamicsdiscrete element method (CFD-DEM) method, specifically National Energy Technology Laboratory’s (NETL) open source MFiX code (MFiX-DEM) with a diffusion based particle-tocontinuum filtering scheme. In particular, this study focused on determining if the numerical method had a solution in the high-resolution limit where the grid size is smaller than the particle size. To address this uncertainty, fixed particle beds of two primary configurations were studied: i) fictitious beds where the particles are seeded with a random particle generator, and ii) instantaneous snapshots from a transient simulation of anmore » experimentally relevant problem. Both problems considered a uniform inlet boundary and a pressure outflow. The CFD grid was refined from a few particle diameters down to 1/6 th of a particle diameter. The pressure drop between two vertical elevations, averaged across the bed cross-section was considered as the system response quantity of interest. A least-squares regression method was used to extrapolate the grid-dependent results to an approximate “grid-free” solution in the limit of infinite resolution. The results show that the diffusion based scheme does yield a converging solution. However, the convergence is more complicated than encountered in simpler, single-phase flow problems showing strong oscillations and, at times, oscillations superimposed on top of globally non-monotonic behavior. The challenging convergence behavior highlights the importance of using at least four grid resolutions in solution verification problems so that (over-determined) regression-based extrapolation methods may be applied to approximate the grid-free solution. The grid-free solution is very important in solution verification and VVUQ exercise in general as the difference between it and the reference solution largely determines the numerical uncertainty. By testing different randomized particle configurations of the same general problem (for the fictitious case) or different instances of freezing a transient simulation, the numerical uncertainties appeared to be on the same order of magnitude as ensemble or time averaging uncertainties. By testing different drag laws, almost all cases studied show that model form uncertainty in this one, very important closure relation was larger than the numerical uncertainty, at least with a reasonable CFD grid, roughly five particle diameters. In this study, the diffusion width (filtering length scale) was mostly set at a constant of six particle diameters. A few exploratory tests were performed to show that similar convergence behavior was observed for diffusion widths greater than approximately two particle diameters. However, this subject was not investigated in great detail because determining an appropriate filter size is really a validation question which must be determined by comparison to experimental or highly accurate numerical data. Future studies are being considered targeting solution verification of transient simulations as well as validation of the filter size with direct numerical simulation data.« less

  10. Applications of Bayesian Procrustes shape analysis to ensemble radar reflectivity nowcast verification

    NASA Astrophysics Data System (ADS)

    Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang

    2016-07-01

    This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.

  11. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  12. Perspectives of human verification via binary QRS template matching of single-lead and 12-lead electrocardiogram.

    PubMed

    Krasteva, Vessela; Jekova, Irena; Schmid, Ramun

    2018-01-01

    This study aims to validate the 12-lead electrocardiogram (ECG) as a biometric modality based on two straightforward binary QRS template matching characteristics. Different perspectives of the human verification problem are considered, regarding the optimal lead selection and stability over sample size, gender, age, heart rate (HR). A clinical 12-lead resting ECG database, including a population of 460 subjects with two-session recordings (>1 year apart) is used. Cost-effective strategies for extraction of personalized QRS patterns (100ms) and binary template matching estimate similarity in the time scale (matching time) and dissimilarity in the amplitude scale (mismatch area). The two-class person verification task, taking the decision to validate or to reject the subject identity is managed by linear discriminant analysis (LDA). Non-redundant LDA models for different lead configurations (I,II,III,aVF,aVL,aVF,V1-V6) are trained on the first half of 230 subjects by stepwise feature selection until maximization of the area under the receiver operating characteristic curve (ROC AUC). The operating point on the training ROC at equal error rate (EER) is tested on the independent dataset (second half of 230 subjects) to report unbiased validation of test-ROC AUC and true verification rate (TVR = 100-EER). The test results are further evaluated in groups by sample size, gender, age, HR. The optimal QRS pattern projection for single-lead ECG biometric modality is found in the frontal plane sector (60°-0°) with best (Test-AUC/TVR) for lead II (0.941/86.8%) and slight accuracy drop for -aVR (-0.017/-1.4%), I (-0.01/-1.5%). Chest ECG leads have degrading accuracy from V1 (0.885/80.6%) to V6 (0.799/71.8%). The multi-lead ECG improves verification: 6-chest (0.97/90.9%), 6-limb (0.986/94.3%), 12-leads (0.995/97.5%). The QRS pattern matching model shows stable performance for verification of 10 to 230 individuals; insignificant degradation of TVR in women by (1.2-3.6%), adults ≥70 years (3.7%), younger <40 years (1.9%), HR<60bpm (1.2%), HR>90bpm (3.9%), no degradation for HR change (0 to >20bpm).

  13. A Multifunctional Interface Method for Coupling Finite Element and Finite Difference Methods: Two-Dimensional Scalar-Field Problems

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.

    2002-01-01

    A multifunctional interface method with capabilities for variable-fidelity modeling and multiple method analysis is presented. The methodology provides an effective capability by which domains with diverse idealizations can be modeled independently to exploit the advantages of one approach over another. The multifunctional method is used to couple independently discretized subdomains, and it is used to couple the finite element and the finite difference methods. The method is based on a weighted residual variational method and is presented for two-dimensional scalar-field problems. A verification test problem and a benchmark application are presented, and the computational implications are discussed.

  14. Quantum adiabatic machine learning

    NASA Astrophysics Data System (ADS)

    Pudenz, Kristen L.; Lidar, Daniel A.

    2013-05-01

    We develop an approach to machine learning and anomaly detection via quantum adiabatic evolution. This approach consists of two quantum phases, with some amount of classical preprocessing to set up the quantum problems. In the training phase we identify an optimal set of weak classifiers, to form a single strong classifier. In the testing phase we adiabatically evolve one or more strong classifiers on a superposition of inputs in order to find certain anomalous elements in the classification space. Both the training and testing phases are executed via quantum adiabatic evolution. All quantum processing is strictly limited to two-qubit interactions so as to ensure physical feasibility. We apply and illustrate this approach in detail to the problem of software verification and validation, with a specific example of the learning phase applied to a problem of interest in flight control systems. Beyond this example, the algorithm can be used to attack a broad class of anomaly detection problems.

  15. Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications

    NASA Technical Reports Server (NTRS)

    Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.

    1990-01-01

    The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning vane control system (TVCS) are discussed and the problems and their solutions are documented. The emphasis of this paper is on the validation of integrated system.

  16. A robust method using propensity score stratification for correcting verification bias for binary tests

    PubMed Central

    He, Hua; McDermott, Michael P.

    2012-01-01

    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650

  17. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  18. Joint ETV/NOWATECH test plan for the Sorbisense GSW40 passive sampler

    EPA Science Inventory

    The joint test plan is the implementation of a test design developed for verification of the performance of an environmental technology following the NOWATECH ETV method. The verification is a joint verification with the US EPA ETV scheme and the Advanced Monitoring Systems Cent...

  19. CMOS VLSI Layout and Verification of a SIMD Computer

    NASA Technical Reports Server (NTRS)

    Zheng, Jianqing

    1996-01-01

    A CMOS VLSI layout and verification of a 3 x 3 processor parallel computer has been completed. The layout was done using the MAGIC tool and the verification using HSPICE. Suggestions for expanding the computer into a million processor network are presented. Many problems that might be encountered when implementing a massively parallel computer are discussed.

  20. Delamination Assessment Tool for Spacecraft Composite Structures

    NASA Astrophysics Data System (ADS)

    Portela, Pedro; Preller, Fabian; Wittke, Henrik; Sinnema, Gerben; Camanho, Pedro; Turon, Albert

    2012-07-01

    Fortunately only few cases are known where failure of spacecraft structures due to undetected damage has resulted in a loss of spacecraft and launcher mission. However, several problems related to damage tolerance and in particular delamination of composite materials have been encountered during structure development of various ESA projects and qualification testing. To avoid such costly failures during development, launch or service of spacecraft, launcher and reusable launch vehicles structures a comprehensive damage tolerance verification approach is needed. In 2009, the European Space Agency (ESA) initiated an activity called “Delamination Assessment Tool” which is led by the Portuguese company HPS Lda and includes academic and industrial partners. The goal of this study is the development of a comprehensive damage tolerance verification approach for launcher and reusable launch vehicles (RLV) structures, addressing analytical and numerical methodologies, material-, subcomponent- and component testing, as well as non-destructive inspection. The study includes a comprehensive review of current industrial damage tolerance practice resulting from ECSS and NASA standards, the development of new Best Practice Guidelines for analysis, test and inspection methods and the validation of these with a real industrial case study. The paper describes the main findings of this activity so far and presents a first iteration of a Damage Tolerance Verification Approach, which includes the introduction of novel analytical and numerical tools at an industrial level. This new approach is being put to the test using real industrial case studies provided by the industrial partners, MT Aerospace, RUAG Space and INVENT GmbH

  1. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  2. Limitations in learning: How treatment verifications fail and what to do about it?

    PubMed

    Richardson, Susan; Thomadsen, Bruce

    The purposes of this study were: to provide dialog on why classic incident learning systems have been insufficient for patient safety improvements, discuss failures in treatment verification, and to provide context to the reasons and lessons that can be learned from these failures. Historically, incident learning in brachytherapy is performed via database mining which might include reading of event reports and incidents followed by incorporating verification procedures to prevent similar incidents. A description of both classic event reporting databases and current incident learning and reporting systems is given. Real examples of treatment failures based on firsthand knowledge are presented to evaluate the effectiveness of verification. These failures will be described and analyzed by outlining potential pitfalls and problems based on firsthand knowledge. Databases and incident learning systems can be limited in value and fail to provide enough detail for physicists seeking process improvement. Four examples of treatment verification failures experienced firsthand by experienced brachytherapy physicists are described. These include both underverification and oververification of various treatment processes. Database mining is an insufficient method to affect substantial improvements in the practice of brachytherapy. New incident learning systems are still immature and being tested. Instead, a new method of shared learning and implementation of changes must be created. Copyright © 2017 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  3. Virtual Platform for See Robustness Verification of Bootloader Embedded Software on Board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Da Silva, A.; Sánchez Prieto, S.; Polo, O.; Parra Espada, P.

    2013-05-01

    Because of the tough robustness requirements in space software development, it is imperative to carry out verification tasks at a very early development stage to ensure that the implemented exception mechanisms work properly. All this should be done long time before the real hardware is available. But even if real hardware is available the verification of software fault tolerance mechanisms can be difficult since real faulty situations must be systematically and artificially brought about which can be imposible on real hardware. To solve this problem the Alcala Space Research Group (SRG) has developed a LEON2 virtual platform (Leon2ViP) with fault injection capabilities. This way it is posible to run the exact same target binary software as runs on the physical system in a more controlled and deterministic environment, allowing a more strict requirements verification. Leon2ViP enables unmanned and tightly focused fault injection campaigns, not possible otherwise, in order to expose and diagnose flaws in the software implementation early. Furthermore, the use of a virtual hardware-in-the-loop approach makes it possible to carry out preliminary integration tests with the spacecraft emulator or the sensors. The use of Leon2ViP has meant a signicant improvement, in both time and cost, in the development and verification processes of the Instrument Control Unit boot software on board Solar Orbiter's Energetic Particle Detector.

  4. Test-state approach to the quantum search problem

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sehrawat, Arun; Nguyen, Le Huy; Graduate School for Integrative Sciences and Engineering, National University of Singapore, Singapore 117597

    2011-05-15

    The search for 'a quantum needle in a quantum haystack' is a metaphor for the problem of finding out which one of a permissible set of unitary mappings - the oracles - is implemented by a given black box. Grover's algorithm solves this problem with quadratic speedup as compared with the analogous search for 'a classical needle in a classical haystack'. Since the outcome of Grover's algorithm is probabilistic - it gives the correct answer with high probability, not with certainty - the answer requires verification. For this purpose we introduce specific test states, one for each oracle. These testmore » states can also be used to realize 'a classical search for the quantum needle' which is deterministic - it always gives a definite answer after a finite number of steps - and 3.41 times as fast as the purely classical search. Since the test-state search and Grover's algorithm look for the same quantum needle, the average number of oracle queries of the test-state search is the classical benchmark for Grover's algorithm.« less

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR TREATMENT OF WASTEWATER GENERATED DURING DECONTAMINATION ACTIVITIES - ULTRASTRIP SYSTEMS, INC., MOBILE EMERGENCY FILTRATION SYSTEM (MEFS) - 04/14/WQPC-HS

    EPA Science Inventory

    Performance verification testing of the UltraStrip Systems, Inc., Mobile Emergency Filtration System (MEFS) was conducted under EPA's Environmental Technology Verification (ETV) Program at the EPA Test and Evaluation (T&E) Facility in Cincinnati, Ohio, during November, 2003, thr...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  7. Use of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments Across International Borders -Test/QA Plan

    EPA Science Inventory

    The Environmental Technology Verification (ETV) – Environmental and Sustainable Technology Evaluations (ESTE) Program conducts third-party verification testing of commercially available technologies that may accomplish environmental program management goals. In this verification...

  8. PERFORMANCE VERIFICATION TEST FOR FIELD-PORTABLE MEASUREMENTS OF LEAD IN DUST

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program (www.epa.jzov/etv) conducts performance verification tests of technologies used for the characterization and monitoring of contaminated media. The program exists to provide high-quali...

  9. VERIFICATION TESTING OF HIGH-RATE MECHANICAL INDUCTION MIXERS FOR CHEMICAL DISINFECTANTS

    EPA Science Inventory

    This paper describes the results of verification testing of mechanical induction mixers for dispersion of chemical disinfectants in wet-weather flow (WWF) conducted under the U.S. Environmental Protection Agency's Environmental Technology Verification (ETV) WWF Pilot Program. Th...

  10. Gaia challenging performances verification: combination of spacecraft models and test results

    NASA Astrophysics Data System (ADS)

    Ecale, Eric; Faye, Frédéric; Chassat, François

    2016-08-01

    To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.

  11. Test and Verification Approach for the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Strong, Edward

    2008-01-01

    This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.

  12. Space Shuttle Tail Service Mast Concept Verification

    NASA Technical Reports Server (NTRS)

    Uda, R. T.

    1976-01-01

    Design studies and analyses were performed to describe the loads and dynamics of the space shuttle tail service masts (TSMs). Of particular interest are the motion and interaction of the umbilical carrier plate, lanyard system, vacuum jacketed hoses, latches, links, and masthead. A development test rig was designed and fabricated to obtain experimental data. The test program is designed to (1) verify the theoretical dynamics calculations, (2) prove the soundness of design concepts, and (3) elucidate problem areas (if any) in the design of mechanisms and structural components. Design, fabrication, and initiation of TSM development testing at Kennedy Space Center are described.

  13. Towards composition of verified hardware devices

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, G. C.

    1991-01-01

    Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.

  14. Verification of floating-point software

    NASA Technical Reports Server (NTRS)

    Hoover, Doug N.

    1990-01-01

    Floating point computation presents a number of problems for formal verification. Should one treat the actual details of floating point operations, or accept them as imprecisely defined, or should one ignore round-off error altogether and behave as if floating point operations are perfectly accurate. There is the further problem that a numerical algorithm usually only approximately computes some mathematical function, and we often do not know just how good the approximation is, even in the absence of round-off error. ORA has developed a theory of asymptotic correctness which allows one to verify floating point software with a minimum entanglement in these problems. This theory and its implementation in the Ariel C verification system are described. The theory is illustrated using a simple program which finds a zero of a given function by bisection. This paper is presented in viewgraph form.

  15. Acceptance and commissioning of a treatment planning system based on Monte Carlo calculations.

    PubMed

    Lopez-Tarjuelo, J; Garcia-Molla, R; Juan-Senabre, X J; Quiros-Higueras, J D; Santos-Serra, A; de Marco-Blancas, N; Calzada-Feliu, S

    2014-04-01

    The Monaco Treatment Planning System (TPS), based on a virtual energy fluence model of the photon beam head components of the linac and a dose computation engine made with Monte Carlo (MC) algorithm X-Ray Voxel MC (XVMC), has been tested before being put into clinical use. An Elekta Synergy with 6 MV was characterized using routine equipment. After the machine's model was installed, a set of functionality, geometric, dosimetric and data transfer tests were performed. The dosimetric tests included dose calculations in water, heterogeneous phantoms and Intensity Modulated Radiation Therapy (IMRT) verifications. Data transfer tests were run for every imaging device, TPS and the electronic medical record linked to Monaco. Functionality and geometric tests were run properly. Dose calculations in water were in accordance with measurements so that, in 95% of cases, differences were up to 1.9%. Dose calculation in heterogeneous media showed expected results found in the literature. IMRT verification results with an ionization chamber led to dose differences lower than 2.5% for points inside a standard gradient. When an 2-D array was used, all the fields passed the g (3%, 3 mm) test with a percentage of succeeding points between 90% and 95%, of which the majority of the mentioned fields had a percentage of succeeding points between 95% and 100%. Data transfer caused problems that had to be solved by means of changing our workflow. In general, tests led to satisfactory results. Monaco performance complied with published international recommendations and scored highly in the dosimetric ambit. However, the problems detected when the TPS was put to work together with our current equipment showed that this kind of product must be completely commissioned, without neglecting data workflow, before treating the first patient.

  16. DYNA3D/ParaDyn Regression Test Suite Inventory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lin, Jerry I.

    2016-09-01

    The following table constitutes an initial assessment of feature coverage across the regression test suite used for DYNA3D and ParaDyn. It documents the regression test suite at the time of preliminary release 16.1 in September 2016. The columns of the table represent groupings of functionalities, e.g., material models. Each problem in the test suite is represented by a row in the table. All features exercised by the problem are denoted by a check mark (√) in the corresponding column. The definition of “feature” has not been subdivided to its smallest unit of user input, e.g., algorithmic parameters specific to amore » particular type of contact surface. This represents a judgment to provide code developers and users a reasonable impression of feature coverage without expanding the width of the table by several multiples. All regression testing is run in parallel, typically with eight processors, except problems involving features only available in serial mode. Many are strictly regression tests acting as a check that the codes continue to produce adequately repeatable results as development unfolds; compilers change and platforms are replaced. A subset of the tests represents true verification problems that have been checked against analytical or other benchmark solutions. Users are welcomed to submit documented problems for inclusion in the test suite, especially if they are heavily exercising, and dependent upon, features that are currently underrepresented.« less

  17. GENERIC VERIFICATION PROTOCOL: DISTRIBUTED GENERATION AND COMBINED HEAT AND POWER FIELD TESTING PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  18. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  19. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  20. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  1. VERIFICATION TESTING OF HIGH-RATE MECHANICAL INDUCTION MIXERS FOR CHEMICAL DISINFECTANTS, Oregon

    EPA Science Inventory

    This paper describes the results of verification testing of mechanical induction mixers for dispersion of chemical disinfectants in wet-weather flow (WWF) conducted under the U.S. Environmental Protection Agency's Environmental Technology Verification (ETV) WWF Pilot Program. Th...

  2. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    NASA Astrophysics Data System (ADS)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  3. Comparison of Military and Commercial Design-to-Cost Aircraft Procurement and Operational Support Practices

    DTIC Science & Technology

    1975-07-01

    the product , including its operational and maintenance requirements . However, there are many other program elements that are equally critical, i.e...customer needs into meaningful, practical requirements which can be met by the designer, verified in the product and used effectively by the operator. The...and government) and H- Production Verification Testing Requirements , spread over longer delivery periods, causing problems of shop load, high cash

  4. Bayesian Estimation of Combined Accuracy for Tests with Verification Bias

    PubMed Central

    Broemeling, Lyle D.

    2011-01-01

    This presentation will emphasize the estimation of the combined accuracy of two or more tests when verification bias is present. Verification bias occurs when some of the subjects are not subject to the gold standard. The approach is Bayesian where the estimation of test accuracy is based on the posterior distribution of the relevant parameter. Accuracy of two combined binary tests is estimated employing either “believe the positive” or “believe the negative” rule, then the true and false positive fractions for each rule are computed for two tests. In order to perform the analysis, the missing at random assumption is imposed, and an interesting example is provided by estimating the combined accuracy of CT and MRI to diagnose lung cancer. The Bayesian approach is extended to two ordinal tests when verification bias is present, and the accuracy of the combined tests is based on the ROC area of the risk function. An example involving mammography with two readers with extreme verification bias illustrates the estimation of the combined test accuracy for ordinal tests. PMID:26859487

  5. Development and testing of the infrared radiometer for the Mariner Venus/Mercury 1973 spacecraft

    NASA Technical Reports Server (NTRS)

    Clarke, T. C.

    1975-01-01

    The science objectives, development history, functional description, and testing of the Mariner Venus/Mercury 1973 infrared radiometer are discussed. Included in the functional description section is a thorough discussion of the IRR optical system, electronic operation, and thermal control. Signal development and its conversion to engineering units is traced, starting with the radiant space object, passing through the IRR optics and electronics, and culminating with data number development and interpretation. The test program section includes discussion of IRR calibration and alignment verification. Finally, the problems and failures encountered by the IRR during the period of its development and testing are reviewed.

  6. Development of a preprototype vapor compression distillation water recovery subsystem

    NASA Technical Reports Server (NTRS)

    Johnson, K. L.

    1978-01-01

    The activities involved in the design, development, and test of a preprototype vapor compression distillation water recovery subsystem are described. This subsystem, part of a larger regenerative life support evaluation system, is designed to recover usable water from urine, urinal rinse water, and concentrated shower and laundry brine collected from three space vehicle crewmen for a period of 180 days without resupply. Details of preliminary design and testing as well as component developments are included. Trade studies, considerations leading to concept selections, problems encountered, and test data are also presented. The rework of existing hardware, subsystem development including computer programs, assembly verification, and comprehensive baseline test results are discussed.

  7. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  8. Apollo experience report: Voice communications techniques and performance

    NASA Technical Reports Server (NTRS)

    Dabbs, J. H.; Schmidt, O. L.

    1972-01-01

    The primary performance requirement of the spaceborne Apollo voice communications system is percent word intelligibility, which is related to other link/channel parameters. The effect of percent word intelligibility on voice channel design and a description of the verification procedures are included. Development and testing performance problems and the techniques used to solve the problems are also discussed. Voice communications performance requirements should be comprehensive and verified easily; the total system must be considered in component design, and the necessity of voice processing and the associated effect on noise, distortion, and cross talk should be examined carefully.

  9. Generic Verification Protocol for Testing Pesticide Application Spray Drift Reduction Technologies for Row and Field Crops

    EPA Pesticide Factsheets

    This generic verification protocol provides a detailed method to conduct and report results from a verification test of pesticide application technologies that can be used to evaluate these technologies for their potential to reduce spray drift.

  10. Verification of performance specifications for a US Food and Drug Administration-approved molecular microbiology test: Clostridium difficile cytotoxin B using the Becton, Dickinson and Company GeneOhm Cdiff assay.

    PubMed

    Schlaberg, Robert; Mitchell, Michael J; Taggart, Edward W; She, Rosemary C

    2012-01-01

    US Food and Drug Administration (FDA)-approved diagnostic tests based on molecular genetic technologies are becoming available for an increasing number of microbial pathogens. Advances in technology and lower costs have moved molecular diagnostic tests formerly performed for research purposes only into much wider use in clinical microbiology laboratories. To provide an example of laboratory studies performed to verify the performance of an FDA-approved assay for the detection of Clostridium difficile cytotoxin B compared with the manufacturer's performance standards. We describe the process and protocols used by a laboratory for verification of an FDA-approved assay, assess data from the verification studies, and implement the assay after verification. Performance data from the verification studies conducted by the laboratory were consistent with the manufacturer's performance standards and the assay was implemented into the laboratory's test menu. Verification studies are required for FDA-approved diagnostic assays prior to use in patient care. Laboratories should develop a standardized approach to verification studies that can be adapted and applied to different types of assays. We describe the verification of an FDA-approved real-time polymerase chain reaction assay for the detection of a toxin gene in a bacterial pathogen.

  11. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION: Reduction of Nitrogen in Domestic Wastewater from Individual Residential Homes. BioConcepts, Inc. ReCip® RTS ~ 500 System

    EPA Science Inventory

    Verification testing of the ReCip® RTS-500 System was conducted over a 12-month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located on Otis Air National Guard Base in Bourne, Massachusetts. A nine-week startup period preceded the verification test t...

  13. Definition of ground test for Large Space Structure (LSS) control verification

    NASA Technical Reports Server (NTRS)

    Waites, H. B.; Doane, G. B., III; Tollison, D. K.

    1984-01-01

    An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.

  14. Optimization and Verification of Droplet Digital PCR Even-Specific Methods for the Quantification of GM Maize DAS1507 and NK603.

    PubMed

    Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir

    2018-05-01

    In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.

  15. Verification of a Remaining Flying Time Prediction System for Small Electric Aircraft

    NASA Technical Reports Server (NTRS)

    Hogge, Edward F.; Bole, Brian M.; Vazquez, Sixto L.; Celaya, Jose R.; Strom, Thomas H.; Hill, Boyd L.; Smalling, Kyle M.; Quach, Cuong C.

    2015-01-01

    This paper addresses the problem of building trust in online predictions of a battery powered aircraft's remaining available flying time. A set of ground tests is described that make use of a small unmanned aerial vehicle to verify the performance of remaining flying time predictions. The algorithm verification procedure described here uses a fully functional vehicle that is restrained to a platform for repeated run-to-functional-failure experiments. The vehicle under test is commanded to follow a predefined propeller RPM profile in order to create battery demand profiles similar to those expected in flight. The fully integrated aircraft is repeatedly operated until the charge stored in powertrain batteries falls below a specified lower-limit. The time at which the lower-limit on battery charge is crossed is then used to measure the accuracy of remaining flying time predictions. Accuracy requirements are considered in this paper for an alarm that warns operators when remaining flying time is estimated to fall below a specified threshold.

  16. Spatial Evaluation and Verification of Earthquake Simulators

    NASA Astrophysics Data System (ADS)

    Wilson, John Max; Yoder, Mark R.; Rundle, John B.; Turcotte, Donald L.; Schultz, Kasey W.

    2017-06-01

    In this paper, we address the problem of verifying earthquake simulators with observed data. Earthquake simulators are a class of computational simulations which attempt to mirror the topological complexity of fault systems on which earthquakes occur. In addition, the physics of friction and elastic interactions between fault elements are included in these simulations. Simulation parameters are adjusted so that natural earthquake sequences are matched in their scaling properties. Physically based earthquake simulators can generate many thousands of years of simulated seismicity, allowing for a robust capture of the statistical properties of large, damaging earthquakes that have long recurrence time scales. Verification of simulations against current observed earthquake seismicity is necessary, and following past simulator and forecast model verification methods, we approach the challenges in spatial forecast verification to simulators; namely, that simulator outputs are confined to the modeled faults, while observed earthquake epicenters often occur off of known faults. We present two methods for addressing this discrepancy: a simplistic approach whereby observed earthquakes are shifted to the nearest fault element and a smoothing method based on the power laws of the epidemic-type aftershock (ETAS) model, which distributes the seismicity of each simulated earthquake over the entire test region at a decaying rate with epicentral distance. To test these methods, a receiver operating characteristic plot was produced by comparing the rate maps to observed m>6.0 earthquakes in California since 1980. We found that the nearest-neighbor mapping produced poor forecasts, while the ETAS power-law method produced rate maps that agreed reasonably well with observations.

  17. Voltage verification unit

    DOEpatents

    Martin, Edward J [Virginia Beach, VA

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  18. Test/QA Plan for Verification of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments across International Borders

    EPA Science Inventory

    The verification test will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Technology Verification (ETV) Program. It will be performed by Battelle, which is managing the ETV Advanced Monitoring Systems (AMS) Center throu...

  19. Offline signature verification using convolution Siamese network

    NASA Astrophysics Data System (ADS)

    Xing, Zi-Jian; Yin, Fei; Wu, Yi-Chao; Liu, Cheng-Lin

    2018-04-01

    This paper presents an offline signature verification approach using convolutional Siamese neural network. Unlike the existing methods which consider feature extraction and metric learning as two independent stages, we adopt a deepleaning based framework which combines the two stages together and can be trained end-to-end. The experimental results on two offline public databases (GPDSsynthetic and CEDAR) demonstrate the superiority of our method on the offline signature verification problem.

  20. Verifiable fault tolerance in measurement-based quantum computation

    NASA Astrophysics Data System (ADS)

    Fujii, Keisuke; Hayashi, Masahito

    2017-09-01

    Quantum systems, in general, cannot be simulated efficiently by a classical computer, and hence are useful for solving certain mathematical problems and simulating quantum many-body systems. This also implies, unfortunately, that verification of the output of the quantum systems is not so trivial, since predicting the output is exponentially hard. As another problem, the quantum system is very delicate for noise and thus needs an error correction. Here, we propose a framework for verification of the output of fault-tolerant quantum computation in a measurement-based model. In contrast to existing analyses on fault tolerance, we do not assume any noise model on the resource state, but an arbitrary resource state is tested by using only single-qubit measurements to verify whether or not the output of measurement-based quantum computation on it is correct. Verifiability is equipped by a constant time repetition of the original measurement-based quantum computation in appropriate measurement bases. Since full characterization of quantum noise is exponentially hard for large-scale quantum computing systems, our framework provides an efficient way to practically verify the experimental quantum error correction.

  1. Measurement of a True [Formula: see text]O2max during a Ramp Incremental Test Is Not Confirmed by a Verification Phase.

    PubMed

    Murias, Juan M; Pogliaghi, Silvia; Paterson, Donald H

    2018-01-01

    The accuracy of an exhaustive ramp incremental (RI) test to determine maximal oxygen uptake ([Formula: see text]O 2max ) was recently questioned and the utilization of a verification phase proposed as a gold standard. This study compared the oxygen uptake ([Formula: see text]O 2 ) during a RI test to that obtained during a verification phase aimed to confirm attainment of [Formula: see text]O 2max . Sixty-one healthy males [31 older (O) 65 ± 5 yrs; 30 younger (Y) 25 ± 4 yrs] performed a RI test (15-20 W/min for O and 25 W/min for Y). At the end of the RI test, a 5-min recovery period was followed by a verification phase of constant load cycling to fatigue at either 85% ( n = 16) or 105% ( n = 45) of the peak power output obtained from the RI test. The highest [Formula: see text]O 2 after the RI test (39.8 ± 11.5 mL·kg -1 ·min -1 ) and the verification phase (40.1 ± 11.2 mL·kg -1 ·min -1 ) were not different ( p = 0.33) and they were highly correlated ( r = 0.99; p < 0.01). This response was not affected by age or intensity of the verification phase. The Bland-Altman analysis revealed a very small absolute bias (-0.25 mL·kg -1 ·min -1 , not different from 0) and a precision of ±1.56 mL·kg -1 ·min -1 between measures. This study indicated that a verification phase does not highlight an under-estimation of [Formula: see text]O 2max derived from a RI test, in a large and heterogeneous group of healthy younger and older men naïve to laboratory testing procedures. Moreover, only minor within-individual differences were observed between the maximal [Formula: see text]O 2 elicited during the RI and the verification phase. Thus a verification phase does not add any validation of the determination of a [Formula: see text]O 2max . Therefore, the recommendation that a verification phase should become a gold standard procedure, although initially appealing, is not supported by the experimental data.

  2. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem

    PubMed Central

    Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-01-01

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem. PMID:29597286

  3. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.

    PubMed

    AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-03-28

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  4. Development of syntax of intuition-based learning model in solving mathematics problems

    NASA Astrophysics Data System (ADS)

    Yeni Heryaningsih, Nok; Khusna, Hikmatul

    2018-01-01

    The aim of the research was to produce syntax of Intuition Based Learning (IBL) model in solving mathematics problem for improving mathematics students’ achievement that valid, practical and effective. The subject of the research were 2 classes in grade XI students of SMAN 2 Sragen, Central Java. The type of the research was a Research and Development (R&D). Development process adopted Plomp and Borg & Gall development model, they were preliminary investigation step, design step, realization step, evaluation and revision step. Development steps were as follow: (1) Collected the information and studied of theories in Preliminary Investigation step, studied about intuition, learning model development, students condition, and topic analysis, (2) Designed syntax that could bring up intuition in solving mathematics problem and then designed research instruments. They were several phases that could bring up intuition, Preparation phase, Incubation phase, Illumination phase and Verification phase, (3) Realized syntax of Intuition Based Learning model that has been designed to be the first draft, (4) Did validation of the first draft to the validator, (5) Tested the syntax of Intuition Based Learning model in the classrooms to know the effectiveness of the syntax, (6) Conducted Focus Group Discussion (FGD) to evaluate the result of syntax model testing in the classrooms, and then did the revision on syntax IBL model. The results of the research were produced syntax of IBL model in solving mathematics problems that valid, practical and effective. The syntax of IBL model in the classroom were, (1) Opening with apperception, motivations and build students’ positive perceptions, (2) Teacher explains the material generally, (3) Group discussion about the material, (4) Teacher gives students mathematics problems, (5) Doing exercises individually to solve mathematics problems with steps that could bring up students’ intuition: Preparations, Incubation, Illumination, and Verification, (6) Closure with the review of students have learned or giving homework.

  5. The influence of cardiorespiratory fitness on strategic, behavioral, and electrophysiological indices of arithmetic cognition in preadolescent children

    PubMed Central

    Moore, R. Davis; Drollette, Eric S.; Scudder, Mark R.; Bharij, Aashiv; Hillman, Charles H.

    2014-01-01

    The current study investigated the influence of cardiorespiratory fitness on arithmetic cognition in forty 9–10 year old children. Measures included a standardized mathematics achievement test to assess conceptual and computational knowledge, self-reported strategy selection, and an experimental arithmetic verification task (including small and large addition problems), which afforded the measurement of event-related brain potentials (ERPs). No differences in math achievement were observed as a function of fitness level, but all children performed better on math concepts relative to math computation. Higher fit children reported using retrieval more often to solve large arithmetic problems, relative to lower fit children. During the arithmetic verification task, higher fit children exhibited superior performance for large problems, as evidenced by greater d' scores, while all children exhibited decreased accuracy and longer reaction time for large relative to small problems, and incorrect relative to correct solutions. On the electrophysiological level, modulations of early (P1, N170) and late ERP components (P3, N400) were observed as a function of problem size and solution correctness. Higher fit children exhibited selective modulations for N170, P3, and N400 amplitude relative to lower fit children, suggesting that fitness influences symbolic encoding, attentional resource allocation and semantic processing during arithmetic tasks. The current study contributes to the fitness-cognition literature by demonstrating that the benefits of cardiorespiratory fitness extend to arithmetic cognition, which has important implications for the educational environment and the context of learning. PMID:24829556

  6. Analytical Verifications in Cryogenic Testing of NGST Advanced Mirror System Demonstrators

    NASA Technical Reports Server (NTRS)

    Cummings, Ramona; Levine, Marie; VanBuren, Dave; Kegley, Jeff; Green, Joseph; Hadaway, James; Presson, Joan; Cline, Todd; Stahl, H. Philip (Technical Monitor)

    2002-01-01

    Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper opens with strategy of analytical verification segments added to vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies. These AMSD assemblies incorporate material and architecture concepts being considered in the Next Generation Space Telescope (NGST) design. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of specific structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with support plan for analyses, data, and observation requirements during the AMSD testing, currently slated for late calendar year 2002 to mid calendar year 2003. The paper includes anomaly resolution as gleaned by authors from similar analytical verification support of a previous large space telescope, then closes with draft of plans for parameter extrapolations, to form a well-verified portion of the integrated modeling being done for NGST performance predictions.

  7. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    Considerations regarding the Space Transportation System (STS) payload environmental verification are reviewed. It is noted that emphasis is placed on testing at the subassembly level and that the basic objective of structural dynamic payload verification is to ensure reliability in a cost-effective manner. Structural analyses consist of: (1) stress analysis for critical loading conditions, (2) model analysis for launch and orbital configurations, (3) flight loads analysis, (4) test simulation analysis to verify models, (5) kinematic analysis of deployment/retraction sequences, and (6) structural-thermal-optical program analysis. In addition to these approaches, payload verification programs are being developed in the thermal-vacuum area. These include the exposure to extreme temperatures, temperature cycling, thermal-balance testing and thermal-vacuum testing.

  8. Space telescope observatory management system preliminary test and verification plan

    NASA Technical Reports Server (NTRS)

    Fritz, J. S.; Kaldenbach, C. F.; Williams, W. B.

    1982-01-01

    The preliminary plan for the Space Telescope Observatory Management System Test and Verification (TAV) is provided. Methodology, test scenarios, test plans and procedure formats, schedules, and the TAV organization are included. Supporting information is provided.

  9. Space shuttle main engine controller assembly, phase C-D. [with lagging system design and analysis

    NASA Technical Reports Server (NTRS)

    1973-01-01

    System design and system analysis and simulation are slightly behind schedule, while design verification testing has improved. Input/output circuit design has improved, but digital computer unit (DCU) and mechanical design continue to lag. Part procurement was impacted by delays in printed circuit board, assembly drawing releases. These are the result of problems in generating suitable printed circuit artwork for the very complex and high density multilayer boards.

  10. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  11. Vortex generator design for aircraft inlet distortion as a numerical optimization problem

    NASA Technical Reports Server (NTRS)

    Anderson, Bernhard H.; Levy, Ralph

    1991-01-01

    Aerodynamic compatibility of aircraft/inlet/engine systems is a difficult design problem for aircraft that must operate in many different flight regimes. Takeoff, subsonic cruise, supersonic cruise, transonic maneuvering, and high altitude loiter each place different constraints on inlet design. Vortex generators, small wing like sections mounted on the inside surfaces of the inlet duct, are used to control flow separation and engine face distortion. The design of vortex generator installations in an inlet is defined as a problem addressable by numerical optimization techniques. A performance parameter is suggested to account for both inlet distortion and total pressure loss at a series of design flight conditions. The resulting optimization problem is difficult since some of the design parameters take on integer values. If numerical procedures could be used to reduce multimillion dollar development test programs to a small set of verification tests, numerical optimization could have a significant impact on both cost and elapsed time to design new aircraft.

  12. DECHADE: DEtecting slight Changes with HArd DEcisions in Wireless Sensor Networks

    NASA Astrophysics Data System (ADS)

    Ciuonzo, D.; Salvo Rossi, P.

    2018-07-01

    This paper focuses on the problem of change detection through a Wireless Sensor Network (WSN) whose nodes report only binary decisions (on the presence/absence of a certain event to be monitored), due to bandwidth/energy constraints. The resulting problem can be modelled as testing the equality of samples drawn from independent Bernoulli probability mass functions, when the bit probabilities under both hypotheses are not known. Both One-Sided (OS) and Two-Sided (TS) tests are considered, with reference to: (i) identical bit probability (a homogeneous scenario), (ii) different per-sensor bit probabilities (a non-homogeneous scenario) and (iii) regions with identical bit probability (a block-homogeneous scenario) for the observed samples. The goal is to provide a systematic framework collecting a plethora of viable detectors (designed via theoretically founded criteria) which can be used for each instance of the problem. Finally, verification of the derived detectors in two relevant WSN-related problems is provided to show the appeal of the proposed framework.

  13. 76 FR 75878 - Information Collection Being Submitted to the Office of Management and Budget for Review and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ...: 3060-0329. Title: Section 2.955, Equipment Authorization-Verification (Retention of Records). Form No.... Section 2.955 describes for each equipment device subject to verification, the responsible party, as shown... performing the verification testing. The Commission may request additional information regarding the test...

  14. Orbit attitude processor. STS-1 bench program verification test plan

    NASA Technical Reports Server (NTRS)

    Mcclain, C. R.

    1980-01-01

    A plan for the static verification of the STS-1 ATT PROC ORBIT software requirements is presented. The orbit version of the SAPIENS bench program is used to generate the verification data. A brief discussion of the simulation software and flight software modules is presented along with a description of the test cases.

  15. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Nox and N2o... recommend that you extract engine exhaust to perform this verification. Use a CLD that meets the..., if one is used during testing, introduce the engine exhaust to the NDUV analyzer. (4) Allow time for...

  16. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Nox and N2o... recommend that you extract engine exhaust to perform this verification. Use a CLD that meets the..., if one is used during testing, introduce the engine exhaust to the NDUV analyzer. (4) Allow time for...

  17. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Nox and N2o... recommend that you extract engine exhaust to perform this verification. Use a CLD that meets the..., if one is used during testing, introduce the engine exhaust to the NDUV analyzer. (4) Allow time for...

  18. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Nox and N2o... recommend that you extract engine exhaust to perform this verification. Use a CLD that meets the..., if one is used during testing, introduce the engine exhaust to the NDUV analyzer. (4) Allow time for...

  19. Generic Verification Protocol for Testing Pesticide Application Spray Drift Reduction Technologies for Row and Field Crops (Version 1.4)

    EPA Science Inventory

    This generic verification protocol provides a detailed method for conducting and reporting results from verification testing of pesticide application technologies. It can be used to evaluate technologies for their potential to reduce spray drift, hence the term “drift reduction t...

  20. Finite element code FENIA verification and application for 3D modelling of thermal state of radioactive waste deep geological repository

    NASA Astrophysics Data System (ADS)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, U. N.

    2017-11-01

    The verification of the FENIA finite element code on some problems and an example of its application are presented in the paper. The code is being developing for 3D modelling of thermal, mechanical and hydrodynamical (THM) problems related to the functioning of deep geological repositories. Verification of the code for two analytical problems has been performed. The first one is point heat source with exponential heat decrease, the second one - linear heat source with similar behavior. Analytical solutions have been obtained by the authors. The problems have been chosen because they reflect the processes influencing the thermal state of deep geological repository of radioactive waste. Verification was performed for several meshes with different resolution. Good convergence between analytical and numerical solutions was achieved. The application of the FENIA code is illustrated by 3D modelling of thermal state of a prototypic deep geological repository of radioactive waste. The repository is designed for disposal of radioactive waste in a rock at depth of several hundred meters with no intention of later retrieval. Vitrified radioactive waste is placed in the containers, which are placed in vertical boreholes. The residual decay heat of radioactive waste leads to containers, engineered safety barriers and host rock heating. Maximum temperatures and corresponding times of their establishment have been determined.

  1. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH - PERKIN-ELMER PHOTOVAC, INC. VOYAGOR

    EPA Science Inventory

    The U.S Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. Reports document the performa...

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - CHROMATOGRAPH/MASS SPECTOMETOR INLICON, INC. HAPSITE

    EPA Science Inventory

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH - SENTEX SYSTEMS, INC. SCENTOGRAPH PLUS II

    EPA Science Inventory

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report documents demons...

  5. Environmental Technology Verification Report -- Baghouse filtration products, GE Energy QG061 filtration media ( tested May 2007)

    EPA Science Inventory

    EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...

  6. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...

  7. Monte Carlo verification of radiotherapy treatments with CloudMC.

    PubMed

    Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José

    2018-06-27

    A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.

  8. General Dynamic (GD) Launch Waveform On-Orbit Performance Report

    NASA Technical Reports Server (NTRS)

    Briones, Janette C.; Shalkhauser, Mary Jo

    2014-01-01

    The purpose of this report is to present the results from the GD SDR on-orbit performance testing using the launch waveform over TDRSS. The tests include the evaluation of well-tested waveform modes, the operation of RF links that are expected to have high margins, the verification of forward return link operation (including full duplex), the verification of non-coherent operational models, and the verification of radio at-launch operational frequencies. This report also outlines the launch waveform tests conducted and comparisons to the results obtained from ground testing.

  9. Verification Testing: Meet User Needs Figure of Merit

    NASA Technical Reports Server (NTRS)

    Kelly, Bryan W.; Welch, Bryan W.

    2017-01-01

    Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible calculations. Additionally, this project will look at the coding generally and note inconsistencies, redundancies, and other aspects that may become problematic or slow down the codes run time. Certain scripts lacking in documentation also will be commented and cataloged.

  10. EOSlib, Version 3

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Woods, Nathan; Menikoff, Ralph

    2017-02-03

    Equilibrium thermodynamics underpins many of the technologies used throughout theoretical physics, yet verification of the various theoretical models in the open literature remains challenging. EOSlib provides a single, consistent, verifiable implementation of these models, in a single, easy-to-use software package. It consists of three parts: a software library implementing various published equation-of-state (EOS) models; a database of fitting parameters for various materials for these models; and a number of useful utility functions for simplifying thermodynamic calculations such as computing Hugoniot curves or Riemann problem solutions. Ready availability of this library will enable reliable code-to- code testing of equation-of-state implementations, asmore » well as a starting point for more rigorous verification work. EOSlib also provides a single, consistent API for its analytic and tabular EOS models, which simplifies the process of comparing models for a particular application.« less

  11. Ground vibration tests of a high fidelity truss for verification of on orbit damage location techniques

    NASA Technical Reports Server (NTRS)

    Kashangaki, Thomas A. L.

    1992-01-01

    This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.

  12. Formal verification of an oral messages algorithm for interactive consistency

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1992-01-01

    The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.

  13. Verification and Optimal Control of Context-Sensitive Probabilistic Boolean Networks Using Model Checking and Polynomial Optimization

    PubMed Central

    Hiraishi, Kunihiko

    2014-01-01

    One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs. PMID:24587766

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PORTABLE GAS CHROMATOGRAPH ELECTRONIC SENSOR TECHNOLOGY MODEL 4100

    EPA Science Inventory

    The U.S. Environmental Protection Agency, through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. As part of this program, the...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHOTOACOUSTIC SPECTROPHOTOMATER INNOVA AIR TECH INSTRUMENTS MODEL 1312 MULTI-GAS MONITOR

    EPA Science Inventory

    The U.S. Environmental Protection Agency, Through the Environmental Technology Verification Program, is working to accelerate the acceptance and use of innovative technologies that improve the way the United States manages its environmental problems. This report documents demons...

  16. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  17. Ada(R) Test and Verification System (ATVS)

    NASA Technical Reports Server (NTRS)

    Strelich, Tom

    1986-01-01

    The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.

  18. Integrated testing and verification system for research flight software

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  19. The Use of Remote Sensing Satellites for Verification in International Law

    NASA Astrophysics Data System (ADS)

    Hettling, J. K.

    The contribution is a very sensitive topic which is currently about to gain significance and importance in the international community. It implies questions of international law as well as the contemplation of new developments and decisions in international politics. The paper will begin with the meaning and current status of verification in international law as well as the legal basis of satellite remote sensing in international treaties and resolutions. For the verification part, this implies giving a definition of verification and naming its fields of application and the different means of verification. For the remote sensing part, it involves the identification of relevant provisions in the Outer Space Treaty and the United Nations General Assembly Principles on Remote Sensing. Furthermore it shall be looked at practical examples: in how far have remote sensing satellites been used to verify international obligations? Are there treaties which would considerably profit from the use of remote sensing satellites? In this respect, there are various examples which can be contemplated, such as the ABM Treaty (even though out of force now), the SALT and START Agreements, the Chemical Weapons Convention and the Conventional Test Ban Treaty. It will be mentioned also that NGOs have started to verify international conventions, e.g. Landmine Monitor is verifying the Mine-Ban Convention. Apart from verifying arms control and disarmament treaties, satellites can also strengthen the negotiation of peace agreements (such as the Dayton Peace Talks) and the prevention of international conflicts from arising. Verification has played an increasingly prominent role in high-profile UN operations. Verification and monitoring can be applied to the whole range of elements that constitute a peace implementation process, ranging from the military aspects through electoral monitoring and human rights monitoring, from negotiating an accord to finally monitoring it. Last but not least the problem of enforcing international obligations needs to be addressed, especially the dependence of international law on the will of political leaders and their respective national interests.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT; UV DISINFECTION FOR REUSE APPLICATION, AQUIONICS, INC. BERSONINLINE 4250 UV SYSTEM

    EPA Science Inventory

    Verification testing of the Aquionics, Inc. bersonInLine® 4250 UV System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills Wastewater Treatment Plant test site in Parsippany, New Jersey. Two full-scale reactors were mounted in series. T...

  1. A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems

    DTIC Science & Technology

    2016-03-01

    insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: UV DISINFECTION FOR REUSE APPLICATIONS, ONDEO DEGREMONT, INC., AQUARAY® 40 HO VLS DISINFECTION SYSTEM

    EPA Science Inventory

    Verification testing of the Ondeo Degremont, Inc. Aquaray® 40 HO VLS Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Three reactor modules were m...

  3. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  4. To thine own self be true? Clarifying the effects of identity discrepancies on psychological distress and emotions.

    PubMed

    Kalkhoff, Will; Marcussen, Kristen; Serpe, Richard T

    2016-07-01

    After many years of research across disciplines, it remains unclear whether people are more motivated to seek appraisals that accurately match self-views (self-verification) or are as favorable as possible (self-enhancement). Within sociology, mixed findings in identity theory have fueled the debate. A problem here is that a commonly employed statistical approach does not take into account the direction of a discrepancy between how we see ourselves and how we think others see us in terms of a given identity, yet doing so is critical for determining which self-motive is at play. We offer a test of three competing models of identity processes, including a new "mixed motivations" model where self-verification and self-enhancement operate simultaneously. We compare the models using the conventional statistical approach versus response surface analysis. The latter method allows us to determine whether identity discrepancies involving over-evaluation are as distressing as those involving under-evaluation. We use nationally representative data and compare results across four different identities and multiple outcomes. The two statistical approaches lead to the same conclusions more often than not and mostly support identity theory and its assumption that people seek self-verification. However, response surface tests reveal patterns that are mistaken as evidence of self-verification by conventional procedures, especially for the spouse identity. We also find that identity discrepancies have different effects on distress and self-conscious emotions (guilt and shame). Our findings have implications not only for research on self and identity across disciplines, but also for many other areas of research that incorporate these concepts and/or use difference scores as explanatory variables. Copyright © 2016 Elsevier Inc. All rights reserved.

  5. Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon

    NASA Astrophysics Data System (ADS)

    Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen

    Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.

  6. What Sensing Tells Us: Towards a Formal Theory of Testing for Dynamical Systems

    NASA Technical Reports Server (NTRS)

    McIlraith, Sheila; Scherl, Richard

    2005-01-01

    Just as actions can have indirect effects on the state of the world, so too can sensing actions have indirect effects on an agent's state of knowledge. In this paper, we investigate "what sensing actions tell us", i.e., what an agent comes to know indirectly from the outcome of a sensing action, given knowledge of its actions and state constraints that hold in the world. To this end, we propose a formalization of the notion of testing within a dialect of the situation calculus that includes knowledge and sensing actions. Realizing this formalization requires addressing the ramification problem for sensing actions. We formalize simple tests as sensing actions. Complex tests are expressed in the logic programming language Golog. We examine what it means to perform a test, and how the outcome of a test affects an agent's state of knowledge. Finally, we propose automated reasoning techniques for test generation and complex-test verification, under certain restrictions. The work presented in this paper is relevant to a number of application domains including diagnostic problem solving, natural language understanding, plan recognition, and active vision.

  7. Verification of component mode techniques for flexible multibody systems

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1990-01-01

    Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.

  8. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  9. TEST/QA PLAN FOR THE VERIFICATION TESTING OF ALTERNATIVES OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSONS, AND LUBRICANTS FOR HIGHWAY AND NONROAD USE HEAVY DUTY DIESEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  10. Collaborative Localization and Location Verification in WSNs

    PubMed Central

    Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang

    2015-01-01

    Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948

  11. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  12. RELAP5-3D Resolution of Known Restart/Backup Issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesina, George L.; Anderson, Nolan A.

    2014-12-01

    The state-of-the-art nuclear reactor system safety analysis computer program developed at the Idaho National Laboratory (INL), RELAP5-3D, continues to adapt to changes in computer hardware and software and to develop to meet the ever-expanding needs of the nuclear industry. To continue at the forefront, code testing must evolve with both code and industry developments, and it must work correctly. To best ensure this, the processes of Software Verification and Validation (V&V) are applied. Verification compares coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions. A form of this, sequentialmore » verification, checks code specifications against coding only when originally written then applies regression testing which compares code calculations between consecutive updates or versions on a set of test cases to check that the performance does not change. A sequential verification testing system was specially constructed for RELAP5-3D to both detect errors with extreme accuracy and cover all nuclear-plant-relevant code features. Detection is provided through a “verification file” that records double precision sums of key variables. Coverage is provided by a test suite of input decks that exercise code features and capabilities necessary to model a nuclear power plant. A matrix of test features and short-running cases that exercise them is presented. This testing system is used to test base cases (called null testing) as well as restart and backup cases. It can test RELAP5-3D performance in both standalone and coupled (through PVM to other codes) runs. Application of verification testing revealed numerous restart and backup issues in both standalone and couple modes. This document reports the resolution of these issues.« less

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yashchuk, V.V.; Conley, R.; Anderson, E.H.

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binarypseudo-random (BPR) gratings and arrays has been suggested and and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer. Here we describe the details of development of binarypseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electronmore » microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi{sub 2}/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML testsamples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.« less

  14. The linear Boltzmann equation in slab geometry - Development and verification of a reliable and efficient solution

    NASA Technical Reports Server (NTRS)

    Stamnes, K.; Lie-Svendsen, O.; Rees, M. H.

    1991-01-01

    The linear Boltzmann equation can be cast in a form mathematically identical to the radiation-transport equation. A multigroup procedure is used to reduce the energy (or velocity) dependence of the transport equation to a series of one-speed problems. Each of these one-speed problems is equivalent to the monochromatic radiative-transfer problem, and existing software is used to solve this problem in slab geometry. The numerical code conserves particles in elastic collisions. Generic examples are provided to illustrate the applicability of this approach. Although this formalism can, in principle, be applied to a variety of test particle or linearized gas dynamics problems, it is particularly well-suited to study the thermalization of suprathermal particles interacting with a background medium when the thermal motion of the background cannot be ignored. Extensions of the formalism to include external forces and spherical geometry are also feasible.

  15. Integration and verification testing of the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.

    2016-08-01

    We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.

  16. Verification of EPA's " Preliminary remediation goals for radionuclides" (PRG) electronic calculator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stagich, B. H.

    The U.S. Environmental Protection Agency (EPA) requested an external, independent verification study of their “Preliminary Remediation Goals for Radionuclides” (PRG) electronic calculator. The calculator provides information on establishing PRGs for radionuclides at Comprehensive Environmental Response, Compensation, and Liability Act (CERCLA) sites with radioactive contamination (Verification Study Charge, Background). These risk-based PRGs set concentration limits using carcinogenic toxicity values under specific exposure conditions (PRG User’s Guide, Section 1). The purpose of this verification study is to ascertain that the computer codes has no inherit numerical problems with obtaining solutions as well as to ensure that the equations are programmed correctly.

  17. Implementation of Precision Verification Solvents on the External Tank

    NASA Technical Reports Server (NTRS)

    Campbell, M.

    1998-01-01

    This paper presents the Implementation of Precision Verification Solvents on the External Tank. The topics include: 1) Background; 2) Solvent Usages; 3) TCE (Trichloroethylene) Reduction; 4) Solvent Replacement Studies; 5) Implementation; 6) Problems Occuring During Implementation; and 7) Future Work. This paper is presented in viewgraph form.

  18. Hardware acceleration and verification of systems designed with hardware description languages (HDL)

    NASA Astrophysics Data System (ADS)

    Wisniewski, Remigiusz; Wegrzyn, Marek

    2005-02-01

    Hardware description languages (HDLs) allow creating bigger and bigger designs nowadays. The size of prototyped systems very often exceeds million gates. Therefore verification process of the designs takes several hours or even days. The solution for this problem can be solved by hardware acceleration of simulation.

  19. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification. (c...

  20. Idaho out-of-service verification field operational test

    DOT National Transportation Integrated Search

    2000-02-01

    The Out-of-Service Verification Field Operational Test Project was initiated in 1994. The purpose of the project was to test the feasibility of using sensors and a computerized tracking system to augment the ability of inspectors to monitor and contr...

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    V Yashchuk; R Conley; E Anderson

    Verification of the reliability of metrology data from high quality X-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [1] and [2] and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [5]. Here we describe the details of development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanningmore » (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi2/Si multilayer coating with pseudo-randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize X-ray microscopes. Corresponding work with X-ray microscopes is in progress.« less

  2. Fabrication and verification testing of ETM 30 cm diameter ion thrusters

    NASA Technical Reports Server (NTRS)

    Collett, C.

    1977-01-01

    Engineering model designs and acceptance tests are described for the 800 and 900 series 30 cm electron bombardment thrustors. Modifications to the test console for a 1000 hr verification test were made. The 10,000 hr endurance test of the S/N 701 thruster is described, and post test analysis results are included.

  3. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF THE INDUSTRIAL TEST SYSTEM, INC. CYANIDE REAGENTSTRIP™ TEST KIT

    EPA Science Inventory

    Cyanide can be present in various forms in water. The cyanide test kit evaluated in this verification study (Industrial Test System, Inc. Cyanide Regent Strip ™ Test Kit) was designed to detect free cyanide in water. This is done by converting cyanide in water to cyanogen...

  4. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols

    NASA Technical Reports Server (NTRS)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio

    1992-01-01

    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  5. Environmental Testing Campaign and Verification of Satellite Deimos-2 at INTA

    NASA Astrophysics Data System (ADS)

    Hernandez, Daniel; Vazquez, Mercedes; Anon, Manuel; Olivo, Esperanza; Gallego, Pablo; Morillo, Pablo; Parra, Javier; Capraro; Luengo, Mar; Garcia, Beatriz; Villacorta, Pablo

    2014-06-01

    In this paper the environmental test campaign and verification of the DEIMOS-2 (DM2) satellite will be presented and described. DM2 will be ready for launch in 2014.Firstly, a short description of the satellite is presented, including its physical characteristics and intended optical performances. DEIMOS-2 is a LEO satellite for earth observation that will provide high resolution imaging services for agriculture, civil protection, environmental issues, disasters monitoring, climate change, urban planning, cartography, security and intelligence.Then, the verification and test campaign carried out on the SM and FM models at INTA is described; including Mechanical test for the SM and Climatic, Mechanical and Electromagnetic Compatibility tests for the FM. In addition, this paper includes Centre of Gravity and Moment of Inertia measurements for both models, and other verification activities carried out in order to ensure satellite's health during launch and its in orbit performance.

  6. Results from an Independent View on The Validation of Safety-Critical Space Systems

    NASA Astrophysics Data System (ADS)

    Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.

    2013-08-01

    The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, MIRATECH CORPORATIONM GECO 3001 AIR/FUEL RATIO CONTROLLER

    EPA Science Inventory

    Details on the verification test design, measurement test procedures, and Quality assurance/Quality Control (QA/QC) procedures can be found in the test plan titled Testing and Quality Assurance Plan, MIRATECH Corporation GECO 3100 Air/Fuel Ratio Controller (SRI 2001). It can be d...

  8. Test/QA Plan for Verification of Semi-Continuous Ambient Air Monitoring Systems - Second Round

    EPA Science Inventory

    Test/QA Plan for Verification of Semi-Continuous Ambient Air Monitoring Systems - Second Round. Changes reflect performance of second round of testing at new location and with various changes to personnel. Additional changes reflect general improvements to the Version 1 test/QA...

  9. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  10. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  11. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  12. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  13. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  14. Built-in-Test Verification Techniques

    DTIC Science & Technology

    1987-02-01

    report documents the results of the effort for the Rome Air Development Center Contract F30602-84-C-0021, BIT Verification Techniques. The work was...Richard Spillman of Sp.,llman Research Associates. The principal investigators were Mike Partridge and subsequently Jeffrey Albert. The contract was...two your effort to develop techniques for Built-In Test (BIT) verification. The objective of the contract was to develop specifications and technical

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL RESIDENTIAL HOMES, WATERLOO BIOFILTER® MODEL 4-BEDROOM (NSF 02/03/WQPC-SWP)

    EPA Science Inventory

    Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE UV DISINFECTION OF SECONDARY EFFLUENTS, SUNTEC, INC. MODEL LPX200 DISINFECTION SYSTEM - 03/09/WQPC-SWP

    EPA Science Inventory

    Verification testing of the SUNTEC LPX200 UV Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Two lamp modules were mounted parallel in a 6.5-meter lon...

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSDF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, SEPTITECH, INC. MODEL 400 SYSTEM - 02/04/WQPC-SWP

    EPA Science Inventory

    Verification testing of the SeptiTech Model 400 System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing was u...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, BIO-MICROBICS, INC., MODEL RETROFAST ®0.375

    EPA Science Inventory

    Verification testing of the Bio-Microbics RetroFAST® 0.375 System to determine the reduction of nitrogen in residential wastewater was conducted over a twelve-month period at the Mamquam Wastewater Technology Test Facility, located at the Mamquam Wastewater Treatment Plant. The R...

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, AQUAPOINT, INC. BIOCLERE MODEL 16/12 - 02/02/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Aquapoint, Inc. (AQP) BioclereTM Model 16/12 was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC), located at Otis Air National Guard Base in Bourne, Massachusetts. Sanitary sewerage from the ba...

  20. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  1. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  2. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  3. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  4. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  5. Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.

    PubMed

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M

    2012-04-15

    A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.

  6. Test/QA Plan for Verification of Ozone Indicator Cards

    EPA Science Inventory

    This verification test will address ozone indicator cards (OICs) that provide short-term semi-quantitative measures of ozone concentration in ambient air. Testing will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Tec...

  7. Verification and Validation of a Coordinate Transformation Method in Axisymmetric Transient Magnetics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashcraft, C. Chace; Niederhaus, John Henry; Robinson, Allen C.

    We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specializedmore » approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.« less

  8. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  9. Numerical Tests for the Problem of U-Pu Fuel Burnup in Fuel Rod and Polycell Models Using the MCNP Code

    NASA Astrophysics Data System (ADS)

    Muratov, V. G.; Lopatkin, A. V.

    An important aspect in the verification of the engineering techniques used in the safety analysis of MOX-fuelled reactors, is the preparation of test calculations to determine nuclide composition variations under irradiation and analysis of burnup problem errors resulting from various factors, such as, for instance, the effect of nuclear data uncertainties on nuclide concentration calculations. So far, no universally recognized tests have been devised. A calculation technique has been developed for solving the problem using the up-to-date calculation tools and the latest versions of nuclear libraries. Initially, in 1997, a code was drawn up in an effort under ISTC Project No. 116 to calculate the burnup in one VVER-1000 fuel rod, using the MCNP Code. Later on, the authors developed a computation technique which allows calculating fuel burnup in models of a fuel rod, or a fuel assembly, or the whole reactor. It became possible to apply it to fuel burnup in all types of nuclear reactors and subcritical blankets.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinh, Nam; Athe, Paridhi; Jones, Christopher

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. Thismore » approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.« less

  11. Numerical simulation of an elastic structure behavior under transient fluid flow excitation

    NASA Astrophysics Data System (ADS)

    Afanasyeva, Irina N.; Lantsova, Irina Yu.

    2017-01-01

    This paper deals with the verification of a numerical technique of modeling fluid-structure interaction (FSI) problems. The configuration consists of incompressible viscous fluid around an elastic structure in the channel. External flow is laminar. Multivariate calculations are performed using special software ANSYS CFX and ANSYS Mechanical. Different types of parameters of mesh deformation and solver controls (time step, under relaxation factor, number of iterations at coupling step) were tested. The results are presented in tables and plots in comparison with reference data.

  12. Modeling human response errors in synthetic flight simulator domain

    NASA Technical Reports Server (NTRS)

    Ntuen, Celestine A.

    1992-01-01

    This paper presents a control theoretic approach to modeling human response errors (HRE) in the flight simulation domain. The human pilot is modeled as a supervisor of a highly automated system. The synthesis uses the theory of optimal control pilot modeling for integrating the pilot's observation error and the error due to the simulation model (experimental error). Methods for solving the HRE problem are suggested. Experimental verification of the models will be tested in a flight quality handling simulation.

  13. Missile and Space Systems Reliability versus Cost Trade-Off Study

    DTIC Science & Technology

    1983-01-01

    F00-1C09 Robert C. Schneider F00-1C09 V . PERFORMING ORGANIZATION NAME AM0 ADDRESS 16 PRGRAM ELEMENT. PROJECT. TASK BoeingAerosace CmpAnyA CA WORK UNIT...reliability problems, which has the - real bearing on program effectiveness. A well planned and funded reliability effort can prevent or ferret out...failure analysis, and the in- corporation and verification of design corrections to prevent recurrence of failures. 302.2.2 A TMJ test plan shall be

  14. Hawking radiation in an electromagnetic waveguide?

    PubMed

    Schützhold, Ralf; Unruh, William G

    2005-07-15

    It is demonstrated that the propagation of electromagnetic waves in an appropriately designed waveguide is (for large wavelengths) analogous to that within a curved space-time--such as around a black hole. As electromagnetic radiation (e.g., microwaves) can be controlled, amplified, and detected (with present-day technology) much easier than sound, for example, we propose a setup for the experimental verification of the Hawking effect. Apart from experimentally testing this striking prediction, this would facilitate the investigation of the trans-Planckian problem.

  15. Interface COMSOL-PHREEQC (iCP), an efficient numerical framework for the solution of coupled multiphysics and geochemistry

    NASA Astrophysics Data System (ADS)

    Nardi, Albert; Idiart, Andrés; Trinchero, Paolo; de Vries, Luis Manuel; Molinero, Jorge

    2014-08-01

    This paper presents the development, verification and application of an efficient interface, denoted as iCP, which couples two standalone simulation programs: the general purpose Finite Element framework COMSOL Multiphysics® and the geochemical simulator PHREEQC. The main goal of the interface is to maximize the synergies between the aforementioned codes, providing a numerical platform that can efficiently simulate a wide number of multiphysics problems coupled with geochemistry. iCP is written in Java and uses the IPhreeqc C++ dynamic library and the COMSOL Java-API. Given the large computational requirements of the aforementioned coupled models, special emphasis has been placed on numerical robustness and efficiency. To this end, the geochemical reactions are solved in parallel by balancing the computational load over multiple threads. First, a benchmark exercise is used to test the reliability of iCP regarding flow and reactive transport. Then, a large scale thermo-hydro-chemical (THC) problem is solved to show the code capabilities. The results of the verification exercise are successfully compared with those obtained using PHREEQC and the application case demonstrates the scalability of a large scale model, at least up to 32 threads.

  16. Space shuttle engineering and operations support. Avionics system engineering

    NASA Technical Reports Server (NTRS)

    Broome, P. A.; Neubaur, R. J.; Welsh, R. T.

    1976-01-01

    The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.

  17. The Second NASA Formal Methods Workshop 1992

    NASA Technical Reports Server (NTRS)

    Johnson, Sally C. (Compiler); Holloway, C. Michael (Compiler); Butler, Ricky W. (Compiler)

    1992-01-01

    The primary goal of the workshop was to bring together formal methods researchers and aerospace industry engineers to investigate new opportunities for applying formal methods to aerospace problems. The first part of the workshop was tutorial in nature. The second part of the workshop explored the potential of formal methods to address current aerospace design and verification problems. The third part of the workshop involved on-line demonstrations of state-of-the-art formal verification tools. Also, a detailed survey was filled in by the attendees; the results of the survey are compiled.

  18. Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3

    EPA Pesticide Factsheets

    The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.

  19. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  20. An analysis of random projection for changeable and privacy-preserving biometric verification.

    PubMed

    Wang, Yongjin; Plataniotis, Konstantinos N

    2010-10-01

    Changeability and privacy protection are important factors for widespread deployment of biometrics-based verification systems. This paper presents a systematic analysis of a random-projection (RP)-based method for addressing these problems. The employed method transforms biometric data using a random matrix with each entry an independent and identically distributed Gaussian random variable. The similarity- and privacy-preserving properties, as well as the changeability of the biometric information in the transformed domain, are analyzed in detail. Specifically, RP on both high-dimensional image vectors and dimensionality-reduced feature vectors is discussed and compared. A vector translation method is proposed to improve the changeability of the generated templates. The feasibility of the introduced solution is well supported by detailed theoretical analyses. Extensive experimentation on a face-based biometric verification problem shows the effectiveness of the proposed method.

  1. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  2. Proceedings of the Sixth NASA Langley Formal Methods (LFM) Workshop

    NASA Technical Reports Server (NTRS)

    Rozier, Kristin Yvonne (Editor)

    2008-01-01

    Today's verification techniques are hard-pressed to scale with the ever-increasing complexity of safety critical systems. Within the field of aeronautics alone, we find the need for verification of algorithms for separation assurance, air traffic control, auto-pilot, Unmanned Aerial Vehicles (UAVs), adaptive avionics, automated decision authority, and much more. Recent advances in formal methods have made verifying more of these problems realistic. Thus we need to continually re-assess what we can solve now and identify the next barriers to overcome. Only through an exchange of ideas between theoreticians and practitioners from academia to industry can we extend formal methods for the verification of ever more challenging problem domains. This volume contains the extended abstracts of the talks presented at LFM 2008: The Sixth NASA Langley Formal Methods Workshop held on April 30 - May 2, 2008 in Newport News, Virginia, USA. The topics of interest that were listed in the call for abstracts were: advances in formal verification techniques; formal models of distributed computing; planning and scheduling; automated air traffic management; fault tolerance; hybrid systems/hybrid automata; embedded systems; safety critical applications; safety cases; accident/safety analysis.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, F. R. MAHONEY & ASSOC., AMPHIDROME SYSTEM FOR SINGLE FAMILY HOMES - 02/05/WQPC-SWP

    EPA Science Inventory

    Verification testing of the F.R. Mahoney Amphidrome System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing w...

  4. Inverse probability weighting estimation of the volume under the ROC surface in the presence of verification bias.

    PubMed

    Zhang, Ying; Alonzo, Todd A

    2016-11-01

    In diagnostic medicine, the volume under the receiver operating characteristic (ROC) surface (VUS) is a commonly used index to quantify the ability of a continuous diagnostic test to discriminate between three disease states. In practice, verification of the true disease status may be performed only for a subset of subjects under study since the verification procedure is invasive, risky, or expensive. The selection for disease examination might depend on the results of the diagnostic test and other clinical characteristics of the patients, which in turn can cause bias in estimates of the VUS. This bias is referred to as verification bias. Existing verification bias correction in three-way ROC analysis focuses on ordinal tests. We propose verification bias-correction methods to construct ROC surface and estimate the VUS for a continuous diagnostic test, based on inverse probability weighting. By applying U-statistics theory, we develop asymptotic properties for the estimator. A Jackknife estimator of variance is also derived. Extensive simulation studies are performed to evaluate the performance of the new estimators in terms of bias correction and variance. The proposed methods are used to assess the ability of a biomarker to accurately identify stages of Alzheimer's disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Monocular precrash vehicle detection: features and classifiers.

    PubMed

    Sun, Zehang; Bebis, George; Miller, Ronald

    2006-07-01

    Robust and reliable vehicle detection from images acquired by a moving vehicle (i.e., on-road vehicle detection) is an important problem with applications to driver assistance systems and autonomous, self-guided vehicles. The focus of this work is on the issues of feature extraction and classification for rear-view vehicle detection. Specifically, by treating the problem of vehicle detection as a two-class classification problem, we have investigated several different feature extraction methods such as principal component analysis, wavelets, and Gabor filters. To evaluate the extracted features, we have experimented with two popular classifiers, neural networks and support vector machines (SVMs). Based on our evaluation results, we have developed an on-board real-time monocular vehicle detection system that is capable of acquiring grey-scale images, using Ford's proprietary low-light camera, achieving an average detection rate of 10 Hz. Our vehicle detection algorithm consists of two main steps: a multiscale driven hypothesis generation step and an appearance-based hypothesis verification step. During the hypothesis generation step, image locations where vehicles might be present are extracted. This step uses multiscale techniques not only to speed up detection, but also to improve system robustness. The appearance-based hypothesis verification step verifies the hypotheses using Gabor features and SVMs. The system has been tested in Ford's concept vehicle under different traffic conditions (e.g., structured highway, complex urban streets, and varying weather conditions), illustrating good performance.

  6. Verification of Java Programs using Symbolic Execution and Invariant Generation

    NASA Technical Reports Server (NTRS)

    Pasareanu, Corina; Visser, Willem

    2004-01-01

    Software verification is recognized as an important and difficult problem. We present a norel framework, based on symbolic execution, for the automated verification of software. The framework uses annotations in the form of method specifications an3 loop invariants. We present a novel iterative technique that uses invariant strengthening and approximation for discovering these loop invariants automatically. The technique handles different types of data (e.g. boolean and numeric constraints, dynamically allocated structures and arrays) and it allows for checking universally quantified formulas. Our framework is built on top of the Java PathFinder model checking toolset and it was used for the verification of several non-trivial Java programs.

  7. Model-Based Verification and Validation of Spacecraft Avionics

    NASA Technical Reports Server (NTRS)

    Khan, M. Omair; Sievers, Michael; Standley, Shaun

    2012-01-01

    Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.

  8. Center for Extended Magnetohydrodynamics Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramos, Jesus

    This researcher participated in the DOE-funded Center for Extended Magnetohydrodynamics Modeling (CEMM), a multi-institutional collaboration led by the Princeton Plasma Physics Laboratory with Dr. Stephen Jardin as the overall Principal Investigator. This project developed advanced simulation tools to study the non-linear macroscopic dynamics of magnetically confined plasmas. The collaborative effort focused on the development of two large numerical simulation codes, M3D-C1 and NIMROD, and their application to a wide variety of problems. Dr. Ramos was responsible for theoretical aspects of the project, deriving consistent sets of model equations applicable to weakly collisional plasmas and devising test problems for verification ofmore » the numerical codes. This activity was funded for twelve years.« less

  9. Certification and verification for Northrup model NSC-01-0732 fresnel lens concentrating solar collector

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Structural analysis and certification of the collector system is presented. System verification against the interim performance criteria is presented and indicated by matrices. The verification discussion, analysis, and test results are also given.

  10. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  11. Test/QA Plan For Verification Of Anaerobic Digester For Energy Production And Pollution Prevention

    EPA Science Inventory

    The ETV-ESTE Program conducts third-party verification testing of commercially available technologies that improve the environmental conditions in the U.S. A stakeholder committee of buyers and users of such technologies guided the development of this test on anaerobic digesters...

  12. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    The current philosophy of the GSFS regarding environmental verification of Shuttle payloads is reviewed. In the structures area, increased emphasis will be placed on the use of analysis for design verification, with selective testing performed as necessary. Furthermore, as a result of recent cost optimization analysis, the multitier test program will presumably give way to a comprehensive test program at the major payload subassembly level after adequate workmanship at the component level has been verified. In the thermal vacuum area, thought is being given to modifying the approaches used for conventional spacecraft.

  13. Overview of the TOPEX/Poseidon Platform Harvest Verification Experiment

    NASA Technical Reports Server (NTRS)

    Morris, Charles S.; DiNardo, Steven J.; Christensen, Edward J.

    1995-01-01

    An overview is given of the in situ measurement system installed on Texaco's Platform Harvest for verification of the sea level measurement from the TOPEX/Poseidon satellite. The prelaunch error budget suggested that the total root mean square (RMS) error due to measurements made at this verification site would be less than 4 cm. The actual error budget for the verification site is within these original specifications. However, evaluation of the sea level data from three measurement systems at the platform has resulted in unexpectedly large differences between the systems. Comparison of the sea level measurements from the different tide gauge systems has led to a better understanding of the problems of measuring sea level in relatively deep ocean. As of May 1994, the Platform Harvest verification site has successfully supported 60 TOPEX/Poseidon overflights.

  14. Certification and verification for Northrup Model NSC-01-0732 Fresnel lens concentrating solar collector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-03-01

    The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    This paper documents the escape of high explosive (HE) products problem. The problem, first presented by Fickett & Rivard, tests the implementation and numerical behavior of a high explosive detonation and energy release model and its interaction with an associated compressible hydrodynamics simulation code. The problem simulates the detonation of a finite-length, one-dimensional piece of HE that is driven by a piston from one end and adjacent to a void at the other end. The HE equation of state is modeled as a polytropic ideal gas. The HE detonation is assumed to be instantaneous with an infinitesimal reaction zone. Viamore » judicious selection of the material specific heat ratio, the problem has an exact solution with linear characteristics, enabling a straightforward calculation of the physical variables as a function of time and space. Lastly, implementation of the exact solution in the Python code ExactPack is discussed, as are verification cases for the exact solution code.« less

  16. Formulating face verification with semidefinite programming.

    PubMed

    Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S

    2007-11-01

    This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.

  17. Software Verification of Orion Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  18. 40 CFR 1066.215 - Summary of verification and calibration procedures for chassis dynamometers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer... manufacturer instructions and good engineering judgment. (c) Automated dynamometer verifications and... accomplish the verifications and calibrations specified in this subpart. You may use these automated...

  19. 40 CFR 1066.215 - Summary of verification and calibration procedures for chassis dynamometers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer... manufacturer instructions and good engineering judgment. (c) Automated dynamometer verifications and... accomplish the verifications and calibrations specified in this subpart. You may use these automated...

  20. Options and Risk for Qualification of Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)

    2002-01-01

    Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.

  1. Environmental Technology Verification--Baghouse Filtration Products: GE Energy QG061 Filtration Media (Tested September 2008)

    EPA Science Inventory

    This report reviews the filtration and pressure drop performance of GE Energy's QG061 filtration media. Environmental Technology Verification (ETV) testing of this technology/product was conducted during a series of tests in September 2008. The objective of the ETV Program is to ...

  2. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  3. An Efficient Location Verification Scheme for Static Wireless Sensor Networks.

    PubMed

    Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok

    2017-01-24

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.

  4. An Efficient Location Verification Scheme for Static Wireless Sensor Networks

    PubMed Central

    Kim, In-hwan; Kim, Bo-sung; Song, JooSeok

    2017-01-01

    In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007

  5. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  6. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  7. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  8. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  9. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  10. Improvement of energy efficiency: the use of thermography and air-tightness test in verification of thermal performance of school buildings

    NASA Astrophysics Data System (ADS)

    Kauppinen, Timo; Siikanen, Sami

    2011-05-01

    The improvement of energy efficiency is the key issue after the energy performance of buildings directive came into the force in European Union countries. The city of Kuopio participate a project, in which different tools will be used, generated and tested to improve the energy efficiency of public buildings. In this project there are 2 schools, the other consuming much more heating energy than the other same type of school. In this paper the results of the thermography in normal conditions and under 50 Pa pressure drop will be presented; as well as the results of remote controlled air tightness test of the buildings. Thermography combined with air tightness test showed clearly the reasons of specific consumption differences of heating energy - also in the other hand, the measurements showed the problems in the performance of ventilation system. Thermography, air tightness test and other supporting measurements can be used together to solve energy loss problems - if these measurements will be carried out by proper way.

  11. Automatic programming for critical applications

    NASA Technical Reports Server (NTRS)

    Loganantharaj, Raj L.

    1988-01-01

    The important phases of a software life cycle include verification and maintenance. Usually, the execution performance is an expected requirement in a software development process. Unfortunately, the verification and the maintenance of programs are the time consuming and the frustrating aspects of software engineering. The verification cannot be waived for the programs used for critical applications such as, military, space, and nuclear plants. As a consequence, synthesis of programs from specifications, an alternative way of developing correct programs, is becoming popular. The definition, or what is understood by automatic programming, has been changed with our expectations. At present, the goal of automatic programming is the automation of programming process. Specifically, it means the application of artificial intelligence to software engineering in order to define techniques and create environments that help in the creation of high level programs. The automatic programming process may be divided into two phases: the problem acquisition phase and the program synthesis phase. In the problem acquisition phase, an informal specification of the problem is transformed into an unambiguous specification while in the program synthesis phase such a specification is further transformed into a concrete, executable program.

  12. Assessment of Galileo modal test results for mathematical model verification

    NASA Technical Reports Server (NTRS)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  13. VERIFYING THE VOC CONTROL PERFORMANCE OF BIOREACTORS

    EPA Science Inventory

    The paper describes the verification testing approach used to collect high-quality, peer-reviewed data on the performance of bioreaction-based technologies for the control of volatile organic compounds (VOCs). The verification protocol that describes the approach for these tests ...

  14. BAGHOUSE FILTRATION PRODUCTS VERIFICATION TESTING, HOW IT BENEFITS THE BOILER BAGHOUSE OPERATOR

    EPA Science Inventory

    The paper describes the Environmental Technology Verification (ETV) Program for baghouse filtration products developed by the Air Pollution Control Technology Verification Center, one of six Centers under the ETV Program, and discusses how it benefits boiler baghouse operators. A...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION--GENERIC VERIFICATION PROTOCOL FOR BIOLOGICAL AND AEROSOL TESTING OF GENERAL VENTILATION AIR CLEANERS

    EPA Science Inventory

    Under EPA's Environmental Technology Verification Program, Research Triangle Institute (RTI) will operate the Air Pollution Control Technology Center to verify the filtration efficiency and bioaerosol inactivation efficiency of heating, ventilation and air conditioning air cleane...

  16. Certification and verification for Calmac flat plate solar collector

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information used in the certification and verification of the Calmac Flat Plate Collector is presented. Contained are such items as test procedures and results, information on materials used, installation, operation, and maintenance manuals, and other information pertaining to the verification and certification.

  17. THIRD PARTY TECHNOLOGY PERFORMANCE VERIFICATION DATA FROM A STAKEHOLD-DRIVEN TECHNOLOGY TESTING PROGRAM

    EPA Science Inventory

    The Greenhouse Gas (GHG) Technology Verification Center is one of 12 independently operated verification centers established by the U.S. Environmental Protection Agency. The Center provides third-party performance data to stakeholders interested in environmetnal technologies tha...

  18. 40 CFR 1066.275 - Daily dynamometer readiness verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.275 Daily... automated process for this verification procedure, perform this evaluation by setting the initial speed and... your dynamometer does not perform this verification with an automated process: (1) With the dynamometer...

  19. The 2014 Sandia Verification and Validation Challenge: Problem statement

    DOE PAGES

    Hu, Kenneth; Orient, George

    2016-01-18

    This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less

  20. Real-Time Extended Interface Automata for Software Testing Cases Generation

    PubMed Central

    Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin

    2014-01-01

    Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system. PMID:24892080

  1. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  2. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  3. Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report

    NASA Technical Reports Server (NTRS)

    Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.

    2017-01-01

    This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.

  4. Test/QA plan for the verification testing of alternative or reformulated liquid fuels, fuel additives, fuel emulsions, and lubricants for highway and nonroad use heavy-duty diesel engines

    EPA Science Inventory

    This Environmental Technology Verification Program test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR P...

  5. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  6. Fault Management Practice: A Roadmap for Improvement

    NASA Technical Reports Server (NTRS)

    Fesq, Lorraine M.; Oberhettinger, David

    2010-01-01

    Autonomous fault management (FM) is critical for deep space and planetary missions where the limited communication opportunities may prevent timely intervention by ground control. Evidence of pervasive architecture, design, and verification/validation problems with NASA FM engineering has been revealed both during technical reviews of spaceflight missions and in flight. These problems include FM design changes required late in the life-cycle, insufficient project insight into the extent of FM testing required, unexpected test results that require resolution, spacecraft operational limitations because certain functions were not tested, and in-flight anomalies and mission failures attributable to fault management. A recent NASA initiative has characterized the FM state-of-practice throughout the spacecraft development community and identified common NASA, DoD, and commercial concerns that can be addressed in the near term through the development of a FM Practitioner's Handbook and the formation of a FM Working Group. Initial efforts will focus on standardizing FM terminology, establishing engineering processes and tools, and training.

  7. Detecting agricultural to urban land use change from multi-temporal MSS digital data. [Salt Lake County, Utah

    NASA Technical Reports Server (NTRS)

    Ridd, M. K.; Merola, J. A.; Jaynes, R. A.

    1983-01-01

    Conversion of agricultural land to a variety of urban uses is a major problem along the Wasatch Front, Utah. Although LANDSAT MSS data is a relatively coarse tool for discriminating categories of change in urban-size plots, its availability prompts a thorough test of its power to detect change. The procedures being applied to a test area in Salt Lake County, Utah, where the land conversion problem is acute are presented. The identity of land uses before and after conversion was determined and digital procedures for doing so were compared. Several algorithms were compared, utilizing both raw data and preprocessed data. Verification of results involved high quality color infrared photography and field observation. Two data sets were digitally registered, specific change categories internally identified in the software, results tabulated by computer, and change maps printed at 1:24,000 scale.

  8. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.

  9. Apollo experience report: Guidance and control systems. Engineering simulation program

    NASA Technical Reports Server (NTRS)

    Gilbert, D. W.

    1973-01-01

    The Apollo Program experience from early 1962 to July 1969 with respect to the engineering-simulation support and the problems encountered is summarized in this report. Engineering simulation in support of the Apollo guidance and control system is discussed in terms of design analysis and verification, certification of hardware in closed-loop operation, verification of hardware/software compatibility, and verification of both software and procedures for each mission. The magnitude, time, and cost of the engineering simulations are described with respect to hardware availability, NASA and contractor facilities (for verification of the command module, the lunar module, and the primary guidance, navigation, and control system), and scheduling and planning considerations. Recommendations are made regarding implementation of similar, large-scale simulations for future programs.

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: BAGHOUSE FILTRATION PRODUCTS—SOUTHERN FILTER MEDIA, LLC, PE-16/M-SPES FILTER SAMPLE

    EPA Science Inventory

    The U.S. EPA has created the Environmental Technology Verification program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program tested the performance of baghouse filtrati...

  11. Validation of mesoscale models

    NASA Technical Reports Server (NTRS)

    Kuo, Bill; Warner, Tom; Benjamin, Stan; Koch, Steve; Staniforth, Andrew

    1993-01-01

    The topics discussed include the following: verification of cloud prediction from the PSU/NCAR mesoscale model; results form MAPS/NGM verification comparisons and MAPS observation sensitivity tests to ACARS and profiler data; systematic errors and mesoscale verification for a mesoscale model; and the COMPARE Project and the CME.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT--BAGHOUSE FILTRATION PRODUCTS, W.L. GORE ASSOC., INC.

    EPA Science Inventory

    The U.S. Environmental Protection Agency Air Pollution Control Technology (APCT) Verification Center evaluates the performance of baghouse filtration products used primarily to control PM2.5 emissions. This verification statement summarizes the test results for W.L. Gore & Assoc....

  13. 40 CFR 1066.240 - Torque transducer verification and calibration.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Torque transducer verification and...

  14. 40 CFR 1066.240 - Torque transducer verification and calibration.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Torque transducer verification and...

  15. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  16. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  17. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  18. ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM

    EPA Science Inventory

    The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...

  19. GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES

    EPA Science Inventory

    This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION COATINGS AND COATING EQUIPMENT PROGRAM (ETV CCEP), FINAL TECHNOLOGY APPLICATIONS GROUP TAGNITE--TESTING AND QUALITY ASSURANCE PLAN (T/QAP)

    EPA Science Inventory

    The overall objective of the Environmental Testing and Verification Coatings and Coating Equipment Program is to verify pollution prevention and performance characteristics of coating technologies and make the results of the testing available to prospective coating technology use...

  1. ANDalyze Lead 100 Test Kit and AND1000 Fluorimeter Environmental Technology Verification Report and Statement

    EPA Science Inventory

    This report provides results for the verification testing of the Lead100/AND1000. The following is a description of the technology based on information provided by the vendor. The information provided below was not verified in this test. The ANDalyze Lead100/AND1000 was des...

  2. Current status of verification practices in clinical biochemistry in Spain.

    PubMed

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  3. Elastic Instability of Members Having Sections Common in Aircraft Construction

    NASA Technical Reports Server (NTRS)

    Trayer, George W; March, H W

    1932-01-01

    Two fundamental problems of elastic stability are discussed in this report. In part one formulas are given for calculating the critical stress at which a thin, outstanding flange of a compression member will either wrinkle into several waves or form into a single half wave and twist the member about its longitudinal axis. A mathematical study of the problem, which together with experimental work has led to these formulas, is given in an appendix. Results of test substantiating the recommended formulas are also presented. In part two the lateral buckling of beams is discussed. The results of a number of mathematical studies of this phenomenon have been published prior to this writing, but very little experimentally determined information relating to the problem has been available heretofore. Experimental verification of the mathematical deductions is supplied.

  4. Receiver operating characteristic (ROC) curves: review of methods with applications in diagnostic medicine

    NASA Astrophysics Data System (ADS)

    Obuchowski, Nancy A.; Bullen, Jennifer A.

    2018-04-01

    Receiver operating characteristic (ROC) analysis is a tool used to describe the discrimination accuracy of a diagnostic test or prediction model. While sensitivity and specificity are the basic metrics of accuracy, they have many limitations when characterizing test accuracy, particularly when comparing the accuracies of competing tests. In this article we review the basic study design features of ROC studies, illustrate sample size calculations, present statistical methods for measuring and comparing accuracy, and highlight commonly used ROC software. We include descriptions of multi-reader ROC study design and analysis, address frequently seen problems of verification and location bias, discuss clustered data, and provide strategies for testing endpoints in ROC studies. The methods are illustrated with a study of transmission ultrasound for diagnosing breast lesions.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yashchuk, Valeriy V; Conley, Raymond; Anderson, Erik H

    Verification of the reliability of metrology data from high quality x-ray optics requires that adequate methods for test and calibration of the instruments be developed. For such verification for optical surface profilometers in the spatial frequency domain, a modulation transfer function (MTF) calibration method based on binary pseudo-random (BPR) gratings and arrays has been suggested [Proc. SPIE 7077-7 (2007), Opt. Eng. 47(7), 073602-1-5 (2008)} and proven to be an effective calibration method for a number of interferometric microscopes, a phase shifting Fizeau interferometer, and a scatterometer [Nucl. Instr. and Meth. A 616, 172-82 (2010)]. Here we describe the details ofmore » development of binary pseudo-random multilayer (BPRML) test samples suitable for characterization of scanning (SEM) and transmission (TEM) electron microscopes. We discuss the results of TEM measurements with the BPRML test samples fabricated from a WiSi2/Si multilayer coating with pseudo randomly distributed layers. In particular, we demonstrate that significant information about the metrological reliability of the TEM measurements can be extracted even when the fundamental frequency of the BPRML sample is smaller than the Nyquist frequency of the measurements. The measurements demonstrate a number of problems related to the interpretation of the SEM and TEM data. Note that similar BPRML test samples can be used to characterize x-ray microscopes. Corresponding work with x-ray microscopes is in progress.« less

  6. AXAF-I Low Intensity-Low Temperature (LILT) Testing of the Development Verification Test (DVT) Solar Panel

    NASA Technical Reports Server (NTRS)

    Alexander, Doug; Edge, Ted; Willowby, Doug

    1998-01-01

    The planned orbit of the AXAF-I spacecraft will subject the spacecraft to both short, less than 30 minutes for solar and less than 2 hours for lunar, and long earth eclipses and lunar eclipses with combined conjunctive duration of up to 3 to 4 hours. Lack of proper Electrical Power System (EPS) conditioning prior to eclipse may cause loss of mission. To avoid this problem, for short eclipses, it is necessary to off-point the solar array prior to or at the beginning of the eclipse to reduce the battery state of charge (SOC). This yields less overcharge during the high charge currents at sun entry. For long lunar eclipses, solar array pointing and load scheduling must be tailored for the profile of the eclipse. The battery SOC, loads, and solar array current-voltage (I-V) must be known or predictable to maintain the bus voltage within acceptable range. To address engineering concerns about the electrical performance of the AXAF-I solar array under Low Intensity and Low Temperature (LILT) conditions, Marshall Space Flight Center (MSFC) engineers undertook special testing of the AXAF-I Development Verification Test (DVT) solar panel in September-November 1997. In the test the DVT test panel was installed in a thermal vacuum chamber with a large view window with a mechanical "flapper door". The DVT test panel was "flash" tested with a Large Area Pulse Solar Simulator (LAPSS) at various fractional sun intensities and panel (solar cell) temperatures. The testing was unique with regards to the large size of the test article and type of testing performed. The test setup, results, and lessons learned from the testing will be presented.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: Stormwater Source Area Treatment Device - Arkal Pressurized Stormwater Filtration System

    EPA Science Inventory

    Performance verification testing of the Arkal Pressurized Stormwater Filtration System was conducted under EPA's Environmental Technology Verification Program on a 5.5-acre parking lot and grounds of St. Mary's Hospital in Milwaukee, Wisconsin. The system consists of a water sto...

  8. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6282 Filtration Media (Tested March - April 2011)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  9. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6277 Filtration Media (Tested March 2011)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  10. Environmental Technology Verification: Baghouse filtration products--W.L. Gore & Associates L3650 filtration media (tested November--December 2009)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  11. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6262 Filtration Media (Tested March 2011)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  12. 47 CFR 25.132 - Verification of earth station antenna performance standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false Verification of earth station antenna... Verification of earth station antenna performance standards. (a)(1) Except for applications for 20/30 GHz earth... the antenna manufacturer on representative equipment in representative configurations, and the test...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, COLUMBUS INDUSTRIES HIGH EFFICIENCY MINI PLEAT

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  14. Numerical Simulation of the Fluid-Structure Interaction of a Surface Effect Ship Bow Seal

    NASA Astrophysics Data System (ADS)

    Bloxom, Andrew L.

    Numerical simulations of fluid-structure interaction (FSI) problems were performed in an effort to verify and validate a commercially available FSI tool. This tool uses an iterative partitioned coupling scheme between CD-adapco's STAR-CCM+ finite volume fluid solver and Simulia's Abaqus finite element structural solver to simulate the FSI response of a system. Preliminary verification and validation work (V&V) was carried out to understand the numerical behavior of the codes individually and together as a FSI tool. Verification and Validation work that was completed included code order verification of the respective fluid and structural solvers with Couette-Poiseuille flow and Euler-Bernoulli beam theory. These results confirmed the 2 nd order accuracy of the spatial discretizations used. Following that, a mixture of solution verifications and model calibrations was performed with the inclusion of the physics models implemented in the solution of the FSI problems. Solution verifications were completed for fluid and structural stand-alone models as well as for the coupled FSI solutions. These results re-confirmed the spatial order of accuracy but for more complex flows and physics models as well as the order of accuracy of the temporal discretizations. In lieu of a good material definition, model calibration is performed to reproduce the experimental results. This work used model calibration for both instances of hyperelastic materials which were presented in the literature as validation cases because these materials were defined as linear elastic. Calibrated, three dimensional models of the bow seal on the University of Michigan bow seal test platform showed the ability to reproduce the experimental results qualitatively through averaging of the forces and seal displacements. These simulations represent the only current 3D results for this case. One significant result of this study is the ability to visualize the flow around the seal and to directly measure the seal resistances at varying cushion pressures, seal immersions, forward speeds, and different seal materials. SES design analysis could greatly benefit from the inclusion of flexible seals in simulations, and this work is a positive step in that direction. In future work, the inclusion of more complex seal geometries and contact will further enhance the capability of this tool.

  15. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    NASA Astrophysics Data System (ADS)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: ENVIROFUELS DIESEL FUEL CATALYZER FUEL ADDITIVE

    EPA Science Inventory

    EPA's Environmental Technology Verification Program has tested EnviroFuels diesel fuel additive, called the Diesel Fuel Catalyzer. EnviroFuels has stated that heavy-duty on and off road diesel engines are the intended market for the catalyzer. Preliminary tests conducted indicate...

  17. Test/QA Plan for Verification of Nitrate Sensors for Groundwater Remediation Monitoring

    EPA Science Inventory

    A submersible nitrate sensor is capable of collecting in-situ measurements of dissolved nitrate concentrations in groundwater. Although several types of nitrate sensors currently exist, this verification test will focus on submersible sensors equipped with a nitrate-specific ion...

  18. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF FOUR IMMUNOASSAY TEST KITS

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  19. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  20. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  1. Two-Black Box Concept for Warhead Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bates, Cameron Russell; Frame, Katherine Chiyoko; Mckigney, Edward Allen

    2017-03-06

    We have created a possible solution to meeting the requirements of certification/authentication while still employing complicated criteria. Technical solutions to protecting information from the host in an inspection environment needs to be assessed by those with specific expertise but, LANL can still study the verification problem. The two-black box framework developed provides another potential solution to the confidence vs. certification paradox.

  2. Repetition and comprehension of spoken sentences by reading-disabled children.

    PubMed

    Shankweiler, D; Smith, S T; Mann, V A

    1984-11-01

    The language problems of reading-disabled elementary school children are not confined to written language alone. These children often exhibit problems of ordered recall of verbal materials that are equally severe whether the materials are presented in printed or in spoken form. Sentences that pose problems of pronoun reference might be expected to place a special burden on short-term memory because close grammatical relationships obtain between words that are distant from one another. With this logic in mind, third-grade children with specific reading disability and classmates matched for age and IQ were tested on five sentence types, each of which poses a problem in assigning pronoun reference. On one occasion the children were tested for comprehension of the sentences by a forced-choice picture verification task. On a later occasion they received the same sentences as a repetition test. Good and poor readers differed significantly in immediate recall of the reflexive sentences, but not in comprehension of them as assessed by picture choice. It was suggested that the pictures provided cues which lightened the memory load, a possibility that could explain why the poor readers were not demonstrably inferior in comprehension of the sentences even though they made significantly more errors than the good readers in recalling them.

  3. JPL control/structure interaction test bed real-time control computer architecture

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1989-01-01

    The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.

  4. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  5. Results of the ESA study on psychological selection of astronaut candidates for Columbus missions II: Personality assessment

    NASA Astrophysics Data System (ADS)

    Goeters, Klaus-Martin; Fassbender, Christoph

    A unique composition of personality assessment methods was applied to a group of 97 ESA scientists and engineers. This group is highly comparable to real astronaut candidates with respect to age and education. The list of used tests includes personality questionnaires, problem solving in groups as well as a projective technique. The study goals were: 1. Verification of psychometric qualities and applicability of tests to the target group; 2. Search for culture-fair tests by which multi-national European groups can be examined; 3. Identification of test methods by which the adaptability of the candidates to the psycho-social stress of long-duration space flights can be assessed. Based on the empirical findings, a test battery was defined which can be used in the selection of ESA space personnel.

  6. Challenges in the Verification of Reinforcement Learning Algorithms

    NASA Technical Reports Server (NTRS)

    Van Wesel, Perry; Goodloe, Alwyn E.

    2017-01-01

    Machine learning (ML) is increasingly being applied to a wide array of domains from search engines to autonomous vehicles. These algorithms, however, are notoriously complex and hard to verify. This work looks at the assumptions underlying machine learning algorithms as well as some of the challenges in trying to verify ML algorithms. Furthermore, we focus on the specific challenges of verifying reinforcement learning algorithms. These are highlighted using a specific example. Ultimately, we do not offer a solution to the complex problem of ML verification, but point out possible approaches for verification and interesting research opportunities.

  7. Electronics systems test laboratory testing of shuttle communications systems

    NASA Technical Reports Server (NTRS)

    Stoker, C. J.; Bromley, L. K.

    1985-01-01

    Shuttle communications and tracking systems space to space and space to ground compatibility and performance evaluations are conducted in the NASA Johnson Space Center Electronics Systems Test Laboratory (ESTL). This evaluation is accomplished through systems verification/certification tests using orbiter communications hardware in conjunction with other shuttle communications and tracking external elements to evaluate end to end system compatibility and to verify/certify that overall system performance meets program requirements before manned flight usage. In this role, the ESTL serves as a multielement major ground test facility. The ESTL capability and program concept are discussed. The system test philosophy for the complex communications channels is described in terms of the major phases. Results of space to space and space to ground systems tests are presented. Several examples of the ESTL's unique capabilities to locate and help resolve potential problems are discussed in detail.

  8. Improved orbiter waste collection system study

    NASA Technical Reports Server (NTRS)

    Bastin, P. H.

    1984-01-01

    Design concepts for improved fecal waste collection both on the space shuttle orbiter and as a precursor for the space station are discussed. Inflight usage problems associated with the existing orbiter waste collection subsystem are considered. A basis was sought for the selection of an optimum waste collection system concept which may ultimately result in the development of an orbiter flight test article for concept verification and subsequent production of new flight hardware. Two concepts were selected for orbiter and are shown in detail. Additionally, one concept selected for application to the space station is presented.

  9. Verification approach for the Shuttle/Payload Contamination Evaluation computer program - Spacelab induced environment

    NASA Technical Reports Server (NTRS)

    Bareiss, L. E.

    1978-01-01

    The paper presents a compilation of the results of a systems level Shuttle/payload contamination analysis and related computer modeling activities. The current technical assessment of the contamination problems anticipated during the Spacelab program are discussed and recommendations are presented on contamination abatement designs and operational procedures based on experience gained in the field of contamination analysis and assessment, dating back to the pre-Skylab era. The ultimate test of the Shuttle/Payload Contamination Evaluation program will be through comparison of predictions with measured levels of contamination during actual flight.

  10. 46 CFR 109.227 - Verification of vessel compliance with applicable stability requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Verification of vessel compliance with applicable stability requirements. 109.227 Section 109.227 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS Tests, Drills, and Inspections § 109.227 Verification...

  11. 46 CFR 109.227 - Verification of vessel compliance with applicable stability requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Verification of vessel compliance with applicable stability requirements. 109.227 Section 109.227 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS Tests, Drills, and Inspections § 109.227 Verification...

  12. 46 CFR 109.227 - Verification of vessel compliance with applicable stability requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Verification of vessel compliance with applicable stability requirements. 109.227 Section 109.227 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS Tests, Drills, and Inspections § 109.227 Verification...

  13. 46 CFR 109.227 - Verification of vessel compliance with applicable stability requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Verification of vessel compliance with applicable stability requirements. 109.227 Section 109.227 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS Tests, Drills, and Inspections § 109.227 Verification...

  14. 46 CFR 109.227 - Verification of vessel compliance with applicable stability requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Verification of vessel compliance with applicable stability requirements. 109.227 Section 109.227 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS Tests, Drills, and Inspections § 109.227 Verification...

  15. 46 CFR 131.513 - Verification of compliance with applicable stability requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Verification of compliance with applicable stability...) OFFSHORE SUPPLY VESSELS OPERATIONS Tests, Drills, and Inspections § 131.513 Verification of compliance with applicable stability requirements. (a) After loading but before departure, and at other times necessary to...

  16. 46 CFR 131.513 - Verification of compliance with applicable stability requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Verification of compliance with applicable stability...) OFFSHORE SUPPLY VESSELS OPERATIONS Tests, Drills, and Inspections § 131.513 Verification of compliance with applicable stability requirements. (a) After loading but before departure, and at other times necessary to...

  17. 46 CFR 131.513 - Verification of compliance with applicable stability requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Verification of compliance with applicable stability...) OFFSHORE SUPPLY VESSELS OPERATIONS Tests, Drills, and Inspections § 131.513 Verification of compliance with applicable stability requirements. (a) After loading but before departure, and at other times necessary to...

  18. Environmental Technology Verification Program Materials Management and Remediation Center Generic Protocol for Verification of In Situ Chemical Oxidation

    EPA Science Inventory

    The protocol provides generic procedures for implementing a verification test for the performance of in situ chemical oxidation (ISCO), focused specifically to expand the application of ISCO at manufactured gas plants with polyaromatic hydrocarbon (PAH) contamination (MGP/PAH) an...

  19. Verification of operation of the actuator control system using the integration the B&R Automation Studio software with a virtual model of the actuator system

    NASA Astrophysics Data System (ADS)

    Herbuś, K.; Ociepka, P.

    2017-08-01

    In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.

  20. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  1. 40 CFR 1065.362 - Non-stoichiometric raw exhaust FID O2 interference verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... air source during testing, use zero air as the FID burner's air source for this verification. (4) Zero the FID analyzer using the zero gas used during emission testing. (5) Span the FID analyzer using a span gas that you use during emission testing. (6) Check the zero response of the FID analyzer using...

  2. 40 CFR 86.1847-01 - Manufacturer in-use verification and in-use confirmatory testing; submittal of information and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... laboratory equipment calibrations and verifications as prescribed by subpart B of this part or by good... in-use confirmatory testing; submittal of information and maintenance of records. 86.1847-01 Section... confirmatory testing; submittal of information and maintenance of records. (a) The manufacturer who conducts or...

  3. Developing a Test for Assessing Elementary Students' Comprehension of Science Texts

    ERIC Educational Resources Information Center

    Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien

    2012-01-01

    This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…

  4. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  5. bcROCsurface: an R package for correcting verification bias in estimation of the ROC surface and its volume for continuous diagnostic tests.

    PubMed

    To Duc, Khanh

    2017-11-18

    Receiver operating characteristic (ROC) surface analysis is usually employed to assess the accuracy of a medical diagnostic test when there are three ordered disease status (e.g. non-diseased, intermediate, diseased). In practice, verification bias can occur due to missingness of the true disease status and can lead to a distorted conclusion on diagnostic accuracy. In such situations, bias-corrected inference tools are required. This paper introduce an R package, named bcROCsurface, which provides utility functions for verification bias-corrected ROC surface analysis. The shiny web application of the correction for verification bias in estimation of the ROC surface analysis is also developed. bcROCsurface may become an important tool for the statistical evaluation of three-class diagnostic markers in presence of verification bias. The R package, readme and example data are available on CRAN. The web interface enables users less familiar with R to evaluate the accuracy of diagnostic tests, and can be found at http://khanhtoduc.shinyapps.io/bcROCsurface_shiny/ .

  6. Design and Verification of Critical Pressurised Windows for Manned Spaceflight

    NASA Astrophysics Data System (ADS)

    Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.

    2014-06-01

    The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.

  7. ETV REPORT AND VERIFICATION STATEMENT; EVALUATION OF LOBO LIQUIDS RINSE WATER RECOVERY SYSTEM

    EPA Science Inventory

    The Lobo Liquids Rinse Water Recovery System (Lobo Liquids system) was tested, under actual production conditions, processing metal finishing wastewater, at Gull Industries in Houston, Texas. The verification test evaluated the ability of the ion exchange (IX) treatment system t...

  8. Long-Term Pavement Performance Materials Characterization Program: Verification of Dynamic Test Systems with an Emphasis on Resilient Modulus

    DOT National Transportation Integrated Search

    2005-09-01

    This document describes a procedure for verifying a dynamic testing system (closed-loop servohydraulic). The procedure is divided into three general phases: (1) electronic system performance verification, (2) calibration check and overall system perf...

  9. Multimodal fusion of polynomial classifiers for automatic person recgonition

    NASA Astrophysics Data System (ADS)

    Broun, Charles C.; Zhang, Xiaozheng

    2001-03-01

    With the prevalence of the information age, privacy and personalization are forefront in today's society. As such, biometrics are viewed as essential components of current evolving technological systems. Consumers demand unobtrusive and non-invasive approaches. In our previous work, we have demonstrated a speaker verification system that meets these criteria. However, there are additional constraints for fielded systems. The required recognition transactions are often performed in adverse environments and across diverse populations, necessitating robust solutions. There are two significant problem areas in current generation speaker verification systems. The first is the difficulty in acquiring clean audio signals in all environments without encumbering the user with a head- mounted close-talking microphone. Second, unimodal biometric systems do not work with a significant percentage of the population. To combat these issues, multimodal techniques are being investigated to improve system robustness to environmental conditions, as well as improve overall accuracy across the population. We propose a multi modal approach that builds on our current state-of-the-art speaker verification technology. In order to maintain the transparent nature of the speech interface, we focus on optical sensing technology to provide the additional modality-giving us an audio-visual person recognition system. For the audio domain, we use our existing speaker verification system. For the visual domain, we focus on lip motion. This is chosen, rather than static face or iris recognition, because it provides dynamic information about the individual. In addition, the lip dynamics can aid speech recognition to provide liveness testing. The visual processing method makes use of both color and edge information, combined within Markov random field MRF framework, to localize the lips. Geometric features are extracted and input to a polynomial classifier for the person recognition process. A late integration approach, based on a probabilistic model, is employed to combine the two modalities. The system is tested on the XM2VTS database combined with AWGN in the audio domain over a range of signal-to-noise ratios.

  10. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  11. Bridge Health Monitoring Using a Machine Learning Strategy

    DOT National Transportation Integrated Search

    2017-01-01

    The goal of this project was to cast the SHM problem within a statistical pattern recognition framework. Techniques borrowed from speaker recognition, particularly speaker verification, were used as this discipline deals with problems very similar to...

  12. Hosted Services for Advanced V and V Technologies: An Approach to Achieving Adoption without the Woes of Usage

    NASA Technical Reports Server (NTRS)

    Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.; OMalley, Owen; Brew, William A.

    2003-01-01

    Attempts to achieve widespread use of software verification tools have been notably unsuccessful. Even 'straightforward', classic, and potentially effective verification tools such as lint-like tools face limits on their acceptance. These limits are imposed by the expertise required applying the tools and interpreting the results, the high false positive rate of many verification tools, and the need to integrate the tools into development environments. The barriers are even greater for more complex advanced technologies such as model checking. Web-hosted services for advanced verification technologies may mitigate these problems by centralizing tool expertise. The possible benefits of this approach include eliminating the need for software developer expertise in tool application and results filtering, and improving integration with other development tools.

  13. Information verification and encryption based on phase retrieval with sparsity constraints and optical inference

    NASA Astrophysics Data System (ADS)

    Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang

    2017-01-01

    A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.

  14. Benchmarking on Tsunami Currents with ComMIT

    NASA Astrophysics Data System (ADS)

    Sharghi vand, N.; Kanoglu, U.

    2015-12-01

    There were no standards for the validation and verification of tsunami numerical models before 2004 Indian Ocean tsunami. Even, number of numerical models has been used for inundation mapping effort, evaluation of critical structures, etc. without validation and verification. After 2004, NOAA Center for Tsunami Research (NCTR) established standards for the validation and verification of tsunami numerical models (Synolakis et al. 2008 Pure Appl. Geophys. 165, 2197-2228), which will be used evaluation of critical structures such as nuclear power plants against tsunami attack. NCTR presented analytical, experimental and field benchmark problems aimed to estimate maximum runup and accepted widely by the community. Recently, benchmark problems were suggested by the US National Tsunami Hazard Mitigation Program Mapping & Modeling Benchmarking Workshop: Tsunami Currents on February 9-10, 2015 at Portland, Oregon, USA (http://nws.weather.gov/nthmp/index.html). These benchmark problems concentrated toward validation and verification of tsunami numerical models on tsunami currents. Three of the benchmark problems were: current measurement of the Japan 2011 tsunami in Hilo Harbor, Hawaii, USA and in Tauranga Harbor, New Zealand, and single long-period wave propagating onto a small-scale experimental model of the town of Seaside, Oregon, USA. These benchmark problems were implemented in the Community Modeling Interface for Tsunamis (ComMIT) (Titov et al. 2011 Pure Appl. Geophys. 168, 2121-2131), which is a user-friendly interface to the validated and verified Method of Splitting Tsunami (MOST) (Titov and Synolakis 1995 J. Waterw. Port Coastal Ocean Eng. 121, 308-316) model and is developed by NCTR. The modeling results are compared with the required benchmark data, providing good agreements and results are discussed. Acknowledgment: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe)

  15. Prototype test article verification of the Space Station Freedom active thermal control system microgravity performance

    NASA Technical Reports Server (NTRS)

    Chen, I. Y.; Ungar, E. K.; Lee, D. Y.; Beckstrom, P. S.

    1993-01-01

    To verify the on-orbit operation of the Space Station Freedom (SSF) two-phase external Active Thermal Control System (ATCS), a test and verification program will be performed prior to flight. The first system level test of the ATCS is the Prototype Test Article (PTA) test that will be performed in early 1994. All ATCS loops will be represented by prototypical components and the line sizes and lengths will be representative of the flight system. In this paper, the SSF ATCS and a portion of its verification process are described. The PTA design and the analytical methods that were used to quantify the gravity effects on PTA operation are detailed. Finally, the gravity effects are listed, and the applicability of the 1-g PTA test results to the validation of on-orbit ATCS operation is discussed.

  16. 40 CFR 1065.545 - Verification of proportional flow control for batch sampling.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... control for batch sampling. 1065.545 Section 1065.545 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.545 Verification of proportional flow control for batch sampling. For any...

  17. 42 CFR 493.1253 - Standard: Establishment and verification of performance specifications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Establishment and verification of..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS... of test results for the test system. (vi) Reference intervals (normal values). (vii) Any other...

  18. Hybrid Decompositional Verification for Discovering Failures in Adaptive Flight Control Systems

    NASA Technical Reports Server (NTRS)

    Thompson, Sarah; Davies, Misty D.; Gundy-Burlet, Karen

    2010-01-01

    Adaptive flight control systems hold tremendous promise for maintaining the safety of a damaged aircraft and its passengers. However, most currently proposed adaptive control methodologies rely on online learning neural networks (OLNNs), which necessarily have the property that the controller is changing during the flight. These changes tend to be highly nonlinear, and difficult or impossible to analyze using standard techniques. In this paper, we approach the problem with a variant of compositional verification. The overall system is broken into components. Undesirable behavior is fed backwards through the system. Components which can be solved using formal methods techniques explicitly for the ranges of safe and unsafe input bounds are treated as white box components. The remaining black box components are analyzed with heuristic techniques that try to predict a range of component inputs that may lead to unsafe behavior. The composition of these component inputs throughout the system leads to overall system test vectors that may elucidate the undesirable behavior

  19. Formal Methods for Verification and Validation of Partial Specifications: A Case Study

    NASA Technical Reports Server (NTRS)

    Easterbrook, Steve; Callahan, John

    1997-01-01

    This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.

  20. Experimental Evaluation of a Planning Language Suitable for Formal Verification

    NASA Technical Reports Server (NTRS)

    Butler, Rick W.; Munoz, Cesar A.; Siminiceanu, Radu I.

    2008-01-01

    The marriage of model checking and planning faces two seemingly diverging alternatives: the need for a planning language expressive enough to capture the complexity of real-life applications, as opposed to a language simple, yet robust enough to be amenable to exhaustive verification and validation techniques. In an attempt to reconcile these differences, we have designed an abstract plan description language, ANMLite, inspired from the Action Notation Modeling Language (ANML) [17]. We present the basic concepts of the ANMLite language as well as an automatic translator from ANMLite to the model checker SAL (Symbolic Analysis Laboratory) [7]. We discuss various aspects of specifying a plan in terms of constraints and explore the implications of choosing a robust logic behind the specification of constraints, rather than simply propose a new planning language. Additionally, we provide an initial assessment of the efficiency of model checking to search for solutions of planning problems. To this end, we design a basic test benchmark and study the scalability of the generated SAL models in terms of plan complexity.

  1. Experimental Validation of L1 Adaptive Control: Rohrs' Counterexample in Flight

    NASA Technical Reports Server (NTRS)

    Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Issac; Kitsios, Ioannis; Cao, Chengyu; Gregory, Irene M.; Valavani, Lena

    2010-01-01

    The paper presents new results on the verification and in-flight validation of an L1 adaptive flight control system, and proposes a general methodology for verification and validation of adaptive flight control algorithms. The proposed framework is based on Rohrs counterexample, a benchmark problem presented in the early 80s to show the limitations of adaptive controllers developed at that time. In this paper, the framework is used to evaluate the performance and robustness characteristics of an L1 adaptive control augmentation loop implemented onboard a small unmanned aerial vehicle. Hardware-in-the-loop simulations and flight test results confirm the ability of the L1 adaptive controller to maintain stability and predictable performance of the closed loop adaptive system in the presence of general (artificially injected) unmodeled dynamics. The results demonstrate the advantages of L1 adaptive control as a verifiable robust adaptive control architecture with the potential of reducing flight control design costs and facilitating the transition of adaptive control into advanced flight control systems.

  2. NASA Formal Methods Workshop, 1990

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W. (Compiler)

    1990-01-01

    The workshop brought together researchers involved in the NASA formal methods research effort for detailed technical interchange and provided a mechanism for interaction with representatives from the FAA and the aerospace industry. The workshop also included speakers from industry to debrief the formal methods researchers on the current state of practice in flight critical system design, verification, and certification. The goals were: define and characterize the verification problem for ultra-reliable life critical flight control systems and the current state of practice in industry today; determine the proper role of formal methods in addressing these problems, and assess the state of the art and recent progress toward applying formal methods to this area.

  3. Control of operating parameters of laser ceilometers with the application of fiber optic delay line imitation

    NASA Astrophysics Data System (ADS)

    Kim, A. A.; Klochkov, D. V.; Konyaev, M. A.; Mihaylenko, A. S.

    2017-11-01

    The article considers the problem of control and verification of the laser ceilometers basic performance parameters and describes an alternative method based on the use of multi-length fiber optic delay line, simulating atmospheric track. The results of the described experiment demonstrate the great potential of this method for inspection and verification procedures of laser ceilometers.

  4. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, Phyllis C.

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.

  5. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  6. 76 FR 60829 - Information Collection Being Reviewed by the Federal Communications Commission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... Authorization-Verification (Retention of Records). Form No.: N/A. Type of Review: Extension of a currently... verification, the responsible party, as shown in 47 CFR 2.909 shall maintain the records listed as follows: (1... laboratory, company, or individual performing the verification testing. The Commission may request additional...

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT STORMWATER MANAGEMENT INC., STORMFILTER SYSTEM WITH ZPG MEDIA

    EPA Science Inventory

    Verification testing of the Stormwater Management, Inc. StormFilter Using ZPG Filter Media was conducted on a 0.19 acre portion of the eastbound highway surface of Interstate 794, at an area commonly referred to as the "Riverwalk" site near downtown Milwaukee, Wisconsin...

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, FLINT HILLS RESOURCES, LP, CCD15010 DIESEL FUEL FORMULATION WITH HITEC4121 ADDITIVE

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: PROTOCOL FOR THE VERIFICATION OF GROUTING MATERIALS FOR INFRASTRUCTURE REHABILITATION AT THE UNIVERSITY OF HOUSTON - CIGMAT

    EPA Science Inventory

    This protocol was developed under the Environmental Protection Agency's Environmental Technology Verification (ETV) Program, and is intended to be used as a guide in preparing laboratory test plans for the purpose of verifying the performance of grouting materials used for infra...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, JCH FUEL SOLUTIONS, INC., JCH ENVIRO AUTOMATED FUEL CLEANING AND MAINTENANCE SYSTEM

    EPA Science Inventory

    The verification testing was conducted at the Cl facility in North Las Vegas, NV, on July 17 and 18, 2001. During this period, engine emissions, fuel consumption, and fuel quality were evaluated with contaminated and cleaned fuel.

    To facilitate this verification, JCH repre...

  11. Verification tests of durable TPS concepts

    NASA Technical Reports Server (NTRS)

    Shideler, J. L.; Webb, G. L.; Pittman, C. M.

    1984-01-01

    Titanium multiwall, superalloy honeycomb, and Advanced Carbon-carbon (ACC) multipost Thermal Protection System (TPS) concepts are being developed to provide durable protection for surfaces of future space transportation systems. Verification tests including thermal, vibration, acoustic, water absorption, lightning strike, and aerothermal tests are described. Preliminary results indicate that the three TPS concepts are viable up to a surface temperature in excess of 2300 F.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, REMOVAL OF ARSENIC IN DRINKING WATER: PHASE 1-ADI PILOT TEST UNIT NO. 2002-09 WITH MEDIA G2®

    EPA Science Inventory

    Integrity verification testing of the ADI International Inc. Pilot Test Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8...

  13. Review of waste package verification tests. Semiannual report, October 1982-March 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soo, P.

    1983-08-01

    The current study is part of an ongoing task to specify tests that may be used to verify that engineered waste package/repository systems comply with NRC radionuclide containment and controlled release performance objectives. Work covered in this report analyzes verification tests for borosilicate glass waste forms and bentonite- and zeolite-based packing mateials (discrete backfills). 76 references.

  14. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singleton, Jr., Robert; Israel, Daniel M.; Doebling, Scott William

    For code verification, one compares the code output against known exact solutions. There are many standard test problems used in this capacity, such as the Noh and Sedov problems. ExactPack is a utility that integrates many of these exact solution codes into a common API (application program interface), and can be used as a stand-alone code or as a python package. ExactPack consists of python driver scripts that access a library of exact solutions written in Fortran or Python. The spatial profiles of the relevant physical quantities, such as the density, fluid velocity, sound speed, or internal energy, are returnedmore » at a time specified by the user. The solution profiles can be viewed and examined by a command line interface or a graphical user interface, and a number of analysis tools and unit tests are also provided. We have documented the physics of each problem in the solution library, and provided complete documentation on how to extend the library to include additional exact solutions. ExactPack’s code architecture makes it easy to extend the solution-code library to include additional exact solutions in a robust, reliable, and maintainable manner.« less

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, HVLP COATING EQUIPMENT, SHARPE MANUFACTURING COMPANY PLATINUM 2012 HVLP SPRAY GUN

    EPA Science Inventory

    This report presents the results of the verification test of the Sharpe Platinum 2013 high-volume, low-pressure gravity-feed spray gun, hereafter referred to as the Sharpe Platinum, which is designed for use in automotive refinishing. The test coating chosen by Sharpe Manufacturi...

  16. EPA/NSF ETV Equipment Verification Testing Plan for the Removal of Volatile Organic Chemical Contaminants by Adsorptive Media Processes

    EPA Science Inventory

    This document is the Environmental Technology Verification (ETV) Technology Specific Test Plan (TSTP) for evaluation of drinking water treatment equipment utilizing adsorptive media for synthetic organic chemical (SOC) removal. This TSTP is to be used within the structure provid...

  17. TEST DESIGN FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF ADD-ON NOX CONTROL UTILIZING OZONE INJECTION

    EPA Science Inventory

    The paper discusses the test design for environmental technology verification (ETV) of add-0n nitrogen oxides (NOx) control utilizing ozone injection. (NOTE: ETV is an EPA-established program to enhance domestic and international market acceptance of new or improved commercially...

  18. Space station structures and dynamics test program

    NASA Technical Reports Server (NTRS)

    Moore, Carleton J.; Townsend, John S.; Ivey, Edward W.

    1987-01-01

    The design, construction, and operation of a low-Earth orbit space station poses unique challenges for development and implementation of new technology. The technology arises from the special requirement that the station be built and constructed to function in a weightless environment, where static loads are minimal and secondary to system dynamics and control problems. One specific challenge confronting NASA is the development of a dynamics test program for: (1) defining space station design requirements, and (2) identifying the characterizing phenomena affecting the station's design and development. A general definition of the space station dynamic test program, as proposed by MSFC, forms the subject of this report. The test proposal is a comprehensive structural dynamics program to be launched in support of the space station. The test program will help to define the key issues and/or problems inherent to large space structure analysis, design, and testing. Development of a parametric data base and verification of the math models and analytical analysis tools necessary for engineering support of the station's design, construction, and operation provide the impetus for the dynamics test program. The philosophy is to integrate dynamics into the design phase through extensive ground testing and analytical ground simulations of generic systems, prototype elements, and subassemblies. On-orbit testing of the station will also be used to define its capability.

  19. Fingerprint changes and verification failure among patients with hand dermatitis.

    PubMed

    Lee, Chew Kek; Chang, Choong Chor; Johar, Asmah; Puwira, Othman; Roshidah, Baba

    2013-03-01

    To determine the prevalence of fingerprint verification failure and to define and quantify the fingerprint changes associated with fingerprint verification failure. Case-control study. Referral public dermatology center. The study included 100 consecutive patients with clinical hand dermatitis involving the palmar distal phalanx of either thumb and 100 age-, sex-, and ethnicity-matched controls. Patients with an altered thumb print due to other causes and palmar hyperhidrosis were excluded. Fingerprint verification(pass/fail) and hand eczema severity index score. Twenty-seven percent of patients failed fingerprint verification compared with 2% of controls. Fingerprint verification failure was associated with a higher hand eczema severity index score (P.001). The main fingerprint abnormalities were fingerprint dystrophy (42.0%) and abnormal white lines (79.5%). The number of abnormal white lines was significantly higher among the patients with hand dermatitis compared with controls(P=.001). Among the patients with hand dermatitis, theodds of failing fingerprint verification with fingerprint dystrophy was 4.01. The presence of broad lines and long lines was associated with a greater odds of fingerprint verification failure (odds ratio [OR], 8.04; 95% CI, 3.56-18.17 and OR, 2.37; 95% CI, 1.31-4.27, respectively),while the presence of thin lines was protective of verification failure (OR, 0.45; 95% CI, 0.23-0.89). Fingerprint verification failure is a significant problem among patients with more severe hand dermatitis. It is mainly due to fingerprint dystrophy and abnormal white lines. Malaysian National Medical Research Register Identifier: NMRR-11-30-8226

  20. Automated Test Environment for a Real-Time Control System

    NASA Technical Reports Server (NTRS)

    Hall, Ronald O.

    1994-01-01

    An automated environment with hardware-in-the-loop has been developed by Rocketdyne Huntsville for test of a real-time control system. The target system of application is the man-rated real-time system which controls the Space Shuttle Main Engines (SSME). The primary use of the environment is software verification and validation, but it is also useful for evaluation and analysis of SSME avionics hardware and mathematical engine models. It provides a test bed for the integration of software and hardware. The principles and skills upon which it operates may be applied to other target systems, such as those requiring hardware-in-the-loop simulation and control system development. Potential applications are in problem domains demanding highly reliable software systems requiring testing to formal requirements and verifying successful transition to/from off-nominal system states.

  1. Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for turbulent fluid-particle flows

    DOE PAGES

    Patel, Ravi G.; Desjardins, Olivier; Kong, Bo; ...

    2017-09-01

    Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less

  2. Verification of Eulerian-Eulerian and Eulerian-Lagrangian simulations for turbulent fluid-particle flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Patel, Ravi G.; Desjardins, Olivier; Kong, Bo

    Here, we present a verification study of three simulation techniques for fluid–particle flows, including an Euler–Lagrange approach (EL) inspired by Jackson's seminal work on fluidized particles, a quadrature–based moment method based on the anisotropic Gaussian closure (AG), and the traditional two-fluid model. We perform simulations of two problems: particles in frozen homogeneous isotropic turbulence (HIT) and cluster-induced turbulence (CIT). For verification, we evaluate various techniques for extracting statistics from EL and study the convergence properties of the three methods under grid refinement. The convergence is found to depend on the simulation method and on the problem, with CIT simulations posingmore » fewer difficulties than HIT. Specifically, EL converges under refinement for both HIT and CIT, but statistics exhibit dependence on the postprocessing parameters. For CIT, AG produces similar results to EL. For HIT, converging both TFM and AG poses challenges. Overall, extracting converged, parameter-independent Eulerian statistics remains a challenge for all methods.« less

  3. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  4. Verification for measurement-only blind quantum computing

    NASA Astrophysics Data System (ADS)

    Morimae, Tomoyuki

    2014-06-01

    Blind quantum computing is a new secure quantum computing protocol where a client who does not have any sophisticated quantum technology can delegate her quantum computing to a server without leaking any privacy. It is known that a client who has only a measurement device can perform blind quantum computing [T. Morimae and K. Fujii, Phys. Rev. A 87, 050301(R) (2013), 10.1103/PhysRevA.87.050301]. It has been an open problem whether the protocol can enjoy the verification, i.e., the ability of the client to check the correctness of the computing. In this paper, we propose a protocol of verification for the measurement-only blind quantum computing.

  5. Hardware proofs using EHDM and the RSRE verification methodology

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Sjogren, Jon A.

    1988-01-01

    Examined is a methodology for hardware verification developed by Royal Signals and Radar Establishment (RSRE) in the context of the SRI International's Enhanced Hierarchical Design Methodology (EHDM) specification/verification system. The methodology utilizes a four-level specification hierarchy with the following levels: functional level, finite automata model, block model, and circuit level. The properties of a level are proved as theorems in the level below it. This methodology is applied to a 6-bit counter problem and is critically examined. The specifications are written in EHDM's specification language, Extended Special, and the proofs are improving both the RSRE methodology and the EHDM system.

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF PARTICULATE CONTAMINANTS IN DRINKING WATER, AQUASOURCE M1A35 ULTRAFILTRATION MEMBRANE SYSTEM AT AQUA2000 RESEARCH CENTER - NSF 00/03/EPADW395

    EPA Science Inventory

    Verification testing of the Aquasource UF unit ws conducted over two test periods at the Aqua2000 Research Center in San Diego, CA. The first test period, from 3/5 - 4/19/99, represented winter/spring conditons. The second test period, from 8/25 - 9/28/99, represented summer/fall...

  7. Gender verification testing in sport.

    PubMed

    Ferris, E A

    1992-07-01

    Gender verification testing in sport, first introduced in 1966 by the International Amateur Athletic Federation (IAAF) in response to fears that males with a physical advantage in terms of muscle mass and strength were cheating by masquerading as females in women's competition, has led to unfair disqualifications of women athletes and untold psychological harm. The discredited sex chromatin test, which identifies only the sex chromosome component of gender and is therefore misleading, was abandoned in 1991 by the IAAF in favour of medical checks for all athletes, women and men, which preclude the need for gender testing. But, women athletes will still be tested at the Olympic Games at Albertville and Barcelona using polymerase chain reaction (PCR) to amplify DNA sequences on the Y chromosome which identifies genetic sex only. Gender verification testing may in time be abolished when the sporting community are fully cognizant of its scientific and ethical implications.

  8. Functions of social support and self-verification in association with loneliness, depression, and stress.

    PubMed

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny

    2014-01-01

    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  9. CD volume design and verification

    NASA Technical Reports Server (NTRS)

    Li, Y. P.; Hughes, J. S.

    1993-01-01

    In this paper, we describe a prototype for CD-ROM volume design and verification. This prototype allows users to create their own model of CD volumes by modifying a prototypical model. Rule-based verification of the test volumes can then be performed later on against the volume definition. This working prototype has proven the concept of model-driven rule-based design and verification for large quantity of data. The model defined for the CD-ROM volumes becomes a data model as well as an executable specification.

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: BROME AGRI SALES, LTD., MAXIMIZER SEPARATOR, MODEL MAX 1016 - 03/01/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Brome Agri Sales Ltd. Maximizer Separator, Model MAX 1016 (Maximizer) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The Maximizer is an inclined screen solids separator that can be used to s...

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    EPA Science Inventory

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  12. Direct Verification of School Meal Applications with Medicaid Data: A Pilot Evaluation of Feasibility, Effectiveness and Costs

    ERIC Educational Resources Information Center

    Logan, Christopher W.; Cole, Nancy; Kamara, Sheku G.

    2010-01-01

    Purpose/Objectives: The Direct Verification Pilot tested the feasibility, effectiveness, and costs of using Medicaid and State Children's Health Insurance Program (SCHIP) data to verify applications for free and reduced-price (FRP) school meals instead of obtaining documentation from parents and guardians. Methods: The Direct Verification Pilot…

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, CUMMINS EMISSION SOLUTIONS AND CUMMINS FILTRATION DIESEL OXIDATION CATALYST AND CLOSED CRANKCASE VENTILATION SYSTEM

    EPA Science Inventory

    The U.S. EPA has created the Environmental Technology Verification (ETV) Program. ETV seeks to provide high-quality, peer-reviewed data on technology performance. The Air Pollution Control Technology (APCT) Verification Center, a center under the ETV Program, is operated by Res...

  14. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Reports: Final Comprehensive Performance Test Report, P/N: 1356006-1, S.N: 202/A2

    NASA Technical Reports Server (NTRS)

    Platt, R.

    1998-01-01

    This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.

  15. Evaluation of verification and testing tools for FORTRAN programs

    NASA Technical Reports Server (NTRS)

    Smith, K. A.

    1980-01-01

    Two automated software verification and testing systems were developed for use in the analysis of computer programs. An evaluation of the static analyzer DAVE and the dynamic analyzer PET, which are used in the analysis of FORTRAN programs on Control Data (CDC) computers, are described. Both systems were found to be effective and complementary, and are recommended for use in testing FORTRAN programs.

  16. Highway noise measurements for verification of prediction models

    DOT National Transportation Integrated Search

    1978-01-01

    Accurate prediction of highway noise has been a major problem for state highway departments. Many noise models have been proposed to alleviate this problem. Results contained in this report will be used to analyze some of these models, and to determi...

  17. Verification and classification bias interactions in diagnostic test accuracy studies for fine-needle aspiration biopsy.

    PubMed

    Schmidt, Robert L; Walker, Brandon S; Cohen, Michael B

    2015-03-01

    Reliable estimates of accuracy are important for any diagnostic test. Diagnostic accuracy studies are subject to unique sources of bias. Verification bias and classification bias are 2 sources of bias that commonly occur in diagnostic accuracy studies. Statistical methods are available to estimate the impact of these sources of bias when they occur alone. The impact of interactions when these types of bias occur together has not been investigated. We developed mathematical relationships to show the combined effect of verification bias and classification bias. A wide range of case scenarios were generated to assess the impact of bias components and interactions on total bias. Interactions between verification bias and classification bias caused overestimation of sensitivity and underestimation of specificity. Interactions had more effect on sensitivity than specificity. Sensitivity was overestimated by at least 7% in approximately 6% of the tested scenarios. Specificity was underestimated by at least 7% in less than 0.1% of the scenarios. Interactions between verification bias and classification bias create distortions in accuracy estimates that are greater than would be predicted from each source of bias acting independently. © 2014 American Cancer Society.

  18. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marleau, Peter; Brubaker, Erik; Deland, Sharon M.

    This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less

  19. Control structural interaction testbed: A model for multiple flexible body verification

    NASA Technical Reports Server (NTRS)

    Chory, M. A.; Cohen, A. L.; Manning, R. A.; Narigon, M. L.; Spector, V. A.

    1993-01-01

    Conventional end-to-end ground tests for verification of control system performance become increasingly complicated with the development of large, multiple flexible body spacecraft structures. The expense of accurately reproducing the on-orbit dynamic environment and the attendant difficulties in reducing and accounting for ground test effects limits the value of these tests. TRW has developed a building block approach whereby a combination of analysis, simulation, and test has replaced end-to-end performance verification by ground test. Tests are performed at the component, subsystem, and system level on engineering testbeds. These tests are aimed at authenticating models to be used in end-to-end performance verification simulations: component and subassembly engineering tests and analyses establish models and critical parameters, unit level engineering and acceptance tests refine models, and subsystem level tests confirm the models' overall behavior. The Precision Control of Agile Spacecraft (PCAS) project has developed a control structural interaction testbed with a multibody flexible structure to investigate new methods of precision control. This testbed is a model for TRW's approach to verifying control system performance. This approach has several advantages: (1) no allocation for test measurement errors is required, increasing flight hardware design allocations; (2) the approach permits greater latitude in investigating off-nominal conditions and parametric sensitivities; and (3) the simulation approach is cost effective, because the investment is in understanding the root behavior of the flight hardware and not in the ground test equipment and environment.

  20. Application of a Near-Field Water Quality Model.

    DTIC Science & Technology

    1979-07-01

    VERIFICATION 45 CENTERLINE TEMPERATURE DECRF~A7F 46 LATERAL VARIATION OF CONSTITUENTS 46 VARIATIOtN OF PLUME WIDTH 49 GENERAL ON VERIFICATION 49...40 4 SOME RESULTS OF VARYING THE ENTRAINMENT COEFFICIENT 4’ 5 RESULTS OF VARYING OTHER COEFFICEINT 42 6 GENERAL PLUME CHARACTERISITICS FOR VARIATION... plume ) axis. These profile forms are then integrated within the basic conservation equations. This integration reduces the problem to a one

  1. PRISMATIC: Unified Hierarchical Probabilistic Verification Tool

    DTIC Science & Technology

    2011-09-01

    security protocols such as for anonymity and quantum cryptography ; and biological reaction pathways. PRISM is currently the leading probabilistic...a whole will only deadlock and fail with a probability ≤ p/2. The assumption allows us to partition the overall system verification problem into two ...run on any port using the standard HTTP protocol. In this way multiple instances of the PRISMATIC web service can respond to different requests when

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DEVILBISS JGHV-531-46FF HVLP SPRAY GUN

    EPA Science Inventory

    This report presents the results of the verification test of the DeVilbiss JGHV-531-46FF high-volume, low-pressure pressure-feed spray gun, hereafter referred to as the DeVilbiss JGHV, which is designed for use in industrial finishing. The test coating chosen by ITW Industrial Fi...

  3. 40 CFR 1065.308 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers not...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... which you sample and record gas-analyzer concentrations. (b) Measurement principles. This test verifies... appropriate frequency to prevent loss of information. This test also verifies that the measurement system... instructions. Adjust the measurement system as needed to optimize performance. Run this verification with the...

  4. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  5. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  6. ON-LINE MONITORING OF I&C TRANSMITTERS AND SENSORS FOR CALIBRATION VERIFICATION AND RESPONSE TIME TESTING WAS SUCCESSFULLY IMPLEMENTED AT ATR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Phillip A.; O'Hagan, Ryan; Shumaker, Brent

    The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carriedmore » out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.« less

  7. Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul

    1995-01-01

    The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.

  8. Model verification of mixed dynamic systems. [POGO problem in liquid propellant rockets

    NASA Technical Reports Server (NTRS)

    Chrostowski, J. D.; Evensen, D. A.; Hasselman, T. K.

    1978-01-01

    A parameter-estimation method is described for verifying the mathematical model of mixed (combined interactive components from various engineering fields) dynamic systems against pertinent experimental data. The model verification problem is divided into two separate parts: defining a proper model and evaluating the parameters of that model. The main idea is to use differences between measured and predicted behavior (response) to adjust automatically the key parameters of a model so as to minimize response differences. To achieve the goal of modeling flexibility, the method combines the convenience of automated matrix generation with the generality of direct matrix input. The equations of motion are treated in first-order form, allowing for nonsymmetric matrices, modeling of general networks, and complex-mode analysis. The effectiveness of the method is demonstrated for an example problem involving a complex hydraulic-mechanical system.

  9. Engineering large-scale agent-based systems with consensus

    NASA Technical Reports Server (NTRS)

    Bokma, A.; Slade, A.; Kerridge, S.; Johnson, K.

    1994-01-01

    The paper presents the consensus method for the development of large-scale agent-based systems. Systems can be developed as networks of knowledge based agents (KBA) which engage in a collaborative problem solving effort. The method provides a comprehensive and integrated approach to the development of this type of system. This includes a systematic analysis of user requirements as well as a structured approach to generating a system design which exhibits the desired functionality. There is a direct correspondence between system requirements and design components. The benefits of this approach are that requirements are traceable into design components and code thus facilitating verification. The use of the consensus method with two major test applications showed it to be successful and also provided valuable insight into problems typically associated with the development of large systems.

  10. A fully-implicit high-order system thermal-hydraulics model for advanced non-LWR safety analyses

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hu, Rui

    An advanced system analysis tool is being developed for advanced reactor safety analysis. This paper describes the underlying physics and numerical models used in the code, including the governing equations, the stabilization schemes, the high-order spatial and temporal discretization schemes, and the Jacobian Free Newton Krylov solution method. The effects of the spatial and temporal discretization schemes are investigated. Additionally, a series of verification test problems are presented to confirm the high-order schemes. Furthermore, it is demonstrated that the developed system thermal-hydraulics model can be strictly verified with the theoretical convergence rates, and that it performs very well for amore » wide range of flow problems with high accuracy, efficiency, and minimal numerical diffusions.« less

  11. A fully-implicit high-order system thermal-hydraulics model for advanced non-LWR safety analyses

    DOE PAGES

    Hu, Rui

    2016-11-19

    An advanced system analysis tool is being developed for advanced reactor safety analysis. This paper describes the underlying physics and numerical models used in the code, including the governing equations, the stabilization schemes, the high-order spatial and temporal discretization schemes, and the Jacobian Free Newton Krylov solution method. The effects of the spatial and temporal discretization schemes are investigated. Additionally, a series of verification test problems are presented to confirm the high-order schemes. Furthermore, it is demonstrated that the developed system thermal-hydraulics model can be strictly verified with the theoretical convergence rates, and that it performs very well for amore » wide range of flow problems with high accuracy, efficiency, and minimal numerical diffusions.« less

  12. SU-E-T-490: Independent Three-Dimensional (3D) Dose Verification of VMAT/SBRT Using EPID and Cloud Computing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ding, A; Han, B; Bush, K

    Purpose: Dosimetric verification of VMAT/SBRT is currently performed on one or two planes in a phantom with either film or array detectors. A robust and easy-to-use 3D dosimetric tool has been sought since the advent of conformal radiation therapy. Here we present such a strategy for independent 3D VMAT/SBRT plan verification system by a combined use of EPID and cloud-based Monte Carlo (MC) dose calculation. Methods: The 3D dosimetric verification proceeds in two steps. First, the plan was delivered with a high resolution portable EPID mounted on the gantry, and the EPID-captured gantry-angle-resolved VMAT/SBRT field images were converted into fluencemore » by using the EPID pixel response function derived from MC simulations. The fluence was resampled and used as the input for an in-house developed Amazon cloud-based MC software to reconstruct the 3D dose distribution. The accuracy of the developed 3D dosimetric tool was assessed using a Delta4 phantom with various field sizes (square, circular, rectangular, and irregular MLC fields) and different patient cases. The method was applied to validate VMAT/SBRT plans using WFF and FFF photon beams (Varian TrueBeam STX). Results: It was found that the proposed method yielded results consistent with the Delta4 measurements. For points on the two detector planes, a good agreement within 1.5% were found for all the testing fields. Patient VMAT/SBRT plan studies revealed similar level of accuracy: an average γ-index passing rate of 99.2± 0.6% (3mm/3%), 97.4± 2.4% (2mm/2%), and 72.6± 8.4 % ( 1mm/1%). Conclusion: A valuable 3D dosimetric verification strategy has been developed for VMAT/SBRT plan validation. The technique provides a viable solution for a number of intractable dosimetry problems, such as small fields and plans with high dose gradient.« less

  13. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  14. EOS-AM precision pointing verification

    NASA Technical Reports Server (NTRS)

    Throckmorton, A.; Braknis, E.; Bolek, J.

    1993-01-01

    The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.

  15. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: TRITON SYSTEMS, LLC SOLID BOWL CENTRIFUGE, MODEL TS-5000

    EPA Science Inventory

    Verification testing of the Triton Systems, LLC Solid Bowl Centrifuge Model TS-5000 (TS-5000) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The TS-5000 was 48" in diameter and 30" deep, with a bowl capacity of 16 ft3. ...

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF MICROBIOLOGICAL & PARTICULATE CONTAMINANTS IN DRINKING WATER: US FILTER 3M10C MICROFILTRATION MEMBRANE SYSTEM AT CHULA VISTA, CALIFORNIA

    EPA Science Inventory

    Verification testing of the US Filter 3M10C membrane system was conducted over a 44-day test period at the Aqua 2000 Research Center in Chula Vista, California. The test period extended from July 24, 2002 to September 5, 2002. The source water was a blend of Colorado River and ...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF MICROBIOLOGICAL AND PARTICULATE CONTAMINANTS IN DRINKING WATER, HYDRANAUTICS HYDRACAP ULTRAFILTRATION MEMBRANE SYSTEM AT THE AQUA2000 RESEARCH CENTER - NSF 00/04/EPADW395

    EPA Science Inventory

    Verification testing of the Hydranautics HYDRA Cap(TM) Ultrafiltration Membrane System (Hydranautics UF unit) was conducted over two test periods at the Aqua 2000 Research Center in San Diego, CA. The first test period, from 8/3/99-9/13/99, represented summer/fall conditions. The...

  19. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  20. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  1. Surface protection overview

    NASA Technical Reports Server (NTRS)

    Levine, S. R.

    1982-01-01

    A first-cut integrated environmental attack life prediction methodology for hot section components is addressed. The HOST program is concerned with oxidation and hot corrosion attack of metallic coatings as well as their degradation by interdiffusion with the substrate. The effects of the environment and coatings on creep/fatigue behavior are being addressed through a joint effort with the Fatigue sub-project. An initial effort will attempt to scope the problem of thermal barrier coating life prediction. Verification of models will be carried out through benchmark rig tests including a 4 atm. replaceable blade turbine and a 50 atm. pressurized burner rig.

  2. Measurement of Plastic Stress and Strain for Analytical Method Verification (MSFC Center Director's Discretionary Fund Project No. 93-08)

    NASA Technical Reports Server (NTRS)

    Price, J. M.; Steeve, B. E.; Swanson, G. R.

    1999-01-01

    The analytical prediction of stress, strain, and fatigue life at locations experiencing local plasticity is full of uncertainties. Much of this uncertainty arises from the material models and their use in the numerical techniques used to solve plasticity problems. Experimental measurements of actual plastic strains would allow the validity of these models and solutions to be tested. This memorandum describes how experimental plastic residual strain measurements were used to verify the results of a thermally induced plastic fatigue failure analysis of a space shuttle main engine fuel pump component.

  3. A validated non-linear Kelvin-Helmholtz benchmark for numerical hydrodynamics

    NASA Astrophysics Data System (ADS)

    Lecoanet, D.; McCourt, M.; Quataert, E.; Burns, K. J.; Vasil, G. M.; Oishi, J. S.; Brown, B. P.; Stone, J. M.; O'Leary, R. M.

    2016-02-01

    The non-linear evolution of the Kelvin-Helmholtz instability is a popular test for code verification. To date, most Kelvin-Helmholtz problems discussed in the literature are ill-posed: they do not converge to any single solution with increasing resolution. This precludes comparisons among different codes and severely limits the utility of the Kelvin-Helmholtz instability as a test problem. The lack of a reference solution has led various authors to assert the accuracy of their simulations based on ad hoc proxies, e.g. the existence of small-scale structures. This paper proposes well-posed two-dimensional Kelvin-Helmholtz problems with smooth initial conditions and explicit diffusion. We show that in many cases numerical errors/noise can seed spurious small-scale structure in Kelvin-Helmholtz problems. We demonstrate convergence to a reference solution using both ATHENA, a Godunov code, and DEDALUS, a pseudo-spectral code. Problems with constant initial density throughout the domain are relatively straightforward for both codes. However, problems with an initial density jump (which are the norm in astrophysical systems) exhibit rich behaviour and are more computationally challenging. In the latter case, ATHENA simulations are prone to an instability of the inner rolled-up vortex; this instability is seeded by grid-scale errors introduced by the algorithm, and disappears as resolution increases. Both ATHENA and DEDALUS exhibit late-time chaos. Inviscid simulations are riddled with extremely vigorous secondary instabilities which induce more mixing than simulations with explicit diffusion. Our results highlight the importance of running well-posed test problems with demonstrated convergence to a reference solution. To facilitate future comparisons, we include as supplementary material the resolved, converged solutions to the Kelvin-Helmholtz problems in this paper in machine-readable form.

  4. Seismic design verification of LMFBR structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-07-01

    The report provides an assessment of the seismic design verification procedures currently used for nuclear power plant structures, a comparison of dynamic test methods available, and conclusions and recommendations for future LMFB structures.

  5. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  6. Verification of the numerical model of insert-type joint of scaffolding in relation to experimental research

    NASA Astrophysics Data System (ADS)

    Pieńko, Michał; Błazik-Borowa, Ewa

    2018-01-01

    This paper presents the problem of comparing the results of computer simulations with the results of laboratory tests. The subject of the study was the insert-type joint of scaffolding loaded with a bending moment. The research was carried out on the real elements of the scaffolding. Due to the complexity of the connection different friction coefficients and depths of wedge insertion were taken into account in the analysis. The aim of conducting the series of analyses was to determine the sensitivity of the model to the mentioned characteristics. Since laboratory tests were carried out on the real samples, there were no preparations of surface involved in the load transfer. This approach caused many problems with the clear definition of the nature of work of individual node elements during the load. The analysis consist of two stages: the stage in which the connection is defined (the wedge is inserted into the rosette), and the loading stage (the node is loaded by the bending moment).

  7. Cleanroom Garment Silicone Contamination

    NASA Technical Reports Server (NTRS)

    Geer, Wayne; Lepage, Colette

    2006-01-01

    The slide presentation reviews actions taken at Goddard Space Flight Center (GSFC) to eliminate contamination by silicone in clean rooms. Background information includes facilities and hardware affected by silicon contamination, a discussion of the negative aspects of silicone contamination, clean room garments, and how the problem was identified at GSFC. Actions taken by the GSFC Contamination Engineering Group and lessons learned are detailed. Results include: awareness of the silicone issue in laundry, increase in infrastructure and support of the testing lab, establishment of protocols for garment verification, closer relationship established with laundry and converter, specifications for laundry services and garments were strengthened, all consumables are tested before use in clean rooms, and established procedures were used to identify and treat silicone found on face masks.

  8. Rapid Verification of Candidate Serological Biomarkers Using Gel-based, Label-free Multiple Reaction Monitoring

    PubMed Central

    Tang, Hsin-Yao; Beer, Lynn A.; Barnhart, Kurt T.; Speicher, David W.

    2011-01-01

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves, quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1-D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μl serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers. PMID:21726088

  9. Rapid verification of candidate serological biomarkers using gel-based, label-free multiple reaction monitoring.

    PubMed

    Tang, Hsin-Yao; Beer, Lynn A; Barnhart, Kurt T; Speicher, David W

    2011-09-02

    Stable isotope dilution-multiple reaction monitoring-mass spectrometry (SID-MRM-MS) has emerged as a promising platform for verification of serological candidate biomarkers. However, cost and time needed to synthesize and evaluate stable isotope peptides, optimize spike-in assays, and generate standard curves quickly becomes unattractive when testing many candidate biomarkers. In this study, we demonstrate that label-free multiplexed MRM-MS coupled with major protein depletion and 1D gel separation is a time-efficient, cost-effective initial biomarker verification strategy requiring less than 100 μL of serum. Furthermore, SDS gel fractionation can resolve different molecular weight forms of targeted proteins with potential diagnostic value. Because fractionation is at the protein level, consistency of peptide quantitation profiles across fractions permits rapid detection of quantitation problems for specific peptides from a given protein. Despite the lack of internal standards, the entire workflow can be highly reproducible, and long-term reproducibility of relative protein abundance can be obtained using different mass spectrometers and LC methods with external reference standards. Quantitation down to ~200 pg/mL could be achieved using this workflow. Hence, the label-free GeLC-MRM workflow enables rapid, sensitive, and economical initial screening of large numbers of candidate biomarkers prior to setting up SID-MRM assays or immunoassays for the most promising candidate biomarkers.

  10. MC 2 -3: Multigroup Cross Section Generation Code for Fast Reactor Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Changho; Yang, Won Sik

    This paper presents the methods and performance of the MC2 -3 code, which is a multigroup cross-section generation code for fast reactor analysis, developed to improve the resonance self-shielding and spectrum calculation methods of MC2 -2 and to simplify the current multistep schemes generating region-dependent broad-group cross sections. Using the basic neutron data from ENDF/B data files, MC2 -3 solves the consistent P1 multigroup transport equation to determine the fundamental mode spectra for use in generating multigroup neutron cross sections. A homogeneous medium or a heterogeneous slab or cylindrical unit cell problem is solved in ultrafine (2082) or hyperfine (~400more » 000) group levels. In the resolved resonance range, pointwise cross sections are reconstructed with Doppler broadening at specified temperatures. The pointwise cross sections are directly used in the hyperfine group calculation, whereas for the ultrafine group calculation, self-shielded cross sections are prepared by numerical integration of the pointwise cross sections based upon the narrow resonance approximation. For both the hyperfine and ultrafine group calculations, unresolved resonances are self-shielded using the analytic resonance integral method. The ultrafine group calculation can also be performed for a two-dimensional whole-core problem to generate region-dependent broad-group cross sections. Verification tests have been performed using the benchmark problems for various fast critical experiments including Los Alamos National Laboratory critical assemblies; Zero-Power Reactor, Zero-Power Physics Reactor, and Bundesamt für Strahlenschutz experiments; Monju start-up core; and Advanced Burner Test Reactor. Verification and validation results with ENDF/B-VII.0 data indicated that eigenvalues from MC2 -3/DIF3D agreed well with Monte Carlo N-Particle5 MCNP5 or VIM Monte Carlo solutions within 200 pcm and regionwise one-group fluxes were in good agreement with Monte Carlo solutions.« less

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: HVLP COATING EQUIPMENT, ITW AUTOMOTIVE REFINISHING, DEVILBISS GTI-600G, HVLP SPRAY GUN

    EPA Science Inventory

    This report presents the results of the verification test of the DeVilbiss GTi-600G high-volume, low-pressure gravity-feed spray gun, hereafter referred to as the DeVilbiss GTi, which is designed for use in automotive refinishing. The test coating chosen by ITW Automotive Refinis...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: HVLP COATING EQUIPMENT, ITW AUTOMOTIVE REFINISHING, DEVILBISS FLG-631-318 HVLP SPRAY GUN

    EPA Science Inventory

    This report presents the results of the verification test of the DeVilbiss FLG-631-318 high-volume, low-pressure gravity-feed spray gun, hereafter referred to as the DeVilbiss FLG, which is designed for use in automotive refinishing. The test coating chosen by ITW Automotive Refi...

  13. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  14. 76 FR 20536 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-13

    ... ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 75 [EPA-HQ-OAR-2009-0837; FRL-9280-9] RIN 2060-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing Correction In rule document 2011-6216 appearing on pages 17288-17325 in the issue of Monday, March 28, 2011...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE; PRACTICAL BEST MANAGEMENT OF GEORGIA, INC., CRYSTALSTREAM� WATER QUALITY VAULT MODEL 1056

    EPA Science Inventory

    Verification testing of the Practical Best Management, Inc., CrystalStream™ stormwater treatment system was conducted over a 15-month period starting in March, 2003. The system was installed in a test site in Griffin, Georgia, and served a drainage basin of approximately 4 ...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, PERFORMANCE TEST RESULTS FOR THE A AND A ENVIRONMENTAL SEALS' SEAL ASSIST SYSTEM (SAS), PHASE I--TECHNOLOGY VERIFICATION REPORT

    EPA Science Inventory

    The report presents results of tests determining the efficacy of A&A Environmental Seals, Inc's Seal Assist System (SAS) in preventing natural gas compressor station's compressor rod packing leaks from escaping into the atmosphere. The SAS consists of an Emission Containment Glan...

  17. Arms Control Verification: ’Bridge’ Theories and the Politics of Expediency.

    DTIC Science & Technology

    1983-04-01

    that the compliance verification dilemma, a uniquely American problem, creates a set of opportunities that are, in fact, among the principal reasons for...laws of the class struggle.4 9 While Americans were arguing among themselves about whether detente should involve political "linkage,’ the Chairman...required an equivalent American willingness to persevere indefinitely. But to generate that kind of fervor among the voting populace would have required

  18. Mathematics learning disabilities in girls with fragile X or Turner syndrome during late elementary school.

    PubMed

    Murphy, Melissa M; Mazzocco, Michèle M M

    2008-01-01

    The present study focuses on math and related skills among 32 girls with fragile X (n = 14) or Turner (n = 18) syndrome during late elementary school. Performance in each syndrome group was assessed relative to Full Scale IQ-matched comparison groups of girls from the general population (n = 32 and n = 89 for fragile X syndrome and Turner syndrome, respectively). Differences between girls with fragile X and their comparison group emerged on untimed arithmetic calculations, mastery of counting skills, and arithmetic problem verification accuracy. Relative to girls in the comparison group, girls with Turner syndrome did not differ on untimed arithmetic calculations or problem verification accuracy, but they had limited mastery of counting skills and longer response times to complete the problem verification task. Girls with fragile X or Turner syndrome also differed from their respective comparison groups on math-related abilities, including visual-spatial, working memory, and reading skills, and the associations between math and those related skills. Together, these findings support the notion that difficulty with math and related skills among girls with fragile X or Turner syndrome continues into late elementary school and that the profile of math and related skill difficulty distinguishes the two syndrome groups from each other.

  19. Commissioning and quality assurance of an integrated system for patient positioning and setup verification in particle therapy.

    PubMed

    Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G

    2014-08-01

    In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.

  20. Empirical evaluation of decision support systems: Needs, definitions, potential methods, and an example pertaining to waterfowl management

    USGS Publications Warehouse

    Sojda, R.S.

    2007-01-01

    Decision support systems are often not empirically evaluated, especially the underlying modelling components. This can be attributed to such systems necessarily being designed to handle complex and poorly structured problems and decision making. Nonetheless, evaluation is critical and should be focused on empirical testing whenever possible. Verification and validation, in combination, comprise such evaluation. Verification is ensuring that the system is internally complete, coherent, and logical from a modelling and programming perspective. Validation is examining whether the system is realistic and useful to the user or decision maker, and should answer the question: “Was the system successful at addressing its intended purpose?” A rich literature exists on verification and validation of expert systems and other artificial intelligence methods; however, no single evaluation methodology has emerged as preeminent. At least five approaches to validation are feasible. First, under some conditions, decision support system performance can be tested against a preselected gold standard. Second, real-time and historic data sets can be used for comparison with simulated output. Third, panels of experts can be judiciously used, but often are not an option in some ecological domains. Fourth, sensitivity analysis of system outputs in relation to inputs can be informative. Fifth, when validation of a complete system is impossible, examining major components can be substituted, recognizing the potential pitfalls. I provide an example of evaluation of a decision support system for trumpeter swan (Cygnus buccinator) management that I developed using interacting intelligent agents, expert systems, and a queuing system. Predicted swan distributions over a 13-year period were assessed against observed numbers. Population survey numbers and banding (ringing) studies may provide long term data useful in empirical evaluation of decision support.

  1. Summary of the 2014 Sandia V&V Challenge Workshop

    DOE PAGES

    Schroeder, Benjamin B.; Hu, Kenneth T.; Mullins, Joshua Grady; ...

    2016-02-19

    A discussion of the five responses to the 2014 Sandia Verification and Validation (V&V) Challenge Problem, presented within this special issue, is provided hereafter. Overviews of the challenge problem workshop, workshop participants, and the problem statement are also included. Brief summations of teams' responses to the challenge problem are provided. Issues that arose throughout the responses that are deemed applicable to the general verification, validation, and uncertainty quantification (VVUQ) community are the main focal point of this paper. The discussion is oriented and organized into big picture comparison of data and model usage, VVUQ activities, and differentiating conceptual themes behindmore » the teams' VVUQ strategies. Significant differences are noted in the teams' approaches toward all VVUQ activities, and those deemed most relevant are discussed. Beyond the specific details of VVUQ implementations, thematic concepts are found to create differences among the approaches; some of the major themes are discussed. Lastly, an encapsulation of the key contributions, the lessons learned, and advice for the future are presented.« less

  2. Toward Automatic Verification of Goal-Oriented Flow Simulations

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2014-01-01

    We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.

  3. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  4. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?

    PubMed

    Schaun, Gustavo Z

    2017-12-08

    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

  5. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1

    NASA Technical Reports Server (NTRS)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.

  6. Development of a pilot-scale kinetic extruder feeder system and test program. Phase II. Verification testing. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1984-01-12

    This report describes the work done under Phase II, the verification testing of the Kinetic Extruder. The main objective of the test program was to determine failure modes and wear rates. Only minor auxiliary equipment malfunctions were encountered. Wear rates indicate useful life expectancy of from 1 to 5 years for wear-exposed components. Recommendations are made for adapting the equipment for pilot plant and commercial applications. 3 references, 20 figures, 12 tables.

  7. Assessment of test methods for evaluating effectiveness of cleaning flexible endoscopes.

    PubMed

    Washburn, Rebecca E; Pietsch, Jennifer J

    2018-06-01

    Strict adherence to each step of reprocessing is imperative to removing potentially infectious agents. Multiple methods for verifying proper reprocessing exist; however, each presents challenges and limitations, and best practice within the industry has not been established. Our goal was to evaluate endoscope cleaning verification tests with particular interest in the evaluation of the manual cleaning step. The results of the cleaning verification tests were compared with microbial culturing to see if a positive cleaning verification test would be predictive of microbial growth. This study was conducted at 2 high-volume endoscopy units within a multisite health care system. Each of the 90 endoscopes were tested for adenosine triphosphate, protein, microbial growth via agar plate, and rapid gram-negative culture via assay. The endoscopes were tested in 3 locations: the instrument channel, control knob, and elevator mechanism. This analysis showed substantial level of agreement between protein detection postmanual cleaning and protein detection post-high-level disinfection at the control head for scopes sampled sequentially. This study suggests that if protein is detected postmanual cleaning, there is a significant likelihood that protein will also be detected post-high-level disinfection. It also infers that a cleaning verification test is not predictive of microbial growth. Copyright © 2018 Association for Professionals in Infection Control and Epidemiology, Inc. Published by Elsevier Inc. All rights reserved.

  8. Design verification test matrix development for the STME thrust chamber assembly

    NASA Technical Reports Server (NTRS)

    Dexter, Carol E.; Elam, Sandra K.; Sparks, David L.

    1993-01-01

    This report presents the results of the test matrix development for design verification at the component level for the National Launch System (NLS) space transportation main engine (STME) thrust chamber assembly (TCA) components including the following: injector, combustion chamber, and nozzle. A systematic approach was used in the development of the minimum recommended TCA matrix resulting in a minimum number of hardware units and a minimum number of hot fire tests.

  9. Definition of ground test for verification of large space structure control

    NASA Technical Reports Server (NTRS)

    Doane, G. B., III; Glaese, J. R.; Tollison, D. K.; Howsman, T. G.; Curtis, S. (Editor); Banks, B.

    1984-01-01

    Control theory and design, dynamic system modelling, and simulation of test scenarios are the main ideas discussed. The overall effort is the achievement at Marshall Space Flight Center of a successful ground test experiment of a large space structure. A simplified planar model of ground test experiment of a large space structure. A simplified planar model of ground test verification was developed. The elimination from that model of the uncontrollable rigid body modes was also examined. Also studied was the hardware/software of computation speed.

  10. Risk Mitigation Testing with the BepiColombo MPO SADA

    NASA Astrophysics Data System (ADS)

    Zemann, J.; Heinrich, B.; Skulicz, A.; Madsen, M.; Weisenstein, W.; Modugno, F.; Althaus, F.; Panhofer, T.; Osterseher, G.

    2013-09-01

    A Solar Array (SA) Drive Assembly (SADA) for the BepiColombo mission is being developed and qualified at RUAG Space Zürich (RSSZ). The system is consisting of the Solar Array Drive Mechanism (SADM) and the Solar Array Drive Electronics (SADE) which is subcontracted to RUAG Space Austria (RSA).This paper deals with the risk mitigation activities and the lesson learnt from this development. In specific following topics substantiated by bread board (BB) test results will be addressed in detail:Slipring Bread Board Test: Verification of lifetime and electrical performance of carbon brush technology Potentiometer BB Tests: Focus on lifetime verification (> 650000 revolution) and accuracy requirement SADM EM BB Test: Subcomponent (front-bearing and gearbox) characterization; complete test campaign equivalent to QM test.EM SADM/ SADE Combined Test: Verification of combined performance (accuracy, torque margin) and micro-vibration testing of SADA systemSADE Bread Board Test: Parameter optimization; Test campaign equivalent to QM testThe main improvements identified in frame of BB testing and already implemented in the SADM EM/QM and SADE EQM are:• Improved preload device for gearbox• Improved motor ball-bearing assembly• Position sensor improvements• Calibration process for potentiometer• SADE motor controller optimization toachieve required running smoothness• Overall improvement of test equipment.

  11. Hydrodynamics-induced variability in the USP apparatus II dissolution test.

    PubMed

    Baxter, Jennifer L; Kukura, Joseph; Muzzio, Fernando J

    2005-03-23

    The USP tablet dissolution test is an analytical tool used for the verification of drug release processes and formulation selection within the pharmaceutical industry. Given the strong impact of this test, it is surprising that operating conditions and testing devices have been selected empirically. In fact, the flow phenomena in the USP test have received little attention in the past. An examination of the hydrodynamics in the USP apparatus II shows that the device is highly vulnerable to mixing problems that can affect testing performance and consistency. Experimental and computational techniques reveal that the flow field within the device is not uniform, and dissolution results can vary dramatically with the position of the tablet within the vessel. Specifically, computations predict sharp variations in the shear along the bottom of the vessel where the tablet is most likely to settle. Experiments in which the tablet location was carefully controlled reveal that the variation of shear within the testing device can affect the measured dissolution rate.

  12. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scribner, R.A.

    Sea-launched cruise missiles (SLCMs) present some particularly striking problems for both national security and arms control. These small, dual-purpose, difficult to detect weapons present some formidable challenges for verification in any scheme that attempts to limit rather than eliminate them. Conventionally armed SLCMs offer to the navies of both superpowers important offensive and defensive capabilities. Nuclear armed, long-range, land-attack SLCMs, on the other hand, seem to pose destabilizing threats and otherwise have questionable value, despite strong US support for extensive deployment of them. If these weapons are not constrained, their deployment could circumvent gains which might be made in agreementsmore » directly reducing of strategic nuclear weapons. This paper reviews the technology and planned deployments of SLCMs, the verification schemes which have been discussed and are being investigated to try to deal with the problem, and examines the proposed need for and possible uses of SLCMs. It presents an overview of the problem technically, militarily, and politically.« less

  13. Certification of NIST Room Temperature Low-Energy and High-Energy Charpy Verification Specimens

    PubMed Central

    Lucon, Enrico; McCowan, Chris N.; Santoyo, Ray L.

    2015-01-01

    The possibility for NIST to certify Charpy reference specimens for testing at room temperature (21 °C ± 1 °C) instead of −40 °C was investigated by performing 130 room-temperature tests from five low-energy and four high-energy lots of steel on the three master Charpy machines located in Boulder, CO. The statistical analyses performed show that in most cases the variability of results (i.e., the experimental scatter) is reduced when testing at room temperature. For eight out of the nine lots considered, the observed variability was lower at 21 °C than at −40 °C. The results of this study will allow NIST to satisfy requests for room-temperature Charpy verification specimens that have been received from customers for several years: testing at 21 °C removes from the verification process the operator’s skill in transferring the specimen in a timely fashion from the cooling bath to the impact position, and puts the focus back on the machine performance. For NIST, it also reduces the time and cost for certifying new verification lots. For one of the low-energy lots tested with a C-shaped hammer, we experienced two specimens jamming, which yielded unusually high values of absorbed energy. For both specimens, the signs of jamming were clearly visible. For all the low-energy lots investigated, jamming is slightly more likely to occur at 21 °C than at −40 °C, since at room temperature low-energy samples tend to remain in the test area after impact rather than exiting in the opposite direction of the pendulum swing. In the evaluation of a verification set, any jammed specimen should be removed from the analyses. PMID:26958453

  14. Certification of NIST Room Temperature Low-Energy and High-Energy Charpy Verification Specimens.

    PubMed

    Lucon, Enrico; McCowan, Chris N; Santoyo, Ray L

    2015-01-01

    The possibility for NIST to certify Charpy reference specimens for testing at room temperature (21 °C ± 1 °C) instead of -40 °C was investigated by performing 130 room-temperature tests from five low-energy and four high-energy lots of steel on the three master Charpy machines located in Boulder, CO. The statistical analyses performed show that in most cases the variability of results (i.e., the experimental scatter) is reduced when testing at room temperature. For eight out of the nine lots considered, the observed variability was lower at 21 °C than at -40 °C. The results of this study will allow NIST to satisfy requests for room-temperature Charpy verification specimens that have been received from customers for several years: testing at 21 °C removes from the verification process the operator's skill in transferring the specimen in a timely fashion from the cooling bath to the impact position, and puts the focus back on the machine performance. For NIST, it also reduces the time and cost for certifying new verification lots. For one of the low-energy lots tested with a C-shaped hammer, we experienced two specimens jamming, which yielded unusually high values of absorbed energy. For both specimens, the signs of jamming were clearly visible. For all the low-energy lots investigated, jamming is slightly more likely to occur at 21 °C than at -40 °C, since at room temperature low-energy samples tend to remain in the test area after impact rather than exiting in the opposite direction of the pendulum swing. In the evaluation of a verification set, any jammed specimen should be removed from the analyses.

  15. 40 CFR 1065.369 - H2O, CO, and CO2 interference verification for photoacoustic alcohol analyzers.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct these other measurements to test the compensation algorithms during the analyzer...

  16. 78 FR 1162 - Cardiovascular Devices; Reclassification of External Cardiac Compressor

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-08

    ... safety and electromagnetic compatibility; For devices containing software, software verification... electromagnetic compatibility; For devices containing software, software verification, validation, and hazard... electrical components, appropriate analysis and testing must validate electrical safety and electromagnetic...

  17. Online 3D EPID-based dose verification: Proof of concept.

    PubMed

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  18. New Physical Optics Method for Curvilinear Refractive Surfaces and its Verification in the Design and Testing of W-band Dual-Aspheric Lenses

    DTIC Science & Technology

    2013-10-01

    its Verification in the Design and Testing of W-band Dual-Aspheric Lenses A. Altintas and V. Yurchenko EEE Department, Bilkent University Ankara...Theory and Techn., Vol. 55, 239, 2007 [5] ZEMAX Development Corporation, Zemax- EE , http://www.zemax.com/ [6] Pasqualini D. and Maci S., ”High-Frequency

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF PARTICULATE CONTAMINANTS IN DRINKING WATER: POLYMEM UF 120 S2 ULTRAFILTRATION MEMBRANE MODULE, LUXENBURG, WISCONSIN

    EPA Science Inventory

    Verification testing of the Polymem UF120 S2 Ultrafiltration Membrane Module was conducted over a 46-day period at the Green Bay Water Utility Filtration Plant, Luxemburg, Wisconsin. The ETV testing described herein was funded in conjunction with a 12-month membrane pilot study f...

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, REMOVAL OF ARSENIC IN DRINKING WATER: WATTS PREMIER M-SERIES M-15,000 REVERSE OSMOSIS TREATMENT SYSTEM

    EPA Science Inventory

    Verification testing of the Watts Premier M-Series M-15,000 RO Treatment System was conducted over a 31-day period from April 26, 2004, through May 26, 2004. This test was conducted at the Coachella Valley Water District (CVWD) Well 7802 in Thermal, California. The source water...

Top