Sample records for validation analytical verification

  1. The MCNP6 Analytic Criticality Benchmark Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.

    2016-06-16

    Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less

  2. [Validation and verfication of microbiology methods].

    PubMed

    Camaró-Sala, María Luisa; Martínez-García, Rosana; Olmos-Martínez, Piedad; Catalá-Cuenca, Vicente; Ocete-Mochón, María Dolores; Gimeno-Cardona, Concepción

    2015-01-01

    Clinical microbiologists should ensure, to the maximum level allowed by the scientific and technical development, the reliability of the results. This implies that, in addition to meeting the technical criteria to ensure their validity, they must be performed with a number of conditions that allows comparable results to be obtained, regardless of the laboratory that performs the test. In this sense, the use of recognized and accepted reference methodsis the most effective tool for these guarantees. The activities related to verification and validation of analytical methods has become very important, as there is continuous development, as well as updating techniques and increasingly complex analytical equipment, and an interest of professionals to ensure quality processes and results. The definitions of validation and verification are described, along with the different types of validation/verification, and the types of methods, and the level of validation necessary depending on the degree of standardization. The situations in which validation/verification is mandatory and/or recommended is discussed, including those particularly related to validation in Microbiology. It stresses the importance of promoting the use of reference strains as controls in Microbiology and the use of standard controls, as well as the importance of participation in External Quality Assessment programs to demonstrate technical competence. The emphasis is on how to calculate some of the parameters required for validation/verification, such as the accuracy and precision. The development of these concepts can be found in the microbiological process SEIMC number 48: «Validation and verification of microbiological methods» www.seimc.org/protocols/microbiology. Copyright © 2013 Elsevier España, S.L.U. y Sociedad Española de Enfermedades Infecciosas y Microbiología Clínica. All rights reserved.

  3. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  4. Analytical Verifications in Cryogenic Testing of NGST Advanced Mirror System Demonstrators

    NASA Technical Reports Server (NTRS)

    Cummings, Ramona; Levine, Marie; VanBuren, Dave; Kegley, Jeff; Green, Joseph; Hadaway, James; Presson, Joan; Cline, Todd; Stahl, H. Philip (Technical Monitor)

    2002-01-01

    Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper opens with strategy of analytical verification segments added to vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies. These AMSD assemblies incorporate material and architecture concepts being considered in the Next Generation Space Telescope (NGST) design. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of specific structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with support plan for analyses, data, and observation requirements during the AMSD testing, currently slated for late calendar year 2002 to mid calendar year 2003. The paper includes anomaly resolution as gleaned by authors from similar analytical verification support of a previous large space telescope, then closes with draft of plans for parameter extrapolations, to form a well-verified portion of the integrated modeling being done for NGST performance predictions.

  5. Life cycle management of analytical methods.

    PubMed

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  6. Optimization and Verification of Droplet Digital PCR Even-Specific Methods for the Quantification of GM Maize DAS1507 and NK603.

    PubMed

    Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir

    2018-05-01

    In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.

  7. High Fidelity Modeling of Field-Reversed Configuration (FRC) Thrusters (Briefing Charts)

    DTIC Science & Technology

    2017-05-24

    Converged Math → Irrelevant Solutions? Validation: Fluids Example Stoke’s Flow MARTIN, SOUSA, TRAN (AFRL/RQRS) DISTRIBUTION A - APPROVED FOR PUBLIC RELEASE...Convergence Tests Converged Math → Irrelevant Solutions? Must be Aware of Valid Assumption Regions Validation: Fluids Example Stoke’s Flow Potential...AND VALIDATION Verification: Asymptotic Models → Analytical Solutions Yields Exact Convergence Tests Converged Math → Irrelevant Solutions? Must be

  8. Analytical methodology for safety validation of computer controlled subsystems. Volume 1 : state-of-the-art and assessment of safety verification/validation methodologies

    DOT National Transportation Integrated Search

    1995-09-01

    This report describes the development of a methodology designed to assure that a sufficiently high level of safety is achieved and maintained in computer-based systems which perform safety critical functions in high-speed rail or magnetic levitation ...

  9. Development, Verification and Validation of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes including the capabilities to account for radiative heating is a requirement. This paper presents the recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute view factors for radiation problems involving multiple surfaces. Furthermore, verification and validation of the code's radiation capabilities are demonstrated by comparing solutions to analytical results, to other codes, and to radiant test data.

  10. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  11. Verification of an Analytical Method for Measuring Crystal Nucleation Rates in Glasses from DTA Data

    NASA Technical Reports Server (NTRS)

    Ranasinghe, K. S.; Wei, P. F.; Kelton, K. F.; Ray, C. S.; Day, D. E.

    2004-01-01

    A recently proposed analytical (DTA) method for estimating the nucleation rates in glasses has been evaluated by comparing experimental data with numerically computed nucleation rates for a model lithium disilicate glass. The time and temperature dependent nucleation rates were predicted using the model and compared with those values from an analysis of numerically calculated DTA curves. The validity of the numerical approach was demonstrated earlier by a comparison with experimental data. The excellent agreement between the nucleation rates from the model calculations and fiom the computer generated DTA data demonstrates the validity of the proposed analytical DTA method.

  12. Closed Loop Requirements and Analysis Management

    NASA Technical Reports Server (NTRS)

    Lamoreaux, Michael; Verhoef, Brett

    2015-01-01

    Effective systems engineering involves the use of analysis in the derivation of requirements and verification of designs against those requirements. The initial development of requirements often depends on analysis for the technical definition of specific aspects of a product. Following the allocation of system-level requirements to a product's components, the closure of those requirements often involves analytical approaches to verify that the requirement criteria have been satisfied. Meanwhile, changes that occur in between these two processes need to be managed in order to achieve a closed-loop requirement derivation/verification process. Herein are presented concepts for employing emerging Team center capabilities to jointly manage requirements and analysis data such that analytical techniques are utilized to effectively derive and allocate requirements, analyses are consulted and updated during the change evaluation processes, and analyses are leveraged during the design verification process. Recommendations on concept validation case studies are also discussed.

  13. Analytical Formulation for Sizing and Estimating the Dimensions and Weight of Wind Turbine Hub and Drivetrain Components

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Guo, Y.; Parsons, T.; King, R.

    This report summarizes the theory, verification, and validation of a new sizing tool for wind turbine drivetrain components, the Drivetrain Systems Engineering (DriveSE) tool. DriveSE calculates the dimensions and mass properties of the hub, main shaft, main bearing(s), gearbox, bedplate, transformer if up-tower, and yaw system. The level of fi¬ delity for each component varies depending on whether semiempirical parametric or physics-based models are used. The physics-based models have internal iteration schemes based on system constraints and design criteria. Every model is validated against available industry data or finite-element analysis. The verification and validation results show that the models reasonablymore » capture primary drivers for the sizing and design of major drivetrain components.« less

  14. Classical workflow nets and workflow nets with reset arcs: using Lyapunov stability for soundness verification

    NASA Astrophysics Data System (ADS)

    Clempner, Julio B.

    2017-01-01

    This paper presents a novel analytical method for soundness verification of workflow nets and reset workflow nets, using the well-known stability results of Lyapunov for Petri nets. We also prove that the soundness property is decidable for workflow nets and reset workflow nets. In addition, we provide evidence of several outcomes related with properties such as boundedness, liveness, reversibility and blocking using stability. Our approach is validated theoretically and by a numerical example related to traffic signal-control synchronisation.

  15. TOUGH Simulations of the Updegraff's Set of Fluid and Heat Flow Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Moridis, G.J.; Pruess

    1992-11-01

    The TOUGH code [Pruess, 1987] for two-phase flow of water, air, and heat in penneable media has been exercised on a suite of test problems originally selected and simulated by C. D. Updegraff [1989]. These include five 'verification' problems for which analytical or numerical solutions are available, and three 'validation' problems that model laboratory fluid and heat flow experiments. All problems could be run without any code modifications (*). Good and efficient numerical performance, as well as accurate results were obtained throughout. Additional code verification and validation problems from the literature are briefly summarized, and suggestions are given for propermore » applications of TOUGH and related codes.« less

  16. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development and lower operating costs. However, as those system close control loops and arbitrate resources on board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques and concrete experiments at NASA.

  17. Verification and Validation of Autonomy Software at NASA

    NASA Technical Reports Server (NTRS)

    Pecheur, Charles

    2000-01-01

    Autonomous software holds the promise of new operation possibilities, easier design and development, and lower operating costs. However, as those system close control loops and arbitrate resources on-board with specialized reasoning, the range of possible situations becomes very large and uncontrollable from the outside, making conventional scenario-based testing very inefficient. Analytic verification and validation (V&V) techniques, and model checking in particular, can provide significant help for designing autonomous systems in a more efficient and reliable manner, by providing a better coverage and allowing early error detection. This article discusses the general issue of V&V of autonomy software, with an emphasis towards model-based autonomy, model-checking techniques, and concrete experiments at NASA.

  18. Accurate Biomass Estimation via Bayesian Adaptive Sampling

    NASA Technical Reports Server (NTRS)

    Wheeler, Kevin R.; Knuth, Kevin H.; Castle, Joseph P.; Lvov, Nikolay

    2005-01-01

    The following concepts were introduced: a) Bayesian adaptive sampling for solving biomass estimation; b) Characterization of MISR Rahman model parameters conditioned upon MODIS landcover. c) Rigorous non-parametric Bayesian approach to analytic mixture model determination. d) Unique U.S. asset for science product validation and verification.

  19. Verification of spatial and temporal pressure distributions in segmented solid rocket motors

    NASA Technical Reports Server (NTRS)

    Salita, Mark

    1989-01-01

    A wide variety of analytical tools are in use today to predict the history and spatial distributions of pressure in the combustion chambers of solid rocket motors (SRMs). Experimental and analytical methods are presented here that allow the verification of many of these predictions. These methods are applied to the redesigned space shuttle booster (RSRM). Girth strain-gage data is compared to the predictions of various one-dimensional quasisteady analyses in order to verify the axial drop in motor static pressure during ignition transients as well as quasisteady motor operation. The results of previous modeling of radial flows in the bore, slots, and around grain overhangs are supported by approximate analytical and empirical techniques presented here. The predictions of circumferential flows induced by inhibitor asymmetries, nozzle vectoring, and propellant slump are compared to each other and to subscale cold air and water tunnel measurements to ascertain their validity.

  20. Verification and Validation of EnergyPlus Phase Change Material Model for Opaque Wall Assemblies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tabares-Velasco, P. C.; Christensen, C.; Bianchi, M.

    2012-08-01

    Phase change materials (PCMs) represent a technology that may reduce peak loads and HVAC energy consumption in buildings. A few building energy simulation programs have the capability to simulate PCMs, but their accuracy has not been completely tested. This study shows the procedure used to verify and validate the PCM model in EnergyPlus using a similar approach as dictated by ASHRAE Standard 140, which consists of analytical verification, comparative testing, and empirical validation. This process was valuable, as two bugs were identified and fixed in the PCM model, and version 7.1 of EnergyPlus will have a validated PCM model. Preliminarymore » results using whole-building energy analysis show that careful analysis should be done when designing PCMs in homes, as their thermal performance depends on several variables such as PCM properties and location in the building envelope.« less

  1. Fluorescence In Situ Hybridization Probe Validation for Clinical Use.

    PubMed

    Gu, Jun; Smith, Janice L; Dowling, Patricia K

    2017-01-01

    In this chapter, we provide a systematic overview of the published guidelines and validation procedures for fluorescence in situ hybridization (FISH) probes for clinical diagnostic use. FISH probes-which are classified as molecular probes or analyte-specific reagents (ASRs)-have been extensively used in vitro for both clinical diagnosis and research. Most commercially available FISH probes in the United States are strictly regulated by the U.S. Food and Drug Administration (FDA), the Centers for Disease Control and Prevention (CDC), the Centers for Medicare & Medicaid Services (CMS) the Clinical Laboratory Improvement Amendments (CLIA), and the College of American Pathologists (CAP). Although home-brewed FISH probes-defined as probes made in-house or acquired from a source that does not supply them to other laboratories-are not regulated by these agencies, they too must undergo the same individual validation process prior to clinical use as their commercial counterparts. Validation of a FISH probe involves initial validation and ongoing verification of the test system. Initial validation includes assessment of a probe's technical specifications, establishment of its standard operational procedure (SOP), determination of its clinical sensitivity and specificity, development of its cutoff, baseline, and normal reference ranges, gathering of analytics, confirmation of its applicability to a specific research or clinical setting, testing of samples with or without the abnormalities that the probe is meant to detect, staff training, and report building. Ongoing verification of the test system involves testing additional normal and abnormal samples using the same method employed during the initial validation of the probe.

  2. CSTI Earth-to-orbit propulsion research and technology program overview

    NASA Technical Reports Server (NTRS)

    Gentz, Steven J.

    1993-01-01

    NASA supports a vigorous Earth-to-orbit (ETO) research and technology program as part of its Civil Space Technology Initiative. The purpose of this program is to provide an up-to-date technology base to support future space transportation needs for a new generation of lower cost, operationally efficient, long-lived and highly reliable ETO propulsion systems by enhancing the knowledge, understanding and design methodology applicable to advanced oxygen/hydrogen and oxygen/hydrocarbon ETO propulsion systems. Program areas of interest include analytical models, advanced component technology, instrumentation, and validation/verification testing. Organizationally, the program is divided between technology acquisition and technology verification as follows: (1) technology acquisition; and (2) technology verification.

  3. Assessment of Galileo modal test results for mathematical model verification

    NASA Technical Reports Server (NTRS)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  4. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A.

    This document proposes to provide a listing of available sources which can be used to validate analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers, and books reviewed is given in Appendix 1. Available validation documents and guides are listed in the appendix; each has a brief description of application and use. In the referenced sources, there are varying approaches to validation and varying descriptions of validation at different stages in method development. This discussion focuses onmore » validation and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all documents were published in English.« less

  5. Validation and Verification of Composite Pressure Vessel Design

    NASA Technical Reports Server (NTRS)

    Kreger, Stephen T.; Ortyl, Nicholas; Grant, Joseph; Taylor, F. Tad

    2006-01-01

    Ten composite pressure vessels were instrumented with fiber Bragg grating sensors and pressure tested Through burst. This paper and presentation will discuss the testing methodology, the test results, compare the testing results to the analytical model, and also compare the fiber Bragg grating sensor data with data obtained against that obtained from foil strain gages.

  6. Extremely accurate sequential verification of RELAP5-3D

    DOE PAGES

    Mesina, George L.; Aumiller, David L.; Buschman, Francis X.

    2015-11-19

    Large computer programs like RELAP5-3D solve complex systems of governing, closure and special process equations to model the underlying physics of nuclear power plants. Further, these programs incorporate many other features for physics, input, output, data management, user-interaction, and post-processing. For software quality assurance, the code must be verified and validated before being released to users. For RELAP5-3D, verification and validation are restricted to nuclear power plant applications. Verification means ensuring that the program is built right by checking that it meets its design specifications, comparing coding to algorithms and equations and comparing calculations against analytical solutions and method ofmore » manufactured solutions. Sequential verification performs these comparisons initially, but thereafter only compares code calculations between consecutive code versions to demonstrate that no unintended changes have been introduced. Recently, an automated, highly accurate sequential verification method has been developed for RELAP5-3D. The method also provides to test that no unintended consequences result from code development in the following code capabilities: repeating a timestep advancement, continuing a run from a restart file, multiple cases in a single code execution, and modes of coupled/uncoupled operation. In conclusion, mathematical analyses of the adequacy of the checks used in the comparisons are provided.« less

  7. Verification and Validation of a Coordinate Transformation Method in Axisymmetric Transient Magnetics.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ashcraft, C. Chace; Niederhaus, John Henry; Robinson, Allen C.

    We present a verification and validation analysis of a coordinate-transformation-based numerical solution method for the two-dimensional axisymmetric magnetic diffusion equation, implemented in the finite-element simulation code ALEGRA. The transformation, suggested by Melissen and Simkin, yields an equation set perfectly suited for linear finite elements and for problems with large jumps in material conductivity near the axis. The verification analysis examines transient magnetic diffusion in a rod or wire in a very low conductivity background by first deriving an approximate analytic solution using perturbation theory. This approach for generating a reference solution is shown to be not fully satisfactory. A specializedmore » approach for manufacturing an exact solution is then used to demonstrate second-order convergence under spatial refinement and tem- poral refinement. For this new implementation, a significant improvement relative to previously available formulations is observed. Benefits in accuracy for computed current density and Joule heating are also demonstrated. The validation analysis examines the circuit-driven explosion of a copper wire using resistive magnetohydrodynamics modeling, in comparison to experimental tests. The new implementation matches the accuracy of the existing formulation, with both formulations capturing the experimental burst time and action to within approximately 2%.« less

  8. Development of Advanced Verification and Validation Procedures and Tools for the Certification of Learning Systems in Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen; Schumann, Johann; Gupta, Pramod; Richard, Michael; Guenther, Kurt; Soares, Fola

    2005-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable automatic flight control and vehicle recovery, autonomous flight, and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments. In order for adaptive control systems to be used in safety-critical aerospace applications, they must be proven to be highly safe and reliable. Rigorous methods for adaptive software verification and validation must be developed to ensure that control system software failures will not occur. Of central importance in this regard is the need to establish reliable methods that guarantee convergent learning, rapid convergence (learning) rate, and algorithm stability. This paper presents the major problems of adaptive control systems that use learning to improve performance. The paper then presents the major procedures and tools presently developed or currently being developed to enable the verification, validation, and ultimate certification of these adaptive control systems. These technologies include the application of automated program analysis methods, techniques to improve the learning process, analytical methods to verify stability, methods to automatically synthesize code, simulation and test methods, and tools to provide on-line software assurance.

  9. Modeling interfacial fracture in Sierra.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Arthur A.; Ohashi, Yuki; Lu, Wei-Yang

    2013-09-01

    This report summarizes computational efforts to model interfacial fracture using cohesive zone models in the SIERRA/SolidMechanics (SIERRA/SM) finite element code. Cohesive surface elements were used to model crack initiation and propagation along predefined paths. Mesh convergence was observed with SIERRA/SM for numerous geometries. As the funding for this project came from the Advanced Simulation and Computing Verification and Validation (ASC V&V) focus area, considerable effort was spent performing verification and validation. Code verification was performed to compare code predictions to analytical solutions for simple three-element simulations as well as a higher-fidelity simulation of a double-cantilever beam. Parameter identification was conductedmore » with Dakota using experimental results on asymmetric double-cantilever beam (ADCB) and end-notched-flexure (ENF) experiments conducted under Campaign-6 funding. Discretization convergence studies were also performed with respect to mesh size and time step and an optimization study was completed for mode II delamination using the ENF geometry. Throughout this verification process, numerous SIERRA/SM bugs were found and reported, all of which have been fixed, leading to over a 10-fold increase in convergence rates. Finally, mixed-mode flexure experiments were performed for validation. One of the unexplained issues encountered was material property variability for ostensibly the same composite material. Since the variability is not fully understood, it is difficult to accurately assess uncertainty when performing predictions.« less

  10. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  11. Verification of the Icarus Material Response Tool

    NASA Technical Reports Server (NTRS)

    Schroeder, Olivia; Palmer, Grant; Stern, Eric; Schulz, Joseph; Muppidi, Suman; Martin, Alexandre

    2017-01-01

    Due to the complex physics encountered during reentry, material response solvers are used for two main purposes: improve the understanding of the physical phenomena; and design and size thermal protection systems (TPS). Icarus, is a three dimensional, unstructured material response tool that is intended to be used for design while maintaining the flexibility to easily implement physical models as needed. Because TPS selection and sizing is critical, it is of the utmost importance that the design tools be extensively verified and validated before their use. Verification tests aim at insuring that the numerical schemes and equations are implemented correctly by comparison to analytical solutions and grid convergence tests.

  12. Fracture mechanics life analytical methods verification testing

    NASA Technical Reports Server (NTRS)

    Favenesi, J. A.; Clemmons, T. G.; Lambert, T. J.

    1994-01-01

    Verification and validation of the basic information capabilities in NASCRAC has been completed. The basic information includes computation of K versus a, J versus a, and crack opening area versus a. These quantities represent building blocks which NASCRAC uses in its other computations such as fatigue crack life and tearing instability. Several methods were used to verify and validate the basic information capabilities. The simple configurations such as the compact tension specimen and a crack in a finite plate were verified and validated versus handbook solutions for simple loads. For general loads using weight functions, offline integration using standard FORTRAN routines was performed. For more complicated configurations such as corner cracks and semielliptical cracks, NASCRAC solutions were verified and validated versus published results and finite element analyses. A few minor problems were identified in the basic information capabilities of the simple configurations. In the more complicated configurations, significant differences between NASCRAC and reference solutions were observed because NASCRAC calculates its solutions as averaged values across the entire crack front whereas the reference solutions were computed for a single point.

  13. Prototype test article verification of the Space Station Freedom active thermal control system microgravity performance

    NASA Technical Reports Server (NTRS)

    Chen, I. Y.; Ungar, E. K.; Lee, D. Y.; Beckstrom, P. S.

    1993-01-01

    To verify the on-orbit operation of the Space Station Freedom (SSF) two-phase external Active Thermal Control System (ATCS), a test and verification program will be performed prior to flight. The first system level test of the ATCS is the Prototype Test Article (PTA) test that will be performed in early 1994. All ATCS loops will be represented by prototypical components and the line sizes and lengths will be representative of the flight system. In this paper, the SSF ATCS and a portion of its verification process are described. The PTA design and the analytical methods that were used to quantify the gravity effects on PTA operation are detailed. Finally, the gravity effects are listed, and the applicability of the 1-g PTA test results to the validation of on-orbit ATCS operation is discussed.

  14. TOPEX Microwave Radiometer - Thermal design verification test and analytical model validation

    NASA Technical Reports Server (NTRS)

    Lin, Edward I.

    1992-01-01

    The testing of the TOPEX Microwave Radiometer (TMR) is described in terms of hardware development based on the modeling and thermal vacuum testing conducted. The TMR and the vacuum-test facility are described, and the thermal verification test includes a hot steady-state segment, a cold steady-state segment, and a cold survival mode segment totalling 65 hours. A graphic description is given of the test history which is related temperature tracking, and two multinode TMR test-chamber models are compared to the test results. Large discrepancies between the test data and the model predictions are attributed to contact conductance, effective emittance from the multilayer insulation, and heat leaks related to deviations from the flight configuration. The TMR thermal testing/modeling effort is shown to provide technical corrections for the procedure outlined, and the need for validating predictive models is underscored.

  15. Measurement of Plastic Stress and Strain for Analytical Method Verification (MSFC Center Director's Discretionary Fund Project No. 93-08)

    NASA Technical Reports Server (NTRS)

    Price, J. M.; Steeve, B. E.; Swanson, G. R.

    1999-01-01

    The analytical prediction of stress, strain, and fatigue life at locations experiencing local plasticity is full of uncertainties. Much of this uncertainty arises from the material models and their use in the numerical techniques used to solve plasticity problems. Experimental measurements of actual plastic strains would allow the validity of these models and solutions to be tested. This memorandum describes how experimental plastic residual strain measurements were used to verify the results of a thermally induced plastic fatigue failure analysis of a space shuttle main engine fuel pump component.

  16. Validation of a SysML based design for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed

    2017-07-01

    When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.

  17. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 45 Public Welfare 1 2013-10-01 2013-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  18. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 1 2014-10-01 2014-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  19. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  20. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 45 Public Welfare 1 2012-10-01 2012-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort may...

  1. Final Report - Regulatory Considerations for Adaptive Systems

    NASA Technical Reports Server (NTRS)

    Wilkinson, Chris; Lynch, Jonathan; Bharadwaj, Raj

    2013-01-01

    This report documents the findings of a preliminary research study into new approaches to the software design assurance of adaptive systems. We suggest a methodology to overcome the software validation and verification difficulties posed by the underlying assumption of non-adaptive software in the requirementsbased- testing verification methods in RTCA/DO-178B and C. An analysis of the relevant RTCA/DO-178B and C objectives is presented showing the reasons for the difficulties that arise in showing satisfaction of the objectives and suggested additional means by which they could be satisfied. We suggest that the software design assurance problem for adaptive systems is principally one of developing correct and complete high level requirements and system level constraints that define the necessary system functional and safety properties to assure the safe use of adaptive systems. We show how analytical techniques such as model based design, mathematical modeling and formal or formal-like methods can be used to both validate the high level functional and safety requirements, establish necessary constraints and provide the verification evidence for the satisfaction of requirements and constraints that supplements conventional testing. Finally the report identifies the follow-on research topics needed to implement this methodology.

  2. Statistical Design for Biospecimen Cohort Size in Proteomics-based Biomarker Discovery and Verification Studies

    PubMed Central

    Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.

    2014-01-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748

  3. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    PubMed

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  4. Benchmarking on Tsunami Currents with ComMIT

    NASA Astrophysics Data System (ADS)

    Sharghi vand, N.; Kanoglu, U.

    2015-12-01

    There were no standards for the validation and verification of tsunami numerical models before 2004 Indian Ocean tsunami. Even, number of numerical models has been used for inundation mapping effort, evaluation of critical structures, etc. without validation and verification. After 2004, NOAA Center for Tsunami Research (NCTR) established standards for the validation and verification of tsunami numerical models (Synolakis et al. 2008 Pure Appl. Geophys. 165, 2197-2228), which will be used evaluation of critical structures such as nuclear power plants against tsunami attack. NCTR presented analytical, experimental and field benchmark problems aimed to estimate maximum runup and accepted widely by the community. Recently, benchmark problems were suggested by the US National Tsunami Hazard Mitigation Program Mapping & Modeling Benchmarking Workshop: Tsunami Currents on February 9-10, 2015 at Portland, Oregon, USA (http://nws.weather.gov/nthmp/index.html). These benchmark problems concentrated toward validation and verification of tsunami numerical models on tsunami currents. Three of the benchmark problems were: current measurement of the Japan 2011 tsunami in Hilo Harbor, Hawaii, USA and in Tauranga Harbor, New Zealand, and single long-period wave propagating onto a small-scale experimental model of the town of Seaside, Oregon, USA. These benchmark problems were implemented in the Community Modeling Interface for Tsunamis (ComMIT) (Titov et al. 2011 Pure Appl. Geophys. 168, 2121-2131), which is a user-friendly interface to the validated and verified Method of Splitting Tsunami (MOST) (Titov and Synolakis 1995 J. Waterw. Port Coastal Ocean Eng. 121, 308-316) model and is developed by NCTR. The modeling results are compared with the required benchmark data, providing good agreements and results are discussed. Acknowledgment: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe)

  5. Verification and Validation (V&V) Methodologies for Multiphase Turbulent and Explosive Flows. V&V Case Studies of Computer Simulations from Los Alamos National Laboratory GMFIX codes

    NASA Astrophysics Data System (ADS)

    Dartevelle, S.

    2006-12-01

    Large-scale volcanic eruptions are inherently hazardous events, hence cannot be described by detailed and accurate in situ measurements; hence, volcanic explosive phenomenology is inadequately constrained in terms of initial and inflow conditions. Consequently, little to no real-time data exist to Verify and Validate computer codes developed to model these geophysical events as a whole. However, code Verification and Validation remains a necessary step, particularly when volcanologists use numerical data for mitigation of volcanic hazards as more often performed nowadays. The Verification and Validation (V&V) process formally assesses the level of 'credibility' of numerical results produced within a range of specific applications. The first step, Verification, is 'the process of determining that a model implementation accurately represents the conceptual description of the model', which requires either exact analytical solutions or highly accurate simplified experimental data. The second step, Validation, is 'the process of determining the degree to which a model is an accurate representation of the real world', which requires complex experimental data of the 'real world' physics. The Verification step is rather simple to formally achieve, while, in the 'real world' explosive volcanism context, the second step, Validation, is about impossible. Hence, instead of validating computer code against the whole large-scale unconstrained volcanic phenomenology, we rather suggest to focus on the key physics which control these volcanic clouds, viz., momentum-driven supersonic jets and multiphase turbulence. We propose to compare numerical results against a set of simple but well-constrained analog experiments, which uniquely and unambiguously represent these two key-phenomenology separately. Herewith, we use GMFIX (Geophysical Multiphase Flow with Interphase eXchange, v1.62), a set of multiphase- CFD FORTRAN codes, which have been recently redeveloped to meet the strict Quality Assurance, verification, and validation requirements from the Office of Civilian Radioactive Waste Management of the US Dept of Energy. GMFIX solves Navier-Stokes and energy partial differential equations for each phase with appropriate turbulence and interfacial coupling between phases. For momentum-driven single- to multi-phase underexpanded jets, the position of the first Mach disk is known empirically as a function of both the pressure ratio, K, and the particle mass fraction, Phi at the nozzle. Namely, the higher K, the further downstream the Mach disk and the higher Phi, the further upstream the first Mach disk. We show that GMFIX captures these two essential features. In addition, GMFIX displays all the properties found in these jets, such as expansion fans, incident and reflected shocks, and subsequent downstream mach discs, which make this code ideal for further investigations of equivalent volcanological phenomena. One of the other most challenging aspects of volcanic phenomenology is the multiphase nature of turbulence. We also validated GMFIX in comparing the velocity profiles and turbulence quantities against well constrained analog experiments. The velocity profiles agree with the analog ones as well as these of production of turbulent quantities. Overall, the Verification and the Validation experiments although inherently challenging suggest GMFIX captures the most essential dynamical properties of multiphase and supersonic flows and jets.

  6. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  7. VALIDATION OF ANALYTICAL METHODS AND INSTRUMENTATION FOR BERYLLIUM MEASUREMENT: REVIEW AND SUMMARY OF AVAILABLE GUIDES, PROCEDURES, AND PROTOCOLS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ekechukwu, A

    Method validation is the process of evaluating whether an analytical method is acceptable for its intended purpose. For pharmaceutical methods, guidelines from the United States Pharmacopeia (USP), International Conference on Harmonisation (ICH), and the United States Food and Drug Administration (USFDA) provide a framework for performing such valications. In general, methods for regulatory compliance must include studies on specificity, linearity, accuracy, precision, range, detection limit, quantitation limit, and robustness. Elements of these guidelines are readily adapted to the issue of validation for beryllium sampling and analysis. This document provides a listing of available sources which can be used to validatemore » analytical methods and/or instrumentation for beryllium determination. A literature review was conducted of available standard methods and publications used for method validation and/or quality control. A comprehensive listing of the articles, papers and books reviewed is given in the Appendix. Available validation documents and guides are listed therein; each has a brief description of application and use. In the referenced sources, there are varying approches to validation and varying descriptions of the valication process at different stages in method development. This discussion focuses on valication and verification of fully developed methods and instrumentation that have been offered up for use or approval by other laboratories or official consensus bodies such as ASTM International, the International Standards Organization (ISO) and the Association of Official Analytical Chemists (AOAC). This review was conducted as part of a collaborative effort to investigate and improve the state of validation for measuring beryllium in the workplace and the environment. Documents and publications from the United States and Europe are included. Unless otherwise specified, all referenced documents were published in English.« less

  8. Standards and Guidelines for Numerical Models for Tsunami Hazard Mitigation

    NASA Astrophysics Data System (ADS)

    Titov, V.; Gonzalez, F.; Kanoglu, U.; Yalciner, A.; Synolakis, C. E.

    2006-12-01

    An increased number of nations around the workd need to develop tsunami mitigation plans which invariably involve inundation maps for warning guidance and evacuation planning. There is the risk that inundation maps may be produced with older or untested methodology, as there are currently no standards for modeling tools. In the aftermath of the 2004 megatsunami, some models were used to model inundation for Cascadia events with results much larger than sediment records and existing state-of-the-art studies suggest leading to confusion among emergency management. Incorrectly assessing tsunami impact is hazardous, as recent events in 2006 in Tonga, Kythira, Greece and Central Java have suggested (Synolakis and Bernard, 2006). To calculate tsunami currents, forces and runup on coastal structures, and inundation of coastlines one must calculate the evolution of the tsunami wave from the deep ocean to its target site, numerically. No matter what the numerical model, validation (the process of ensuring that the model solves the parent equations of motion accurately) and verification (the process of ensuring that the model used represents geophysical reality appropriately) both are an essential. Validation ensures that the model performs well in a wide range of circumstances and is accomplished through comparison with analytical solutions. Verification ensures that the computational code performs well over a range of geophysical problems. A few analytic solutions have been validated themselves with laboratory data. Even fewer existing numerical models have been both validated with the analytical solutions and verified with both laboratory measurements and field measurements, thus establishing a gold standard for numerical codes for inundation mapping. While there is in principle no absolute certainty that a numerical code that has performed well in all the benchmark tests will also produce correct inundation predictions with any given source motions, validated codes reduce the level of uncertainty in their results to the uncertainty in the geophysical initial conditions. Further, when coupled with real--time free--field tsunami measurements from tsunameters, validated codes are the only choice for realistic forecasting of inundation; the consequences of failure are too ghastly to take chances with numerical procedures that have not been validated. We discuss a ten step process of benchmark tests for models used for inundation mapping. The associated methodology and algorithmes have to first be validated with analytical solutions, then verified with laboratory measurements and field data. The models need to be published in the scientific literature in peer-review journals indexed by ISI. While this process may appear onerous, it reflects our state of knowledge, and is the only defensible methodology when human lives are at stake. Synolakis, C.E., and Bernard, E.N, Tsunami science before and beyond Boxing Day 2004, Phil. Trans. R. Soc. A 364 1845, 2231--2263, 2005.

  9. A carrier-based analytical theory for negative capacitance symmetric double-gate field effect transistors and its simulation verification

    NASA Astrophysics Data System (ADS)

    Jiang, Chunsheng; Liang, Renrong; Wang, Jing; Xu, Jun

    2015-09-01

    A carrier-based analytical drain current model for negative capacitance symmetric double-gate field effect transistors (NC-SDG FETs) is proposed by solving the differential equation of the carrier, the Pao-Sah current formulation, and the Landau-Khalatnikov equation. The carrier equation is derived from Poisson’s equation and the Boltzmann distribution law. According to the model, an amplified semiconductor surface potential and a steeper subthreshold slope could be obtained with suitable thicknesses of the ferroelectric film and insulator layer at room temperature. Results predicted by the analytical model agree well with those of the numerical simulation from a 2D simulator without any fitting parameters. The analytical model is valid for all operation regions and captures the transitions between them without any auxiliary variables or functions. This model can be used to explore the operating mechanisms of NC-SDG FETs and to optimize device performance.

  10. Shielded-Twisted-Pair Cable Model for Chafe Fault Detection via Time-Domain Reflectometry

    NASA Technical Reports Server (NTRS)

    Schuet, Stefan R.; Timucin, Dogan A.; Wheeler, Kevin R.

    2012-01-01

    This report details the development, verification, and validation of an innovative physics-based model of electrical signal propagation through shielded-twisted-pair cable, which is commonly found on aircraft and offers an ideal proving ground for detection of small holes in a shield well before catastrophic damage occurs. The accuracy of this model is verified through numerical electromagnetic simulations using a commercially available software tool. The model is shown to be representative of more realistic (analytically intractable) cable configurations as well. A probabilistic framework is developed for validating the model accuracy with reflectometry data obtained from real aircraft-grade cables chafed in the laboratory.

  11. The NASA Hyper-X Program

    NASA Technical Reports Server (NTRS)

    Freeman, Delman C., Jr.; Reubush, Daivd E.; McClinton, Charles R.; Rausch, Vincent L.; Crawford, J. Larry

    1997-01-01

    This paper provides an overview of NASA's Hyper-X Program; a focused hypersonic technology effort designed to move hypersonic, airbreathing vehicle technology from the laboratory environment to the flight environment. This paper presents an overview of the flight test program, research objectives, approach, schedule and status. Substantial experimental database and concept validation have been completed. The program is currently concentrating on the first, Mach 7, vehicle development, verification and validation in preparation for wind-tunnel testing in 1998 and flight testing in 1999. Parallel to this effort the Mach 5 and 10 vehicle designs are being finalized. Detailed analytical and experimental evaluation of the Mach 7 vehicle at the flight conditions is nearing completion, and will provide a database for validation of design methods once flight test data are available.

  12. Case Study: Test Results of a Tool and Method for In-Flight, Adaptive Control System Verification on a NASA F-15 Flight Research Aircraft

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.; Schumann, Johann; Guenther, Kurt; Bosworth, John

    2006-01-01

    Adaptive control technologies that incorporate learning algorithms have been proposed to enable autonomous flight control and to maintain vehicle performance in the face of unknown, changing, or poorly defined operating environments [1-2]. At the present time, however, it is unknown how adaptive algorithms can be routinely verified, validated, and certified for use in safety-critical applications. Rigorous methods for adaptive software verification end validation must be developed to ensure that. the control software functions as required and is highly safe and reliable. A large gap appears to exist between the point at which control system designers feel the verification process is complete, and when FAA certification officials agree it is complete. Certification of adaptive flight control software verification is complicated by the use of learning algorithms (e.g., neural networks) and degrees of system non-determinism. Of course, analytical efforts must be made in the verification process to place guarantees on learning algorithm stability, rate of convergence, and convergence accuracy. However, to satisfy FAA certification requirements, it must be demonstrated that the adaptive flight control system is also able to fail and still allow the aircraft to be flown safely or to land, while at the same time providing a means of crew notification of the (impending) failure. It was for this purpose that the NASA Ames Confidence Tool was developed [3]. This paper presents the Confidence Tool as a means of providing in-flight software assurance monitoring of an adaptive flight control system. The paper will present the data obtained from flight testing the tool on a specially modified F-15 aircraft designed to simulate loss of flight control faces.

  13. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study

    PubMed Central

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Background Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules’ performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Methods Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. Results A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2–4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Conclusion Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved. PMID:26730980

  14. Design Characteristics Influence Performance of Clinical Prediction Rules in Validation: A Meta-Epidemiological Study.

    PubMed

    Ban, Jong-Wook; Emparanza, José Ignacio; Urreta, Iratxe; Burls, Amanda

    2016-01-01

    Many new clinical prediction rules are derived and validated. But the design and reporting quality of clinical prediction research has been less than optimal. We aimed to assess whether design characteristics of validation studies were associated with the overestimation of clinical prediction rules' performance. We also aimed to evaluate whether validation studies clearly reported important methodological characteristics. Electronic databases were searched for systematic reviews of clinical prediction rule studies published between 2006 and 2010. Data were extracted from the eligible validation studies included in the systematic reviews. A meta-analytic meta-epidemiological approach was used to assess the influence of design characteristics on predictive performance. From each validation study, it was assessed whether 7 design and 7 reporting characteristics were properly described. A total of 287 validation studies of clinical prediction rule were collected from 15 systematic reviews (31 meta-analyses). Validation studies using case-control design produced a summary diagnostic odds ratio (DOR) 2.2 times (95% CI: 1.2-4.3) larger than validation studies using cohort design and unclear design. When differential verification was used, the summary DOR was overestimated by twofold (95% CI: 1.2 -3.1) compared to complete, partial and unclear verification. The summary RDOR of validation studies with inadequate sample size was 1.9 (95% CI: 1.2 -3.1) compared to studies with adequate sample size. Study site, reliability, and clinical prediction rule was adequately described in 10.1%, 9.4%, and 7.0% of validation studies respectively. Validation studies with design shortcomings may overestimate the performance of clinical prediction rules. The quality of reporting among studies validating clinical prediction rules needs to be improved.

  15. Design, analysis, and test verification of advanced encapsulation systems

    NASA Technical Reports Server (NTRS)

    Mardesich, N.; Minning, C.

    1982-01-01

    Design sensitivities are established for the development of photovoltaic module criteria and the definition of needed research tasks. The program consists of three phases. In Phase I, analytical models were developed to perform optical, thermal, electrical, and structural analyses on candidate encapsulation systems. From these analyses several candidate systems will be selected for qualification testing during Phase II. Additionally, during Phase II, test specimens of various types will be constructed and tested to determine the validity of the analysis methodology developed in Phase I. In Phse III, a finalized optimum design based on knowledge gained in Phase I and II will be developed. All verification testing was completed during this period. Preliminary results and observations are discussed. Descriptions of the thermal, thermal structural, and structural deflection test setups are included.

  16. Implementation of a Blowing Boundary Condition in the LAURA Code

    NASA Technical Reports Server (NTRS)

    Thompson, Richard a.; Gnoffo, Peter A.

    2008-01-01

    Preliminary steps toward modeling a coupled ablation problem using a finite-volume Navier-Stokes code (LAURA) are presented in this paper. Implementation of a surface boundary condition with mass transfer (blowing) is described followed by verification and validation through comparisons with analytic results and experimental data. Application of the code to a carbon-nosetip ablation problem is demonstrated and the results are compared with previously published data. It is concluded that the code and coupled procedure are suitable to support further ablation analyses and studies.

  17. Development Approaches Coupled with Verification and Validation Methodologies for Agent-Based Mission-Level Analytical Combat Simulations

    DTIC Science & Technology

    2004-03-01

    When applying experience to new situations, the process is very similar. Faced with a new situation, a human generally looks for ways in which...find the best course of action, the human would compare current goals to those it faced in the previous experiences and choose the path that...154. Saperstein, Alvin (1995) “War and Chaos”. American Scientist, vol. 84. November-December 1995. pp. 548-557. 155. Sargent, Robert G . (1991

  18. Prediction of turning stability using receptance coupling

    NASA Astrophysics Data System (ADS)

    Jasiewicz, Marcin; Powałka, Bartosz

    2018-01-01

    This paper presents an issue of machining stability prediction of dynamic "lathe - workpiece" system evaluated using receptance coupling method. Dynamic properties of the lathe components (the spindle and the tailstock) are assumed to be constant and can be determined experimentally based on the results of the impact test. Hence, the variable of the system "machine tool - holder - workpiece" is the machined part, which can be easily modelled analytically. The method of receptance coupling enables a synthesis of experimental (spindle, tailstock) and analytical (machined part) models, so impact testing of the entire system becomes unnecessary. The paper presents methodology of analytical and experimental models synthesis, evaluation of the stability lobes and experimental validation procedure involving both the determination of the dynamic properties of the system and cutting tests. In the summary the experimental verification results would be presented and discussed.

  19. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  20. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  1. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  2. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  3. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  4. Validation of a finite element method framework for cardiac mechanics applications

    NASA Astrophysics Data System (ADS)

    Danan, David; Le Rolle, Virginie; Hubert, Arnaud; Galli, Elena; Bernard, Anne; Donal, Erwan; Hernández, Alfredo I.

    2017-11-01

    Modeling cardiac mechanics is a particularly challenging task, mainly because of the poor understanding of the underlying physiology, the lack of observability and the complexity of the mechanical properties of myocardial tissues. The choice of cardiac mechanic solvers, especially, implies several difficulties, notably due to the potential instability arising from the nonlinearities inherent to the large deformation framework. Furthermore, the verification of the obtained simulations is a difficult task because there is no analytic solutions for these kinds of problems. Hence, the objective of this work is to provide a quantitative verification of a cardiac mechanics implementation based on two published benchmark problems. The first problem consists in deforming a bar whereas the second problem concerns the inflation of a truncated ellipsoid-shaped ventricle, both in the steady state case. Simulations were obtained by using the finite element software GETFEM++. Results were compared to the consensus solution published by 11 groups and the proposed solutions were indistinguishable. The validation of the proposed mechanical model implementation is an important step toward the proposition of a global model of cardiac electro-mechanical activity.

  5. Advanced Software V&V for Civil Aviation and Autonomy

    NASA Technical Reports Server (NTRS)

    Brat, Guillaume P.

    2017-01-01

    With the advances in high-computing platform (e.g., advanced graphical processing units or multi-core processors), computationally-intensive software techniques such as the ones used in artificial intelligence or formal methods have provided us with an opportunity to further increase safety in the aviation industry. Some of these techniques have facilitated building safety at design time, like in aircraft engines or software verification and validation, and others can introduce safety benefits during operations as long as we adapt our processes. In this talk, I will present how NASA is taking advantage of these new software techniques to build in safety at design time through advanced software verification and validation, which can be applied earlier and earlier in the design life cycle and thus help also reduce the cost of aviation assurance. I will then show how run-time techniques (such as runtime assurance or data analytics) offer us a chance to catch even more complex problems, even in the face of changing and unpredictable environments. These new techniques will be extremely useful as our aviation systems become more complex and more autonomous.

  6. Assessment of a Hybrid Continuous/Discontinuous Galerkin Finite Element Code for Geothermal Reservoir Simulations

    DOE PAGES

    Xia, Yidong; Podgorney, Robert; Huang, Hai

    2016-03-17

    FALCON (“Fracturing And Liquid CONvection”) is a hybrid continuous / discontinuous Galerkin finite element geothermal reservoir simulation code based on the MOOSE (“Multiphysics Object-Oriented Simulation Environment”) framework being developed and used for multiphysics applications. In the present work, a suite of verification and validation (“V&V”) test problems for FALCON was defined to meet the design requirements, and solved to the interests of enhanced geothermal system (“EGS”) design. Furthermore, the intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the FALCON solution methods. The simulation problems vary in complexity from singly mechanical ormore » thermo process, to coupled thermo-hydro-mechanical processes in geological porous media. Numerical results obtained by FALCON agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these capabilities in FALCON. Some form of solution verification has been attempted to identify sensitivities in the solution methods, where possible, and suggest best practices when using the FALCON code.« less

  7. Assessment of a hybrid finite element and finite volume code for turbulent incompressible flows

    DOE PAGES

    Xia, Yidong; Wang, Chuanjin; Luo, Hong; ...

    2015-12-15

    Hydra-TH is a hybrid finite-element/finite-volume incompressible/low-Mach flow simulation code based on the Hydra multiphysics toolkit being developed and used for thermal-hydraulics applications. In the present work, a suite of verification and validation (V&V) test problems for Hydra-TH was defined to meet the design requirements of the Consortium for Advanced Simulation of Light Water Reactors (CASL). The intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the Hydra-TH solution methods. The simulation problems vary in complexity from laminar to turbulent flows. A set of RANS and LES turbulence models were used in themore » simulation of four classical test problems. Numerical results obtained by Hydra-TH agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these turbulence models in Hydra-TH. Where possible, we have attempted some form of solution verification to identify sensitivities in the solution methods, and to suggest best practices when using the Hydra-TH code.« less

  8. Assessment of a hybrid finite element and finite volume code for turbulent incompressible flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Yidong; Wang, Chuanjin; Luo, Hong

    Hydra-TH is a hybrid finite-element/finite-volume incompressible/low-Mach flow simulation code based on the Hydra multiphysics toolkit being developed and used for thermal-hydraulics applications. In the present work, a suite of verification and validation (V&V) test problems for Hydra-TH was defined to meet the design requirements of the Consortium for Advanced Simulation of Light Water Reactors (CASL). The intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the Hydra-TH solution methods. The simulation problems vary in complexity from laminar to turbulent flows. A set of RANS and LES turbulence models were used in themore » simulation of four classical test problems. Numerical results obtained by Hydra-TH agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these turbulence models in Hydra-TH. Where possible, we have attempted some form of solution verification to identify sensitivities in the solution methods, and to suggest best practices when using the Hydra-TH code.« less

  9. Hyper-X Engine Design and Ground Test Program

    NASA Technical Reports Server (NTRS)

    Voland, R. T.; Rock, K. E.; Huebner, L. D.; Witte, D. W.; Fischer, K. E.; McClinton, C. R.

    1998-01-01

    The Hyper-X Program, NASA's focused hypersonic technology program jointly run by NASA Langley and Dryden, is designed to move hypersonic, air-breathing vehicle technology from the laboratory environment to the flight environment, the last stage preceding prototype development. The Hyper-X research vehicle will provide the first ever opportunity to obtain data on an airframe integrated supersonic combustion ramjet propulsion system in flight, providing the first flight validation of wind tunnel, numerical and analytical methods used for design of these vehicles. A substantial portion of the integrated vehicle/engine flowpath development, engine systems verification and validation and flight test risk reduction efforts are experimentally based, including vehicle aeropropulsive force and moment database generation for flight control law development, and integrated vehicle/engine performance validation. The Mach 7 engine flowpath development tests have been completed, and effort is now shifting to engine controls, systems and performance verification and validation tests, as well as, additional flight test risk reduction tests. The engine wind tunnel tests required for these efforts range from tests of partial width engines in both small and large scramjet test facilities, to tests of the full flight engine on a vehicle simulator and tests of a complete flight vehicle in the Langley 8-Ft. High Temperature Tunnel. These tests will begin in the summer of 1998 and continue through 1999. The first flight test is planned for early 2000.

  10. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  11. 49 CFR Appendix F to Part 236 - Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System Safety...

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...

  12. 49 CFR Appendix F to Part 236 - Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System Safety...

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...

  13. 49 CFR Appendix F to Part 236 - Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System Safety...

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...

  14. 49 CFR Appendix F to Part 236 - Minimum Requirements of FRA Directed Independent Third-Party Assessment of PTC System Safety...

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Third-Party Assessment of PTC System Safety Verification and Validation F Appendix F to Part 236... Safety Verification and Validation (a) This appendix provides minimum requirements for mandatory independent third-party assessment of PTC system safety verification and validation pursuant to subpart H or I...

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Song, Ehwang; Gao, Yuqian; Wu, Chaochao

    Here, mass spectrometry (MS) based targeted proteomic methods such as selected reaction monitoring (SRM) are becoming the method of choice for preclinical verification of candidate protein biomarkers. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute has investigated the standardization and analytical validation of the SRM assays and demonstrated robust analytical performance on different instruments across different laboratories. An Assay Portal has also been established by CPTAC to provide the research community a resource consisting of large set of targeted MS-based assays, and a depository to share assays publicly, providing that assays meet the guidelines proposed bymore » CPTAC. Herein, we report 98 SRM assays covering 70 candidate protein biomarkers previously reported as associated with ovarian cancer that have been thoroughly characterized according to the CPTAC Assay Characterization Guidance Document. The experiments, methods and results for characterizing these SRM assays for their MS response, repeatability, selectivity, stability, and reproducible detection of endogenous analytes are described in detail.« less

  16. Delamination Assessment Tool for Spacecraft Composite Structures

    NASA Astrophysics Data System (ADS)

    Portela, Pedro; Preller, Fabian; Wittke, Henrik; Sinnema, Gerben; Camanho, Pedro; Turon, Albert

    2012-07-01

    Fortunately only few cases are known where failure of spacecraft structures due to undetected damage has resulted in a loss of spacecraft and launcher mission. However, several problems related to damage tolerance and in particular delamination of composite materials have been encountered during structure development of various ESA projects and qualification testing. To avoid such costly failures during development, launch or service of spacecraft, launcher and reusable launch vehicles structures a comprehensive damage tolerance verification approach is needed. In 2009, the European Space Agency (ESA) initiated an activity called “Delamination Assessment Tool” which is led by the Portuguese company HPS Lda and includes academic and industrial partners. The goal of this study is the development of a comprehensive damage tolerance verification approach for launcher and reusable launch vehicles (RLV) structures, addressing analytical and numerical methodologies, material-, subcomponent- and component testing, as well as non-destructive inspection. The study includes a comprehensive review of current industrial damage tolerance practice resulting from ECSS and NASA standards, the development of new Best Practice Guidelines for analysis, test and inspection methods and the validation of these with a real industrial case study. The paper describes the main findings of this activity so far and presents a first iteration of a Damage Tolerance Verification Approach, which includes the introduction of novel analytical and numerical tools at an industrial level. This new approach is being put to the test using real industrial case studies provided by the industrial partners, MT Aerospace, RUAG Space and INVENT GmbH

  17. Structural damage detection based on stochastic subspace identification and statistical pattern recognition: II. Experimental validation under varying temperature

    NASA Astrophysics Data System (ADS)

    Lin, Y. Q.; Ren, W. X.; Fang, S. E.

    2011-11-01

    Although most vibration-based damage detection methods can acquire satisfactory verification on analytical or numerical structures, most of them may encounter problems when applied to real-world structures under varying environments. The damage detection methods that directly extract damage features from the periodically sampled dynamic time history response measurements are desirable but relevant research and field application verification are still lacking. In this second part of a two-part paper, the robustness and performance of the statistics-based damage index using the forward innovation model by stochastic subspace identification of a vibrating structure proposed in the first part have been investigated against two prestressed reinforced concrete (RC) beams tested in the laboratory and a full-scale RC arch bridge tested in the field under varying environments. Experimental verification is focused on temperature effects. It is demonstrated that the proposed statistics-based damage index is insensitive to temperature variations but sensitive to the structural deterioration or state alteration. This makes it possible to detect the structural damage for the real-scale structures experiencing ambient excitations and varying environmental conditions.

  18. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  19. Specification, Validation and Verification of Mobile Application Behavior

    DTIC Science & Technology

    2013-03-01

    VALIDATION AND VERIFICATION OF MOBILE APPLICATION BEHAVIOR by Christopher B. Bonine March 2013 Thesis Advisor: Man-Tak Shing Thesis Co...NUMBERS 6. AUTHOR(S) Christopher B. Bonine 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) Naval Postgraduate School Monterey, CA 93943–5000 8...VALIDATION AND VERIFICATION OF MOBILE APPLICATION BEHAVIOR Christopher B. Bonine Lieutenant, United States Navy B.S. Southern Polytechnic State

  20. Proceedings of the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering - M and C 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    2013-07-01

    The Mathematics and Computation Division of the American Nuclear (ANS) and the Idaho Section of the ANS hosted the 2013 International Conference on Mathematics and Computational Methods Applied to Nuclear Science and Engineering (M and C 2013). This proceedings contains over 250 full papers with topics ranging from reactor physics; radiation transport; materials science; nuclear fuels; core performance and optimization; reactor systems and safety; fluid dynamics; medical applications; analytical and numerical methods; algorithms for advanced architectures; and validation verification, and uncertainty quantification.

  1. Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin

    2016-05-13

    This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.

  2. Clinical validation of the 50 gene AmpliSeq Cancer Panel V2 for use on a next generation sequencing platform using formalin fixed, paraffin embedded and fine needle aspiration tumour specimens.

    PubMed

    Rathi, Vivek; Wright, Gavin; Constantin, Diana; Chang, Siok; Pham, Huong; Jones, Kerryn; Palios, Atha; Mclachlan, Sue-Anne; Conron, Matthew; McKelvie, Penny; Williams, Richard

    2017-01-01

    The advent of massively parallel sequencing has caused a paradigm shift in the ways cancer is treated, as personalised therapy becomes a reality. More and more laboratories are looking to introduce next generation sequencing (NGS) as a tool for mutational analysis, as this technology has many advantages compared to conventional platforms like Sanger sequencing. In Australia all massively parallel sequencing platforms are still considered in-house in vitro diagnostic tools by the National Association of Testing Authorities (NATA) and a comprehensive analytical validation of all assays, and not just mere verification, is a strict requirement before accreditation can be granted for clinical testing on these platforms. Analytical validation of assays on NGS platforms can prove to be extremely challenging for pathology laboratories. Although there are many affordable and easily accessible NGS instruments available, there are no standardised guidelines as yet for clinical validation of NGS assays. We present an accreditation development procedure that was both comprehensive and applicable in a setting of hospital laboratory for NGS services. This approach may also be applied to other NGS applications in service laboratories. Copyright © 2016 Royal College of Pathologists of Australasia. Published by Elsevier B.V. All rights reserved.

  3. A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bravenec, Ronald

    My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less

  4. 77 FR 50723 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., ``Verification, Validation, Reviews, and Audits for Digital Computer Software used in Safety Systems of Nuclear... NRC regulations promoting the development of, and compliance with, software verification and...

  5. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  6. Formal verification and testing: An integrated approach to validating Ada programs

    NASA Technical Reports Server (NTRS)

    Cohen, Norman H.

    1986-01-01

    An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.

  7. Verification and Validation of Monte Carlo n-Particle Code 6 (MCNP6) with Neutron Protection Factor Measurements of an Iron Box

    DTIC Science & Technology

    2014-03-27

    VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR... PARTICLE CODE 6 (MCNP6) WITH NEUTRON PROTECTION FACTOR MEASUREMENTS OF AN IRON BOX THESIS Presented to the Faculty Department of Engineering...STATEMENT A. APPROVED FOR PUBLIC RELEASE; DISTRIBUTION UNLIMITED iv AFIT-ENP-14-M-05 VERIFICATION AND VALIDATION OF MONTE CARLO N- PARTICLE CODE 6

  8. Verification and validation of RADMODL Version 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kimball, K.D.

    1993-03-01

    RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transportmore » of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.« less

  9. Neoclassical toroidal viscosity calculations in tokamaks using a δf Monte Carlo simulation and their verifications.

    PubMed

    Satake, S; Park, J-K; Sugama, H; Kanno, R

    2011-07-29

    Neoclassical toroidal viscosities (NTVs) in tokamaks are investigated using a δf Monte Carlo simulation, and are successfully verified with a combined analytic theory over a wide range of collisionality. A Monte Carlo simulation has been required in the study of NTV since the complexities in guiding-center orbits of particles and their collisions cannot be fully investigated by any means of analytic theories alone. Results yielded the details of the complex NTV dependency on particle precessions and collisions, which were predicted roughly in a combined analytic theory. Both numerical and analytic methods can be utilized and extended based on these successful verifications.

  10. The validation & verification of an LC/MS method for the determination of total docosahexaenoic acid concentrations in canine blood serum.

    PubMed

    Dillon, Gerald Patrick; Keegan, Jason D; Wallace, Geoff; Yiannikouris, Alexandros; Moran, Colm Anthony

    2018-06-01

    Docosahexaenoic acid (DHA), is an omega 3 fatty acid (n-3 FA) that has been shown to play a role in canine growth and physiological integrity and improvements in skin and coat condition. However, potential adverse effects of n-3 FA specifically, impaired cellular immunity has been observed in dogs fed diets with elevated levels of n-3 FA. As such, a safe upper limit (SUL) for total n-3 FAs (DHA and EPA) in dogs has been established. Considering this SUL, sensitive methods detecting DHA in blood serum as a biomarker when conducting n-3 FA supplementation trials involving dogs are required. In this study, an LC-ESI-MS/MS method of DHA detection in dog serum was validated and verified. Recovery of DHA was optimized and parallelism tests were conducted with spiked samples demonstrating that the serum matrix did not interfere with quantitation. The stability of DHA in serum was also investigated, with -80 °C considered suitable when storing samples for up to six months. The method was linear over a calibration range of 1-500 μg/mL and precision and accuracy were found to meet the requirements for validation. This method was verified in an alternative laboratory using a different analytical system and operator, with the results meeting the criteria for verification. Copyright © 2018. Published by Elsevier Inc.

  11. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  12. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy; Kim, Youngkwang; Conway, Claire; Conway, Darrel J.

    2017-01-01

    This paper describes the processes and results of Verification and Validation (VV) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The VV effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  13. Benchmarking the Collocation Stand-Alone Library and Toolkit (CSALT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven; Knittel, Jeremy; Shoan, Wendy (Compiler); Kim, Youngkwang; Conway, Claire (Compiler); Conway, Darrel

    2017-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the Collocation Stand Alone Library and Toolkit (CSALT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort employs classical problems with known analytic solutions, solutions from other available software tools, and comparisons to benchmarking data available in the public literature. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results for a broad range of problems, and detailed comparisons for selected problems.

  14. A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems

    DTIC Science & Technology

    2016-03-01

    insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of

  15. Development of a point-kinetic verification scheme for nuclear reactor applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Demazière, C., E-mail: demaz@chalmers.se; Dykin, V.; Jareteg, K.

    In this paper, a new method that can be used for checking the proper implementation of time- or frequency-dependent neutron transport models and for verifying their ability to recover some basic reactor physics properties is proposed. This method makes use of the application of a stationary perturbation to the system at a given frequency and extraction of the point-kinetic component of the system response. Even for strongly heterogeneous systems for which an analytical solution does not exist, the point-kinetic component follows, as a function of frequency, a simple analytical form. The comparison between the extracted point-kinetic component and its expectedmore » analytical form provides an opportunity to verify and validate neutron transport solvers. The proposed method is tested on two diffusion-based codes, one working in the time domain and the other working in the frequency domain. As long as the applied perturbation has a non-zero reactivity effect, it is demonstrated that the method can be successfully applied to verify and validate time- or frequency-dependent neutron transport solvers. Although the method is demonstrated in the present paper in a diffusion theory framework, higher order neutron transport methods could be verified based on the same principles.« less

  16. Validation (not just verification) of Deep Space Missions

    NASA Technical Reports Server (NTRS)

    Duren, Riley M.

    2006-01-01

    ion & Validation (V&V) is a widely recognized and critical systems engineering function. However, the often used definition 'Verification proves the design is right; validation proves it is the right design' is rather vague. And while Verification is a reasonably well standardized systems engineering process, Validation is a far more abstract concept and the rigor and scope applied to it varies widely between organizations and individuals. This is reflected in the findings in recent Mishap Reports for several NASA missions, in which shortfalls in Validation (not just Verification) were cited as root- or contributing-factors in catastrophic mission loss. Furthermore, although there is strong agreement in the community that Test is the preferred method for V&V, many people equate 'V&V' with 'Test', such that Analysis and Modeling aren't given comparable attention. Another strong motivator is a realization that the rapid growth in complexity of deep-space missions (particularly Planetary Landers and Space Observatories given their inherent unknowns) is placing greater demands on systems engineers to 'get it right' with Validation.

  17. Targeted proteomic assays for quantitation of proteins identified by proteogenomic analysis of ovarian cancer

    DOE PAGES

    Song, Ehwang; Gao, Yuqian; Wu, Chaochao; ...

    2017-07-19

    Here, mass spectrometry (MS) based targeted proteomic methods such as selected reaction monitoring (SRM) are becoming the method of choice for preclinical verification of candidate protein biomarkers. The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute has investigated the standardization and analytical validation of the SRM assays and demonstrated robust analytical performance on different instruments across different laboratories. An Assay Portal has also been established by CPTAC to provide the research community a resource consisting of large set of targeted MS-based assays, and a depository to share assays publicly, providing that assays meet the guidelines proposed bymore » CPTAC. Herein, we report 98 SRM assays covering 70 candidate protein biomarkers previously reported as associated with ovarian cancer that have been thoroughly characterized according to the CPTAC Assay Characterization Guidance Document. The experiments, methods and results for characterizing these SRM assays for their MS response, repeatability, selectivity, stability, and reproducible detection of endogenous analytes are described in detail.« less

  18. RELAP5-3D Resolution of Known Restart/Backup Issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesina, George L.; Anderson, Nolan A.

    2014-12-01

    The state-of-the-art nuclear reactor system safety analysis computer program developed at the Idaho National Laboratory (INL), RELAP5-3D, continues to adapt to changes in computer hardware and software and to develop to meet the ever-expanding needs of the nuclear industry. To continue at the forefront, code testing must evolve with both code and industry developments, and it must work correctly. To best ensure this, the processes of Software Verification and Validation (V&V) are applied. Verification compares coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions. A form of this, sequentialmore » verification, checks code specifications against coding only when originally written then applies regression testing which compares code calculations between consecutive updates or versions on a set of test cases to check that the performance does not change. A sequential verification testing system was specially constructed for RELAP5-3D to both detect errors with extreme accuracy and cover all nuclear-plant-relevant code features. Detection is provided through a “verification file” that records double precision sums of key variables. Coverage is provided by a test suite of input decks that exercise code features and capabilities necessary to model a nuclear power plant. A matrix of test features and short-running cases that exercise them is presented. This testing system is used to test base cases (called null testing) as well as restart and backup cases. It can test RELAP5-3D performance in both standalone and coupled (through PVM to other codes) runs. Application of verification testing revealed numerous restart and backup issues in both standalone and couple modes. This document reports the resolution of these issues.« less

  19. Verification of Gyrokinetic codes: Theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia; Bottino, Alberto; Görler, Tobias; Sonnendrücker, Eric; Told, Daniel; Villard, Laurent

    2017-05-01

    In fusion plasmas, the strong magnetic field allows the fast gyro-motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the subsequent transport. Naturally, these codes require thorough verification and validation. Here, we present a new and generic theoretical framework and specific numerical applications to test the faithfulness of the implemented models to theory and to verify the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which has rarely been done and therefore makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The verification of the numerical scheme is proposed via the benchmark effort. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC) and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations implemented in the ORB5 and GENE codes using the Lagrangian variational formulation. At the computational level, detailed verifications of global electromagnetic test cases developed from the CYCLONE Base Case are considered, including a parametric β-scan covering the transition from ITG to KBM and the spectral properties at the nominal β value.

  20. Analytical torque calculation and experimental verification of synchronous permanent magnet couplings with Halbach arrays

    NASA Astrophysics Data System (ADS)

    Seo, Sung-Won; Kim, Young-Hyun; Lee, Jung-Ho; Choi, Jang-Young

    2018-05-01

    This paper presents analytical torque calculation and experimental verification of synchronous permanent magnet couplings (SPMCs) with Halbach arrays. A Halbach array is composed of various numbers of segments per pole; we calculate and compare the magnetic torques for 2, 3, and 4 segments. Firstly, based on the magnetic vector potential, and using a 2D polar coordinate system, we obtain analytical solutions for the magnetic field. Next, through a series of processes, we perform magnetic torque calculations using the derived solutions and a Maxwell stress tensor. Finally, the analytical results are verified by comparison with the results of 2D and 3D finite element analysis and the results of an experiment.

  1. 49 CFR 236.1017 - Independent third party Verification and Validation.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 49 Transportation 4 2014-10-01 2014-10-01 false Independent third party Verification and Validation. 236.1017 Section 236.1017 Transportation Other Regulations Relating to Transportation (Continued... Validation. (a) The PTCSP must be supported by an independent third-party assessment when the Associate...

  2. 49 CFR 236.1017 - Independent third party Verification and Validation.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 49 Transportation 4 2012-10-01 2012-10-01 false Independent third party Verification and Validation. 236.1017 Section 236.1017 Transportation Other Regulations Relating to Transportation (Continued... Validation. (a) The PTCSP must be supported by an independent third-party assessment when the Associate...

  3. 49 CFR 236.1017 - Independent third party Verification and Validation.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 49 Transportation 4 2013-10-01 2013-10-01 false Independent third party Verification and Validation. 236.1017 Section 236.1017 Transportation Other Regulations Relating to Transportation (Continued... Validation. (a) The PTCSP must be supported by an independent third-party assessment when the Associate...

  4. 49 CFR 236.1017 - Independent third party Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 49 Transportation 4 2011-10-01 2011-10-01 false Independent third party Verification and Validation. 236.1017 Section 236.1017 Transportation Other Regulations Relating to Transportation (Continued... Validation. (a) The PTCSP must be supported by an independent third-party assessment when the Associate...

  5. WRAP-RIB antenna technology development

    NASA Technical Reports Server (NTRS)

    Freeland, R. E.; Garcia, N. F.; Iwamoto, H.

    1985-01-01

    The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.

  6. Sierra/Aria 4.48 Verification Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sierra Thermal Fluid Development Team

    Presented in this document is a portion of the tests that exist in the Sierra Thermal/Fluids verification test suite. Each of these tests is run nightly with the Sierra/TF code suite and the results of the test checked under mesh refinement against the correct analytic result. For each of the tests presented in this document the test setup, derivation of the analytic solution, and comparison of the code results to the analytic solution is provided. This document can be used to confirm that a given code capability is verified or referenced as a compilation of example problems.

  7. Computer Modeling and Simulation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pronskikh, V. S.

    2014-05-09

    Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less

  8. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose

    2018-03-01

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less

  9. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel Jose

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less

  10. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  11. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... validation. The RSPP must require the identification of verification and validation methods for the... to be used in the verification and validation process, consistent with appendix C to this part. The... information. (3) If no action is taken on the petition within 180 days, the petition remains pending for...

  12. Cognitive Bias in the Verification and Validation of Space Flight Systems

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Cognitive bias is generally recognized as playing a significant role in virtually all domains of human decision making. Insight into this role is informally built into many of the system engineering practices employed in the aerospace industry. The review process, for example, typically has features that help to counteract the effect of bias. This paper presents a discussion of how commonly recognized biases may affect the verification and validation process. Verifying and validating a system is arguably more challenging than development, both technically and cognitively. Whereas there may be a relatively limited number of options available for the design of a particular aspect of a system, there is a virtually unlimited number of potential verification scenarios that may be explored. The probability of any particular scenario occurring in operations is typically very difficult to estimate, which increases reliance on judgment that may be affected by bias. Implementing a verification activity often presents technical challenges that, if they can be overcome at all, often result in a departure from actual flight conditions (e.g., 1-g testing, simulation, time compression, artificial fault injection) that may raise additional questions about the meaningfulness of the results, and create opportunities for the introduction of additional biases. In addition to mitigating the biases it can introduce directly, the verification and validation process must also overcome the cumulative effect of biases introduced during all previous stages of development. A variety of cognitive biases will be described, with research results for illustration. A handful of case studies will be presented that show how cognitive bias may have affected the verification and validation process on recent JPL flight projects, identify areas of strength and weakness, and identify potential changes or additions to commonly used techniques that could provide a more robust verification and validation of future systems.

  13. Verification of Gyrokinetic codes: theoretical background and applications

    NASA Astrophysics Data System (ADS)

    Tronko, Natalia

    2016-10-01

    In fusion plasmas the strong magnetic field allows the fast gyro motion to be systematically removed from the description of the dynamics, resulting in a considerable model simplification and gain of computational time. Nowadays, the gyrokinetic (GK) codes play a major role in the understanding of the development and the saturation of turbulence and in the prediction of the consequent transport. We present a new and generic theoretical framework and specific numerical applications to test the validity and the domain of applicability of existing GK codes. For a sound verification process, the underlying theoretical GK model and the numerical scheme must be considered at the same time, which makes this approach pioneering. At the analytical level, the main novelty consists in using advanced mathematical tools such as variational formulation of dynamics for systematization of basic GK code's equations to access the limits of their applicability. The indirect verification of numerical scheme is proposed via the Benchmark process. In this work, specific examples of code verification are presented for two GK codes: the multi-species electromagnetic ORB5 (PIC), and the radially global version of GENE (Eulerian). The proposed methodology can be applied to any existing GK code. We establish a hierarchy of reduced GK Vlasov-Maxwell equations using the generic variational formulation. Then, we derive and include the models implemented in ORB5 and GENE inside this hierarchy. At the computational level, detailed verification of global electromagnetic test cases based on the CYCLONE are considered, including a parametric β-scan covering the transition between the ITG to KBM and the spectral properties at the nominal β value.

  14. Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities

    NASA Technical Reports Server (NTRS)

    Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.

    2009-01-01

    Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.

  15. Cleaning verification: A five parameter study of a Total Organic Carbon method development and validation for the cleaning assessment of residual detergents in manufacturing equipment.

    PubMed

    Li, Xue; Ahmad, Imad A Haidar; Tam, James; Wang, Yan; Dao, Gina; Blasko, Andrei

    2018-02-05

    A Total Organic Carbon (TOC) based analytical method to quantitate trace residues of clean-in-place (CIP) detergents CIP100 ® and CIP200 ® on the surfaces of pharmaceutical manufacturing equipment was developed and validated. Five factors affecting the development and validation of the method were identified: diluent composition, diluent volume, extraction method, location for TOC sample preparation, and oxidant flow rate. Key experimental parameters were optimized to minimize contamination and to improve the sensitivity, recovery, and reliability of the method. The optimized concentration of the phosphoric acid in the swabbing solution was 0.05M, and the optimal volume of the sample solution was 30mL. The swab extraction method was 1min sonication. The use of a clean room, as compared to an isolated lab environment, was not required for method validation. The method was demonstrated to be linear with a correlation coefficient (R) of 0.9999. The average recoveries from stainless steel surfaces at multiple spike levels were >90%. The repeatability and intermediate precision results were ≤5% across the 2.2-6.6ppm range (50-150% of the target maximum carry over, MACO, limit). The method was also shown to be sensitive with a detection limit (DL) of 38ppb and a quantitation limit (QL) of 114ppb. The method validation demonstrated that the developed method is suitable for its intended use. The methodology developed in this study is generally applicable to the cleaning verification of any organic detergents used for the cleaning of pharmaceutical manufacturing equipment made of electropolished stainless steel material. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. 78 FR 1162 - Cardiovascular Devices; Reclassification of External Cardiac Compressor

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-08

    ... safety and electromagnetic compatibility; For devices containing software, software verification... electromagnetic compatibility; For devices containing software, software verification, validation, and hazard... electrical components, appropriate analysis and testing must validate electrical safety and electromagnetic...

  17. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Morten T.; Wendt, Fabian F.; Robertson, Amy N.

    2016-07-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  18. Use of Social Media to Target Information-Driven Arms Control and Nonproliferation Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kreyling, Sean J.; Williams, Laura S.; Gastelum, Zoe N.

    There has been considerable discussion within the national security community, including a recent workshop sponsored by the U.S. State Department, about the use of social media for extracting patterns of collective behavior and influencing public perception in areas relevant to arms control and nonproliferation. This paper seeks to explore if, and how, social media can be used to supplement nonproliferation and arms control inspection and monitoring activities on states and sites of greatest proliferation relevance. In this paper, we set the stage for how social media can be applied in this problem space and describe some of the foreseen challenges,more » including data validation, sources and attributes, verification, and security. Using information analytics and data visualization capabilities available at Pacific Northwest National Laboratory (PNNL), we provide graphical examples of some social media "signatures" of potential relevance for nonproliferation and arms control purposes. We conclude by describing a proposed case study and offering recommendations both for further research and next steps by the policy community.« less

  19. Fracture mechanics life analytical methods verification testing

    NASA Technical Reports Server (NTRS)

    Favenesi, J. A.; Clemons, T. G.; Riddell, W. T.; Ingraffea, A. R.; Wawrzynek, P. A.

    1994-01-01

    The objective was to evaluate NASCRAC (trademark) version 2.0, a second generation fracture analysis code, for verification and validity. NASCRAC was evaluated using a combination of comparisons to the literature, closed-form solutions, numerical analyses, and tests. Several limitations and minor errors were detected. Additionally, a number of major flaws were discovered. These major flaws were generally due to application of a specific method or theory, not due to programming logic. Results are presented for the following program capabilities: K versus a, J versus a, crack opening area, life calculation due to fatigue crack growth, tolerable crack size, proof test logic, tearing instability, creep crack growth, crack transitioning, crack retardation due to overloads, and elastic-plastic stress redistribution. It is concluded that the code is an acceptable fracture tool for K solutions of simplified geometries, for a limited number of J and crack opening area solutions, and for fatigue crack propagation with the Paris equation and constant amplitude loads when the Paris equation is applicable.

  20. Verification, Validation and Sensitivity Studies in Computational Biomechanics

    PubMed Central

    Anderson, Andrew E.; Ellis, Benjamin J.; Weiss, Jeffrey A.

    2012-01-01

    Computational techniques and software for the analysis of problems in mechanics have naturally moved from their origins in the traditional engineering disciplines to the study of cell, tissue and organ biomechanics. Increasingly complex models have been developed to describe and predict the mechanical behavior of such biological systems. While the availability of advanced computational tools has led to exciting research advances in the field, the utility of these models is often the subject of criticism due to inadequate model verification and validation. The objective of this review is to present the concepts of verification, validation and sensitivity studies with regard to the construction, analysis and interpretation of models in computational biomechanics. Specific examples from the field are discussed. It is hoped that this review will serve as a guide to the use of verification and validation principles in the field of computational biomechanics, thereby improving the peer acceptance of studies that use computational modeling techniques. PMID:17558646

  1. CASL Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mousseau, Vincent Andrew; Dinh, Nam

    2016-06-30

    This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation andmore » verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.« less

  2. CFD and ventilation research.

    PubMed

    Li, Y; Nielsen, P V

    2011-12-01

    There has been a rapid growth of scientific literature on the application of computational fluid dynamics (CFD) in the research of ventilation and indoor air science. With a 1000-10,000 times increase in computer hardware capability in the past 20 years, CFD has become an integral part of scientific research and engineering development of complex air distribution and ventilation systems in buildings. This review discusses the major and specific challenges of CFD in terms of turbulence modelling, numerical approximation, and boundary conditions relevant to building ventilation. We emphasize the growing need for CFD verification and validation, suggest ongoing needs for analytical and experimental methods to support the numerical solutions, and discuss the growing capacity of CFD in opening up new research areas. We suggest that CFD has not become a replacement for experiment and theoretical analysis in ventilation research, rather it has become an increasingly important partner. We believe that an effective scientific approach for ventilation studies is still to combine experiments, theory, and CFD. We argue that CFD verification and validation are becoming more crucial than ever as more complex ventilation problems are solved. It is anticipated that ventilation problems at the city scale will be tackled by CFD in the next 10 years. © 2011 John Wiley & Sons A/S.

  3. Assessment of a hybrid finite element and finite volume code for turbulent incompressible flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xia, Yidong, E-mail: yidong.xia@inl.gov; Wang, Chuanjin; Luo, Hong

    Hydra-TH is a hybrid finite-element/finite-volume incompressible/low-Mach flow simulation code based on the Hydra multiphysics toolkit being developed and used for thermal-hydraulics applications. In the present work, a suite of verification and validation (V&V) test problems for Hydra-TH was defined to meet the design requirements of the Consortium for Advanced Simulation of Light Water Reactors (CASL). The intent for this test problem suite is to provide baseline comparison data that demonstrates the performance of the Hydra-TH solution methods. The simulation problems vary in complexity from laminar to turbulent flows. A set of RANS and LES turbulence models were used in themore » simulation of four classical test problems. Numerical results obtained by Hydra-TH agreed well with either the available analytical solution or experimental data, indicating the verified and validated implementation of these turbulence models in Hydra-TH. Where possible, some form of solution verification has been attempted to identify sensitivities in the solution methods, and suggest best practices when using the Hydra-TH code. -- Highlights: •We performed a comprehensive study to verify and validate the turbulence models in Hydra-TH. •Hydra-TH delivers 2nd-order grid convergence for the incompressible Navier–Stokes equations. •Hydra-TH can accurately simulate the laminar boundary layers. •Hydra-TH can accurately simulate the turbulent boundary layers with RANS turbulence models. •Hydra-TH delivers high-fidelity LES capability for simulating turbulent flows in confined space.« less

  4. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  5. Topologically Nontrivial Magnon Bands in Artificial Square Spin Ices with Dzyaloshinskii-Moriya Interaction [Topologically Non-Trivial Magnon Bands in Artificial Square Spin Ices Subject to Dzyaloshinskii-Moriya Interaction

    DOE PAGES

    Iacocca, Ezio; Heinonen, Olle

    2017-09-20

    Systems that exhibit topologically protected edge states are interesting both from a fundamental point of view as well as for potential applications, the latter because of the absence of backscattering and robustness to perturbations. It is desirable to be able to control and manipulate such edge states. Here, we demonstrate using a semi-analytical model that artificial square ices can incorporate both features: an interfacial Dzyaloshinksii-Moriya gives rise to topologically non-trivial magnon bands, and the equilibrium state of the spin ice is reconfigurable with different states having different magnon dispersions and topology. Micromagnetic simulations are used to determine the magnetization equilibriummore » states and to validate the semi-analytical model. Lastly, our results are amenable to experimental verification via, e.g., lithographic patterning and micro-focused Brillouin light scattering.« less

  6. Gap Analysis for Chinese Drug Control Institutes to Achieve the Standards of World Health Organization Medicine Prequalification.

    PubMed

    Mao, Xin; Yang, Yue

    2017-02-01

    The study aims to explore the challenges and the gaps faced by Chinese Drug Control Institutes in achieving the standards of World Health Organization (WHO) Medicine Prequalification. The study was undertaken with 6 Provincial Drug Control Institutes in China from November 2012 to November 2013. The study assessed key elements required to comply with WHO Good Practices for Pharmaceutical Quality Control Laboratories (GPPQCL). For GPPQCL, the study found gaps in quality management system, control of documentation, data-processing equipment, premises and equipment, contracts, reagents (water), reference substances and reference materials, calibration, verification of performance and qualification of equipment, instruments and other devices, analytical worksheet, evaluation of test results, personnel, and validation of analytical procedures. The study indicates that gaps are attributed to differences between the standards of Chinese Accreditation Standards and WHO-GPPQCL. Copyright © 2017 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  7. Topologically Nontrivial Magnon Bands in Artificial Square Spin Ices with Dzyaloshinskii-Moriya Interaction [Topologically Non-Trivial Magnon Bands in Artificial Square Spin Ices Subject to Dzyaloshinskii-Moriya Interaction

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Iacocca, Ezio; Heinonen, Olle

    Systems that exhibit topologically protected edge states are interesting both from a fundamental point of view as well as for potential applications, the latter because of the absence of backscattering and robustness to perturbations. It is desirable to be able to control and manipulate such edge states. Here, we demonstrate using a semi-analytical model that artificial square ices can incorporate both features: an interfacial Dzyaloshinksii-Moriya gives rise to topologically non-trivial magnon bands, and the equilibrium state of the spin ice is reconfigurable with different states having different magnon dispersions and topology. Micromagnetic simulations are used to determine the magnetization equilibriummore » states and to validate the semi-analytical model. Lastly, our results are amenable to experimental verification via, e.g., lithographic patterning and micro-focused Brillouin light scattering.« less

  8. Control research in the NASA high-alpha technology program

    NASA Technical Reports Server (NTRS)

    Gilbert, William P.; Nguyen, Luat T.; Gera, Joseph

    1990-01-01

    NASA is conducting a focused technology program, known as the High-Angle-of-Attack Technology Program, to accelerate the development of flight-validated technology applicable to the design of fighters with superior stall and post-stall characteristics and agility. A carefully integrated effort is underway combining wind tunnel testing, analytical predictions, piloted simulation, and full-scale flight research. A modified F-18 aircraft has been extensively instrumented for use as the NASA High-Angle-of-Attack Research Vehicle used for flight verification of new methods and concepts. This program stresses the importance of providing improved aircraft control capabilities both by powered control (such as thrust-vectoring) and by innovative aerodynamic control concepts. The program is accomplishing extensive coordinated ground and flight testing to assess and improve available experimental and analytical methods and to develop new concepts for enhanced aerodynamics and for effective control, guidance, and cockpit displays essential for effective pilot utilization of the increased agility provided.

  9. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Morten T.; Wendt, Fabian; Robertson, Amy

    2016-08-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  10. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  11. Verification and Validation for Flight-Critical Systems (VVFCS)

    NASA Technical Reports Server (NTRS)

    Graves, Sharon S.; Jacobsen, Robert A.

    2010-01-01

    On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).

  12. Towards Verification and Validation for Increased Autonomy

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra

    2017-01-01

    This presentation goes over the work we have performed over the last few years on verification and validation of the next generation onboard collision avoidance system, ACAS X, for commercial aircraft. It describes our work on probabilistic verification and synthesis of the model that ACAS X is based on, and goes on to the validation of that model with respect to actual simulation and flight data. The presentation then moves on to identify the characteristics of ACAS X that are related to autonomy and to discuss the challenges that autonomy pauses on VV. All work presented has already been published.

  13. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  14. Compromises produced by the dialectic between self-verification and self-enhancement.

    PubMed

    Morling, B; Epstein, S

    1997-12-01

    Three studies of people's reactions to evaluative feedback demonstrated that the dialectic between self-enhancement and self-verification results in compromises between these 2 motives, as hypothesized in cognitive-experiential self-theory. The demonstration was facilitated by 2 procedural improvements: Enhancement and verification were established by calibrating evaluative feedback against self appraisals, and degree of enhancement and of verification were varied along a continuum, rather than categorically. There was also support for the hypotheses that processing in an intuitive-experiential mode favors enhancement and processing in an analytical-rational mode favors verification in the kinds of situations investigated.

  15. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  16. 40 CFR 1066.130 - Measurement instrument calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Measurement instrument calibrations... (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Equipment, Measurement Instruments, Fuel, and Analytical Gas Specifications § 1066.130 Measurement instrument calibrations and verifications. The...

  17. Verification of conventional equations of state for tantalum under quasi-isentropic compression

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Binqiang, Luo; Guiji, Wang; Jianjun, Mo

    2014-11-21

    Shock Hugoniot data have been widely used to calibrate analytic equations of state (EOSs) of condensed matter at high pressures. However, the suitability of particular analytic EOSs under off-Hugoniot states has not been sufficiently verified using experimental data. We have conducted quasi-isentropic compression experiments (ICEs) of tantalum using the compact pulsed power generator CQ-4, and explored the relation of longitudinal stress versus volume of tantalum under quasi-isentropic compression using backward integration and characteristic inverse methods. By subtracting the deviatoric stress and additional pressure caused by irreversible plastic dissipation, the isentropic pressure can be extracted from the longitudinal stress. Several theoreticalmore » isentropes are deduced from analytic EOSs and compared with ICE results to validate the suitability of these analytic EOSs in isentropic compression states. The comparisons show that the Gruneisen EOS with Gruneisen Gamma proportional to volume is accurate, regardless whether the Hugoniot or isentrope is used as the reference line. The Vinet EOS yields better accuracy in isentropic compression states. Theoretical isentropes derived from Tillotson, PUFF, and Birch-Murnaghan EOSs well agree with the experimental isentrope in the range of 0–100 GPa, but deviate gradually with pressure increasing further.« less

  18. How Nasa's Independent Verification and Validation (IVandV) Program Builds Reliability into a Space Mission Software System (SMSS)

    NASA Technical Reports Server (NTRS)

    Fisher, Marcus S.; Northey, Jeffrey; Stanton, William

    2014-01-01

    The purpose of this presentation is to outline how the NASA Independent Verification and Validation (IVV) Program helps to build reliability into the Space Mission Software Systems (SMSSs) that its customers develop.

  19. Validation and verification of a virtual environment for training naval submarine officers

    NASA Astrophysics Data System (ADS)

    Zeltzer, David L.; Pioch, Nicholas J.

    1996-04-01

    A prototype virtual environment (VE) has been developed for training a submarine officer of the desk (OOD) to perform in-harbor navigation on a surfaced submarine. The OOD, stationed on the conning tower of the vessel, is responsible for monitoring the progress of the boat as it negotiates a marked channel, as well as verifying the navigational suggestions of the below- deck piloting team. The VE system allows an OOD trainee to view a particular harbor and associated waterway through a head-mounted display, receive spoken reports from a simulated piloting team, give spoken commands to the helmsman, and receive verbal confirmation of command execution from the helm. The task analysis of in-harbor navigation, and the derivation of application requirements are briefly described. This is followed by a discussion of the implementation of the prototype. This implementation underwent a series of validation and verification assessment activities, including operational validation, data validation, and software verification of individual software modules as well as the integrated system. Validation and verification procedures are discussed with respect to the OOD application in particular, and with respect to VE applications in general.

  20. Relativistic electron kinetic effects on laser diagnostics in burning plasmas

    NASA Astrophysics Data System (ADS)

    Mirnov, V. V.; Den Hartog, D. J.

    2018-02-01

    Toroidal interferometry/polarimetry (TIP), poloidal polarimetry (PoPola), and Thomson scattering systems (TS) are major optical diagnostics being designed and developed for ITER. Each of them relies upon a sophisticated quantitative understanding of the electron response to laser light propagating through a burning plasma. Review of the theoretical results for two different applications is presented: interferometry/polarimetry (I/P) and polarization of Thomson scattered light, unified by the importance of relativistic (quadratic in vTe/c) electron kinetic effects. For I/P applications, rigorous analytical results are obtained perturbatively by expansion in powers of the small parameter τ = Te/me c2, where Te is electron temperature and me is electron rest mass. Experimental validation of the analytical models has been made by analyzing data of more than 1200 pulses collected from high-Te JET discharges. Based on this validation the relativistic analytical expressions are included in the error analysis and design projects of the ITER TIP and PoPola systems. The polarization properties of incoherent Thomson scattered light are being examined as a method of Te measurement relevant to ITER operational regimes. The theory is based on Stokes vector transformation and Mueller matrices formalism. The general approach is subdivided into frequency-integrated and frequency-resolved cases. For each of them, the exact analytical relativistic solutions are presented in the form of Mueller matrix elements averaged over the relativistic Maxwellian distribution function. New results related to the detailed verification of the frequency-resolved solutions are reported. The precise analytic expressions provide output much more rapidly than relativistic kinetic numerical codes allowing for direct real-time feedback control of ITER device operation.

  1. International Council for Standardization in Haematology (ICSH) Recommendations for Laboratory Measurement of Direct Oral Anticoagulants.

    PubMed

    Gosselin, Robert C; Adcock, Dorothy M; Bates, Shannon M; Douxfils, Jonathan; Favaloro, Emmanuel J; Gouin-Thibault, Isabelle; Guillermo, Cecilia; Kawai, Yohko; Lindhoff-Last, Edelgard; Kitchen, Steve

    2018-03-01

    This guidance document was prepared on behalf of the International Council for Standardization in Haematology (ICSH) for providing haemostasis-related guidance documents for clinical laboratories. This inaugural coagulation ICSH document was developed by an ad hoc committee, comprised of international clinical and laboratory direct acting oral anticoagulant (DOAC) experts. The committee developed consensus recommendations for laboratory measurement of DOACs (dabigatran, rivaroxaban, apixaban and edoxaban), which would be germane for laboratories assessing DOAC anticoagulation. This guidance document addresses all phases of laboratory DOAC measurements, including pre-analytical (e.g. preferred time sample collection, preferred sample type, sample stability), analytical (gold standard method, screening and quantifying methods) and post analytical (e.g. reporting units, quality assurance). The committee addressed the use and limitations of screening tests such as prothrombin time, activated partial thromboplastin time as well as viscoelastic measurements of clotting blood and point of care methods. Additionally, the committee provided recommendations for the proper validation or verification of performance of laboratory assays prior to implementation for clinical use, and external quality assurance to provide continuous assessment of testing and reporting method. Schattauer GmbH Stuttgart.

  2. Multi-center evaluation of analytical performance of the Beckman Coulter AU5822 chemistry analyzer.

    PubMed

    Zimmerman, M K; Friesen, L R; Nice, A; Vollmer, P A; Dockery, E A; Rankin, J D; Zmuda, K; Wong, S H

    2015-09-01

    Our three academic institutions, Indiana University, Northwestern Memorial Hospital, and Wake Forest, were among the first in the United States to implement the Beckman Coulter AU5822 series chemistry analyzers. We undertook this post-hoc multi-center study by merging our data to determine performance characteristics and the impact of methodology changes on analyte measurement. We independently completed performance validation studies including precision, linearity/analytical measurement range, method comparison, and reference range verification. Complete data sets were available from at least one institution for 66 analytes with the following groups: 51 from all three institutions, and 15 from 1 or 2 institutions for a total sample size of 12,064. Precision was similar among institutions. Coefficients of variation (CV) were <10% for 97%. Analytes with CVs >10% included direct bilirubin and digoxin. All analytes exhibited linearity over the analytical measurement range. Method comparison data showed slopes between 0.900-1.100 for 87.9% of the analytes. Slopes for amylase, tobramycin and urine amylase were <0.8; the slope for lipase was >1.5, due to known methodology or standardization differences. Consequently, reference ranges of amylase, urine amylase and lipase required only minor or no modification. The four AU5822 analyzers independently evaluated at three sites showed consistent precision, linearity, and correlation results. Since installations, the test results had been well received by clinicians from all three institutions. Copyright © 2015. Published by Elsevier Inc.

  3. Extension and validation of an analytical model for in vivo PET verification of proton therapy—a phantom and clinical study

    NASA Astrophysics Data System (ADS)

    Attanasi, F.; Knopf, A.; Parodi, K.; Paganetti, H.; Bortfeld, T.; Rosso, V.; Del Guerra, A.

    2011-08-01

    The interest in positron emission tomography (PET) as a tool for treatment verification in proton therapy has become widespread in recent years, and several research groups worldwide are currently investigating the clinical implementation. After the first off-line investigation with a PET/CT scanner at MGH (Boston, USA), attention is now focused on an in-room PET application immediately after treatment in order to also detect shorter-lived isotopes, such as O15 and N13, minimizing isotope washout and avoiding patient repositioning errors. Clinical trials are being conducted by means of commercially available PET systems, and other tests are planned using application-dedicated tomographs. Parallel to the experimental investigation and new hardware development, great interest has been shown in the development of fast procedures to provide feedback regarding the delivered dose from reconstructed PET images. Since the thresholds of inelastic nuclear reactions leading to tissue β+-activation fall within the energy range of 15-20 MeV, the distal activity fall-off is correlated, but not directly matched, to the distal fall-off of the dose distribution. Moreover, the physical interactions leading to β+-activation and energy deposition are of a different nature. All these facts make it essential to further develop accurate and fast methodologies capable of predicting, on the basis of the planned dose distribution, expected PET images to be compared with actual PET measurements, thus providing clinical feedback on the correctness of the dose delivery and of the irradiation field position. The aim of this study has been to validate an analytical model and to implement and evaluate it in a fast and flexible framework able to locally predict such activity distributions directly taking the reference planning CT and planned dose as inputs. The results achieved in this study for phantoms and clinical cases highlighted the potential of the implemented method to predict expected activity distributions with great accuracy. Thus, the analytical model can be used as a powerful substitute method to the sensitive and time-consuming Monte Carlo approach.

  4. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications.

    PubMed

    Meier, D C; Benkstein, K D; Hurst, W S; Chu, P M

    2017-05-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, -5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals.

  5. Fourier Transform Infrared Absorption Spectroscopy for Quantitative Analysis of Gas Mixtures at Low Temperatures for Homeland Security Applications

    PubMed Central

    Meier, D.C.; Benkstein, K.D.; Hurst, W.S.; Chu, P.M.

    2016-01-01

    Performance standard specifications for point chemical vapor detectors are established in ASTM E 2885-13 and ASTM E 2933-13. The performance evaluation of the detectors requires the accurate delivery of known concentrations of the chemical target to the system under test. Referee methods enable the analyte test concentration and associated uncertainties in the analyte test concentration to be validated by independent analysis, which is especially important for reactive analytes. This work extends the capability of a previously demonstrated method for using Fourier transform infrared (FT-IR) absorption spectroscopy for quantitatively evaluating the composition of vapor streams containing hazardous materials at Acute Exposure Guideline Levels (AEGL) to include test conditions colder than laboratory ambient temperatures. The described method covers the use of primary reference spectra to establish analyte concentrations, the generation of secondary reference spectra suitable for measuring analyte concentrations under specified testing environments, and the use of additional reference spectra and spectral profile strategies to mitigate the uncertainties due to impurities and water condensation within the low-temperature (7 °C, −5 °C) test cell. Important benefits of this approach include verification of the test analyte concentration with characterized uncertainties by in situ measurements co-located with the detector under test, near-real-time feedback, and broad applicability to toxic industrial chemicals. PMID:28090126

  6. 49 CFR Appendix D to Part 236 - Independent Review of Verification and Validation

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... standards. (f) The reviewer shall analyze all Fault Tree Analyses (FTA), Failure Mode and Effects... for each product vulnerability cited by the reviewer; (4) Identification of any documentation or... not properly followed; (6) Identification of the software verification and validation procedures, as...

  7. Independent verification and validation report of Washington state ferries' wireless high speed data project

    DOT National Transportation Integrated Search

    2008-06-30

    The following Independent Verification and Validation (IV&V) report documents and presents the results of a study of the Washington State Ferries Prototype Wireless High Speed Data Network. The purpose of the study was to evaluate and determine if re...

  8. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  9. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  10. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... analysis. Any establishment that does not have a HACCP plan because a hazard analysis has revealed no food.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  11. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... not have a HACCP plan because a hazard analysis has revealed no food safety hazards that are.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  12. 9 CFR 417.4 - Validation, Verification, Reassessment.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... analysis. Any establishment that does not have a HACCP plan because a hazard analysis has revealed no food.... 417.4 Section 417.4 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF... ACT HAZARD ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.4 Validation, Verification...

  13. Approaching the investigation of plasma turbulence through a rigorous verification and validation procedure: A practical example

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.

    In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less

  14. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    NASA Astrophysics Data System (ADS)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; Price, Stephen; Hoffman, Matthew; Lipscomb, William H.; Fyke, Jeremy; Vargo, Lauren; Boghozian, Adrianna; Norman, Matthew; Worley, Patrick H.

    2017-06-01

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptops to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Ultimately, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.

  15. A verification and validation effort for high explosives at Los Alamos National Lab (u)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scovel, Christina A; Menikoff, Ralph S

    2009-01-01

    We have started a project to verify and validate ASC codes used to simulate detonation waves in high explosives. Since there are no non-trivial analytic solutions, we are going to compare simulated results with experimental data that cover a wide range of explosive phenomena. The intent is to compare both different codes and different high explosives (HE) models. The first step is to test the products equation of state used for the HE models, For this purpose, the cylinder test, flyer plate and plate-push experiments are being used. These experiments sample different regimes in thermodynamic phase space: the CJ isentropemore » for the cylinder tests, the isentrope behind an overdriven detonation wave for the flyer plate experiment, and expansion following a reflected CJ detonation for the plate-push experiment, which is sensitive to the Gruneisen coefficient. The results of our findings for PBX 9501 are presented here.« less

  16. A highly reliable, high performance open avionics architecture for real time Nap-of-the-Earth operations

    NASA Technical Reports Server (NTRS)

    Harper, Richard E.; Elks, Carl

    1995-01-01

    An Army Fault Tolerant Architecture (AFTA) has been developed to meet real-time fault tolerant processing requirements of future Army applications. AFTA is the enabling technology that will allow the Army to configure existing processors and other hardware to provide high throughput and ultrahigh reliability necessary for TF/TA/NOE flight control and other advanced Army applications. A comprehensive conceptual study of AFTA has been completed that addresses a wide range of issues including requirements, architecture, hardware, software, testability, producibility, analytical models, validation and verification, common mode faults, VHDL, and a fault tolerant data bus. A Brassboard AFTA for demonstration and validation has been fabricated, and two operating systems and a flight-critical Army application have been ported to it. Detailed performance measurements have been made of fault tolerance and operating system overheads while AFTA was executing the flight application in the presence of faults.

  17. Low level vapor verification of monomethyl hydrazine

    NASA Technical Reports Server (NTRS)

    Mehta, Narinder

    1990-01-01

    The vapor scrubbing system and the coulometric test procedure for the low level vapor verification of monomethyl hydrazine (MMH) are evaluated. Experimental data on precision, efficiency of the scrubbing liquid, instrument response, detection and reliable quantitation limits, stability of the vapor scrubbed solution, and interference were obtained to assess the applicability of the method for the low ppb level detection of the analyte vapor in air. The results indicated that the analyte vapor scrubbing system and the coulometric test procedure can be utilized for the quantitative detection of low ppb level vapor of MMH in air.

  18. Academic Self-Esteem and Perceived Validity of Grades: A Test of Self-Verification Theory.

    ERIC Educational Resources Information Center

    Okun, Morris A.; Fournet, Lee M.

    1993-01-01

    The hypothesis derived from self-verification theory that semester grade point average would be positively related to perceived validity of grade scores among high self-esteem undergraduates and inversely related for low self-esteem students was not supported in a study with 281 undergraduates. (SLD)

  19. Nonclinical dose formulation analysis method validation and sample analysis.

    PubMed

    Whitmire, Monica Lee; Bryan, Peter; Henry, Teresa R; Holbrook, John; Lehmann, Paul; Mollitor, Thomas; Ohorodnik, Susan; Reed, David; Wietgrefe, Holly D

    2010-12-01

    Nonclinical dose formulation analysis methods are used to confirm test article concentration and homogeneity in formulations and determine formulation stability in support of regulated nonclinical studies. There is currently no regulatory guidance for nonclinical dose formulation analysis method validation or sample analysis. Regulatory guidance for the validation of analytical procedures has been developed for drug product/formulation testing; however, verification of the formulation concentrations falls under the framework of GLP regulations (not GMP). The only current related regulatory guidance is the bioanalytical guidance for method validation. The fundamental parameters for bioanalysis and formulation analysis validations that overlap include: recovery, accuracy, precision, specificity, selectivity, carryover, sensitivity, and stability. Divergence in bioanalytical and drug product validations typically center around the acceptance criteria used. As the dose formulation samples are not true "unknowns", the concept of quality control samples that cover the entire range of the standard curve serving as the indication for the confidence in the data generated from the "unknown" study samples may not always be necessary. Also, the standard bioanalytical acceptance criteria may not be directly applicable, especially when the determined concentration does not match the target concentration. This paper attempts to reconcile the different practices being performed in the community and to provide recommendations of best practices and proposed acceptance criteria for nonclinical dose formulation method validation and sample analysis.

  20. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  1. Results from an Independent View on The Validation of Safety-Critical Space Systems

    NASA Astrophysics Data System (ADS)

    Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.

    2013-08-01

    The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.

  2. Numerical implementation, verification and validation of two-phase flow four-equation drift flux model with Jacobian-free Newton–Krylov method

    DOE PAGES

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    2016-08-24

    This study presents a numerical investigation on using the Jacobian-free Newton–Krylov (JFNK) method to solve the two-phase flow four-equation drift flux model with realistic constitutive correlations (‘closure models’). The drift flux model is based on Isshi and his collaborators’ work. Additional constitutive correlations for vertical channel flow, such as two-phase flow pressure drop, flow regime map, wall boiling and interfacial heat transfer models, were taken from the RELAP5-3D Code Manual and included to complete the model. The staggered grid finite volume method and fully implicit backward Euler method was used for the spatial discretization and time integration schemes, respectively. Themore » Jacobian-free Newton–Krylov method shows no difficulty in solving the two-phase flow drift flux model with a discrete flow regime map. In addition to the Jacobian-free approach, the preconditioning matrix is obtained by using the default finite differencing method provided in the PETSc package, and consequently the labor-intensive implementation of complex analytical Jacobian matrix is avoided. Extensive and successful numerical verification and validation have been performed to prove the correct implementation of the models and methods. Code-to-code comparison with RELAP5-3D has further demonstrated the successful implementation of the drift flux model.« less

  3. Applying Independent Verification and Validation to Automatic Test Equipment

    NASA Technical Reports Server (NTRS)

    Calhoun, Cynthia C.

    1997-01-01

    This paper describes a general overview of applying Independent Verification and Validation (IV&V) to Automatic Test Equipment (ATE). The overview is not inclusive of all IV&V activities that can occur or of all development and maintenance items that can be validated and verified, during the IV&V process. A sampling of possible IV&V activities that can occur within each phase of the ATE life cycle are described.

  4. The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.

    2004-01-01

    The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.

  5. Evaluating the performance of the two-phase flow solver interFoam

    NASA Astrophysics Data System (ADS)

    Deshpande, Suraj S.; Anumolu, Lakshman; Trujillo, Mario F.

    2012-01-01

    The performance of the open source multiphase flow solver, interFoam, is evaluated in this work. The solver is based on a modified volume of fluid (VoF) approach, which incorporates an interfacial compression flux term to mitigate the effects of numerical smearing of the interface. It forms a part of the C + + libraries and utilities of OpenFOAM and is gaining popularity in the multiphase flow research community. However, to the best of our knowledge, the evaluation of this solver is confined to the validation tests of specific interest to the users of the code and the extent of its applicability to a wide range of multiphase flow situations remains to be explored. In this work, we have performed a thorough investigation of the solver performance using a variety of verification and validation test cases, which include (i) verification tests for pure advection (kinematics), (ii) dynamics in the high Weber number limit and (iii) dynamics of surface tension-dominated flows. With respect to (i), the kinematics tests show that the performance of interFoam is generally comparable with the recent algebraic VoF algorithms; however, it is noticeably worse than the geometric reconstruction schemes. For (ii), the simulations of inertia-dominated flows with large density ratios {\\sim }\\mathscr {O}(10^3) yielded excellent agreement with analytical and experimental results. In regime (iii), where surface tension is important, consistency of pressure-surface tension formulation and accuracy of curvature are important, as established by Francois et al (2006 J. Comput. Phys. 213 141-73). Several verification tests were performed along these lines and the main findings are: (a) the algorithm of interFoam ensures a consistent formulation of pressure and surface tension; (b) the curvatures computed by the solver converge to a value slightly (10%) different from the analytical value and a scope for improvement exists in this respect. To reduce the disruptive effects of spurious currents, we followed the analysis of Galusinski and Vigneaux (2008 J. Comput. Phys. 227 6140-64) and arrived at the following criterion for stable capillary simulations for interFoam: \\Delta t\\leqslant \\max (10\\tau _\\mu , 0.1\\tau _\\rho) where \\tau _\\mu =\\mu \\Delta x/\\sigma ,~ {and}~\\tau _\\rho =\\sqrt {\\rho \\Delta x^3/\\sigma } . Finally, some capillary flows relevant to atomization were simulated, resulting in good agreement with the results from the literature.

  6. Circuitbot

    DTIC Science & Technology

    2016-03-01

    constraints problem. Game rules described valid moves allowing player to generate a memory graph performing improved C program verification . 15. SUBJECT...TERMS Formal Verification , Static Analysis, Abstract Interpretation, Pointer Analysis, Fixpoint Iteration 16. SECURITY CLASSIFICATION OF: 17...36 3.4.12 Example: Game Play . . . . . . . . . . . . . . . . . . . . . . . . . . . 37 3.4.13 Verification

  7. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2013-04-01 2013-04-01 false Verification and validation. 120.11 Section 120.11...

  8. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Verification and validation. 120.11 Section 120.11...

  9. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Verification and validation. 120.11 Section 120.11...

  10. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Verification and validation. 120.11 Section 120.11...

  11. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... analysis. Whenever a juice processor has no HACCP plan because a hazard analysis has revealed no food... analysis whenever there are any changes in the process that could reasonably affect whether a food hazard... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Verification and validation. 120.11 Section 120.11...

  12. Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process

  13. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  14. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  15. LIVVkit: An extensible, python-based, land ice verification and validation toolkit for ice sheet models

    DOE PAGES

    Kennedy, Joseph H.; Bennett, Andrew R.; Evans, Katherine J.; ...

    2017-03-23

    To address the pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice sheet models is underway. Concurrent to the development of the Community Ice Sheet Model (CISM), the corresponding verification and validation (V&V) process is being coordinated through a new, robust, Python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVVkit). Incorporated into the typical ice sheet model development cycle, it provides robust and automated numerical verification, software verification, performance validation, and physical validation analyses on a variety of platforms, from personal laptopsmore » to the largest supercomputers. LIVVkit operates on sets of regression test and reference data sets, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-for-bit evaluation, and plots of model variables to indicate where differences occur. LIVVkit also provides an easily extensible framework to incorporate and analyze results of new intercomparison projects, new observation data, and new computing platforms. LIVVkit is designed for quick adaptation to additional ice sheet models via abstraction of model specific code, functions, and configurations into an ice sheet model description bundle outside the main LIVVkit structure. Furthermore, through shareable and accessible analysis output, LIVVkit is intended to help developers build confidence in their models and enhance the credibility of ice sheet models overall.« less

  16. Description of a Website Resource for Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.; Smith, Brian R.; Huang, George P.

    2010-01-01

    The activities of the Turbulence Model Benchmarking Working Group - which is a subcommittee of the American Institute of Aeronautics and Astronautics (AIAA) Fluid Dynamics Technical Committee - are described. The group s main purpose is to establish a web-based repository for Reynolds-averaged Navier-Stokes turbulence model documentation, including verification and validation cases. This turbulence modeling resource has been established based on feedback from a survey on what is needed to achieve consistency and repeatability in turbulence model implementation and usage, and to document and disseminate information on new turbulence models or improvements to existing models. The various components of the website are described in detail: description of turbulence models, turbulence model readiness rating system, verification cases, validation cases, validation databases, and turbulence manufactured solutions. An outline of future plans of the working group is also provided.

  17. Evaluation of HCFC AK 225 Alternatives for Precision Cleaning and Verification

    NASA Technical Reports Server (NTRS)

    Melton, D. M.

    1998-01-01

    Maintaining qualified cleaning and verification processes are essential in an production environment. Environmental regulations have and are continuing to impact cleaning and verification processing in component and large structures, both at the Michoud Assembly Facility and component suppliers. The goal of the effort was to assure that the cleaning and verification proceeds unimpeded and that qualified, environmentally compliant material and process replacements are implemented and perform to specifications. The approach consisted of (1) selection of a Supersonic Gas-Liquid Cleaning System; (2) selection and evaluation of three cleaning and verification solvents as candidate alternatives to HCFC 225 (Vertrel 423 (HCFC), Vertrel MCA (HFC/1,2-Dichloroethylene), and HFE 7100DE (HFE/1,2 Dichloroethylene)); and evaluation of an analytical instrumental post cleaning verification technique. This document is presented in viewgraph format.

  18. Validation of mesoscale models

    NASA Technical Reports Server (NTRS)

    Kuo, Bill; Warner, Tom; Benjamin, Stan; Koch, Steve; Staniforth, Andrew

    1993-01-01

    The topics discussed include the following: verification of cloud prediction from the PSU/NCAR mesoscale model; results form MAPS/NGM verification comparisons and MAPS observation sensitivity tests to ACARS and profiler data; systematic errors and mesoscale verification for a mesoscale model; and the COMPARE Project and the CME.

  19. Finite element code FENIA verification and application for 3D modelling of thermal state of radioactive waste deep geological repository

    NASA Astrophysics Data System (ADS)

    Butov, R. A.; Drobyshevsky, N. I.; Moiseenko, E. V.; Tokarev, U. N.

    2017-11-01

    The verification of the FENIA finite element code on some problems and an example of its application are presented in the paper. The code is being developing for 3D modelling of thermal, mechanical and hydrodynamical (THM) problems related to the functioning of deep geological repositories. Verification of the code for two analytical problems has been performed. The first one is point heat source with exponential heat decrease, the second one - linear heat source with similar behavior. Analytical solutions have been obtained by the authors. The problems have been chosen because they reflect the processes influencing the thermal state of deep geological repository of radioactive waste. Verification was performed for several meshes with different resolution. Good convergence between analytical and numerical solutions was achieved. The application of the FENIA code is illustrated by 3D modelling of thermal state of a prototypic deep geological repository of radioactive waste. The repository is designed for disposal of radioactive waste in a rock at depth of several hundred meters with no intention of later retrieval. Vitrified radioactive waste is placed in the containers, which are placed in vertical boreholes. The residual decay heat of radioactive waste leads to containers, engineered safety barriers and host rock heating. Maximum temperatures and corresponding times of their establishment have been determined.

  20. Analytical Approach Validation for the Spin-Stabilized Satellite Attitude

    NASA Technical Reports Server (NTRS)

    Zanardi, Maria Cecilia F. P. S.; Garcia, Roberta Veloso; Kuga, Helio Koiti

    2007-01-01

    An analytical approach for spin-stabilized spacecraft attitude prediction is presented for the influence of the residual magnetic torques and the satellite in an elliptical orbit. Assuming a quadripole model for the Earth s magnetic field, an analytical averaging method is applied to obtain the mean residual torque in every orbital period. The orbit mean anomaly is used to compute the average components of residual torque in the spacecraft body frame reference system. The theory is developed for time variations in the orbital elements, giving rise to many curvature integrals. It is observed that the residual magnetic torque does not have component along the spin axis. The inclusion of this torque on the rotational motion differential equations of a spin stabilized spacecraft yields conditions to derive an analytical solution. The solution shows that the residual torque does not affect the spin velocity magnitude, contributing only for the precession and the drift of the spin axis of the spacecraft. The theory developed has been applied to the Brazilian s spin stabilized satellites, which are quite appropriated for verification and comparison of the theory with the data generated and processed by the Satellite Control Center of Brazil National Research Institute. The results show the period that the analytical solution can be used to the attitude propagation, within the dispersion range of the attitude determination system performance of Satellite Control Center of Brazil National Research Institute.

  1. Keeping the Momentum and Nuclear Forensics at Los Alamos National Laboratory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Steiner, Robert Ernest; Dion, Heather M.; Dry, Donald E.

    LANL has 70 years of experience in nuclear forensics and supports the community through a wide variety of efforts and leveraged capabilities: Expanding the understanding of nuclear forensics, providing training on nuclear forensics methods, and developing bilateral relationships to expand our understanding of nuclear forensic science. LANL remains highly supportive of several key organizations tasked with carrying forth the Nuclear Security Summit messages: IAEA, GICNT, and INTERPOL. Analytical chemistry measurements on plutonium and uranium matrices are critical to numerous programs including safeguards accountancy verification measurements. Los Alamos National Laboratory operates capable actinide analytical chemistry and material science laboratories suitable formore » nuclear material and environmental forensic characterization. Los Alamos National Laboratory uses numerous means to validate and independently verify that measurement data quality objectives are met. Numerous LANL nuclear facilities support the nuclear material handling, preparation, and analysis capabilities necessary to evaluate samples containing nearly any mass of an actinide (attogram to kilogram levels).« less

  2. Modal analysis of graphene-based structures for large deformations, contact and material nonlinearities

    NASA Astrophysics Data System (ADS)

    Ghaffari, Reza; Sauer, Roger A.

    2018-06-01

    The nonlinear frequencies of pre-stressed graphene-based structures, such as flat graphene sheets and carbon nanotubes, are calculated. These structures are modeled with a nonlinear hyperelastic shell model. The model is calibrated with quantum mechanics data and is valid for high strains. Analytical solutions of the natural frequencies of various plates are obtained for the Canham bending model by assuming infinitesimal strains. These solutions are used for the verification of the numerical results. The performance of the model is illustrated by means of several examples. Modal analysis is performed for square plates under pure dilatation or uniaxial stretch, circular plates under pure dilatation or under the effects of an adhesive substrate, and carbon nanotubes under uniaxial compression or stretch. The adhesive substrate is modeled with van der Waals interaction (based on the Lennard-Jones potential) and a coarse grained contact model. It is shown that the analytical natural frequencies underestimate the real ones, and this should be considered in the design of devices based on graphene structures.

  3. Computational Fluid Dynamics Uncertainty Analysis Applied to Heat Transfer over a Flat Plate

    NASA Technical Reports Server (NTRS)

    Groves, Curtis Edward; Ilie, Marcel; Schallhorn, Paul A.

    2013-01-01

    There have been few discussions on using Computational Fluid Dynamics (CFD) without experimental validation. Pairing experimental data, uncertainty analysis, and analytical predictions provides a comprehensive approach to verification and is the current state of the art. With pressed budgets, collecting experimental data is rare or non-existent. This paper investigates and proposes a method to perform CFD uncertainty analysis only from computational data. The method uses current CFD uncertainty techniques coupled with the Student-T distribution to predict the heat transfer coefficient over a at plate. The inputs to the CFD model are varied from a specified tolerance or bias error and the difference in the results are used to estimate the uncertainty. The variation in each input is ranked from least to greatest to determine the order of importance. The results are compared to heat transfer correlations and conclusions drawn about the feasibility of using CFD without experimental data. The results provide a tactic to analytically estimate the uncertainty in a CFD model when experimental data is unavailable

  4. Modeling of Compressible Flow with Friction and Heat Transfer Using the Generalized Fluid System Simulation Program (GFSSP)

    NASA Technical Reports Server (NTRS)

    Bandyopadhyay, Alak; Majumdar, Alok

    2007-01-01

    The present paper describes the verification and validation of a quasi one-dimensional pressure based finite volume algorithm, implemented in Generalized Fluid System Simulation Program (GFSSP), for predicting compressible flow with friction, heat transfer and area change. The numerical predictions were compared with two classical solutions of compressible flow, i.e. Fanno and Rayleigh flow. Fanno flow provides an analytical solution of compressible flow in a long slender pipe where incoming subsonic flow can be choked due to friction. On the other hand, Raleigh flow provides analytical solution of frictionless compressible flow with heat transfer where incoming subsonic flow can be choked at the outlet boundary with heat addition to the control volume. Nonuniform grid distribution improves the accuracy of numerical prediction. A benchmark numerical solution of compressible flow in a converging-diverging nozzle with friction and heat transfer has been developed to verify GFSSP's numerical predictions. The numerical predictions compare favorably in all cases.

  5. 7 CFR 272.11 - Systematic Alien Verification for Entitlements (SAVE) Program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 7 Agriculture 4 2013-01-01 2013-01-01 false Systematic Alien Verification for Entitlements (SAVE... FOR PARTICIPATING STATE AGENCIES § 272.11 Systematic Alien Verification for Entitlements (SAVE... and Naturalization Service (INS), in order to verify the validity of documents provided by aliens...

  6. 7 CFR 272.11 - Systematic Alien Verification for Entitlements (SAVE) Program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 7 Agriculture 4 2012-01-01 2012-01-01 false Systematic Alien Verification for Entitlements (SAVE... FOR PARTICIPATING STATE AGENCIES § 272.11 Systematic Alien Verification for Entitlements (SAVE... and Naturalization Service (INS), in order to verify the validity of documents provided by aliens...

  7. 7 CFR 272.11 - Systematic Alien Verification for Entitlements (SAVE) Program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 7 Agriculture 4 2010-01-01 2010-01-01 false Systematic Alien Verification for Entitlements (SAVE... FOR PARTICIPATING STATE AGENCIES § 272.11 Systematic Alien Verification for Entitlements (SAVE... and Naturalization Service (INS), in order to verify the validity of documents provided by aliens...

  8. 7 CFR 272.11 - Systematic Alien Verification for Entitlements (SAVE) Program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 7 Agriculture 4 2011-01-01 2011-01-01 false Systematic Alien Verification for Entitlements (SAVE... FOR PARTICIPATING STATE AGENCIES § 272.11 Systematic Alien Verification for Entitlements (SAVE... and Naturalization Service (INS), in order to verify the validity of documents provided by aliens...

  9. 7 CFR 272.11 - Systematic Alien Verification for Entitlements (SAVE) Program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 7 Agriculture 4 2014-01-01 2014-01-01 false Systematic Alien Verification for Entitlements (SAVE... FOR PARTICIPATING STATE AGENCIES § 272.11 Systematic Alien Verification for Entitlements (SAVE... and Naturalization Service (INS), in order to verify the validity of documents provided by aliens...

  10. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (IIT-A-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures for the initial and periodic verification and validation of computer programs. The programs are used during the Arizona NHEXAS project and Border study at the Illinois Institute of Technology (IIT) site. Keywords: computers; s...

  11. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  12. Using Academia-Industry Partnerships to Enhance Software Verification & Validation Education via Active Learning Tools

    ERIC Educational Resources Information Center

    Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter

    2017-01-01

    Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…

  13. Software Independent Verification and Validation (SIV&V) Simplified

    DTIC Science & Technology

    2006-12-01

    Configuration Item I/O Input/Output I2V2 Independent Integrated Verification and Validation IBM International Business Machines ICD Interface...IPT Integrated Product Team IRS Interface Requirements Specification ISD Integrated System Diagram ITD Integrated Test Description ITP ...programming languages such as COBOL (Common Business Oriented Language) (Codasyl committee 1960), and FORTRAN (FORmula TRANslator) ( IBM 1952) (Robat 11

  14. NHEXAS PHASE I ARIZONA STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the "Border" study. Keywords: Computers; Software; QA/QC.

    The National Human Exposure Assessment Sur...

  15. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols

    NASA Technical Reports Server (NTRS)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio

    1992-01-01

    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  16. An Overview of NASA's IM&S Verification and Validation Process Plan and Specification for Space Exploration

    NASA Technical Reports Server (NTRS)

    Gravitz, Robert M.; Hale, Joseph

    2006-01-01

    NASA's Exploration Systems Mission Directorate (ESMD) is implementing a management approach for modeling and simulation (M&S) that will provide decision-makers information on the model's fidelity, credibility, and quality. This information will allow the decision-maker to understand the risks involved in using a model's results in the decision-making process. This presentation will discuss NASA's approach for verification and validation (V&V) of its models or simulations supporting space exploration. This presentation will describe NASA's V&V process and the associated M&S verification and validation (V&V) activities required to support the decision-making process. The M&S V&V Plan and V&V Report templates for ESMD will also be illustrated.

  17. 7 CFR 1980.353 - Filing and processing applications.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... subject to the availability of funds. (15) A copy of a valid verification of income for each adult member... method of verifying information. Verifications must pass directly from the source of information to the Lender and shall not pass through the hands of a third party or applicant. (1) Income verification...

  18. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  19. Reverse Engineering Validation using a Benchmark Synthetic Gene Circuit in Human Cells

    PubMed Central

    Kang, Taek; White, Jacob T.; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas

    2013-01-01

    Multi-component biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network. PMID:23654266

  20. Reverse engineering validation using a benchmark synthetic gene circuit in human cells.

    PubMed

    Kang, Taek; White, Jacob T; Xie, Zhen; Benenson, Yaakov; Sontag, Eduardo; Bleris, Leonidas

    2013-05-17

    Multicomponent biological networks are often understood incompletely, in large part due to the lack of reliable and robust methodologies for network reverse engineering and characterization. As a consequence, developing automated and rigorously validated methodologies for unraveling the complexity of biomolecular networks in human cells remains a central challenge to life scientists and engineers. Today, when it comes to experimental and analytical requirements, there exists a great deal of diversity in reverse engineering methods, which renders the independent validation and comparison of their predictive capabilities difficult. In this work we introduce an experimental platform customized for the development and verification of reverse engineering and pathway characterization algorithms in mammalian cells. Specifically, we stably integrate a synthetic gene network in human kidney cells and use it as a benchmark for validating reverse engineering methodologies. The network, which is orthogonal to endogenous cellular signaling, contains a small set of regulatory interactions that can be used to quantify the reconstruction performance. By performing successive perturbations to each modular component of the network and comparing protein and RNA measurements, we study the conditions under which we can reliably reconstruct the causal relationships of the integrated synthetic network.

  1. Commissioning and validation of COMPASS system for VMAT patient specific quality assurance

    NASA Astrophysics Data System (ADS)

    Pimthong, J.; Kakanaporn, C.; Tuntipumiamorn, L.; Laojunun, P.; Iampongpaiboon, P.

    2016-03-01

    Pre-treatment patient specific quality assurance (QA) of advanced treatment techniques such as volumetric modulated arc therapy (VMAT) is one of important QA in radiotherapy. The fast and reliable dosimetric device is required. The objective of this study is to commission and validate the performance of COMPASS system for dose verification of VMAT technique. The COMPASS system is composed of an array of ionization detectors (MatriXX) mounted to the gantry using a custom holder and software for the analysis and visualization of QA results. We validated the COMPASS software for basic and advanced clinical application. For the basic clinical study, the simple open field in various field sizes were validated in homogeneous phantom. And the advanced clinical application, the fifteen prostate and fifteen nasopharyngeal cancers VMAT plans were chosen to study. The treatment plans were measured by the MatriXX. The doses and dose-volume histograms (DVHs) reconstructed from the fluence measurements were compared to the TPS calculated plans. And also, the doses and DVHs computed using collapsed cone convolution (CCC) Algorithm were compared with Eclipse TPS calculated plans using Analytical Anisotropic Algorithm (AAA) that according to dose specified in ICRU 83 for PTV.

  2. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  3. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  4. Generalized model of electromigration with 1:1 (analyte:selector) complexation stoichiometry: part II. Application to dual systems and experimental verification.

    PubMed

    Müllerová, Ludmila; Dubský, Pavel; Gaš, Bohuslav

    2015-03-06

    Interactions among analyte forms that undergo simultaneous dissociation/protonation and complexation with multiple selectors take the shape of a highly interconnected multi-equilibrium scheme. This makes it difficult to express the effective mobility of the analyte in these systems, which are often encountered in electrophoretical separations, unless a generalized model is introduced. In the first part of this series, we presented the theory of electromigration of a multivalent weakly acidic/basic/amphoteric analyte undergoing complexation with a mixture of an arbitrary number of selectors. In this work we demonstrate the validity of this concept experimentally. The theory leads to three useful perspectives, each of which is closely related to the one originally formulated for simpler systems. If pH, IS and the selector mixture composition are all kept constant, the system is treated as if only a single analyte form interacted with a single selector. If the pH changes at constant IS and mixture composition, the already well-established models of a weakly acidic/basic analyte interacting with a single selector can be employed. Varying the mixture composition at constant IS and pH leads to a situation where virtually a single analyte form interacts with a mixture of selectors. We show how to switch between the three perspectives in practice and confirm that they can be employed interchangeably according to the specific needs by measurements performed in single- and dual-selector systems at a pH where the analyte is fully dissociated, partly dissociated or fully protonated. Weak monoprotic analyte (R-flurbiprofen) and two selectors (native β-cyclodextrin and monovalent positively charged 6-monodeoxy-6-monoamino-β-cyclodextrin) serve as a model system. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. Verification, Validation, and Accreditation (VV&A) of Federations (Verification, validation et accreditation (VV&A) des federations)

    DTIC Science & Technology

    2008-04-01

    le manque de modèle universel pour la Vérification, la Validation et l’Accréditation des fédérations à cause des perspectives et besoins nationaux...federation application will depend on a number of factors, including the quality of the requirements information and the resources allocated to the VV&A...required) allocating the required functionality to federates, and developing a detailed plan for federation development and implementation. Step 4

  6. The Design and Evaluation of Class Exercises as Active Learning Tools in Software Verification and Validation

    ERIC Educational Resources Information Center

    Wu, Peter Y.; Manohar, Priyadarshan A.; Acharya, Sushil

    2016-01-01

    It is well known that interesting questions can stimulate thinking and invite participation. Class exercises are designed to make use of questions to engage students in active learning. In a project toward building a community skilled in software verification and validation (SV&V), we critically review and further develop course materials in…

  7. U.S.-MEXICO BORDER PROGRAM ARIZONA BORDER STUDY--STANDARD OPERATING PROCEDURE FOR PERFORMANCE OF COMPUTER SOFTWARE: VERIFICATION AND VALIDATION (UA-D-2.0)

    EPA Science Inventory

    The purpose of this SOP is to define the procedures used for the initial and periodic verification and validation of computer programs used during the Arizona NHEXAS project and the Border study. Keywords: Computers; Software; QA/QC.

    The U.S.-Mexico Border Program is sponsored ...

  8. A Visual Analytics Approach to Structured Data Analysis to Enhance Nonproliferation and Arms Control Verification Activities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gillen, David S.

    Analysis activities for Nonproliferation and Arms Control verification require the use of many types of data. Tabular structured data, such as Excel spreadsheets and relational databases, have traditionally been used for data mining activities, where specific queries are issued against data to look for matching results. The application of visual analytics tools to structured data enables further exploration of datasets to promote discovery of previously unknown results. This paper discusses the application of a specific visual analytics tool to datasets related to the field of Arms Control and Nonproliferation to promote the use of visual analytics more broadly in thismore » domain. Visual analytics focuses on analytical reasoning facilitated by interactive visual interfaces (Wong and Thomas 2004). It promotes exploratory analysis of data, and complements data mining technologies where known patterns can be mined for. Also with a human in the loop, they can bring in domain knowledge and subject matter expertise. Visual analytics has not widely been applied to this domain. In this paper, we will focus on one type of data: structured data, and show the results of applying a specific visual analytics tool to answer questions in the Arms Control and Nonproliferation domain. We chose to use the T.Rex tool, a visual analytics tool developed at PNNL, which uses a variety of visual exploration patterns to discover relationships in structured datasets, including a facet view, graph view, matrix view, and timeline view. The facet view enables discovery of relationships between categorical information, such as countries and locations. The graph tool visualizes node-link relationship patterns, such as the flow of materials being shipped between parties. The matrix visualization shows highly correlated categories of information. The timeline view shows temporal patterns in data. In this paper, we will use T.Rex with two different datasets to demonstrate how interactive exploration of the data can aid an analyst with arms control and nonproliferation verification activities. Using a dataset from PIERS (PIERS 2014), we will show how container shipment imports and exports can aid an analyst in understanding the shipping patterns between two countries. We will also use T.Rex to examine a collection of research publications from the IAEA International Nuclear Information System (IAEA 2014) to discover collaborations of concern. We hope this paper will encourage the use of visual analytics structured data analytics in the field of nonproliferation and arms control verification. Our paper outlines some of the challenges that exist before broad adoption of these kinds of tools can occur and offers next steps to overcome these challenges.« less

  9. Computer aided system engineering and analysis (CASE/A) modeling package for ECLS systems - An overview

    NASA Technical Reports Server (NTRS)

    Dalee, Robert C.; Bacskay, Allen S.; Knox, James C.

    1990-01-01

    An overview of the CASE/A-ECLSS series modeling package is presented. CASE/A is an analytical tool that has supplied engineering productivity accomplishments during ECLSS design activities. A components verification program was performed to assure component modeling validity based on test data from the Phase II comparative test program completed at the Marshall Space Flight Center. An integrated plotting feature has been added to the program which allows the operator to analyze on-screen data trends or get hard copy plots from within the CASE/A operating environment. New command features in the areas of schematic, output, and model management, and component data editing have been incorporated to enhance the engineer's productivity during a modeling program.

  10. Vector wind profile gust model

    NASA Technical Reports Server (NTRS)

    Adelfang, S. I.

    1981-01-01

    To enable development of a vector wind gust model suitable for orbital flight test operations and trade studies, hypotheses concerning the distributions of gust component variables were verified. Methods for verification of hypotheses that observed gust variables, including gust component magnitude, gust length, u range, and L range, are gamma distributed and presented. Observed gust modulus has been drawn from a bivariate gamma distribution that can be approximated with a Weibull distribution. Zonal and meridional gust components are bivariate gamma distributed. An analytical method for testing for bivariate gamma distributed variables is presented. Two distributions for gust modulus are described and the results of extensive hypothesis testing of one of the distributions are presented. The validity of the gamma distribution for representation of gust component variables is established.

  11. Towards the development of a rapid, portable, surface enhanced Raman spectroscopy based cleaning verification system for the drug nelarabine.

    PubMed

    Corrigan, Damion K; Salton, Neale A; Preston, Chris; Piletsky, Sergey

    2010-09-01

    Cleaning verification is a scientific and economic problem for the pharmaceutical industry. A large amount of potential manufacturing time is lost to the process of cleaning verification. This involves the analysis of residues on spoiled manufacturing equipment, with high-performance liquid chromatography (HPLC) being the predominantly employed analytical technique. The aim of this study was to develop a portable cleaning verification system for nelarabine using surface enhanced Raman spectroscopy (SERS). SERS was conducted using a portable Raman spectrometer and a commercially available SERS substrate to develop a rapid and portable cleaning verification system for nelarabine. Samples of standard solutions and swab extracts were deposited onto the SERS active surfaces, allowed to dry and then subjected to spectroscopic analysis. Nelarabine was amenable to analysis by SERS and the necessary levels of sensitivity were achievable. It is possible to use this technology for a semi-quantitative limits test. Replicate precision, however, was poor due to the heterogeneous drying pattern of nelarabine on the SERS active surface. Understanding and improving the drying process in order to produce a consistent SERS signal for quantitative analysis is desirable. This work shows the potential application of SERS for cleaning verification analysis. SERS may not replace HPLC as the definitive analytical technique, but it could be used in conjunction with HPLC so that swabbing is only carried out once the portable SERS equipment has demonstrated that the manufacturing equipment is below the threshold contamination level.

  12. ENVIRONMENTAL TECHNOLOGICAL VERIFICATION REPORT - L2000 PCB/CHLORIDE ANALYZER - DEXSIL CORPORATION

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - ENVIROGARD PCB TEST KIT - STRATEGIC DIAGNOSTICS INC

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of Polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCBs in soil...

  14. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  15. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  16. A process improvement model for software verification and validation

    NASA Technical Reports Server (NTRS)

    Callahan, John; Sabolish, George

    1994-01-01

    We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.

  17. Stopping power and dose calculations with analytical and Monte Carlo methods for protons and prompt gamma range verification

    NASA Astrophysics Data System (ADS)

    Usta, Metin; Tufan, Mustafa Çağatay; Aydın, Güral; Bozkurt, Ahmet

    2018-07-01

    In this study, we have performed the calculations stopping power, depth dose, and range verification for proton beams using dielectric and Bethe-Bloch theories and FLUKA, Geant4 and MCNPX Monte Carlo codes. In the framework, as analytical studies, Drude model was applied for dielectric theory and effective charge approach with Roothaan-Hartree-Fock charge densities was used in Bethe theory. In the simulations different setup parameters were selected to evaluate the performance of three distinct Monte Carlo codes. The lung and breast tissues were investigated are considered to be related to the most common types of cancer throughout the world. The results were compared with each other and the available data in literature. In addition, the obtained results were verified with prompt gamma range data. In both stopping power values and depth-dose distributions, it was found that the Monte Carlo values give better results compared with the analytical ones while the results that agree best with ICRU data in terms of stopping power are those of the effective charge approach between the analytical methods and of the FLUKA code among the MC packages. In the depth dose distributions of the examined tissues, although the Bragg curves for Monte Carlo almost overlap, the analytical ones show significant deviations that become more pronounce with increasing energy. Verifications with the results of prompt gamma photons were attempted for 100-200 MeV protons which are regarded important for proton therapy. The analytical results are within 2%-5% and the Monte Carlo values are within 0%-2% as compared with those of the prompt gammas.

  18. Thermal System Verification and Model Validation for NASA's Cryogenic Passively Cooled James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Cleveland, Paul E.; Parrish, Keith A.

    2005-01-01

    A thorough and unique thermal verification and model validation plan has been developed for NASA s James Webb Space Telescope. The JWST observatory consists of a large deployed aperture optical telescope passively cooled to below 50 Kelvin along with a suite of several instruments passively and actively cooled to below 37 Kelvin and 7 Kelvin, respectively. Passive cooling to these extremely low temperatures is made feasible by the use of a large deployed high efficiency sunshield and an orbit location at the L2 Lagrange point. Another enabling feature is the scale or size of the observatory that allows for large radiator sizes that are compatible with the expected power dissipation of the instruments and large format Mercury Cadmium Telluride (HgCdTe) detector arrays. This passive cooling concept is simple, reliable, and mission enabling when compared to the alternatives of mechanical coolers and stored cryogens. However, these same large scale observatory features, which make passive cooling viable, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone to most space missions thermal verification plan. JWST is simply too large in its deployed configuration to be properly thermal balance tested in the facilities that currently exist. This reality, when combined with a mission thermal concept with little to no flight heritage, has necessitated the need for a unique and alternative approach to thermal system verification and model validation. This paper describes the thermal verification and model validation plan that has been developed for JWST. The plan relies on judicious use of cryogenic and thermal design margin, a completely independent thermal modeling cross check utilizing different analysis teams and software packages, and finally, a comprehensive set of thermal tests that occur at different levels of JWST assembly. After a brief description of the JWST mission and thermal architecture, a detailed description of the three aspects of the thermal verification and model validation plan is presented.

  19. Airside HVAC BESTEST: HVAC Air-Distribution System Model Test Cases for ASHRAE Standard 140

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Judkoff, Ronald; Neymark, Joel; Kennedy, Mike D.

    This paper summarizes recent work to develop new airside HVAC equipment model analytical verification test cases for ANSI/ASHRAE Standard 140, Standard Method of Test for the Evaluation of Building Energy Analysis Computer Programs. The analytical verification test method allows comparison of simulation results from a wide variety of building energy simulation programs with quasi-analytical solutions, further described below. Standard 140 is widely cited for evaluating software for use with performance-path energy efficiency analysis, in conjunction with well-known energy-efficiency standards including ASHRAE Standard 90.1, the International Energy Conservation Code, and other international standards. Airside HVAC Equipment is a common area ofmore » modelling not previously explicitly tested by Standard 140. Integration of the completed test suite into Standard 140 is in progress.« less

  20. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: IMMUNOASSAY KIT, ENVIROLOGIX, INC., PCB IN SOIL TUBE ASSAY

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The purpose of this demonstration was to evaluate field analytical technologies capable of detecting and quantifying PCB's in soi...

  2. Large-Scale Interlaboratory Study to Develop, Analytically Validate and Apply Highly Multiplexed, Quantitative Peptide Assays to Measure Cancer-Relevant Proteins in Plasma*

    PubMed Central

    Abbatiello, Susan E.; Schilling, Birgit; Mani, D. R.; Zimmerman, Lisa J.; Hall, Steven C.; MacLean, Brendan; Albertolle, Matthew; Allen, Simon; Burgess, Michael; Cusack, Michael P.; Gosh, Mousumi; Hedrick, Victoria; Held, Jason M.; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kinsinger, Christopher R.; Lyssand, John; Makowski, Lee; Mesri, Mehdi; Rodriguez, Henry; Rudnick, Paul; Sadowski, Pawel; Sedransk, Nell; Shaddox, Kent; Skates, Stephen J.; Kuhn, Eric; Smith, Derek; Whiteaker, Jeffery R.; Whitwell, Corbin; Zhang, Shucha; Borchers, Christoph H.; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel C.; MacCoss, Michael J.; Neubert, Thomas A.; Paulovich, Amanda G.; Regnier, Fred E.; Tempst, Paul; Carr, Steven A.

    2015-01-01

    There is an increasing need in biology and clinical medicine to robustly and reliably measure tens to hundreds of peptides and proteins in clinical and biological samples with high sensitivity, specificity, reproducibility, and repeatability. Previously, we demonstrated that LC-MRM-MS with isotope dilution has suitable performance for quantitative measurements of small numbers of relatively abundant proteins in human plasma and that the resulting assays can be transferred across laboratories while maintaining high reproducibility and quantitative precision. Here, we significantly extend that earlier work, demonstrating that 11 laboratories using 14 LC-MS systems can develop, determine analytical figures of merit, and apply highly multiplexed MRM-MS assays targeting 125 peptides derived from 27 cancer-relevant proteins and seven control proteins to precisely and reproducibly measure the analytes in human plasma. To ensure consistent generation of high quality data, we incorporated a system suitability protocol (SSP) into our experimental design. The SSP enabled real-time monitoring of LC-MRM-MS performance during assay development and implementation, facilitating early detection and correction of chromatographic and instrumental problems. Low to subnanogram/ml sensitivity for proteins in plasma was achieved by one-step immunoaffinity depletion of 14 abundant plasma proteins prior to analysis. Median intra- and interlaboratory reproducibility was <20%, sufficient for most biological studies and candidate protein biomarker verification. Digestion recovery of peptides was assessed and quantitative accuracy improved using heavy-isotope-labeled versions of the proteins as internal standards. Using the highly multiplexed assay, participating laboratories were able to precisely and reproducibly determine the levels of a series of analytes in blinded samples used to simulate an interlaboratory clinical study of patient samples. Our study further establishes that LC-MRM-MS using stable isotope dilution, with appropriate attention to analytical validation and appropriate quality control measures, enables sensitive, specific, reproducible, and quantitative measurements of proteins and peptides in complex biological matrices such as plasma. PMID:25693799

  3. Experimental verification of a gain reduction model for the space charge effect in a wire chamber

    NASA Astrophysics Data System (ADS)

    Nagakura, Naoki; Fujii, Kazuki; Harayama, Isao; Kato, Yu; Sekiba, Daiichiro; Watahiki, Yumi; Yamashita, Satoru

    2018-01-01

    A wire chamber often suffers significant saturation of the multiplication factor when the electric field around its wires is strong. An analytical model of this effect has previously been proposed [Y. Arimoto et al., Nucl. Instrum. Meth. Phys. Res. A 799, 187 (2015)], in which the saturation was described by the multiplication factor, energy deposit density per wire length, and one constant parameter. In order to confirm the validity of this model, a multi-wire drift chamber was developed and irradiated by a MeV-range proton beam at the University of Tsukuba. The saturation effect was compared for energy deposits ranging from 70 keV/cm to 180 keV/cm and multiplication factors 3× 103 to 3× 104. The chamber was rotated with respect to the proton beam in order to vary the space charge density around the wires. The energy deposit distribution corrected for the effect was consistent with the result of a Monte Carlo simulation, thus validating the proposed model.

  4. Method of analysis at the U.S. Geological Survey California Water Science Center, Sacramento Laboratory - determination of haloacetic acid formation potential, method validation, and quality-control practices

    USGS Publications Warehouse

    Zazzi, Barbara C.; Crepeau, Kathryn L.; Fram, Miranda S.; Bergamaschi, Brian A.

    2005-01-01

    An analytical method for the determination of haloacetic acid formation potential of water samples has been developed by the U.S. Geological Survey California Water Science Center Sacramento Laboratory. The haloacetic acid formation potential is measured by dosing water samples with chlorine under specified conditions of pH, temperature, incubation time, darkness, and residual-free chlorine. The haloacetic acids formed are bromochloroacetic acid, bromodichloroacetic acid, dibromochloroacetic acid, dibromoacetic acid, dichloroacetic acid, monobromoacetic acid, monochloroacetic acid, tribromoacetic acid, and trichloroacetic acid. They are extracted, methylated, and then analyzed using a gas chromatograph equipped with an electron capture detector. Method validation experiments were performed to determine the method accuracy, precision, and detection limit for each of the compounds. Method detection limits for these nine haloacetic acids ranged from 0.11 to 0.45 microgram per liter. Quality-control practices include the use of blanks, quality-control samples, calibration verification standards, surrogate recovery, internal standard, matrix spikes, and duplicates.

  5. Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications

    NASA Technical Reports Server (NTRS)

    Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.

    1990-01-01

    The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning valve control system (TVCS) are discussed, and their solutions are documented. The emphasis of this paper is on the validation of integrated systems.

  6. Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications

    NASA Technical Reports Server (NTRS)

    Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.

    1990-01-01

    The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning vane control system (TVCS) are discussed and the problems and their solutions are documented. The emphasis of this paper is on the validation of integrated system.

  7. Comparison of measurement methods with a mixed effects procedure accounting for replicated evaluations (COM3PARE): method comparison algorithm implementation for head and neck IGRT positional verification.

    PubMed

    Roy, Anuradha; Fuller, Clifton D; Rosenthal, David I; Thomas, Charles R

    2015-08-28

    Comparison of imaging measurement devices in the absence of a gold-standard comparator remains a vexing problem; especially in scenarios where multiple, non-paired, replicated measurements occur, as in image-guided radiotherapy (IGRT). As the number of commercially available IGRT presents a challenge to determine whether different IGRT methods may be used interchangeably, an unmet need conceptually parsimonious and statistically robust method to evaluate the agreement between two methods with replicated observations. Consequently, we sought to determine, using an previously reported head and neck positional verification dataset, the feasibility and utility of a Comparison of Measurement Methods with the Mixed Effects Procedure Accounting for Replicated Evaluations (COM3PARE), a unified conceptual schema and analytic algorithm based upon Roy's linear mixed effects (LME) model with Kronecker product covariance structure in a doubly multivariate set-up, for IGRT method comparison. An anonymized dataset consisting of 100 paired coordinate (X/ measurements from a sequential series of head and neck cancer patients imaged near-simultaneously with cone beam CT (CBCT) and kilovoltage X-ray (KVX) imaging was used for model implementation. Software-suggested CBCT and KVX shifts for the lateral (X), vertical (Y) and longitudinal (Z) dimensions were evaluated for bias, inter-method (between-subject variation), intra-method (within-subject variation), and overall agreement using with a script implementing COM3PARE with the MIXED procedure of the statistical software package SAS (SAS Institute, Cary, NC, USA). COM3PARE showed statistically significant bias agreement and difference in inter-method between CBCT and KVX was observed in the Z-axis (both p - value<0.01). Intra-method and overall agreement differences were noted as statistically significant for both the X- and Z-axes (all p - value<0.01). Using pre-specified criteria, based on intra-method agreement, CBCT was deemed preferable for X-axis positional verification, with KVX preferred for superoinferior alignment. The COM3PARE methodology was validated as feasible and useful in this pilot head and neck cancer positional verification dataset. COM3PARE represents a flexible and robust standardized analytic methodology for IGRT comparison. The implemented SAS script is included to encourage other groups to implement COM3PARE in other anatomic sites or IGRT platforms.

  8. Annual verifications--a tick-box exercise?

    PubMed

    Walker, Gwen; Williams, David

    2014-09-01

    With the onus on healthcare providers and their staff to protect patients against all elements of 'avoidable harm' perhaps never greater, Gwen Walker, a highly experienced infection prevention control nurse specialist, and David Williams, MD of Approved Air, who has 30 years' experience in validation and verification of ventilation and ultraclean ventilation systems, examine changing requirements for, and trends in, operating theatre ventilation. Validation and verification reporting on such vital HVAC equipment should not, they argue, merely be viewed as a 'tick-box exercise'; it should instead 'comprehensively inform key stakeholders, and ultimately form part of clinical governance, thus protecting those ultimately named responsible for organisation-wide safety at Trust board level'.

  9. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  10. Validation of Mission Plans Through Simulation

    NASA Astrophysics Data System (ADS)

    St-Pierre, J.; Melanson, P.; Brunet, C.; Crabtree, D.

    2002-01-01

    The purpose of a spacecraft mission planning system is to automatically generate safe and optimized mission plans for a single spacecraft, or more functioning in unison. The system verifies user input syntax, conformance to commanding constraints, absence of duty cycle violations, timing conflicts, state conflicts, etc. Present day constraint-based systems with state-based predictive models use verification rules derived from expert knowledge. A familiar solution found in Mission Operations Centers, is to complement the planning system with a high fidelity spacecraft simulator. Often a dedicated workstation, the simulator is frequently used for operator training and procedure validation, and may be interfaced to actual control stations with command and telemetry links. While there are distinct advantages to having a planning system offer realistic operator training using the actual flight control console, physical verification of data transfer across layers and procedure validation, experience has revealed some drawbacks and inefficiencies in ground segment operations: With these considerations, two simulation-based mission plan validation projects are under way at the Canadian Space Agency (CSA): RVMP and ViSION. The tools proposed in these projects will automatically run scenarios and provide execution reports to operations planning personnel, prior to actual command upload. This can provide an important safeguard for system or human errors that can only be detected with high fidelity, interdependent spacecraft models running concurrently. The core element common to these projects is a spacecraft simulator, built with off-the- shelf components such as CAE's Real-Time Object-Based Simulation Environment (ROSE) technology, MathWork's MATLAB/Simulink, and Analytical Graphics' Satellite Tool Kit (STK). To complement these tools, additional components were developed, such as an emulated Spacecraft Test and Operations Language (STOL) interpreter and CCSDS TM/TC encoders and decoders. This paper discusses the use of simulation in the context of space mission planning, describes the projects under way and proposes additional venues of investigation and development.

  11. A Multiplexed Serum Biomarker Immunoassay Panel Discriminates Clinical Lung Cancer Patients from High-Risk Individuals Found to be Cancer-Free by CT Screening

    PubMed Central

    Bigbee, William L.; Gopalakrishnan, Vanathi; Weissfeld, Joel L.; Wilson, David O.; Dacic, Sanja; Lokshin, Anna E.; Siegfried, Jill M.

    2012-01-01

    Introduction Clinical decision-making in the setting of CT screening could benefit from accessible biomarkers that help predict the level of lung cancer risk in high-risk individuals with indeterminate pulmonary nodules. Methods To identify candidate serum biomarkers, we measured 70 cancer-related proteins by Luminex xMAP® multiplexed immunoassays in a training set of sera from 56 patients with biopsy-proven primary non small cell lung cancer and 56 age-, sex- and smoking-matched CT-screened controls. Results We identified a panel of 10 serum biomarkers – prolactin, transthyretin, thrombospondin-1, E-selectin, C-C motif chemokine 5, macrophage migration inhibitory factor, plasminogen activator inhibitor, receptor tyrosine-protein kinase, Cyfra 21.1, and serum amyloid A – that distinguished lung cancer from controls with an estimated balanced accuracy (average of sensitivity and specificity) of 76.0%±3.8% from 20-fold internal cross-validation. We then iteratively evaluated this model in independent test and verification case/control studies confirming the initial classification performance of the panel. The classification performance of the 10-biomarker panel was also analytically validated using ELISAs in a second independent case/control population further validating the robustness of the panel. Conclusions The performance of this 10-biomarker panel based model was 77.1% sensitivity/76.2% specificity in cross-validation in the expanded training set, 73.3% sensitivity/93.3% specificity (balanced accuracy 83.3%) in the blinded verification set with the best discriminative performance in Stage I/II cases: 85% sensitivity (balanced accuracy 89.2%). Importantly, the rate of misclassification of CT-screened controls was not different in most control subgroups with or without airflow obstruction or emphysema or pulmonary nodules. These biomarkers have potential to aid in the early detection of lung cancer and more accurate interpretation of indeterminate pulmonary nodules detected by screening CT. PMID:22425918

  12. Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation & Uncertainty Quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Tsao, Jeffrey Y.; Trucano, Timothy G.; Kleban, Stephen D.

    This report contains the written footprint of a Sandia-hosted workshop held in Albuquerque, New Mexico, June 22-23, 2016 on “Complex Systems Models and Their Applications: Towards a New Science of Verification, Validation and Uncertainty Quantification,” as well as of pre-work that fed into the workshop. The workshop’s intent was to explore and begin articulating research opportunities at the intersection between two important Sandia communities: the complex systems (CS) modeling community, and the verification, validation and uncertainty quantification (VVUQ) community The overarching research opportunity (and challenge) that we ultimately hope to address is: how can we quantify the credibility of knowledgemore » gained from complex systems models, knowledge that is often incomplete and interim, but will nonetheless be used, sometimes in real-time, by decision makers?« less

  13. Fuzzy Logic Controller Stability Analysis Using a Satisfiability Modulo Theories Approach

    NASA Technical Reports Server (NTRS)

    Arnett, Timothy; Cook, Brandon; Clark, Matthew A.; Rattan, Kuldip

    2017-01-01

    While many widely accepted methods and techniques exist for validation and verification of traditional controllers, at this time no solutions have been accepted for Fuzzy Logic Controllers (FLCs). Due to the highly nonlinear nature of such systems, and the fact that developing a valid FLC does not require a mathematical model of the system, it is quite difficult to use conventional techniques to prove controller stability. Since safety-critical systems must be tested and verified to work as expected for all possible circumstances, the fact that FLC controllers cannot be tested to achieve such requirements poses limitations on the applications for such technology. Therefore, alternative methods for verification and validation of FLCs needs to be explored. In this study, a novel approach using formal verification methods to ensure the stability of a FLC is proposed. Main research challenges include specification of requirements for a complex system, conversion of a traditional FLC to a piecewise polynomial representation, and using a formal verification tool in a nonlinear solution space. Using the proposed architecture, the Fuzzy Logic Controller was found to always generate negative feedback, but inconclusive for Lyapunov stability.

  14. A methodology for the rigorous verification of plasma simulation codes

    NASA Astrophysics Data System (ADS)

    Riva, Fabio

    2016-10-01

    The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.

  15. VERTPAK1. Code Verification Analytic Solution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Golis, M.J.

    1983-04-01

    VERTPAK1 is a package of analytical solutions used in verification of numerical codes that simulate fluid flow, rock deformation, and solute transport in fractured and unfractured porous media. VERTPAK1 contains the following: BAREN, an analytical solution developed by Barenblatt, Zhelton and Kochina (1960) for describing transient flow to a well penetrating a (double porosity) confined aquifer; GIBMAC, an analytical solution developed by McNamee and Gibson (1960) for describing consolidation of a semi-infinite soil medium subject to a strip (plane strain) or cylindrical (axisymmetric) loading; GRINRH, an analytical solution developed by Gringarten (1971) for describing transient flow to a partially penetratingmore » well in a confined aquifer containing a single horizontal fracture; GRINRV, an analytical solution developed by Gringarten, Ramey, and Raghavan (1974) for describing transient flow to a fully penetrating well in a confined aquifer containing a single vertical fracture; HART, an analytical solution given by Nowacki (1962) and implemented by HART (1981) for describing the elastic behavior of an infinite solid subject to a line heat source; LESTER, an analytical solution presented by Lester, Jansen, and Burkholder (1975) for describing one-dimensional transport of radionuclide chains through an adsorbing medium; STRELT, an analytical solution presented by Streltsova-Adams (1978) for describing transient flow to a fully penetrating well in a (double porosity) confined aquifer; and TANG, an analytical solution developed by Tang, Frind, and Sudicky (1981) for describing solute transport in a porous medium containing a single fracture.« less

  16. An assessment of space shuttle flight software development processes

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.

  17. Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.

    ERIC Educational Resources Information Center

    Kaya, Azmi

    1982-01-01

    Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…

  18. Commissioning of a grid-based Boltzmann solver for cervical cancer brachytherapy treatment planning with shielded colpostats.

    PubMed

    Mikell, Justin K; Klopp, Ann H; Price, Michael; Mourtada, Firas

    2013-01-01

    We sought to commission a gynecologic shielded colpostat analytic model provided from a treatment planning system (TPS) library. We have reported retrospectively the dosimetric impact of this applicator model in a cohort of patients. A commercial TPS with a grid-based Boltzmann solver (GBBS) was commissioned for (192)Ir high-dose-rate (HDR) brachytherapy for cervical cancer with stainless steel-shielded colpostats. Verification of the colpostat analytic model was verified using a radiograph and vendor schematics. MCNPX v2.6 Monte Carlo simulations were performed to compare dose distributions around the applicator in water with the TPS GBBS dose predictions. Retrospectively, the dosimetric impact was assessed over 24 cervical cancer patients' HDR plans. Applicator (TPS ID #AL13122005) shield dimensions were within 0.4 mm of the independent shield dimensions verification. GBBS profiles in planes bisecting the cap around the applicator agreed with Monte Carlo simulations within 2% at most locations; differing screw representations resulted in differences of up to 9%. For the retrospective study, the GBBS doses differed from TG-43 as follows (mean value ± standard deviation [min, max]): International Commission on Radiation units [ICRU]rectum (-8.4 ± 2.5% [-14.1, -4.1%]), ICRUbladder (-7.2 ± 3.6% [-15.7, -2.1%]), D2cc-rectum (-6.2 ± 2.6% [-11.9, -0.8%]), D2cc-sigmoid (-5.6 ± 2.6% [-9.3, -2.0%]), and D2cc-bladder (-3.4 ± 1.9% [-7.2, -1.1%]). As brachytherapy TPSs implement advanced model-based dose calculations, the analytic applicator models stored in TPSs should be independently validated before clinical use. For this cohort, clinically meaningful differences (>5%) from TG-43 were observed. Accurate dosimetric modeling of shielded applicators may help to refine organ toxicity studies. Copyright © 2013 American Brachytherapy Society. Published by Elsevier Inc. All rights reserved.

  19. Comparison of detection limits in environmental analysis--is it possible? An approach on quality assurance in the lower working range by verification.

    PubMed

    Geiss, S; Einax, J W

    2001-07-01

    Detection limit, reporting limit and limit of quantitation are analytical parameters which describe the power of analytical methods. These parameters are used for internal quality assurance and externally for competing, especially in the case of trace analysis in environmental compartments. The wide variety of possibilities for computing or obtaining these measures in literature and in legislative rules makes any comparison difficult. Additionally, a host of terms have been used within the analytical community to describe detection and quantitation capabilities. Without trying to create an order for the variety of terms, this paper is aimed at providing a practical proposal for answering the main questions for the analysts concerning quality measures above. These main questions and related parameters were explained and graphically demonstrated. Estimation and verification of these parameters are the two steps to get real measures. A rule for a practical verification is given in a table, where the analyst can read out what to measure, what to estimate and which criteria have to be fulfilled. In this manner verified parameters detection limit, reporting limit and limit of quantitation now are comparable and the analyst himself is responsible to the unambiguity and reliability of these measures.

  20. MRM for the verification of cancer biomarker proteins: recent applications to human plasma and serum.

    PubMed

    Chambers, Andrew G; Percy, Andrew J; Simon, Romain; Borchers, Christoph H

    2014-04-01

    Accurate cancer biomarkers are needed for early detection, disease classification, prediction of therapeutic response and monitoring treatment. While there appears to be no shortage of candidate biomarker proteins, a major bottleneck in the biomarker pipeline continues to be their verification by enzyme linked immunosorbent assays. Multiple reaction monitoring (MRM), also known as selected reaction monitoring, is a targeted mass spectrometry approach to protein quantitation and is emerging to bridge the gap between biomarker discovery and clinical validation. Highly multiplexed MRM assays are readily configured and enable simultaneous verification of large numbers of candidates facilitating the development of biomarker panels which can increase specificity. This review focuses on recent applications of MRM to the analysis of plasma and serum from cancer patients for biomarker verification. The current status of this approach is discussed along with future directions for targeted mass spectrometry in clinical biomarker validation.

  1. Expert system verification and validation study. Phase 2: Requirements Identification. Delivery 2: Current requirements applicability

    NASA Technical Reports Server (NTRS)

    1991-01-01

    The second phase of a task is described which has the ultimate purpose of ensuring that adequate Expert Systems (ESs) Verification and Validation (V and V) tools and techniques are available for Space Station Freedom Program Knowledge Based Systems development. The purpose of this phase is to recommend modifications to current software V and V requirements which will extend the applicability of the requirements to NASA ESs.

  2. The 2014 Sandia Verification and Validation Challenge: Problem statement

    DOE PAGES

    Hu, Kenneth; Orient, George

    2016-01-18

    This paper presents a case study in utilizing information from experiments, models, and verification and validation (V&V) to support a decision. It consists of a simple system with data and models provided, plus a safety requirement to assess. The goal is to pose a problem that is flexible enough to allow challengers to demonstrate a variety of approaches, but constrained enough to focus attention on a theme. This was accomplished by providing a good deal of background information in addition to the data, models, and code, but directing the participants' activities with specific deliverables. In this challenge, the theme ismore » how to gather and present evidence about the quality of model predictions, in order to support a decision. This case study formed the basis of the 2014 Sandia V&V Challenge Workshop and this resulting special edition of the ASME Journal of Verification, Validation, and Uncertainty Quantification.« less

  3. Verification and Validation Methodology of Real-Time Adaptive Neural Networks for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Loparo, Kenneth; Mackall, Dale; Schumann, Johann; Soares, Fola

    2004-01-01

    Recent research has shown that adaptive neural based control systems are very effective in restoring stability and control of an aircraft in the presence of damage or failures. The application of an adaptive neural network with a flight critical control system requires a thorough and proven process to ensure safe and proper flight operation. Unique testing tools have been developed as part of a process to perform verification and validation (V&V) of real time adaptive neural networks used in recent adaptive flight control system, to evaluate the performance of the on line trained neural networks. The tools will help in certification from FAA and will help in the successful deployment of neural network based adaptive controllers in safety-critical applications. The process to perform verification and validation is evaluated against a typical neural adaptive controller and the results are discussed.

  4. Space Weather Models and Their Validation and Verification at the CCMC

    NASA Technical Reports Server (NTRS)

    Hesse, Michael

    2010-01-01

    The Community Coordinated l\\lodeling Center (CCMC) is a US multi-agency activity with a dual mission. With equal emphasis, CCMC strives to provide science support to the international space research community through the execution of advanced space plasma simulations, and it endeavors to support the space weather needs of the CS and partners. Space weather support involves a broad spectrum, from designing robust forecasting systems and transitioning them to forecasters, to providing space weather updates and forecasts to NASA's robotic mission operators. All of these activities have to rely on validation and verification of models and their products, so users and forecasters have the means to assign confidence levels to the space weather information. In this presentation, we provide an overview of space weather models resident at CCMC, as well as of validation and verification activities undertaken at CCMC or through the use of CCMC services.

  5. Draft Plan for Characterizing Commercial Data Products in Support of Earth Science Research

    NASA Technical Reports Server (NTRS)

    Ryan, Robert E.; Terrie, Greg; Berglund, Judith

    2006-01-01

    This presentation introduces a draft plan for characterizing commercial data products for Earth science research. The general approach to the commercial product verification and validation includes focused selection of a readily available commercial remote sensing products that support Earth science research. Ongoing product verification and characterization will question whether the product meets specifications and will examine its fundamental properties, potential and limitations. Validation will encourage product evaluation for specific science research and applications. Specific commercial products included in the characterization plan include high-spatial-resolution multispectral (HSMS) imagery and LIDAR data products. Future efforts in this process will include briefing NASA headquarters and modifying plans based on feedback, increased engagement with the science community and refinement of details, coordination with commercial vendors and The Joint Agency Commercial Imagery Evaluation (JACIE) for HSMS satellite acquisitions, acquiring waveform LIDAR data and performing verification and validation.

  6. Verification Assessment of Flow Boundary Conditions for CFD Analysis of Supersonic Inlet Flows

    NASA Technical Reports Server (NTRS)

    Slater, John W.

    2002-01-01

    Boundary conditions for subsonic inflow, bleed, and subsonic outflow as implemented into the WIND CFD code are assessed with respect to verification for steady and unsteady flows associated with supersonic inlets. Verification procedures include grid convergence studies and comparisons to analytical data. The objective is to examine errors, limitations, capabilities, and behavior of the boundary conditions. Computational studies were performed on configurations derived from a "parameterized" supersonic inlet. These include steady supersonic flows with normal and oblique shocks, steady subsonic flow in a diffuser, and unsteady flow with the propagation and reflection of an acoustic disturbance.

  7. Laboratory study of low-β forces in arched, line-tied magnetic flux ropes

    NASA Astrophysics Data System (ADS)

    Myers, C. E.; Yamada, M.; Ji, H.; Yoo, J.; Jara-Almonte, J.; Fox, W.

    2016-11-01

    The loss-of-equilibrium is a solar eruption mechanism whereby a sudden breakdown of the magnetohydrodynamic force balance in the Sun's corona ejects a massive burst of particles and energy into the heliosphere. Predicting a loss-of-equilibrium, which has more recently been formulated as the torus instability, relies on a detailed understanding of the various forces that hold the pre-eruption magnetic flux rope in equilibrium. Traditionally, idealized analytical force expressions are used to derive simplified eruption criteria that can be compared to solar observations and modeling. What is missing, however, is a validation that these idealized analytical force expressions can be applied to the line-tied, low-aspect-ratio conditions of the corona. In this paper, we address this shortcoming by using a laboratory experiment to study the forces that act on long-lived, arched, line-tied magnetic flux ropes. Three key force terms are evaluated over a wide range of experimental conditions: (1) the upward hoop force; (2) the downward strapping force; and (3) the downward toroidal field tension force. First, the laboratory force measurements show that, on average, the three aforementioned force terms cancel to produce a balanced line-tied equilibrium. This finding validates the laboratory force measurement techniques developed here, which were recently used to identify a dynamic toroidal field tension force that can prevent flux rope eruptions [Myers et al., Nature 528, 526 (2015)]. The verification of magnetic force balance also confirms the low-β assumption that the plasma thermal pressure is negligible in these experiments. Next, the measured force terms are directly compared to corresponding analytical expressions. While the measured and analytical forces are found to be well correlated, the low-aspect-ratio, line-tied conditions in the experiment are found to both reduce the measured hoop force and increase the measured tension force with respect to analytical expectations. These two co-directed effects combine to generate laboratory flux rope equilibria at lower altitudes than are predicted analytically. Such considerations are expected to modify the loss-of-equilibrium eruption criteria for analogous flux ropes in the solar corona.

  8. Laboratory study of low- β forces in arched, line-tied magnetic flux ropes

    DOE PAGES

    Myers, C. E.; Yamada, M.; Ji, H.; ...

    2016-11-04

    Here, the loss-of-equilibrium is a solar eruption mechanism whereby a sudden breakdown of the magnetohydrodynamic force balance in the Sun's corona ejects a massive burst of particles and energy into the heliosphere. Predicting a loss-of-equilibrium, which has more recently been formulated as the torus instability, relies on a detailed understanding of the various forces that hold the pre-eruption magnetic flux rope in equilibrium. Traditionally, idealized analytical force expressions are used to derive simplified eruption criteria that can be compared to solar observations and modeling. What is missing, however, is a validation that these idealized analytical force expressions can be appliedmore » to the line-tied, low-aspect-ratio conditions of the corona. In this paper, we address this shortcoming by using a laboratory experiment to study the forces that act on long-lived, arched, line-tied magnetic flux ropes. Three key force terms are evaluated over a wide range of experimental conditions: (1) the upward hoop force; (2) the downward strapping force; and (3) the downward toroidal field tension force. First, the laboratory force measurements show that, on average, the three aforementioned force terms cancel to produce a balanced line-tied equilibrium. This finding validates the laboratory force measurement techniques developed here, which were recently used to identify a dynamic toroidal field tension force that can prevent flux rope eruption. The verification of magnetic force balance also confirms the low-beta assumption that the plasma thermal pressure is negligible in these experiments. Next, the measured force terms are directly compared to corresponding analytical expressions. While the measured and analytical forces are found to be well correlated, the low-aspect-ratio, line-tied conditions in the experiment are found to both reduce the measured hoop force and increase the measured tension force with respect to analytical expectations. These two co-directed effects combine to generate laboratory flux rope equilibria at lower altitudes than are predicted analytically. Such considerations are expected to modify the loss-of-equilibrium eruption criteria for analogous flux ropes in the solar corona.« less

  9. Laboratory study of low- β forces in arched, line-tied magnetic flux ropes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Myers, C. E.; Yamada, M.; Ji, H.

    Here, the loss-of-equilibrium is a solar eruption mechanism whereby a sudden breakdown of the magnetohydrodynamic force balance in the Sun's corona ejects a massive burst of particles and energy into the heliosphere. Predicting a loss-of-equilibrium, which has more recently been formulated as the torus instability, relies on a detailed understanding of the various forces that hold the pre-eruption magnetic flux rope in equilibrium. Traditionally, idealized analytical force expressions are used to derive simplified eruption criteria that can be compared to solar observations and modeling. What is missing, however, is a validation that these idealized analytical force expressions can be appliedmore » to the line-tied, low-aspect-ratio conditions of the corona. In this paper, we address this shortcoming by using a laboratory experiment to study the forces that act on long-lived, arched, line-tied magnetic flux ropes. Three key force terms are evaluated over a wide range of experimental conditions: (1) the upward hoop force; (2) the downward strapping force; and (3) the downward toroidal field tension force. First, the laboratory force measurements show that, on average, the three aforementioned force terms cancel to produce a balanced line-tied equilibrium. This finding validates the laboratory force measurement techniques developed here, which were recently used to identify a dynamic toroidal field tension force that can prevent flux rope eruption. The verification of magnetic force balance also confirms the low-beta assumption that the plasma thermal pressure is negligible in these experiments. Next, the measured force terms are directly compared to corresponding analytical expressions. While the measured and analytical forces are found to be well correlated, the low-aspect-ratio, line-tied conditions in the experiment are found to both reduce the measured hoop force and increase the measured tension force with respect to analytical expectations. These two co-directed effects combine to generate laboratory flux rope equilibria at lower altitudes than are predicted analytically. Such considerations are expected to modify the loss-of-equilibrium eruption criteria for analogous flux ropes in the solar corona.« less

  10. Using formal methods for content validation of medical procedure documents.

    PubMed

    Cota, Érika; Ribeiro, Leila; Bezerra, Jonas Santos; Costa, Andrei; da Silva, Rosiana Estefane; Cota, Gláucia

    2017-08-01

    We propose the use of a formal approach to support content validation of a standard operating procedure (SOP) for a therapeutic intervention. Such an approach provides a useful tool to identify ambiguities, omissions and inconsistencies, and improves the applicability and efficacy of documents in the health settings. We apply and evaluate a methodology originally proposed for the verification of software specification documents to a specific SOP. The verification methodology uses the graph formalism to model the document. Semi-automatic analysis identifies possible problems in the model and in the original document. The verification is an iterative process that identifies possible faults in the original text that should be revised by its authors and/or specialists. The proposed method was able to identify 23 possible issues in the original document (ambiguities, omissions, redundant information, and inaccuracies, among others). The formal verification process aided the specialists to consider a wider range of usage scenarios and to identify which instructions form the kernel of the proposed SOP and which ones represent additional or required knowledge that are mandatory for the correct application of the medical document. By using the proposed verification process, a simpler and yet more complete SOP could be produced. As consequence, during the validation process the experts received a more mature document and could focus on the technical aspects of the procedure itself. Copyright © 2017 Elsevier B.V. All rights reserved.

  11. Development of an integrated laboratory system for the monitoring of cyanotoxins in surface and drinking waters.

    PubMed

    Triantis, Theodoros; Tsimeli, Katerina; Kaloudis, Triantafyllos; Thanassoulias, Nicholas; Lytras, Efthymios; Hiskia, Anastasia

    2010-05-01

    A system of analytical processes has been developed in order to serve as a cost-effective scheme for the monitoring of cyanobacterial toxins on a quantitative basis, in surface and drinking waters. Five cyclic peptide hepatotoxins, microcystin-LR, -RR, -YR, -LA and nodularin were chosen as the target compounds. Two different enzyme-linked immunosorbent assays (ELISA) were validated in order to serve as primary quantitative screening tools. Validation results showed that the ELISA methods are sufficiently specific and sensitive with limits of detection (LODs) around 0.1 microg/L, however, matrix effects should be considered, especially with surface water samples or bacterial mass methanolic extracts. A colorimetric protein phosphatase inhibition assay (PPIA) utilizing protein phosphatase 2A and p-nitrophenyl phosphate as substrate, was applied in microplate format in order to serve as a quantitative screening method for the detection of the toxic activity associated with cyclic peptide hepatotoxins, at concentration levels >0.2 microg/L of MC-LR equivalents. A fast HPLC/PDA method has been developed for the determination of microcystins, by using a short, 50mm C18 column, with 1.8 microm particle size. Using this method a 10-fold reduction of sample run time was achieved and sufficient separation of microcystins was accomplished in less than 3 min. Finally, the analytical system includes an LC/MS/MS method that was developed for the determination of the 5 target compounds after SPE extraction. The method achieves extremely low limits of detection (<0.02 microg/L), in both surface and drinking waters and it is used for identification and verification purposes as well as for determinations at the ppt level. An analytical protocol that includes the above methods has been designed and validated through the analysis of a number of real samples. Copyright 2009 Elsevier Ltd. All rights reserved.

  12. Using Colored Stochastic Petri Net (CS-PN) software for protocol specification, validation, and evaluation

    NASA Technical Reports Server (NTRS)

    Zenie, Alexandre; Luguern, Jean-Pierre

    1987-01-01

    The specification, verification, validation, and evaluation, which make up the different steps of the CS-PN software are outlined. The colored stochastic Petri net software is applied to a Wound/Wait protocol decomposable into two principal modules: request or couple (transaction, granule) treatment module and wound treatment module. Each module is specified, verified, validated, and then evaluated separately, to deduce a verification, validation and evaluation of the complete protocol. The colored stochastic Petri nets tool is shown to be a natural extension of the stochastic tool, adapted to distributed systems and protocols, because the color conveniently takes into account the numerous sites, transactions, granules and messages.

  13. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  14. Identifying, Visualizing, and Fusing Social Media Data to Support Nonproliferation and Arms Control Treaty Verification: Preliminary Results

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe N.; Cramer, Nicholas O.; Benz, Jacob M.

    While international nonproliferation and arms control verification capabilities have their foundations in physical and chemical sensors, state declarations, and on-site inspections, verification experts are beginning to consider the importance of open source data to complement and support traditional means of verification. One of those new, and increasingly expanding, sources of open source information is social media, which can be ingested and understood through social media analytics (SMA). Pacific Northwest National Laboratory (PNNL) is conducting research to further our ability to identify, visualize, and fuse social media data to support nonproliferation and arms control treaty verification efforts. This paper will describemore » our preliminary research to examine social media signatures of nonproliferation or arms control proxy events. We will describe the development of our preliminary nonproliferation and arms control proxy events, outline our initial findings, and propose ideas for future work.« less

  15. Intersubject variability and intrasubject reproducibility of 12-lead ECG metrics: Implications for human verification.

    PubMed

    Jekova, Irena; Krasteva, Vessela; Leber, Remo; Schmid, Ramun; Twerenbold, Raphael; Müller, Christian; Reichlin, Tobias; Abächerli, Roger

    Electrocardiogram (ECG) biometrics is an advanced technology, not yet covered by guidelines on criteria, features and leads for maximal authentication accuracy. This study aims to define the minimal set of morphological metrics in 12-lead ECG by optimization towards high reliability and security, and validation in a person verification model across a large population. A standard 12-lead resting ECG database from 574 non-cardiac patients with two remote recordings (>1year apart) was used. A commercial ECG analysis module (Schiller AG) measured 202 morphological features, including lead-specific amplitudes, durations, ST-metrics, and axes. Coefficient of variation (CV, intersubject variability) and percent-mean-absolute-difference (PMAD, intrasubject reproducibility) defined the optimization (PMAD/CV→min) and restriction (CV<30%) criteria for selection of the most stable and distinctive features. Linear discriminant analysis (LDA) validated the non-redundant feature set for person verification. Maximal LDA verification sensitivity (85.3%) and specificity (86.4%) were validated for 11 optimal features: R-amplitude (I,II,V1,V2,V3,V5), S-amplitude (V1,V2), Tnegative-amplitude (aVR), and R-duration (aVF,V1). Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Using a Modular Open Systems Approach in Defense Acquisitions: Implications for the Contracting Process

    DTIC Science & Technology

    2006-01-30

    He has taught contract management courses for the UCLA Government Contracts Certificate program and is also a senior faculty member for the Keller...standards for its key interfaces, and has been subjected to successful validation and verification tests to ensure the openness of its key interfaces...widely supported and consensus based standards for its key interfaces, and is subject to validation and verification tests to ensure the openness of its

  17. Verification and Validation of NASA-Supported Enhancements to the Near Real Time Harmful Algal Blooms Observing System (HABSOS)

    NASA Technical Reports Server (NTRS)

    Spruce, Joseph P.; Hall, Calllie; McPherson, Terry; Spiering, Bruce; Brown, Richard; Estep, Lee; Lunde, Bruce; Guest, DeNeice; Navard, Andy; Pagnutti, Mary; hide

    2006-01-01

    This report discusses verification and validation (V&V) assessment of Moderate Resolution Imaging Spectroradiometer (MODIS) ocean data products contributed by the Naval Research Laboratory (NRL) and Applied Coherent Technologies (ACT) Corporation to National Oceanic Atmospheric Administration s (NOAA) Near Real Time (NRT) Harmful Algal Blooms Observing System (HABSOS). HABSOS is a maturing decision support tool (DST) used by NOAA and its partners involved with coastal and public health management.

  18. Independent Validation and Verification of automated information systems in the Department of Energy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hunteman, W.J.; Caldwell, R.

    1994-07-01

    The Department of Energy (DOE) has established an Independent Validation and Verification (IV&V) program for all classified automated information systems (AIS) operating in compartmented or multi-level modes. The IV&V program was established in DOE Order 5639.6A and described in the manual associated with the Order. This paper describes the DOE IV&V program, the IV&V process and activities, the expected benefits from an IV&V, and the criteria and methodologies used during an IV&V. The first IV&V under this program was conducted on the Integrated Computing Network (ICN) at Los Alamos National Laboratory and several lessons learned are presented. The DOE IV&Vmore » program is based on the following definitions. An IV&V is defined as the use of expertise from outside an AIS organization to conduct validation and verification studies on a classified AIS. Validation is defined as the process of applying the specialized security test and evaluation procedures, tools, and equipment needed to establish acceptance for joint usage of an AIS by one or more departments or agencies and their contractors. Verification is the process of comparing two levels of an AIS specification for proper correspondence (e.g., security policy model with top-level specifications, top-level specifications with source code, or source code with object code).« less

  19. Performance Evaluation of a Data Validation System

    NASA Technical Reports Server (NTRS)

    Wong, Edmond (Technical Monitor); Sowers, T. Shane; Santi, L. Michael; Bickford, Randall L.

    2005-01-01

    Online data validation is a performance-enhancing component of modern control and health management systems. It is essential that performance of the data validation system be verified prior to its use in a control and health management system. A new Data Qualification and Validation (DQV) Test-bed application was developed to provide a systematic test environment for this performance verification. The DQV Test-bed was used to evaluate a model-based data validation package known as the Data Quality Validation Studio (DQVS). DQVS was employed as the primary data validation component of a rocket engine health management (EHM) system developed under NASA's NGLT (Next Generation Launch Technology) program. In this paper, the DQVS and DQV Test-bed software applications are described, and the DQV Test-bed verification procedure for this EHM system application is presented. Test-bed results are summarized and implications for EHM system performance improvements are discussed.

  20. The New NASA Orbital Debris Engineering Model ORDEM2000

    NASA Technical Reports Server (NTRS)

    Liou, Jer-Chyi; Matney, Mark J.; Anz-Meador, Phillip D.; Kessler, Donald; Jansen, Mark; Theall, Jeffery R.

    2002-01-01

    The NASA Orbital Debris Program Office at Johnson Space Center has developed a new computer-based orbital debris engineering model, ORDEM2000, which describes the orbital debris environment in the low Earth orbit region between 200 and 2000 km altitude. The model is appropriate for those engineering solutions requiring knowledge and estimates of the orbital debris environment (debris spatial density, flux, etc.). ORDEM2000 can also be used as a benchmark for ground-based debris measurements and observations. We incorporated a large set of observational data, covering the object size range from 10 mm to 10 m, into the ORDEM2000 debris database, utilizing a maximum likelihood estimator to convert observations into debris population probability distribution functions. These functions then form the basis of debris populations. We developed a finite element model to process the debris populations to form the debris environment. A more capable input and output structure and a user-friendly graphical user interface are also implemented in the model. ORDEM2000 has been subjected to a significant verification and validation effort. This document describes ORDEM2000, which supersedes the previous model, ORDEM96. The availability of new sensor and in situ data, as well as new analytical techniques, has enabled the construction of this new model. Section 1 describes the general requirements and scope of an engineering model. Data analyses and the theoretical formulation of the model are described in Sections 2 and 3. Section 4 describes the verification and validation effort and the sensitivity and uncertainty analyses. Finally, Section 5 describes the graphical user interface, software installation, and test cases for the user.

  1. TRIAD: The Translational Research Informatics and Data Management Grid

    PubMed Central

    Payne, P.; Ervin, D.; Dhaval, R.; Borlawsky, T.; Lai, A.

    2011-01-01

    Objective Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. Methods A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis “pipelines.” Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile “working interoperability” between data, information, and knowledge resources. Results Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile “working interoperability” between distributed data and knowledge sources. Conclusion Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling “working interoperability” in heterogeneous biomedical environments. PMID:23616879

  2. TRIAD: The Translational Research Informatics and Data Management Grid.

    PubMed

    Payne, P; Ervin, D; Dhaval, R; Borlawsky, T; Lai, A

    2011-01-01

    Multi-disciplinary and multi-site biomedical research programs frequently require infrastructures capable of enabling the collection, management, analysis, and dissemination of heterogeneous, multi-dimensional, and distributed data and knowledge collections spanning organizational boundaries. We report on the design and initial deployment of an extensible biomedical informatics platform that is intended to address such requirements. A common approach to distributed data, information, and knowledge management needs in the healthcare and life science settings is the deployment and use of a service-oriented architecture (SOA). Such SOA technologies provide for strongly-typed, semantically annotated, and stateful data and analytical services that can be combined into data and knowledge integration and analysis "pipelines." Using this overall design pattern, we have implemented and evaluated an extensible SOA platform for clinical and translational science applications known as the Translational Research Informatics and Data-management grid (TRIAD). TRIAD is a derivative and extension of the caGrid middleware and has an emphasis on supporting agile "working interoperability" between data, information, and knowledge resources. Based upon initial verification and validation studies conducted in the context of a collection of driving clinical and translational research problems, we have been able to demonstrate that TRIAD achieves agile "working interoperability" between distributed data and knowledge sources. Informed by our initial verification and validation studies, we believe TRIAD provides an example instance of a lightweight and readily adoptable approach to the use of SOA technologies in the clinical and translational research setting. Furthermore, our initial use cases illustrate the importance and efficacy of enabling "working interoperability" in heterogeneous biomedical environments.

  3. Thermal/structural design verification strategies for large space structures

    NASA Technical Reports Server (NTRS)

    Benton, David

    1988-01-01

    Requirements for space structures of increasing size, complexity, and precision have engendered a search for thermal design verification methods that do not impose unreasonable costs, that fit within the capabilities of existing facilities, and that still adequately reduce technical risk. This requires a combination of analytical and testing methods. This requires two approaches. The first is to limit thermal testing to sub-elements of the total system only in a compact configuration (i.e., not fully deployed). The second approach is to use a simplified environment to correlate analytical models with test results. These models can then be used to predict flight performance. In practice, a combination of these approaches is needed to verify the thermal/structural design of future very large space systems.

  4. Development tests for the 2.5 megawatt Mod-2 wind turbine generator

    NASA Technical Reports Server (NTRS)

    Andrews, J. S.; Baskin, J. M.

    1982-01-01

    The 2.5 megawatt MOD-2 wind turbine generator test program is discussed. The development of the 2.5 megawatt MOD-2 wind turbine generator included an extensive program of testing which encompassed verification of analytical procedures, component development, and integrated system verification. The test program was to assure achievement of the thirty year design operational life of the wind turbine system as well as to minimize costly design modifications which would otherwise have been required during on site system testing. Computer codes were modified, fatigue life of structure and dynamic components were verified, mechanical and electrical component and subsystems were functionally checked and modified where necessary to meet system specifications, and measured dynamic responses of coupled systems confirmed analytical predictions.

  5. Sentence Verification, Sentence Recognition, and the Semantic-Episodic Distinction

    ERIC Educational Resources Information Center

    Shoben, Edward J.; And Others

    1978-01-01

    In an attempt to assess the validity of the distinction between episodic and semantic memory, this research examined the influence of two variables on sentence verification (presumably a semantic memory task) and sentence recognition (presumably an episodic memory task). ( Editor)

  6. Load Model Verification, Validation and Calibration Framework by Statistical Analysis on Field Data

    NASA Astrophysics Data System (ADS)

    Jiao, Xiangqing; Liao, Yuan; Nguyen, Thai

    2017-11-01

    Accurate load models are critical for power system analysis and operation. A large amount of research work has been done on load modeling. Most of the existing research focuses on developing load models, while little has been done on developing formal load model verification and validation (V&V) methodologies or procedures. Most of the existing load model validation is based on qualitative rather than quantitative analysis. In addition, not all aspects of model V&V problem have been addressed by the existing approaches. To complement the existing methods, this paper proposes a novel load model verification and validation framework that can systematically and more comprehensively examine load model's effectiveness and accuracy. Statistical analysis, instead of visual check, quantifies the load model's accuracy, and provides a confidence level of the developed load model for model users. The analysis results can also be used to calibrate load models. The proposed framework can be used as a guidance to systematically examine load models for utility engineers and researchers. The proposed method is demonstrated through analysis of field measurements collected from a utility system.

  7. Formal Verification for a Next-Generation Space Shuttle

    NASA Technical Reports Server (NTRS)

    Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)

    2002-01-01

    This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.

  8. 3-D Inhomogeous Radiative Transfer Model using a Planar-stratified Forward RT Model and Horizontal Perturbation Series

    NASA Astrophysics Data System (ADS)

    Zhang, K.; Gasiewski, A. J.

    2017-12-01

    A horizontally inhomogeneous unified microwave radiative transfer (HI-UMRT) model based upon a nonspherical hydrometeor scattering model is being developed at the University of Colorado at Boulder to facilitate forward radiative simulations for 3-dimensionally inhomogeneous clouds in severe weather. The HI-UMRT 3-D analytical solution is based on incorporating a planar-stratified 1-D UMRT algorithm within a horizontally inhomogeneous iterative perturbation scheme. Single-scattering parameters are computed using the Discrete Dipole Scattering (DDSCAT v7.3) program for hundreds of carefully selected nonspherical complex frozen hydrometeors from the NASA/GSFC DDSCAT database. The required analytic factorization symmetry of transition matrix in a normalized RT equation was analytically proved and validated numerically using the DDSCAT-based full Stokes matrix of randomly oriented hydrometeors. The HI-UMRT model thus inherits the properties of unconditional numerical stability, efficiency, and accuracy from the UMRT algorithm and provides a practical 3-D two-Stokes parameter radiance solution with Jacobian to be used within microwave retrievals and data assimilation schemes. In addition, a fast forward radar reflectivity operator with Jacobian based on DDSCAT backscatter efficiency computed for large hydrometeors is incorporated into the HI-UMRT model to provide applicability to active radar sensors. The HI-UMRT will be validated strategically at two levels: 1) intercomparison of brightness temperature (Tb) results with those of several 1-D and 3-D RT models, including UMRT, CRTM and Monte Carlo models, 2) intercomparison of Tb with observed data from combined passive and active spaceborne sensors (e.g. GPM GMI and DPR). The precise expression for determining the required number of 3-D iterations to achieve an error bound on the perturbation solution will be developed to facilitate the numerical verification of the HI-UMRT code complexity and computation performance.

  9. A closed-form analytical model for predicting 3D boundary layer displacement thickness for the validation of viscous flow solvers

    NASA Astrophysics Data System (ADS)

    Kumar, V. R. Sanal; Sankar, Vigneshwaran; Chandrasekaran, Nichith; Saravanan, Vignesh; Natarajan, Vishnu; Padmanabhan, Sathyan; Sukumaran, Ajith; Mani, Sivabalan; Rameshkumar, Tharikaa; Nagaraju Doddi, Hema Sai; Vysaprasad, Krithika; Sharan, Sharad; Murugesh, Pavithra; Shankar, S. Ganesh; Nejaamtheen, Mohammed Niyasdeen; Baskaran, Roshan Vignesh; Rahman Mohamed Rafic, Sulthan Ariff; Harisrinivasan, Ukeshkumar; Srinivasan, Vivek

    2018-02-01

    A closed-form analytical model is developed for estimating the 3D boundary-layer-displacement thickness of an internal flow system at the Sanal flow choking condition for adiabatic flows obeying the physics of compressible viscous fluids. At this unique condition the boundary-layer blockage induced fluid-throat choking and the adiabatic wall-friction persuaded flow choking occur at a single sonic-fluid-throat location. The beauty and novelty of this model is that without missing the flow physics we could predict the exact boundary-layer blockage of both 2D and 3D cases at the sonic-fluid-throat from the known values of the inlet Mach number, the adiabatic index of the gas and the inlet port diameter of the internal flow system. We found that the 3D blockage factor is 47.33 % lower than the 2D blockage factor with air as the working fluid. We concluded that the exact prediction of the boundary-layer-displacement thickness at the sonic-fluid-throat provides a means to correctly pinpoint the causes of errors of the viscous flow solvers. The methodology presented herein with state-of-the-art will play pivotal roles in future physical and biological sciences for a credible verification, calibration and validation of various viscous flow solvers for high-fidelity 2D/3D numerical simulations of real-world flows. Furthermore, our closed-form analytical model will be useful for the solid and hybrid rocket designers for the grain-port-geometry optimization of new generation single-stage-to-orbit dual-thrust-motors with the highest promising propellant loading density within the given envelope without manifestation of the Sanal flow choking leading to possible shock waves causing catastrophic failures.

  10. Release Fixed Heel Point (FHP) Accommodation Model Verification and Validation (V and V) Plan - Rev A

    DTIC Science & Technology

    2017-01-23

    5e. TASK NUMBER N/A 5f. WORK UNIT NUMBER N/A 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) AND ADDRESS(ES) RDECOM-TARDEC-ACT Attn...occupant work space, central 90% of the Soldier population, encumbrance, posture and position, verification and validation, computer aided design...factors engineers could benefit by working with vehicle designers to perform virtual assessments in CAD when there is not enough time and/or funding to

  11. Risk-Based Tailoring of the Verification, Validation, and Accreditation/Acceptance Processes (Adaptation fondee sur le risque, des processus de verification, de validation, et d’accreditation/d’acceptation)

    DTIC Science & Technology

    2012-04-01

    Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December

  12. Shift Verification and Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over amore » burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.« less

  13. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF FOUR TEST KITS FOR THE ANALYSIS OF ATRAZINE IN WATER: ABRAXIS LLC ATRAZINE ELISA KIT, BEACON ANALYTICAL SYSTEMS, INC. ATRAZINE TUBE KIT, SILVER LAKE RESEARCH CORP. WATERSAFE PESTICIDE TEST AND STRATEGIC DIAGNOSTICS, INC. RAPID ASSAY KIT

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV ...

  14. Transfer function verification and block diagram simplification of a very high-order distributed pole closed-loop servo by means of non-linear time-response simulation

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, A. K.

    1975-01-01

    Linear frequency domain methods are inadequate in analyzing the 1975 Viking Orbiter (VO75) digital tape recorder servo due to dominant nonlinear effects such as servo signal limiting, unidirectional servo control, and static/dynamic Coulomb friction. The frequency loop (speed control) servo of the VO75 tape recorder is used to illustrate the analytical tools and methodology of system redundancy elimination and high order transfer function verification. The paper compares time-domain performance parameters derived from a series of nonlinear time responses with the available experimental data in order to select the best possible analytical transfer function representation of the tape transport (mechanical segment of the tape recorder) from several possible candidates. The study also shows how an analytical time-response simulation taking into account most system nonlinearities can pinpoint system redundancy and overdesign stemming from a strictly empirical design approach. System order reduction is achieved through truncation of individual transfer functions and elimination of redundant blocks.

  15. An analytical model for solute transport through a GCL-based two-layered liner considering biodegradation.

    PubMed

    Guan, C; Xie, H J; Wang, Y Z; Chen, Y M; Jiang, Y S; Tang, X W

    2014-01-01

    An analytical model for solute advection and dispersion in a two-layered liner consisting of a geosynthetic clay liner (GCL) and a soil liner (SL) considering the effect of biodegradation was proposed. The analytical solution was derived by Laplace transformation and was validated over a range of parameters using the finite-layer method based software Pollute v7.0. Results show that if the half-life of the solute in GCL is larger than 1 year, the degradation in GCL can be neglected for solute transport in GCL/SL. When the half-life of GCL is less than 1 year, neglecting the effect of degradation in GCL on solute migration will result in a large difference of relative base concentration of GCL/SL (e.g., 32% for the case with half-life of 0.01 year). The 100-year solute base concentration can be reduced by a factor of 2.2 when the hydraulic conductivity of the SL was reduced by an order of magnitude. The 100-year base concentration was reduced by a factor of 155 when the half life of the contaminant in the SL was reduced by an order of magnitude. The effect of degradation is more important in approving the groundwater protection level than the hydraulic conductivity. The analytical solution can be used for experimental data fitting, verification of complicated numerical models and preliminary design of landfill liner systems. © 2013.

  16. Reducing software security risk through an integrated approach research initiative model based verification of the Secure Socket Layer (SSL) Protocol

    NASA Technical Reports Server (NTRS)

    Powell, John D.

    2003-01-01

    This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.

  17. 76 FR 81991 - National Spectrum Sharing Research Experimentation, Validation, Verification, Demonstration and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... non-federal community, including the academic, commercial, and public safety sectors, to implement a..., Verification, Demonstration and Trials: Technical Workshop II on Coordinating Federal Government/Private Sector Spectrum Innovation Testing Needs AGENCY: The National Coordination Office (NCO) for Networking and...

  18. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  19. Development of a verification program for deployable truss advanced technology

    NASA Technical Reports Server (NTRS)

    Dyer, Jack E.

    1988-01-01

    Use of large deployable space structures to satisfy the growth demands of space systems is contingent upon reducing the associated risks that pervade many related technical disciplines. The overall objectives of this program was to develop a detailed plan to verify deployable truss advanced technology applicable to future large space structures and to develop a preliminary design of a deployable truss reflector/beam structure for use a a technology demonstration test article. The planning is based on a Shuttle flight experiment program using deployable 5 and 15 meter aperture tetrahedral truss reflections and a 20 m long deployable truss beam structure. The plan addresses validation of analytical methods, the degree to which ground testing adequately simulates flight and in-space testing requirements for large precision antenna designs. Based on an assessment of future NASA and DOD space system requirements, the program was developed to verify four critical technology areas: deployment, shape accuracy and control, pointing and alignment, and articulation and maneuvers. The flight experiment technology verification objectives can be met using two shuttle flights with the total experiment integrated on a single Shuttle Test Experiment Platform (STEP) and a Mission Peculiar Experiment Support Structure (MPESS). First flight of the experiment can be achieved 60 months after go-ahead with a total program duration of 90 months.

  20. Improved hybrid isolator with maglev actuator integrated in air spring for active-passive isolation of ship machinery vibration

    NASA Astrophysics Data System (ADS)

    Li, Yan; He, Lin; Shuai, Chang-geng; Wang, Chun-yu

    2017-10-01

    A hybrid isolator consisting of maglev actuator and air spring is proposed and developed for application in active-passive vibration isolation system of ship machinery. The dynamic characteristics of this hybrid isolator are analyzed and tested. The stability and adaptability of this hybrid isolator to shock and swing in the marine environment are improved by a compliant gap protection technique and a disengageable suspended structure. The functions of these new engineering designs are proved by analytical verification and experimental validation of the designed stiffness of such a hybrid isolator, and also by shock adaptability testing of the hybrid isolator. Finally, such hybrid isolators are installed in an engineering mounting loaded with a 200-kW ship diesel generator, and the broadband and low-frequency sinusoidal isolation performance is tested.

  1. RADSOURCE. Volume 1, Part 1, A scaling factor prediction computer program technical manual and code validation: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vance, J.N.; Holderness, J.H.; James, D.W.

    1992-12-01

    Waste stream scaling factors based on sampling programs are vulnerable to one or more of the following factors: sample representativeness, analytic accuracy, and measurement sensitivity. As an alternative to sample analyses or as a verification of the sampling results, this project proposes the use of the RADSOURCE code, which accounts for the release of fuel-source radionuclides. Once the release rates of these nuclides from fuel are known, the code develops scaling factors for waste streams based on easily measured Cobalt-60 (Co-60) and Cesium-137 (Cs-137). The project team developed mathematical models to account for the appearance rate of 10CFR61 radionuclides inmore » reactor coolant. They based these models on the chemistry and nuclear physics of the radionuclides involved. Next, they incorporated the models into a computer code that calculates plant waste stream scaling factors based on reactor coolant gamma- isotopic data. Finally, the team performed special sampling at 17 reactors to validate the models in the RADSOURCE code.« less

  2. Clinical proteomic biomarkers: relevant issues on study design & technical considerations in biomarker development

    PubMed Central

    2014-01-01

    Biomarker research is continuously expanding in the field of clinical proteomics. A combination of different proteomic–based methodologies can be applied depending on the specific clinical context of use. Moreover, current advancements in proteomic analytical platforms are leading to an expansion of biomarker candidates that can be identified. Specifically, mass spectrometric techniques could provide highly valuable tools for biomarker research. Ideally, these advances could provide with biomarkers that are clinically applicable for disease diagnosis and/ or prognosis. Unfortunately, in general the biomarker candidates fail to be implemented in clinical decision making. To improve on this current situation, a well-defined study design has to be established driven by a clear clinical need, while several checkpoints between the different phases of discovery, verification and validation have to be passed in order to increase the probability of establishing valid biomarkers. In this review, we summarize the technical proteomic platforms that are available along the different stages in the biomarker discovery pipeline, exemplified by clinical applications in the field of bladder cancer biomarker research. PMID:24679154

  3. Dynamic variational asymptotic procedure for laminated composite shells

    NASA Astrophysics Data System (ADS)

    Lee, Chang-Yong

    Unlike published shell theories, the main two parts of this thesis are devoted to the asymptotic construction of a refined theory for composite laminated shells valid over a wide range of frequencies and wavelengths. The resulting theory is applicable to shells each layer of which is made of materials with monoclinic symmetry. It enables one to analyze shell dynamic responses within both long-wavelength, low- and high-frequency vibration regimes. It also leads to energy functionals that are both positive definiteness and sufficient simplicity for all wavelengths. This whole procedure was first performed analytically. From the insight gained from the procedure, a finite element version of the analysis was then developed; and a corresponding computer program, DVAPAS, was developed. DVAPAS can obtain the generalized 2-D constitutive law and recover accurately the 3-D results for stress and strain in composite shells. Some independent works will be needed to develop the corresponding 2-D surface analysis associated with the present theory and to continue towards full verification and validation of the present process by comparison with available published works.

  4. New analytical solutions to the two-phase water faucet problem

    DOE PAGES

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    2016-06-17

    Here, the one-dimensional water faucet problem is one of the classical benchmark problems originally proposed by Ransom to study the two-fluid two-phase flow model. With certain simplifications, such as massless gas phase and no wall and interfacial frictions, analytical solutions had been previously obtained for the transient liquid velocity and void fraction distribution. The water faucet problem and its analytical solutions have been widely used for the purposes of code assessment, benchmark and numerical verifications. In our previous study, the Ransom’s solutions were used for the mesh convergence study of a high-resolution spatial discretization scheme. It was found that, atmore » the steady state, an anticipated second-order spatial accuracy could not be achieved, when compared to the existing Ransom’s analytical solutions. A further investigation showed that the existing analytical solutions do not actually satisfy the commonly used two-fluid single-pressure two-phase flow equations. In this work, we present a new set of analytical solutions of the water faucet problem at the steady state, considering the gas phase density’s effect on pressure distribution. This new set of analytical solutions are used for mesh convergence studies, from which anticipated second-order of accuracy is achieved for the 2nd order spatial discretization scheme. In addition, extended Ransom’s transient solutions for the gas phase velocity and pressure are derived, with the assumption of decoupled liquid and gas pressures. Numerical verifications on the extended Ransom’s solutions are also presented.« less

  5. Verification and Validation Process for Progressive Damage and Failure Analysis Methods in the NASA Advanced Composites Consortium

    NASA Technical Reports Server (NTRS)

    Wanthal, Steven; Schaefer, Joseph; Justusson, Brian; Hyder, Imran; Engelstad, Stephen; Rose, Cheryl

    2017-01-01

    The Advanced Composites Consortium is a US Government/Industry partnership supporting technologies to enable timeline and cost reduction in the development of certified composite aerospace structures. A key component of the consortium's approach is the development and validation of improved progressive damage and failure analysis methods for composite structures. These methods will enable increased use of simulations in design trade studies and detailed design development, and thereby enable more targeted physical test programs to validate designs. To accomplish this goal with confidence, a rigorous verification and validation process was developed. The process was used to evaluate analysis methods and associated implementation requirements to ensure calculation accuracy and to gage predictability for composite failure modes of interest. This paper introduces the verification and validation process developed by the consortium during the Phase I effort of the Advanced Composites Project. Specific structural failure modes of interest are first identified, and a subset of standard composite test articles are proposed to interrogate a progressive damage analysis method's ability to predict each failure mode of interest. Test articles are designed to capture the underlying composite material constitutive response as well as the interaction of failure modes representing typical failure patterns observed in aerospace structures.

  6. 49 CFR 236.903 - Definitions.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... the site-specific application programs, run timers, read inputs, drive outputs, perform self... validation process is to determine “whether the correct product was built.” Verification means the process of... established at the start of that phase. The goal of the verification process is to determine “whether the...

  7. Statement Verification: A Stochastic Model of Judgment and Response.

    ERIC Educational Resources Information Center

    Wallsten, Thomas S.; Gonzalez-Vallejo, Claudia

    1994-01-01

    A stochastic judgment model (SJM) is presented as a framework for addressing issues in statement verification and probability judgment. Results of 5 experiments with 264 undergraduates support the validity of the model and provide new information that is interpreted in terms of the SJM. (SLD)

  8. Convergence between Measures of Work-to-Family and Family-to-Work Conflict: A Meta-Analytic Examination

    ERIC Educational Resources Information Center

    Mesmer-Magnus, Jessica R.; Viswesvaran, Chockalingam

    2005-01-01

    The overlap between measures of work-to-family (WFC) and family-to-work conflict (FWC) was meta-analytically investigated. Researchers have assumed WFC and FWC to be distinct, however, this assumption requires empirical verification. Across 25 independent samples (total N=9079) the sample size weighted mean observed correlation was .38 and the…

  9. INNOVATIVE TECHNOLOGY VERIFICATION REPORT "FIELD MEASUREMENT TECHNOLOGIES FOR TOTAL PETROLEUM HYDROCARBONS IN SOIL" SITELAB CORPORATION SITELAB ANALYTICAL TEST KIT UVF-3100A

    EPA Science Inventory



    site LAB(& Analytical Test Kit UVF-3 I OOA (UVF-3 I OOA) developed by siteLABqD Corporation (siteLABa)) was demonstrated under the U.S. Environmental Protection Agency Superfund Innovative Technology Evaluation Program in June 2000 at the Navy Base Ventura County site in ...

  10. Design, analysis and test verification of advanced encapsulation systems

    NASA Technical Reports Server (NTRS)

    Garcia, A., III; Kallis, J. M.; Trucker, D. C.

    1983-01-01

    Analytical models were developed to perform optical, thermal, electrical and structural analyses on candidate encapsulation systems. From these analyses several candidate encapsulation systems were selected for qualification testing.

  11. Ground-water models: Validate or invalidate

    USGS Publications Warehouse

    Bredehoeft, J.D.; Konikow, Leonard F.

    1993-01-01

    The word validation has a clear meaning to both the scientific community and the general public. Within the scientific community the validation of scientific theory has been the subject of philosophical debate. The philosopher of science, Karl Popper, argued that scientific theory cannot be validated, only invalidated. Popper’s view is not the only opinion in this debate; however, many scientists today agree with Popper (including the authors). To the general public, proclaiming that a ground-water model is validated carries with it an aura of correctness that we do not believe many of us who model would claim. We can place all the caveats we wish, but the public has its own understanding of what the word implies. Using the word valid with respect to models misleads the public; verification carries with it similar connotations as far as the public is concerned. Our point is this: using the terms validation and verification are misleading, at best. These terms should be abandoned by the ground-water community.

  12. Observations on CFD Verification and Validation from the AIAA Drag Prediction Workshops

    NASA Technical Reports Server (NTRS)

    Morrison, Joseph H.; Kleb, Bil; Vassberg, John C.

    2014-01-01

    The authors provide observations from the AIAA Drag Prediction Workshops that have spanned over a decade and from a recent validation experiment at NASA Langley. These workshops provide an assessment of the predictive capability of forces and moments, focused on drag, for transonic transports. It is very difficult to manage the consistency of results in a workshop setting to perform verification and validation at the scientific level, but it may be sufficient to assess it at the level of practice. Observations thus far: 1) due to simplifications in the workshop test cases, wind tunnel data are not necessarily the “correct” results that CFD should match, 2) an average of core CFD data are not necessarily a better estimate of the true solution as it is merely an average of other solutions and has many coupled sources of variation, 3) outlier solutions should be investigated and understood, and 4) the DPW series does not have the systematic build up and definition on both the computational and experimental side that is required for detailed verification and validation. Several observations regarding the importance of the grid, effects of physical modeling, benefits of open forums, and guidance for validation experiments are discussed. The increased variation in results when predicting regions of flow separation and increased variation due to interaction effects, e.g., fuselage and horizontal tail, point out the need for validation data sets for these important flow phenomena. Experiences with a recent validation experiment at NASA Langley are included to provide guidance on validation experiments.

  13. Design, analysis and test verification of advanced encapsulation systems

    NASA Technical Reports Server (NTRS)

    Garcia, A.; Minning, C.

    1982-01-01

    Analytical models were developed to perform optical, thermal, electrical and structural analyses on candidate encapsulation systems. Qualification testing, specimens of various types, and a finalized optimum design are projected.

  14. Spacecraft attitude calibration/verification baseline study

    NASA Technical Reports Server (NTRS)

    Chen, L. C.

    1981-01-01

    A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.

  15. Application of Architectural Patterns and Lightweight Formal Method for the Validation and Verification of Safety Critical Systems

    DTIC Science & Technology

    2013-09-01

    to a XML file, a code that Bonine in [21] developed for a similar purpose. Using the StateRover XML log file import tool, we are able to generate a...C. Bonine , M. Shing, T.W. Otani, “Computer-aided process and tools for mobile software acquisition,” NPS, Monterey, CA, Tech. Rep. NPS-SE-13...C10P07R05– 075, 2013. [21] C. Bonine , “Specification, validation and verification of mobile application behavior,” M.S. thesis, Dept. Comp. Science, NPS

  16. Why do verification and validation?

    DOE PAGES

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  17. Analysis of SSME HPOTP rotordynamics subsynchronous whirl

    NASA Technical Reports Server (NTRS)

    1984-01-01

    The causes and remedies of vibration and subsynchronous whirl problems encountered in the Shuttle Main Engine SSME turbomachinery are analyzed. Because the nonlinear and linearized models of the turbopumps play such an important role in the analysis process, the main emphasis is concentrated on the verification and improvement of these tools. It has been the goal of our work to validate the equations of motion used in the models are validated, including the assumptions upon which they are based. Verification of th SSME rotordynamics simulation and the developed enhancements, are emphasized.

  18. An Integrated Approach to Conversion, Verification, Validation and Integrity of AFRL Generic Engine Model and Simulation (Postprint)

    DTIC Science & Technology

    2007-02-01

    and Astronautics 11 PS3C W3 P3 T3 FAR3 Ps3 W41 P41 T41 FAR41 Ps41 W4 P4 T4 FAR4 Ps4 7 NozFlow 6 Flow45 5 Flow44 4 Flow41 3 Flow4 2 Flow3 1 N2Bal... Motivation for Modeling and Simulation Work The Augmented Generic Engine Model (AGEM) Model Verification and Validation (V&V) Assessment of AGEM V&V

  19. Software for imaging phase-shift interference microscope

    NASA Astrophysics Data System (ADS)

    Malinovski, I.; França, R. S.; Couceiro, I. B.

    2018-03-01

    In recent years absolute interference microscope was created at National Metrology Institute of Brazil (INMETRO). The instrument by principle of operation is imaging phase-shifting interferometer (PSI) equipped with two stabilized lasers of different colour as traceable reference wavelength sources. We report here some progress in development of the software for this instrument. The status of undergoing internal validation and verification of the software is also reported. In contrast with standard PSI method, different methodology of phase evaluation is applied. Therefore, instrument specific procedures for software validation and verification are adapted and discussed.

  20. Knowledge based system verification and validation as related to automation of space station subsystems: Rationale for a knowledge based system lifecycle

    NASA Technical Reports Server (NTRS)

    Richardson, Keith; Wong, Carla

    1988-01-01

    The role of verification and validation (V and V) in software has been to support and strengthen the software lifecycle and to ensure that the resultant code meets the standards of the requirements documents. Knowledge Based System (KBS) V and V should serve the same role, but the KBS lifecycle is ill-defined. The rationale of a simple form of the KBS lifecycle is presented, including accommodation to certain critical KBS differences from software development.

  1. ESTEST: A Framework for the Verification and Validation of Electronic Structure Codes

    NASA Astrophysics Data System (ADS)

    Yuan, Gary; Gygi, Francois

    2011-03-01

    ESTEST is a verification and validation (V& V) framework for electronic structure codes that supports Qbox, Quantum Espresso, ABINIT, the Exciting Code and plans support for many more. We discuss various approaches to the electronic structure V& V problem implemented in ESTEST, that are related to parsing, formats, data management, search, comparison and analyses. Additionally, an early experiment in the distribution of V& V ESTEST servers among the electronic structure community will be presented. Supported by NSF-OCI 0749217 and DOE FC02-06ER25777.

  2. A formal approach to validation and verification for knowledge-based control systems

    NASA Technical Reports Server (NTRS)

    Castore, Glen

    1987-01-01

    As control systems become more complex in response to desires for greater system flexibility, performance and reliability, the promise is held out that artificial intelligence might provide the means for building such systems. An obstacle to the use of symbolic processing constructs in this domain is the need for verification and validation (V and V) of the systems. Techniques currently in use do not seem appropriate for knowledge-based software. An outline of a formal approach to V and V for knowledge-based control systems is presented.

  3. Successful MPPF Pneumatics Verification and Validation Testing

    NASA Image and Video Library

    2017-03-28

    Engineers and technicians completed verification and validation testing of several pneumatic systems inside and outside the Multi-Payload Processing Facility (MPPF) at NASA's Kennedy Space Center in Florida. In view is the service platform for Orion spacecraft processing. The MPPF will be used for offline processing and fueling of the Orion spacecraft and service module stack before launch. Orion also will be de-serviced in the MPPF after a mission. The Ground Systems Development and Operations Program (GSDO) is overseeing upgrades to the facility. The Engineering Directorate led the recent pneumatic tests.

  4. Successful MPPF Pneumatics Verification and Validation Testing

    NASA Image and Video Library

    2017-03-28

    Engineers and technicians completed verification and validation testing of several pneumatic systems inside and outside the Multi-Payload Processing Facility (MPPF) at NASA's Kennedy Space Center in Florida. In view is the top level of the service platform for Orion spacecraft processing. The MPPF will be used for offline processing and fueling of the Orion spacecraft and service module stack before launch. Orion also will be de-serviced in the MPPF after a mission. The Ground Systems Development and Operations Program (GSDO) is overseeing upgrades to the facility. The Engineering Directorate led the recent pneumatic tests.

  5. Code Verification Capabilities and Assessments in Support of ASC V&V Level 2 Milestone #6035

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William; Budzien, Joanne Louise; Ferguson, Jim Michael

    This document provides a summary of the code verification activities supporting the FY17 Level 2 V&V milestone entitled “Deliver a Capability for V&V Assessments of Code Implementations of Physics Models and Numerical Algorithms in Support of Future Predictive Capability Framework Pegposts.” The physics validation activities supporting this milestone are documented separately. The objectives of this portion of the milestone are: 1) Develop software tools to support code verification analysis; 2) Document standard definitions of code verification test problems; and 3) Perform code verification assessments (focusing on error behavior of algorithms). This report and a set of additional standalone documents servemore » as the compilation of results demonstrating accomplishment of these objectives.« less

  6. Doppler-broadened NICE-OHMS beyond the cavity-limited weak absorption condition - II: Experimental verification

    NASA Astrophysics Data System (ADS)

    Hausmaninger, Thomas; Silander, Isak; Ma, Weiguang; Axner, Ove

    2016-01-01

    Doppler-broadened (Db) noise-immune cavity-enhanced optical heterodyne molecular spectrometry (NICE-OHMS) is normally described by an expression, here termed the conventional (CONV) description, that is restricted to the conventional cavity-limited weak absorption condition (CCLWA), i.e. when the single pass absorbance is significantly smaller than the empty cavity losses, i.e. when α0 L < < π / F. To describe NICE-OHMS signals beyond this limit two simplified extended descriptions (termed the extended locking and extended transmission description, ELET, and the extended locking and full transmission description, ELFT), which are assumed to be valid under the relaxed cavity-limited weak absorption condition (RCLWA), i.e. when α0 L < π / F, and a full description (denoted FULL), presumed to be valid also when the α0 L < π / F condition does not hold, have recently been derived in an accompanying work (Ma W, et al. Doppler-broadened NICE-OHMS beyond the cavity-limited weak absorption condition - I. Theoretical Description. J Quant Spectrosc Radiat Transfer, 2015, http://dx.doi.org/10.1016/j.jqsrt.2015.09.007). The present work constitutes an experimental verification and assessment of the validity of these, performed in the Doppler limit for a set of Fα0 L / π values (up to 3.5); it is shown under which conditions the various descriptions are valid. It is concluded that for samples with Fα0 L / π up to 0.01, all descriptions replicate the data well. It is shown that the CONV description is adequate and provides accurate assessments of the signal strength (and thereby the analyte concentration) up to Fα0 L / π of around 0.1, while the ELET is accurate for Fα0 L / π up to around 0.3. The ELFT description mimics the Db NICE-OHMS signal well for Fα0 L / π up to around unity, while the FULL description is adequate for all Fα0 L / π values investigated. Access to these descriptions both increases considerably the dynamic range of the technique and facilitates calibration using certified reference gases, which thereby significantly broadens the applicability of the Db NICE-OHMS technique.

  7. Sigma Metrics Across the Total Testing Process.

    PubMed

    Charuruks, Navapun

    2017-03-01

    Laboratory quality control has been developed for several decades to ensure patients' safety, from a statistical quality control focus on the analytical phase to total laboratory processes. The sigma concept provides a convenient way to quantify the number of errors in extra-analytical and analytical phases through the defect per million and sigma metric equation. Participation in a sigma verification program can be a convenient way to monitor analytical performance continuous quality improvement. Improvement of sigma-scale performance has been shown from our data. New tools and techniques for integration are needed. Copyright © 2016 Elsevier Inc. All rights reserved.

  8. HYDRA-II: A hydrothermal analysis computer code: Volume 3, Verification/validation assessments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    McCann, R.A.; Lowery, P.S.

    1987-10-01

    HYDRA-II is a hydrothermal computer code capable of three-dimensional analysis of coupled conduction, convection, and thermal radiation problems. This code is especially appropriate for simulating the steady-state performance of spent fuel storage systems. The code has been evaluated for this application for the US Department of Energy's Commercial Spent Fuel Management Program. HYDRA-II provides a finite difference solution in cartesian coordinates to the equations governing the conservation of mass, momentum, and energy. A cylindrical coordinate system may also be used to enclose the cartesian coordinate system. This exterior coordinate system is useful for modeling cylindrical cask bodies. The difference equationsmore » for conservation of momentum are enhanced by the incorporation of directional porosities and permeabilities that aid in modeling solid structures whose dimensions may be smaller than the computational mesh. The equation for conservation of energy permits modeling of orthotropic physical properties and film resistances. Several automated procedures are available to model radiation transfer within enclosures and from fuel rod to fuel rod. The documentation of HYDRA-II is presented in three separate volumes. Volume I - Equations and Numerics describes the basic differential equations, illustrates how the difference equations are formulated, and gives the solution procedures employed. Volume II - User's Manual contains code flow charts, discusses the code structure, provides detailed instructions for preparing an input file, and illustrates the operation of the code by means of a model problem. This volume, Volume III - Verification/Validation Assessments, provides a comparison between the analytical solution and the numerical simulation for problems with a known solution. This volume also documents comparisons between the results of simulations of single- and multiassembly storage systems and actual experimental data. 11 refs., 55 figs., 13 tabs.« less

  9. Performance Comparison of NAMI DANCE and FLOW-3D® Models in Tsunami Propagation, Inundation and Currents using NTHMP Benchmark Problems

    NASA Astrophysics Data System (ADS)

    Velioglu Sogut, Deniz; Yalciner, Ahmet Cevdet

    2018-06-01

    Field observations provide valuable data regarding nearshore tsunami impact, yet only in inundation areas where tsunami waves have already flooded. Therefore, tsunami modeling is essential to understand tsunami behavior and prepare for tsunami inundation. It is necessary that all numerical models used in tsunami emergency planning be subject to benchmark tests for validation and verification. This study focuses on two numerical codes, NAMI DANCE and FLOW-3D®, for validation and performance comparison. NAMI DANCE is an in-house tsunami numerical model developed by the Ocean Engineering Research Center of Middle East Technical University, Turkey and Laboratory of Special Research Bureau for Automation of Marine Research, Russia. FLOW-3D® is a general purpose computational fluid dynamics software, which was developed by scientists who pioneered in the design of the Volume-of-Fluid technique. The codes are validated and their performances are compared via analytical, experimental and field benchmark problems, which are documented in the ``Proceedings and Results of the 2011 National Tsunami Hazard Mitigation Program (NTHMP) Model Benchmarking Workshop'' and the ``Proceedings and Results of the NTHMP 2015 Tsunami Current Modeling Workshop". The variations between the numerical solutions of these two models are evaluated through statistical error analysis.

  10. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation

    NASA Astrophysics Data System (ADS)

    Magro, G.; Molinelli, S.; Mairani, A.; Mirandola, A.; Panizza, D.; Russo, S.; Ferrari, A.; Valvo, F.; Fossati, P.; Ciocca, M.

    2015-09-01

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo® TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus® chamber. An EBT3® film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  11. Dosimetric accuracy of a treatment planning system for actively scanned proton beams and small target volumes: Monte Carlo and experimental validation.

    PubMed

    Magro, G; Molinelli, S; Mairani, A; Mirandola, A; Panizza, D; Russo, S; Ferrari, A; Valvo, F; Fossati, P; Ciocca, M

    2015-09-07

    This study was performed to evaluate the accuracy of a commercial treatment planning system (TPS), in optimising proton pencil beam dose distributions for small targets of different sizes (5-30 mm side) located at increasing depths in water. The TPS analytical algorithm was benchmarked against experimental data and the FLUKA Monte Carlo (MC) code, previously validated for the selected beam-line. We tested the Siemens syngo(®) TPS plan optimisation module for water cubes fixing the configurable parameters at clinical standards, with homogeneous target coverage to a 2 Gy (RBE) dose prescription as unique goal. Plans were delivered and the dose at each volume centre was measured in water with a calibrated PTW Advanced Markus(®) chamber. An EBT3(®) film was also positioned at the phantom entrance window for the acquisition of 2D dose maps. Discrepancies between TPS calculated and MC simulated values were mainly due to the different lateral spread modeling and resulted in being related to the field-to-spot size ratio. The accuracy of the TPS was proved to be clinically acceptable in all cases but very small and shallow volumes. In this contest, the use of MC to validate TPS results proved to be a reliable procedure for pre-treatment plan verification.

  12. Voltammetric determination of copper in selected pharmaceutical preparations--validation of the method.

    PubMed

    Lutka, Anna; Maruszewska, Małgorzata

    2011-01-01

    It were established and validated the conditions of voltammetric determination of copper in pharmaceutical preparations. The three selected preparations: Zincuprim (A), Wapń, cynk, miedź z wit. C (B), Vigor complete (V) contained different salts and different quantity of copper (II) and increasing number of accompanied ingredients. For the purpose to transfer copper into solution, the samples of powdered tablets of the first and second preparation were undergone extraction and of the third the mineralization procedures. The concentration of copper in solution was determined by differential pulse voltammetry (DP) using comparison with standard technique. In the validation process, the selectivity, accuracy, precision and linearity of DP determination of copper in three preparations were estimated. Copper was determined within the concentration range of 1-9 ppm (1-9 microg/mL): the mean recoveries approached 102% (A), 100% (B), 102% (V); the relative standard deviations of determinations (RSD) were 0.79-1.59% (A), 0.62-0.85% (B) and 1.68-2.28% (V), respectively. The mean recoveries and the RSDs of determination satisfied the requirements for the analyte concentration at the level 1-10 ppm. The statistical verification confirmed that the tested voltammetric method is suitable for determination of copper in pharmaceutical preparation.

  13. Information Management Platform for Data Analytics and Aggregation (IMPALA) System Design Document

    NASA Technical Reports Server (NTRS)

    Carnell, Andrew; Akinyelu, Akinyele

    2016-01-01

    The System Design document tracks the design activities that are performed to guide the integration, installation, verification, and acceptance testing of the IMPALA Platform. The inputs to the design document are derived from the activities recorded in Tasks 1 through 6 of the Statement of Work (SOW), with the proposed technical solution being the completion of Phase 1-A. With the documentation of the architecture of the IMPALA Platform and the installation steps taken, the SDD will be a living document, capturing the details about capability enhancements and system improvements to the IMPALA Platform to provide users in development of accurate and precise analytical models. The IMPALA Platform infrastructure team, data architecture team, system integration team, security management team, project manager, NASA data scientists and users are the intended audience of this document. The IMPALA Platform is an assembly of commercial-off-the-shelf (COTS) products installed on an Apache-Hadoop platform. User interface details for the COTS products will be sourced from the COTS tools vendor documentation. The SDD is a focused explanation of the inputs, design steps, and projected outcomes of every design activity for the IMPALA Platform through installation and validation.

  14. Preparation of candidate reference materials for the determination of phosphorus containing flame retardants in styrene-based polymers.

    PubMed

    Roth, Thomas; Urpi Bertran, Raquel; Latza, Andreas; Andörfer-Lang, Katrin; Hügelschäffer, Claudia; Pöhlein, Manfred; Puchta, Ralph; Placht, Christian; Maid, Harald; Bauer, Walter; van Eldik, Rudi

    2015-04-01

    Candidate reference materials (RM) for the analysis of phosphorus-based flame retardants in styrene-based polymers were prepared using a self-made mini-extruder. Due to legal requirements of the current restriction for the use of certain hazardous substances in electrical and electronic equipment, focus now is placed on phosphorus-based flame retardants instead of the brominated kind. Newly developed analytical methods for the first-mentioned substances also require RMs similar to industrial samples for validation and verification purposes. Hence, the prepared candidate RMs contained resorcinol-bis-(diphenyl phosphate), bisphenol A bis(diphenyl phosphate), triphenyl phosphate and triphenyl phosphine oxide as phosphorus-based flame retardants. Blends of polycarbonate and acrylonitrile-co-butadiene-co-styrene as well as blends of high-impact polystyrene and polyphenylene oxide were chosen as carrier polymers. Homogeneity and thermal stability of the candidate RMs were investigated. Results showed that the candidate RMs were comparable to the available industrial materials. Measurements by ICP/OES, FTIR and NMR confirmed the expected concentrations of the flame retardants and proved that analyte loss and degradation, respectively, was below the uncertainty of measurement during the extrusion process. Thus, the candidate RMs were found to be suitable for laboratory use.

  15. Consistent Structural Integrity and Efficient Certification with Analysis. Volume 3: Appendices of Verification and Validation Examples, Correlation Factors, and Failure Criteria

    DTIC Science & Technology

    2005-05-01

    TANK WALL.........................74 6 VERIFICATION - BONDED JOINT HOMOGENOUS ISOTROPIC AND ORTHOTROPIC DELALE & ERDOGAN PUBLICATION (SIX EXAMPLES...developed for verification of BondJo 87 6.3.2 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 88...comparisons for condition 1 91 6.3.6 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 92 x FIGURE

  16. The state of advanced measurement and verification technology and industry application

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Granderson, Jessica; Fernandes, Samuel

    2017-09-28

    With the expansion of advanced metering and increased use of energy analytics tools, the energy efficiency community has begun to explore the application of advanced measurement and verification (or ‘M & V 2.0') technologies. Current literature recognizes their promise, but does not offer in-depth assessment of technical underpinnings. Here, this paper assesses the state of the technology and its application. Sixteen commercially available technologies were characterized and combined with a national review of their use.

  17. Validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Gilstrap, Lewey

    1991-01-01

    Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.

  18. Considerations regarding the validation of chromatographic mass spectrometric methods for the quantification of endogenous substances in forensics.

    PubMed

    Hess, Cornelius; Sydow, Konrad; Kueting, Theresa; Kraemer, Michael; Maas, Alexandra

    2018-02-01

    The requirement for correct evaluation of forensic toxicological results in daily routine work and scientific studies is reliable analytical data based on validated methods. Validation of a method gives the analyst tools to estimate the efficacy and reliability of the analytical method. Without validation, data might be contested in court and lead to unjustified legal consequences for a defendant. Therefore, new analytical methods to be used in forensic toxicology require careful method development and validation of the final method. Until now, there are no publications on the validation of chromatographic mass spectrometric methods for the detection of endogenous substances although endogenous analytes can be important in Forensic Toxicology (alcohol consumption marker, congener alcohols, gamma hydroxy butyric acid, human insulin and C-peptide, creatinine, postmortal clinical parameters). For these analytes, conventional validation instructions cannot be followed completely. In this paper, important practical considerations in analytical method validation for endogenous substances will be discussed which may be used as guidance for scientists wishing to develop and validate analytical methods for analytes produced naturally in the human body. Especially the validation parameters calibration model, analytical limits, accuracy (bias and precision) and matrix effects and recovery have to be approached differently. Highest attention should be paid to selectivity experiments. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. Assessing Requirements Quality through Requirements Coverage

    NASA Technical Reports Server (NTRS)

    Rajan, Ajitha; Heimdahl, Mats; Woodham, Kurt

    2008-01-01

    In model-based development, the development effort is centered around a formal description of the proposed software system the model. This model is derived from some high-level requirements describing the expected behavior of the software. For validation and verification purposes, this model can then be subjected to various types of analysis, for example, completeness and consistency analysis [6], model checking [3], theorem proving [1], and test-case generation [4, 7]. This development paradigm is making rapid inroads in certain industries, e.g., automotive, avionics, space applications, and medical technology. This shift towards model-based development naturally leads to changes in the verification and validation (V&V) process. The model validation problem determining that the model accurately captures the customer's high-level requirements has received little attention and the sufficiency of the validation activities has been largely determined through ad-hoc methods. Since the model serves as the central artifact, its correctness with respect to the users needs is absolutely crucial. In our investigation, we attempt to answer the following two questions with respect to validation (1) Are the requirements sufficiently defined for the system? and (2) How well does the model implement the behaviors specified by the requirements? The second question can be addressed using formal verification. Nevertheless, the size and complexity of many industrial systems make formal verification infeasible even if we have a formal model and formalized requirements. Thus, presently, there is no objective way of answering these two questions. To this end, we propose an approach based on testing that, when given a set of formal requirements, explores the relationship between requirements-based structural test-adequacy coverage and model-based structural test-adequacy coverage. The proposed technique uses requirements coverage metrics defined in [9] on formal high-level software requirements and existing model coverage metrics such as the Modified Condition and Decision Coverage (MC/DC) used when testing highly critical software in the avionics industry [8]. Our work is related to Chockler et al. [2], but we base our work on traditional testing techniques as opposed to verification techniques.

  20. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    Here, the one-dimensional water faucet problem is one of the classical benchmark problems originally proposed by Ransom to study the two-fluid two-phase flow model. With certain simplifications, such as massless gas phase and no wall and interfacial frictions, analytical solutions had been previously obtained for the transient liquid velocity and void fraction distribution. The water faucet problem and its analytical solutions have been widely used for the purposes of code assessment, benchmark and numerical verifications. In our previous study, the Ransom’s solutions were used for the mesh convergence study of a high-resolution spatial discretization scheme. It was found that, atmore » the steady state, an anticipated second-order spatial accuracy could not be achieved, when compared to the existing Ransom’s analytical solutions. A further investigation showed that the existing analytical solutions do not actually satisfy the commonly used two-fluid single-pressure two-phase flow equations. In this work, we present a new set of analytical solutions of the water faucet problem at the steady state, considering the gas phase density’s effect on pressure distribution. This new set of analytical solutions are used for mesh convergence studies, from which anticipated second-order of accuracy is achieved for the 2nd order spatial discretization scheme. In addition, extended Ransom’s transient solutions for the gas phase velocity and pressure are derived, with the assumption of decoupled liquid and gas pressures. Numerical verifications on the extended Ransom’s solutions are also presented.« less

  1. Comments on the Synergism Between the Analytic Planetary Boundary-Layer Model and Remote Sensing Data

    NASA Astrophysics Data System (ADS)

    Brown, R. A.

    2005-08-01

    This paper is adapted from a presentation at the session of the European Geophysical Society meeting in 2002 honouring Joost Businger. It documents the interaction of the non-linear planetary boundary-layer (PBL) model (UW-PBL) and satellite remote sensing of marine surface winds from verification and calibration studies for the sensor model function to the current state of verification of the model by satellite data. It is also a personal history where Joost Businger had seminal input to this research at several critical junctures. The first scatterometer in space was on SeaSat in 1978, while currently in orbit there are the QuikSCAT and ERS-2 scatterometers and the WindSat radiometer. The volume and detail of data from the scatterometers during the past decade are unprecedented, though the value of these data depends on a careful interpretation of the PBL dynamics. The model functions (algorithms) that relate surface wind to sensor signal have evolved from straight empirical correlation with simple surface-layer 10-m winds to satellite sensor model functions for surface pressure fields. A surface stress model function is also available. The validation data for the satellite model functions depended crucially on the PBL solution. The non-linear solution for the flow of fluid in the boundary layer of a rotating coordinate system was completed in 1969. The implications for traditional ways of measuring and modelling the PBL were huge and continue to this day. Unfortunately, this solution replaced an elegant one by Ekman with a stability/finite perturbation equilibrium solution. Consequently, there has been great reluctance to accept this solution. The verification of model predictions has been obtained from the satellite data.

  2. 76 FR 14038 - TWIC/MTSA Policy Advisory Council; Voluntary Use of TWIC Readers

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-15

    ... and would like to know that they reached the Facility, please enclose a stamped, self-addressed... regulatory requirements for effective (1) identity verification, (2) card validity, and (3) card... access is granted. 33 CFR 101.514. At each entry, the TWIC must be checked for (1) identity verification...

  3. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  4. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  5. From model conception to verification and validation, a global approach to multiphase Navier-Stoke models with an emphasis on volcanic explosive phenomenology

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dartevelle, Sebastian

    2007-10-01

    Large-scale volcanic eruptions are hazardous events that cannot be described by detailed and accurate in situ measurement: hence, little to no real-time data exists to rigorously validate current computer models of these events. In addition, such phenomenology involves highly complex, nonlinear, and unsteady physical behaviors upon many spatial and time scales. As a result, volcanic explosive phenomenology is poorly understood in terms of its physics, and inadequately constrained in terms of initial, boundary, and inflow conditions. Nevertheless, code verification and validation become even more critical because more and more volcanologists use numerical data for assessment and mitigation of volcanic hazards.more » In this report, we evaluate the process of model and code development in the context of geophysical multiphase flows. We describe: (1) the conception of a theoretical, multiphase, Navier-Stokes model, (2) its implementation into a numerical code, (3) the verification of the code, and (4) the validation of such a model within the context of turbulent and underexpanded jet physics. Within the validation framework, we suggest focusing on the key physics that control the volcanic clouds—namely, momentum-driven supersonic jet and buoyancy-driven turbulent plume. For instance, we propose to compare numerical results against a set of simple and well-constrained analog experiments, which uniquely and unambiguously represent each of the key-phenomenology. Key« less

  6. International Energy Agency Ocean Energy Systems Task 10 Wave Energy Converter Modeling Verification and Validation: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendt, Fabian F; Yu, Yi-Hsiang; Nielsen, Kim

    This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30)more » [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.« less

  7. SPR Hydrostatic Column Model Verification and Validation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bettin, Giorgia; Lord, David; Rudeen, David Keith

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extendedmore » nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.« less

  8. SMAP Verification and Validation Project - Final Report

    NASA Technical Reports Server (NTRS)

    Murry, Michael

    2012-01-01

    In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.

  9. International Space Station Requirement Verification for Commercial Visiting Vehicles

    NASA Technical Reports Server (NTRS)

    Garguilo, Dan

    2017-01-01

    The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.

  10. Software safety - A user's practical perspective

    NASA Technical Reports Server (NTRS)

    Dunn, William R.; Corliss, Lloyd D.

    1990-01-01

    Software safety assurance philosophy and practices at the NASA Ames are discussed. It is shown that, to be safe, software must be error-free. Software developments on two digital flight control systems and two ground facility systems are examined, including the overall system and software organization and function, the software-safety issues, and their resolution. The effectiveness of safety assurance methods is discussed, including conventional life-cycle practices, verification and validation testing, software safety analysis, and formal design methods. It is concluded (1) that a practical software safety technology does not yet exist, (2) that it is unlikely that a set of general-purpose analytical techniques can be developed for proving that software is safe, and (3) that successful software safety-assurance practices will have to take into account the detailed design processes employed and show that the software will execute correctly under all possible conditions.

  11. The visual communication in the optonometric scales.

    PubMed

    Dantas, Rosane Arruda; Pagliuca, Lorita Marlena Freitag

    2006-01-01

    Communication through vision involves visual apprenticeship that demands ocular integrity, which results in the importance of the evaluation of visual acuity. The scale of images, formed by optotypes, is a method for the verification of visual acuity in kindergarten children. To identify the optotype the child needs to know the image in analysis. Given the importance of visual communication during the process of construction of the scale of images, one presents a bibliographic, analytical study aiming at thinking about the principles for the construction of those tables. One considers the draw inserted as an optotype as a non-verbal symbolic expression of the body and/or of the environment constructed based on the caption of experiences by the individual. One contests the indiscriminate use of images, for one understands that there must be previous knowledge. Despite the subjectivity of the optotypes, the scales continue valid if one adapts images to those of the universe of the children to be examined.

  12. A Science and Risk-Based Pragmatic Methodology for Blend and Content Uniformity Assessment.

    PubMed

    Sayeed-Desta, Naheed; Pazhayattil, Ajay Babu; Collins, Jordan; Doshi, Chetan

    2018-04-01

    This paper describes a pragmatic approach that can be applied in assessing powder blend and unit dosage uniformity of solid dose products at Process Design, Process Performance Qualification, and Continued/Ongoing Process Verification stages of the Process Validation lifecycle. The statistically based sampling, testing, and assessment plan was developed due to the withdrawal of the FDA draft guidance for industry "Powder Blends and Finished Dosage Units-Stratified In-Process Dosage Unit Sampling and Assessment." This paper compares the proposed Grouped Area Variance Estimate (GAVE) method with an alternate approach outlining the practicality and statistical rationalization using traditional sampling and analytical methods. The approach is designed to fit solid dose processes assuring high statistical confidence in both powder blend uniformity and dosage unit uniformity during all three stages of the lifecycle complying with ASTM standards as recommended by the US FDA.

  13. Material Model Evaluation of a Composite Honeycomb Energy Absorber

    NASA Technical Reports Server (NTRS)

    Jackson, Karen E.; Annett, Martin S.; Fasanella, Edwin L.; Polanco, Michael A.

    2012-01-01

    A study was conducted to evaluate four different material models in predicting the dynamic crushing response of solid-element-based models of a composite honeycomb energy absorber, designated the Deployable Energy Absorber (DEA). Dynamic crush tests of three DEA components were simulated using the nonlinear, explicit transient dynamic code, LS-DYNA . In addition, a full-scale crash test of an MD-500 helicopter, retrofitted with DEA blocks, was simulated. The four material models used to represent the DEA included: *MAT_CRUSHABLE_FOAM (Mat 63), *MAT_HONEYCOMB (Mat 26), *MAT_SIMPLIFIED_RUBBER/FOAM (Mat 181), and *MAT_TRANSVERSELY_ANISOTROPIC_CRUSHABLE_FOAM (Mat 142). Test-analysis calibration metrics included simple percentage error comparisons of initial peak acceleration, sustained crush stress, and peak compaction acceleration of the DEA components. In addition, the Roadside Safety Verification and Validation Program (RSVVP) was used to assess similarities and differences between the experimental and analytical curves for the full-scale crash test.

  14. EMC MODEL FORECAST VERIFICATION STATS

    Science.gov Websites

    48-H FCST 54-H FCST 60-H FCST 72-H FCST 84-H FCST Loop 500 mb Height BIAS and RMSE CONUS VALID 00Z sub-regions) Surface Wind Vector BIAS and RMSE REGION VALID 00Z VALID 12Z VALID 00Z (loop) VALID 12Z (loop) GMC (Gulf of Mexico Coast) * * * * SEC (Southeast Coast) * * * * NEC (Northeast Coast

  15. Experimental verification of radial magnetic levitation force on the cylindrical magnets in ferrofluid dampers

    NASA Astrophysics Data System (ADS)

    Yang, Wenming; Wang, Pengkai; Hao, Ruican; Ma, Buchuan

    2017-03-01

    Analytical and numerical calculation methods of the radial magnetic levitation force on the cylindrical magnets in cylindrical vessels filled with ferrofluid was reviewed. An experimental apparatus to measure this force was designed and tailored, which could measure the forces in a range of 0-2.0 N with an accuracy of 0.001 N. After calibrated, this apparatus was used to study the radial magnetic levitation force experimentally. The results showed that the numerical method overestimates this force, while the analytical ones underestimate it. The maximum deviation between the numerical results and the experimental ones was 18.5%, while that between the experimental results with the analytical ones attained 68.5%. The latter deviation narrowed with the lengthening of the magnets. With the aids of the experimental verification of the radial magnetic levitation force, the effect of eccentric distance of magnets on the viscous energy dissipation in ferrofluid dampers could be assessed. It was shown that ignorance of the eccentricity of magnets during the estimation could overestimate the viscous dissipation in ferrofluid dampers.

  16. Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration

    NASA Technical Reports Server (NTRS)

    Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.

    1993-01-01

    Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.

  17. How to Find a Bug in Ten Thousand Lines Transport Solver? Outline of Experiences from AN Advection-Diffusion Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F.

    2011-12-01

    Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence study uncovered with the method of false injection and visualization of the results. Symmetry had dual functionality: there was a bug, which was hidden due to the symmetric nature of a test (it was detected afterward utilizing artificial false injection), on the other hand self-symmetry was used to design a new test, and in a case the analytical solution of the ADR equation was unknown. Assisting subroutines designed to check and post-process conservation of mass and oscillatory behavior. Finally, capability of the solver also checked for stiff reaction source term. The above test suite not only was a decent tool of error detection but also it provided a thorough feedback on the ADR solvers limitations. Such information is the crux of any rigorous numerical modeling for a modeler who deals with surface/subsurface pollution transport.

  18. 49 CFR 236.1017 - Independent third party verification and validation.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... validation. 236.1017 Section 236.1017 Transportation Other Regulations Relating to Transportation (Continued... validation. (a) The PTCSP must be supported by an independent third-party assessment when the Associate... request should include supporting information identified in paragraph (c) of this section. FRA may request...

  19. Analysis of QA procedures at the Oregon Department of Transportation.

    DOT National Transportation Integrated Search

    2010-06-01

    This research explored the Oregon Department of Transportation (ODOT) practice of Independent Assurance (IA), : for validation of the contractors test methods, and Verification, for validation of the contractors Quality Control : (QC) data. The...

  20. vvtools v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Drake, Richard R.

    Vvtools is a suite of testing tools, with a focus on reproducible verification and validation. They are written in pure Python, and contain a test harness and an automated process management tool. Users of vvtools can develop suites of verification and validation tests and run them on small to large high performance computing resources in an automated and reproducible way. The test harness enables complex processes to be performed in each test and even supports a one-level parent/child dependency between tests. It includes a built in capability to manage workloads requiring multiple processors and platforms that use batch queueing systems.

  1. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1989-01-01

    The results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems are given. A translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis were developed. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could affect the output from a set of rules.

  2. Expert system verification and validation survey, delivery 4

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  3. Performing Verification and Validation in Reuse-Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1999-01-01

    The implementation of reuse-based software engineering not only introduces new activities to the software development process, such as domain analysis and domain modeling, it also impacts other aspects of software engineering. Other areas of software engineering that are affected include Configuration Management, Testing, Quality Control, and Verification and Validation (V&V). Activities in each of these areas must be adapted to address the entire domain or product line rather than a specific application system. This paper discusses changes and enhancements to the V&V process, in order to adapt V&V to reuse-based software engineering.

  4. Expert system verification and validation survey. Delivery 2: Survey results

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and industry applications. This is the first task of the series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  5. Expert system verification and validation survey. Delivery 5: Revised

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  6. Expert system verification and validation survey. Delivery 3: Recommendations

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to determine the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of ESs.

  7. Successful MPPF Pneumatics Verification and Validation Testing

    NASA Image and Video Library

    2017-03-28

    Engineers and technicians completed verification and validation testing of several pneumatic systems inside and outside the Multi-Payload Processing Facility (MPPF) at NASA's Kennedy Space Center in Florida. In view is the service platform for Orion spacecraft processing. To the left are several pneumatic panels. The MPPF will be used for offline processing and fueling of the Orion spacecraft and service module stack before launch. Orion also will be de-serviced in the MPPF after a mission. The Ground Systems Development and Operations Program (GSDO) is overseeing upgrades to the facility. The Engineering Directorate led the recent pneumatic tests.

  8. OpenMP 4.5 Validation and Verification Suite

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pophale, Swaroop S; Bernholdt, David E; Hernandez, Oscar R

    2017-12-15

    OpenMP, a directive-based programming API, introduce directives for accelerator devices that programmers are starting to use more frequently in production codes. To make sure OpenMP directives work correctly across architectures, it is critical to have a mechanism that tests for an implementation's conformance to the OpenMP standard. This testing process can uncover ambiguities in the OpenMP specification, which helps compiler developers and users make a better use of the standard. We fill this gap through our validation and verification test suite that focuses on the offload directives available in OpenMP 4.5.

  9. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  10. Empirical testing of an analytical model predicting electrical isolation of photovoltaic models

    NASA Astrophysics Data System (ADS)

    Garcia, A., III; Minning, C. P.; Cuddihy, E. F.

    A major design requirement for photovoltaic modules is that the encapsulation system be capable of withstanding large DC potentials without electrical breakdown. Presented is a simple analytical model which can be used to estimate material thickness to meet this requirement for a candidate encapsulation system or to predict the breakdown voltage of an existing module design. A series of electrical tests to verify the model are described in detail. The results of these verification tests confirmed the utility of the analytical model for preliminary design of photovoltaic modules.

  11. Control Chart on Semi Analytical Weighting

    NASA Astrophysics Data System (ADS)

    Miranda, G. S.; Oliveira, C. C.; Silva, T. B. S. C.; Stellato, T. B.; Monteiro, L. R.; Marques, J. R.; Faustino, M. G.; Soares, S. M. V.; Ulrich, J. C.; Pires, M. A. F.; Cotrim, M. E. B.

    2018-03-01

    Semi-analytical balance verification intends to assess the balance performance using graphs that illustrate measurement dispersion, trough time, and to demonstrate measurements were performed in a reliable manner. This study presents internal quality control of a semi-analytical balance (GEHAKA BG400) using control charts. From 2013 to 2016, 2 weight standards were monitored before any balance operation. This work intended to evaluate if any significant difference or bias were presented on weighting procedure over time, to check the generated data reliability. This work also exemplifies how control intervals are established.

  12. Verification of MCNP6.2 for Nuclear Criticality Safety Applications

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Forrest B.; Rising, Michael Evan; Alwin, Jennifer Louise

    2017-05-10

    Several suites of verification/validation benchmark problems were run in early 2017 to verify that the new production release of MCNP6.2 performs correctly for nuclear criticality safety applications (NCS). MCNP6.2 results for several NCS validation suites were compared to the results from MCNP6.1 [1] and MCNP6.1.1 [2]. MCNP6.1 is the production version of MCNP® released in 2013, and MCNP6.1.1 is the update released in 2014. MCNP6.2 includes all of the standard features for NCS calculations that have been available for the past 15 years, along with new features for sensitivity-uncertainty based methods for NCS validation [3]. Results from the benchmark suitesmore » were compared with results from previous verification testing [4-8]. Criticality safety analysts should consider testing MCNP6.2 on their particular problems and validation suites. No further development of MCNP5 is planned. MCNP6.1 is now 4 years old, and MCNP6.1.1 is now 3 years old. In general, released versions of MCNP are supported only for about 5 years, due to resource limitations. All future MCNP improvements, bug fixes, user support, and new capabilities are targeted only to MCNP6.2 and beyond.« less

  13. Remote Sensing Product Verification and Validation at the NASA Stennis Space Center

    NASA Technical Reports Server (NTRS)

    Stanley, Thomas M.

    2005-01-01

    Remote sensing data product verification and validation (V&V) is critical to successful science research and applications development. People who use remote sensing products to make policy, economic, or scientific decisions require confidence in and an understanding of the products' characteristics to make informed decisions about the products' use. NASA data products of coarse to moderate spatial resolution are validated by NASA science teams. NASA's Stennis Space Center (SSC) serves as the science validation team lead for validating commercial data products of moderate to high spatial resolution. At SSC, the Applications Research Toolbox simulates sensors and targets, and the Instrument Validation Laboratory validates critical sensors. The SSC V&V Site consists of radiometric tarps, a network of ground control points, a water surface temperature sensor, an atmospheric measurement system, painted concrete radial target and edge targets, and other instrumentation. NASA's Applied Sciences Directorate participates in the Joint Agency Commercial Imagery Evaluation (JACIE) team formed by NASA, the U.S. Geological Survey, and the National Geospatial-Intelligence Agency to characterize commercial systems and imagery.

  14. IEEE/NASA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation

    NASA Technical Reports Server (NTRS)

    Margaria, Tiziana (Editor); Steffen, Bernhard (Editor); Hichey, Michael G.

    2005-01-01

    This volume contains the Preliminary Proceedings of the 2005 IEEE ISoLA Workshop on Leveraging Applications of Formal Methods, Verification, and Validation, with a special track on the theme of Formal Methods in Human and Robotic Space Exploration. The workshop was held on 23-24 September 2005 at the Loyola College Graduate Center, Columbia, MD, USA. The idea behind the Workshop arose from the experience and feedback of ISoLA 2004, the 1st International Symposium on Leveraging Applications of Formal Methods held in Paphos (Cyprus) last October-November. ISoLA 2004 served the need of providing a forum for developers, users, and researchers to discuss issues related to the adoption and use of rigorous tools and methods for the specification, analysis, verification, certification, construction, test, and maintenance of systems from the point of view of their different application domains.

  15. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  16. A new technique for measuring listening and reading literacy in developing countries

    NASA Astrophysics Data System (ADS)

    Greene, Barbara A.; Royer, James M.; Anzalone, Stephen

    1990-03-01

    One problem in evaluating educational interventions in developing countries is the absence of tests that adequately reflect the culture and curriculum. The Sentence Verification Technique is a new procedure for measuring reading and listening comprehension that allows for the development of tests based on materials indigenous to a given culture. The validity of using the Sentence Verification Technique to measure reading comprehension in Grenada was evaluated in the present study. The study involved 786 students at standards 3, 4 and 5. The tests for each standard consisted of passages that varied in difficulty. The students identified as high ability students in all three standards performed better than those identified as low ability. All students performed better with easier passages. Additionally, students in higher standards performed bettter than students in lower standards on a given passage. These results supported the claim that the Sentence Verification Technique is a valid measure of reading comprehension in Grenada.

  17. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  18. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in socialmore » media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.« less

  19. Impact of radiation attenuation by a carbon fiber couch on patient dose verification

    NASA Astrophysics Data System (ADS)

    Yu, Chun-Yen; Chou, Wen-Tsae; Liao, Yi-Jen; Lee, Jeng-Hung; Liang, Ji-An; Hsu, Shih-Ming

    2017-02-01

    The aim of this study was to understand the difference between the measured and calculated irradiation attenuations obtained using two algorithms and to identify the influence of couch attenuation on patient dose verification. We performed eight tests of couch attenuation with two photon energies, two longitudinal couch positions, and two rail positions. The couch attenuation was determined using a radiation treatment planning system. The measured and calculated attenuations were compared. We also performed 12 verifications of head-and-neck and rectum cases by using a Delta phantom. The dose deviation (DD), distance to agreement (DTA), and gamma index of pencil-beam convolution (PBC) verifications were nearly the same. The agreement was least consistent for the anisotropic analytical algorithm (AAA) without the couch for the head-and-neck case, in which the DD, DTA, and gamma index were 74.4%, 99.3%, and 89%, respectively; for the rectum case, the corresponding values were 56.2%, 95.1%, and 92.4%. We suggest that dose verification should be performed using the following three metrics simultaneously: DD, DTA, and the gamma index.

  20. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  1. Verification of the Multi-Axial, Temperature and Time Dependent (MATT) Failure Criterion

    NASA Technical Reports Server (NTRS)

    Richardson, David E.; Macon, David J.

    2005-01-01

    An extensive test and analytical effort has been completed by the Space Shuttle's Reusable Solid Rocket Motor (KSKM) nozzle program to characterize the failure behavior of two epoxy adhesives (TIGA 321 and EA946). As part of this effort, a general failure model, the "Multi-Axial, Temperature, and Time Dependent" or MATT failure criterion was developed. In the initial development of this failure criterion, tests were conducted to provide validation of the theory under a wide range of test conditions. The purpose of this paper is to present additional verification of the MATT failure criterion, under new loading conditions for the adhesives TIGA 321 and EA946. In many cases, the loading conditions involve an extrapolation from the conditions under which the material models were originally developed. Testing was conducted using three loading conditions: multi-axial tension, torsional shear, and non-uniform tension in a bondline condition. Tests were conducted at constant and cyclic loading rates ranging over four orders of magnitude. Tests were conducted under environmental conditions of primary interest to the RSRM program. The temperature range was not extreme, but the loading ranges were extreme (varying by four orders of magnitude). It should be noted that the testing was conducted at temperatures below the glass transition temperature of the TIGA 321 adhesive. However for the EA946, the testing was conducted at temperatures that bracketed the glass transition temperature.

  2. Numerical verification of bounce-harmonic resonances in neoclassical toroidal viscosity for tokamaks.

    PubMed

    Kim, Kimin; Park, Jong-Kyu; Boozer, Allen H

    2013-05-03

    This Letter presents the first numerical verification for the bounce-harmonic (BH) resonance phenomena of the neoclassical transport in a tokamak perturbed by nonaxisymmetric magnetic fields. The BH resonances were predicted by analytic theories of neoclassical toroidal viscosity (NTV), as the parallel and perpendicular drift motions can be resonant and result in a great enhancement of the radial momentum transport. A new drift-kinetic δf guiding-center particle code, POCA, clearly verified that the perpendicular drift motions can reduce the transport by phase-mixing, but in the BH resonances the motions can form closed orbits and particles radially drift out fast. The POCA calculations on resulting NTV torque are largely consistent with analytic calculations, and show that the BH resonances can easily dominate the NTV torque when a plasma rotates in the perturbed tokamak and therefore, is a critical physics for predicting the rotation and stability in the International Thermonuclear Experimental Reactor.

  3. A new sensitive method of dissociation constants determination based on the isohydric solutions principle.

    PubMed

    Michałowski, Tadeusz; Pilarski, Bogusław; Asuero, Agustin G; Dobkowska, Agnieszka

    2010-10-15

    The paper provides a new formulation and analytical proposals based on the isohydric solutions concept. It is particularly stated that a mixture formed, according to titrimetric mode, from a weak acid (HX, C(0)mol/L) and a strong acid (HB, Cmol/L) solutions, assumes constant pH, independently on the volumes of the solutions mixed, provided that the relation C(0)=C+C(2)·10(pK(1)) is valid, where pK(1)=-log K(1), K(1) the dissociation constant for HX. The generalized formulation, referred to the isohydric solutions thus obtained, was extended also to more complex acid-base systems. Particularly in the (HX, HB) system, the titration occurs at constant ionic strength (I) value, not resulting from presence of a basal electrolyte. This very advantageous conjunction of the properties provides, among others, a new, very sensitive method for verification of pK(1) value. The new method is particularly useful for weak acids HX characterized by low pK(1) values. The method was tested experimentally on four acid-base systems (HX, HB), in aqueous and mixed-solvent media and compared with the literature data. Some useful (linear and hyperbolic) correlations were stated and applied for validation of pK(1) values. Finally, some practical applications of analytical interest of the isohydricity (pH constancy) principle as one formulated in this paper were enumerated, proving the usefulness of such a property which has its remote roots in the Arrhenius concept. Copyright © 2010 Elsevier B.V. All rights reserved.

  4. Experimental verification of a new laminar airfoil: A project for the graduate program in aeronautics

    NASA Technical Reports Server (NTRS)

    Nicks, Oran W.; Korkan, Kenneth D.

    1991-01-01

    Two reports on student activities to determine the properties of a new laminar airfoil which were delivered at a conference on soaring technology are presented. The papers discuss a wind tunnel investigation and analysis of the SM701 airfoil and verification of the SM701 airfoil aerodynamic charcteristics utilizing theoretical techniques. The papers are based on a combination of analytical design, hands-on model fabrication, wind tunnel calibration and testing, data acquisition and analysis, and comparison of test results and theory.

  5. Comparative study of the swabbing properties of seven commercially available swab materials for cleaning verification.

    PubMed

    Corrigan, Damion K; Piletsky, Sergey; McCrossen, Sean

    2009-01-01

    This article compares the technical performances of several different commercially available swabbing materials for the purpose of cleaning verification. A steel surface was soiled with solutions of acetaminophen, nicotinic acid, diclofenac, and benzamidine and wiped with each swabbing material. The compounds were extracted with water or ethanol (depending on polarity of analyte) and their concentration in extract was quantified spectrophotometrically. The study also investigated swab debris on the wiped surface. The swab performances were compared and the best swab material was identified.

  6. Carbon-carbon primary structure for SSTO vehicles

    NASA Astrophysics Data System (ADS)

    Croop, Harold C.; Lowndes, Holland B.

    1997-01-01

    A hot structures development program is nearing completion to validate use of carbon-carbon composite structure for primary load carrying members in a single-stage-to-orbit, or SSTO, vehicle. A four phase program was pursued which involved design development and fabrication of a full-scale wing torque box demonstration component. The design development included vehicle and component selection, design criteria and approach, design data development, demonstration component design and analysis, test fixture design and analysis, demonstration component test planning, and high temperature test instrumentation development. The fabrication effort encompassed fabrication of structural elements for mechanical property verification as well as fabrication of the demonstration component itself and associated test fixturing. The demonstration component features 3D woven graphite preforms, integral spars, oxidation inhibited matrix, chemical vapor deposited (CVD) SiC oxidation protection coating, and ceramic matrix composite fasteners. The demonstration component has been delivered to the United States Air Force (USAF) for testing in the Wright Laboratory Structural Test Facility, WPAFB, OH. Multiple thermal-mechanical load cycles will be applied simulating two atmospheric cruise missions and one orbital mission. This paper discusses the overall approach to validation testing of the wing box component and presents some preliminary analytical test predictions.

  7. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    The Propulsion Flight Test Fixture at the NASA Dryden Flight Research Center is a unique test platform available for use on NASA's F-15B aircraft, tail number 836, as a modular host for a variety of aerodynamics and propulsion research. For future flight data from this platform to be valid, more information must be gathered concerning the quality of the airflow underneath the body of the F-15B at various flight conditions, especially supersonic conditions. The flow angularity and Mach number must be known at multiple locations on any test article interface plane for measurement data at these locations to be valid. To determine this prerequisite information, flight data will be gathered in the Rake Airflow Gauge Experiment using a custom-designed flowfield rake to probe the airflow underneath the F-15B at the desired flight conditions. This paper addresses the design considerations of the rake and probe assembly, including the loads and stress analysis using analytical methods, computational fluid dynamics, and finite element analysis. It also details the flow calibration procedure, including the completed wind-tunnel test and posttest data reduction, calibration verification, and preparation for flight-testing.

  8. Multiple imputation as one tool to provide longitudinal databases for modelling human height and weight development.

    PubMed

    Aßmann, C

    2016-06-01

    Besides large efforts regarding field work, provision of valid databases requires statistical and informational infrastructure to enable long-term access to longitudinal data sets on height, weight and related issues. To foster use of longitudinal data sets within the scientific community, provision of valid databases has to address data-protection regulations. It is, therefore, of major importance to hinder identifiability of individuals from publicly available databases. To reach this goal, one possible strategy is to provide a synthetic database to the public allowing for pretesting strategies for data analysis. The synthetic databases can be established using multiple imputation tools. Given the approval of the strategy, verification is based on the original data. Multiple imputation by chained equations is illustrated to facilitate provision of synthetic databases as it allows for capturing a wide range of statistical interdependencies. Also missing values, typically occurring within longitudinal databases for reasons of item non-response, can be addressed via multiple imputation when providing databases. The provision of synthetic databases using multiple imputation techniques is one possible strategy to ensure data protection, increase visibility of longitudinal databases and enhance the analytical potential.

  9. LIVVkit 2: An extensible land ice verification and validation toolkit for comparing observations and models?

    NASA Astrophysics Data System (ADS)

    Kennedy, J. H.; Bennett, A. R.; Evans, K. J.; Fyke, J. G.; Vargo, L.; Price, S. F.; Hoffman, M. J.

    2016-12-01

    Accurate representation of ice sheets and glaciers are essential for robust predictions of arctic climate within Earth System models. Verification and Validation (V&V) is a set of techniques used to quantify the correctness and accuracy of a model, which builds developer/modeler confidence, and can be used to enhance the credibility of the model. Fundamentally, V&V is a continuous process because each model change requires a new round of V&V testing. The Community Ice Sheet Model (CISM) development community is actively developing LIVVkit, the Land Ice Verification and Validation toolkit, which is designed to easily integrate into an ice-sheet model's development workflow (on both personal and high-performance computers) to provide continuous V&V testing.LIVVkit is a robust and extensible python package for V&V, which has components for both software V&V (construction and use) and model V&V (mathematics and physics). The model Verification component is used, for example, to verify model results against community intercomparisons such as ISMIP-HOM. The model validation component is used, for example, to generate a series of diagnostic plots showing the differences between model results against observations for variables such as thickness, surface elevation, basal topography, surface velocity, surface mass balance, etc. Because many different ice-sheet models are under active development, new validation datasets are becoming available, and new methods of analysing these models are actively being researched, LIVVkit includes a framework to easily extend the model V&V analyses by ice-sheet modelers. This allows modelers and developers to develop evaluations of parameters, implement changes, and quickly see how those changes effect the ice-sheet model and earth system model (when coupled). Furthermore, LIVVkit outputs a portable hierarchical website allowing evaluations to be easily shared, published, and analysed throughout the arctic and Earth system communities.

  10. Quantitative determination and sampling of azathioprine residues for cleaning validation in production area.

    PubMed

    Fazio, Tatiana Tatit; Singh, Anil Kumar; Kedor-Hackmann, Erika Rosa Maria; Santoro, Maria Inês Rocha Miritello

    2007-03-12

    Cleaning validation is an integral part of current good manufacturing practices in any pharmaceutical industry. Nowadays, azathioprine and several other pharmacologically potent pharmaceuticals are manufactured in same production area. Carefully designed cleaning validation and its evaluation can ensure that residues of azathioprine will not carry over and cross contaminate the subsequent product. The aim of this study was to validate simple analytical method for verification of residual azathioprine in equipments used in the production area and to confirm efficiency of cleaning procedure. The HPLC method was validated on a LC system using Nova-Pak C18 (3.9 mm x 150 mm, 4 microm) and methanol-water-acetic acid (20:80:1, v/v/v) as mobile phase at a flow rate of 1.0 mL min(-1). UV detection was made at 280 nm. The calibration curve was linear over a concentration range from 2.0 to 22.0 microg mL(-1) with a correlation coefficient of 0.9998. The detection limit (DL) and quantitation limit (QL) were 0.09 and 0.29 microg mL(-1), respectively. The intra-day and inter-day precision expressed as relative standard deviation (R.S.D.) were below 2.0%. The mean recovery of method was 99.19%. The mean extraction-recovery from manufacturing equipments was 83.5%. The developed UV spectrophotometric method could only be used as limit method to qualify or reject cleaning procedure in production area. Nevertheless, the simplicity of spectrophotometric method makes it useful for routine analysis of azathioprine residues on cleaned surface and as an alternative to proposed HPLC method.

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - 4100 VAPOR DETECTOR - ELECTRONIC SENSOR TECHNOLOGY

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency conducted a demonstration of polychlorinated biphenyl (PCB) FIELD ANALYTICAL TECHNIQUES. The demonstration design was subjected to extensive review and comment by EPA's National Exposure Research Laboratory (NERL) Environmen...

  12. 2007 Beyond SBIR Phase II: Bringing Technology Edge to the Warfighter

    DTIC Science & Technology

    2007-08-23

    Systems Trade-Off Analysis and Optimization Verification and Validation On-Board Diagnostics and Self - healing Security and Anti-Tampering Rapid...verification; Safety and reliability analysis of flight and mission critical systems On-Board Diagnostics and Self - Healing Model-based monitoring and... self - healing On-board diagnostics and self - healing ; Autonomic computing; Network intrusion detection and prevention Anti-Tampering and Trust

  13. Applying Formal Verification Techniques to Ambient Assisted Living Systems

    NASA Astrophysics Data System (ADS)

    Benghazi, Kawtar; Visitación Hurtado, María; Rodríguez, María Luisa; Noguera, Manuel

    This paper presents a verification approach based on timed traces semantics and MEDISTAM-RT [1] to check the fulfillment of non-functional requirements, such as timeliness and safety, and assure the correct functioning of the Ambient Assisted Living (AAL) systems. We validate this approach by its application to an Emergency Assistance System for monitoring people suffering from cardiac alteration with syncope.

  14. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    NASA Technical Reports Server (NTRS)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  15. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1988-01-01

    This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.

  16. Expert system verification and validation study. Phase 2: Requirements identification. Delivery 1: Updated survey report

    NASA Technical Reports Server (NTRS)

    1990-01-01

    The purpose is to report the state-of-the-practice in Verification and Validation (V and V) of Expert Systems (ESs) on current NASA and Industry applications. This is the first task of a series which has the ultimate purpose of ensuring that adequate ES V and V tools and techniques are available for Space Station Knowledge Based Systems development. The strategy for determining the state-of-the-practice is to check how well each of the known ES V and V issues are being addressed and to what extent they have impacted the development of Expert Systems.

  17. JacketSE: An Offshore Wind Turbine Jacket Sizing Tool; Theory Manual and Sample Usage with Preliminary Validation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Damiani, Rick

    This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.

  18. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  19. Managing Complexity in the MSL/Curiosity Entry, Descent, and Landing Flight Software and Avionics Verification and Validation Campaign

    NASA Technical Reports Server (NTRS)

    Stehura, Aaron; Rozek, Matthew

    2013-01-01

    The complexity of the Mars Science Laboratory (MSL) mission presented the Entry, Descent, and Landing systems engineering team with many challenges in its Verification and Validation (V&V) campaign. This paper describes some of the logistical hurdles related to managing a complex set of requirements, test venues, test objectives, and analysis products in the implementation of a specific portion of the overall V&V program to test the interaction of flight software with the MSL avionics suite. Application-specific solutions to these problems are presented herein, which can be generalized to other space missions and to similar formidable systems engineering problems.

  20. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  1. Strategies for Validation Testing of Ground Systems

    NASA Technical Reports Server (NTRS)

    Annis, Tammy; Sowards, Stephanie

    2009-01-01

    In order to accomplish the full Vision for Space Exploration announced by former President George W. Bush in 2004, NASA will have to develop a new space transportation system and supporting infrastructure. The main portion of this supporting infrastructure will reside at the Kennedy Space Center (KSC) in Florida and will either be newly developed or a modification of existing vehicle processing and launch facilities, including Ground Support Equipment (GSE). This type of large-scale launch site development is unprecedented since the time of the Apollo Program. In order to accomplish this successfully within the limited budget and schedule constraints a combination of traditional and innovative strategies for Verification and Validation (V&V) have been developed. The core of these strategies consists of a building-block approach to V&V, starting with component V&V and ending with a comprehensive end-to-end validation test of the complete launch site, called a Ground Element Integration Test (GEIT). This paper will outline these strategies and provide the high level planning for meeting the challenges of implementing V&V on a large-scale development program. KEY WORDS: Systems, Elements, Subsystem, Integration Test, Ground Systems, Ground Support Equipment, Component, End Item, Test and Verification Requirements (TVR), Verification Requirements (VR)

  2. Validation of an in-vivo proton beam range check method in an anthropomorphic pelvic phantom using dose measurements

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bentefour, El H., E-mail: hassan.bentefour@iba-group.com; Prieels, Damien; Tang, Shikui

    Purpose: In-vivo dosimetry and beam range verification in proton therapy could play significant role in proton treatment validation and improvements. In-vivo beam range verification, in particular, could enable new treatment techniques one of which could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. This paper reports validation study of an in-vivo range verification method which can reduce the range uncertainty to submillimeter levels and potentially allow for in-vivo dosimetry. Methods: An anthropomorphic pelvic phantom is used to validate the clinical potential of the time-resolved dose method for range verification inmore » the case of prostrate treatment using range modulated anterior proton beams. The method uses a 3 × 4 matrix of 1 mm diodes mounted in water balloon which are read by an ADC system at 100 kHz. The method is first validated against beam range measurements by dose extinction measurements. The validation is first completed in water phantom and then in pelvic phantom for both open field and treatment field configurations. Later, the beam range results are compared with the water equivalent path length (WEPL) values computed from the treatment planning system XIO. Results: Beam range measurements from both time-resolved dose method and the dose extinction method agree with submillimeter precision in water phantom. For the pelvic phantom, when discarding two of the diodes that show sign of significant range mixing, the two methods agree with ±1 mm. Only a dose of 7 mGy is sufficient to achieve this result. The comparison to the computed WEPL by the treatment planning system (XIO) shows that XIO underestimates the protons beam range. Quantifying the exact XIO range underestimation depends on the strategy used to evaluate the WEPL results. To our best evaluation, XIO underestimates the treatment beam range between a minimum of 1.7% and maximum of 4.1%. Conclusions: Time-resolved dose measurement method satisfies the two basic requirements, WEPL accuracy and minimum dose, necessary for clinical use, thus, its potential for in-vivo protons range verification. Further development is needed, namely, devising a workflow that takes into account the limits imposed by proton range mixing and the susceptibility of the comparison of measured and expected WEPLs to errors on the detector positions. The methods may also be used for in-vivo dosimetry and could benefit various proton therapy treatments.« less

  3. Validity of a self-administered food frequency questionnaire in the estimation of heterocyclic aromatic amines.

    PubMed

    Iwasaki, Motoki; Mukai, Tomomi; Takachi, Ribeka; Ishihara, Junko; Totsuka, Yukari; Tsugane, Shoichiro

    2014-08-01

    Clarification of the putative etiologic role of heterocyclic aromatic amines (HAAs) in the development of cancer requires a validated assessment tool for dietary HAAs. This study primarily aimed to evaluate the validity of a food frequency questionnaire (FFQ) in estimating HAA intake, using 2-amino-1-methyl-6-phenylimidazo[4,5-b]pyridine (PhIP) level in human hair as the reference method. We first updated analytical methods of PhIP using liquid chromatography-electrospray ionization/tandem mass spectrometry (LC-ESI/MS/MS) and measured 44 fur samples from nine rats from a feeding study as part-verification of the quantitative performance of LC-ESI/MS/MS. We next measured PhIP level in human hair samples from a validation study of the FFQ (n = 65). HAA intake from the FFQ was estimated using information on intake from six fish items and seven meat items and data on HAA content in each food item. Correlation coefficients between PhIP level in human hair and HAA intake from the FFQ were calculated. The animal feeding study of PhIP found a significant dose-response relationship between dosage and PhIP in rat fur. Mean level was 53.8 pg/g hair among subjects with values over the limit of detection (LOD) (n = 57). We found significant positive correlation coefficients between PhIP in human hair and HAA intake from the FFQ, with Spearman rank correlation coefficients of 0.35 for all subjects, 0.21 for subjects with over LOD values, and 0.34 for subjects with over limit of quantification. Findings from the validation study suggest that the FFQ is reasonably valid for the assessment of HAA intake.

  4. Solution of the advection-dispersion equation: Continuous load of finite duration

    USGS Publications Warehouse

    Runkel, R.L.

    1996-01-01

    Field studies of solute fate and transport in streams and rivers often involve an. experimental release of solutes at an upstream boundary for a finite period of time. A review of several standard references on surface-water-quality modeling indicates that the analytical solution to the constant-parameter advection-dispersion equation for this type of boundary condition has been generally overlooked. Here an exact analytical solution that considers a continuous load of unite duration is compared to an approximate analytical solution presented elsewhere. Results indicate that the exact analytical solution should be used for verification of numerical solutions and other solute-transport problems wherein a high level of accuracy is required. ?? ASCE.

  5. Verification of a magnetic island in gyro-kinetics by comparison with analytic theory

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zarzoso, D., E-mail: david.zarzoso-fernandez@polytechnique.org; Casson, F. J.; Poli, E.

    A rotating magnetic island is imposed in the gyrokinetic code GKW, when finite differences are used for the radial direction, in order to develop the predictions of analytic tearing mode theory and understand its limitations. The implementation is verified against analytics in sheared slab geometry with three numerical tests that are suggested as benchmark cases for every code that imposes a magnetic island. The convergence requirements to properly resolve physics around the island separatrix are investigated. In the slab geometry, at low magnetic shear, binormal flows inside the island can drive Kelvin-Helmholtz instabilities which prevent the formation of the steadymore » state for which the analytic theory is formulated.« less

  6. Overview of open resources to support automated structure verification and elucidation

    EPA Science Inventory

    Cheminformatics methods form an essential basis for providing analytical scientists with access to data, algorithms and workflows. There are an increasing number of free online databases (compound databases, spectral libraries, data repositories) and a rich collection of software...

  7. EOS-AM precision pointing verification

    NASA Technical Reports Server (NTRS)

    Throckmorton, A.; Braknis, E.; Bolek, J.

    1993-01-01

    The Earth Observing System (EOS) AM mission requires tight pointing knowledge to meet scientific objectives, in a spacecraft with low frequency flexible appendage modes. As the spacecraft controller reacts to various disturbance sources and as the inherent appendage modes are excited by this control action, verification of precision pointing knowledge becomes particularly challenging for the EOS-AM mission. As presently conceived, this verification includes a complementary set of multi-disciplinary analyses, hardware tests and real-time computer in the loop simulations, followed by collection and analysis of hardware test and flight data and supported by a comprehensive data base repository for validated program values.

  8. Verification of Decision-Analytic Models for Health Economic Evaluations: An Overview.

    PubMed

    Dasbach, Erik J; Elbasha, Elamin H

    2017-07-01

    Decision-analytic models for cost-effectiveness analysis are developed in a variety of software packages where the accuracy of the computer code is seldom verified. Although modeling guidelines recommend using state-of-the-art quality assurance and control methods for software engineering to verify models, the fields of pharmacoeconomics and health technology assessment (HTA) have yet to establish and adopt guidance on how to verify health and economic models. The objective of this paper is to introduce to our field the variety of methods the software engineering field uses to verify that software performs as expected. We identify how many of these methods can be incorporated in the development process of decision-analytic models in order to reduce errors and increase transparency. Given the breadth of methods used in software engineering, we recommend a more in-depth initiative to be undertaken (e.g., by an ISPOR-SMDM Task Force) to define the best practices for model verification in our field and to accelerate adoption. Establishing a general guidance for verifying models will benefit the pharmacoeconomics and HTA communities by increasing accuracy of computer programming, transparency, accessibility, sharing, understandability, and trust of models.

  9. Empirical insights and considerations for the OBT inter-laboratory comparison of environmental samples.

    PubMed

    Kim, Sang-Bog; Roche, Jennifer

    2013-08-01

    Organically bound tritium (OBT) is an important tritium species that can be measured in most environmental samples, but has only recently been recognized as a species of tritium in these samples. Currently, OBT is not routinely measured by environmental monitoring laboratories around the world. There are no certified reference materials (CRMs) for environmental samples. Thus, quality assurance (QA), or verification of the accuracy of the OBT measurement, is not possible. Alternatively, quality control (QC), or verification of the precision of the OBT measurement, can be achieved. In the past, there have been differences in OBT analysis results between environmental laboratories. A possible reason for the discrepancies may be differences in analytical methods. Therefore, inter-laboratory OBT comparisons among the environmental laboratories are important and would provide a good opportunity for adopting a reference OBT analytical procedure. Due to the analytical issues, only limited information is available on OBT measurement. Previously conducted OBT inter-laboratory practices are reviewed and the findings are described. Based on our experiences, a few considerations were suggested for the international OBT inter-laboratory comparison exercise to be completed in the near future. Crown Copyright © 2013. Published by Elsevier Ltd. All rights reserved.

  10. The Hyper-X Flight Systems Validation Program

    NASA Technical Reports Server (NTRS)

    Redifer, Matthew; Lin, Yohan; Bessent, Courtney Amos; Barklow, Carole

    2007-01-01

    For the Hyper-X/X-43A program, the development of a comprehensive validation test plan played an integral part in the success of the mission. The goal was to demonstrate hypersonic propulsion technologies by flight testing an airframe-integrated scramjet engine. Preparation for flight involved both verification and validation testing. By definition, verification is the process of assuring that the product meets design requirements; whereas validation is the process of assuring that the design meets mission requirements for the intended environment. This report presents an overview of the program with emphasis on the validation efforts. It includes topics such as hardware-in-the-loop, failure modes and effects, aircraft-in-the-loop, plugs-out, power characterization, antenna pattern, integration, combined systems, captive carry, and flight testing. Where applicable, test results are also discussed. The report provides a brief description of the flight systems onboard the X-43A research vehicle and an introduction to the ground support equipment required to execute the validation plan. The intent is to provide validation concepts that are applicable to current, follow-on, and next generation vehicles that share the hybrid spacecraft and aircraft characteristics of the Hyper-X vehicle.

  11. Exploring phlebotomy technique as a pre-analytical factor in proteomic analyses by mass spectrometry.

    PubMed

    Penn, Andrew M; Lu, Linghong; Chambers, Andrew G; Balshaw, Robert F; Morrison, Jaclyn L; Votova, Kristine; Wood, Eileen; Smith, Derek S; Lesperance, Maria; del Zoppo, Gregory J; Borchers, Christoph H

    2015-12-01

    Multiple reaction monitoring mass spectrometry (MRM-MS) is an emerging technology for blood biomarker verification and validation; however, the results may be influenced by pre-analytical factors. This exploratory study was designed to determine if differences in phlebotomy techniques would significantly affect the abundance of plasma proteins in an upcoming biomarker development study. Blood was drawn from 10 healthy participants using four techniques: (1) a 20-gauge IV with vacutainer, (2) a 21-gauge direct vacutainer, (3) an 18-gauge butterfly with vacutainer, and (4) an 18-gauge butterfly with syringe draw. The abundances of a panel of 122 proteins (117 proteins, plus 5 matrix metalloproteinase (MMP) proteins) were targeted by LC/MRM-MS. In addition, complete blood count (CBC) data were also compared across the four techniques. Phlebotomy technique significantly affected 2 of the 11 CBC parameters (red blood cell count, p = 0.010; hemoglobin concentration, p = 0.035) and only 12 of the targeted 117 proteins (p < 0.05). Of the five MMP proteins, only MMP7 was detectable and its concentration was not significantly affected by different techniques. Overall, most proteins in this exploratory study were not significantly influenced by phlebotomy technique; however, a larger study with additional patients will be required for confirmation.

  12. Mathematical Capture of Human Crowd Behavioral Data for Computational Model Building, Verification, and Validation

    DTIC Science & Technology

    2011-03-21

    throughout the experimental runs. Reliable and validated measures of anxiety ( Spielberger , 1983), as well as custom-constructed questionnaires about...Crowd modeling and simulation technologies. Transactions on modeling and computer simulation, 20(4). Spielberger , C. D. (1983

  13. High-Temperature Fluid-Wall Reactor Technology Research, Test and Evaluation Performed at Naval Construction Battalion Center, Gulfport, MS, for the USAF Installation/Restoration Program

    DTIC Science & Technology

    1988-01-01

    under field conditions. Sampling and analytical laboratory activities were performed by Ecology and Environment, Inc., and California Analytical...the proposed AER3 test conditions. All test samples would be obtained onsite by Ecology and Environment, Inc., of Buffalo, New York, and sent to...ensuring its safe operation. Ecology and Environment performed onsite verification sampling. This activity was coordinated with the Huber project team

  14. An analytical model of SAGD process considering the effect of threshold pressure gradient

    NASA Astrophysics Data System (ADS)

    Morozov, P.; Abdullin, A.; Khairullin, M.

    2018-05-01

    An analytical model is proposed for the development of super-viscous oil deposits by the method of steam-assisted gravity drainage, taking into account the nonlinear filtration law with the limiting gradient. The influence of non-Newtonian properties of oil on the productivity of a horizontal well and the cumulative steam-oil ratio are studied. Verification of the proposed model based on the results of physical modeling of the SAGD process was carried out.

  15. The specification-based validation of reliable multicast protocol: Problem Report. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1995-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP multicasting. In this report, we develop formal models for RMP using existing automated verification systems, and perform validation on the formal RMP specifications. The validation analysis help identifies some minor specification and design problems. We also use the formal models of RMP to generate a test suite for conformance testing of the implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress of implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  16. Analytical procedure validation and the quality by design paradigm.

    PubMed

    Rozet, Eric; Lebrun, Pierre; Michiels, Jean-François; Sondag, Perceval; Scherder, Tara; Boulanger, Bruno

    2015-01-01

    Since the adoption of the ICH Q8 document concerning the development of pharmaceutical processes following a quality by design (QbD) approach, there have been many discussions on the opportunity for analytical procedure developments to follow a similar approach. While development and optimization of analytical procedure following QbD principles have been largely discussed and described, the place of analytical procedure validation in this framework has not been clarified. This article aims at showing that analytical procedure validation is fully integrated into the QbD paradigm and is an essential step in developing analytical procedures that are effectively fit for purpose. Adequate statistical methodologies have also their role to play: such as design of experiments, statistical modeling, and probabilistic statements. The outcome of analytical procedure validation is also an analytical procedure design space, and from it, control strategy can be set.

  17. A Design Rationale Capture Tool to Support Design Verification and Re-use

    NASA Technical Reports Server (NTRS)

    Hooey, Becky Lee; Da Silva, Jonny C.; Foyle, David C.

    2012-01-01

    A design rationale tool (DR tool) was developed to capture design knowledge to support design verification and design knowledge re-use. The design rationale tool captures design drivers and requirements, and documents the design solution including: intent (why it is included in the overall design); features (why it is designed the way it is); information about how the design components support design drivers and requirements; and, design alternatives considered but rejected. For design verification purposes, the tool identifies how specific design requirements were met and instantiated within the final design, and which requirements have not been met. To support design re-use, the tool identifies which design decisions are affected when design drivers and requirements are modified. To validate the design tool, the design knowledge from the Taxiway Navigation and Situation Awareness (T-NASA; Foyle et al., 1996) system was captured and the DR tool was exercised to demonstrate its utility for validation and re-use.

  18. Constrained structural dynamic model verification using free vehicle suspension testing methods

    NASA Technical Reports Server (NTRS)

    Blair, Mark A.; Vadlamudi, Nagarjuna

    1988-01-01

    Verification of the validity of a spacecraft's structural dynamic math model used in computing ascent (or in the case of the STS, ascent and landing) loads is mandatory. This verification process requires that tests be carried out on both the payload and the math model such that the ensuing correlation may validate the flight loads calculations. To properly achieve this goal, the tests should be performed with the payload in the launch constraint (i.e., held fixed at only the payload-booster interface DOFs). The practical achievement of this set of boundary conditions is quite difficult, especially with larger payloads, such as the 12-ton Hubble Space Telescope. The development of equations in the paper will show that by exciting the payload at its booster interface while it is suspended in the 'free-free' state, a set of transfer functions can be produced that will have minima that are directly related to the fundamental modes of the payload when it is constrained in its launch configuration.

  19. Verification and Validation of KBS with Neural Network Components

    NASA Technical Reports Server (NTRS)

    Wen, Wu; Callahan, John

    1996-01-01

    Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.

  20. Vacuum decay container closure integrity leak test method development and validation for a lyophilized product-package system.

    PubMed

    Patel, Jayshree; Mulhall, Brian; Wolf, Heinz; Klohr, Steven; Guazzo, Dana Morton

    2011-01-01

    A leak test performed according to ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method was developed and validated for container-closure integrity verification of a lyophilized product in a parenteral vial package system. This nondestructive leak test method is intended for use in manufacturing as an in-process package integrity check, and for testing product stored on stability in lieu of sterility tests. Method development and optimization challenge studies incorporated artificially defective packages representing a range of glass vial wall and sealing surface defects, as well as various elastomeric stopper defects. Method validation required 3 days of random-order replicate testing of a test sample population of negative-control, no-defect packages and positive-control, with-defect packages. Positive-control packages were prepared using vials each with a single hole laser-drilled through the glass vial wall. Hole creation and hole size certification was performed by Lenox Laser. Validation study results successfully demonstrated the vacuum decay leak test method's ability to accurately and reliably detect those packages with laser-drilled holes greater than or equal to approximately 5 μm in nominal diameter. All development and validation studies were performed at Whitehouse Analytical Laboratories in Whitehouse, NJ, under the direction of consultant Dana Guazzo of RxPax, LLC, using a VeriPac 455 Micro Leak Test System by Packaging Technologies & Inspection (Tuckahoe, NY). Bristol Myers Squibb (New Brunswick, NJ) fully subsidized all work. A leak test performed according to ASTM F2338-09 Standard Test Method for Nondestructive Detection of Leaks in Packages by Vacuum Decay Method was developed and validated to detect defects in stoppered vial packages containing lyophilized product for injection. This nondestructive leak test method is intended for use in manufacturing as an in-process package integrity check, and for testing product stored on stability in lieu of sterility tests. Test method validation study results proved the method capable of detecting holes laser-drilled through the glass vial wall greater than or equal to 5 μm in nominal diameter. Total test time is less than 1 min per package. All method development and validation studies were performed at Whitehouse Analytical Laboratories in Whitehouse, NJ, under the direction of consultant Dana Guazzo of RxPax, LLC, using a VeriPac 455 Micro Leak Test System by Packaging Technologies & Inspection (Tuckahoe, NY). Bristol Myers Squibb (New Brunswick, NJ) fully subsidized all work.

  1. Energy Information Systems

    Science.gov Websites

    Energy Analytics Campaign > 2014-2018 Assessment of Automated M&V Methods > 2012-2018 Better Assessment of automated measurement and verification methods. Granderson, J. et al. Lawrence Berkeley . PDF, 726 KB Performance Metrics and Objective Testing Methods for Energy Baseline Modeling Software

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - RAPID ASSAY SYSTEM FOR PCB ANALYSIS - STRATEGIC DIAGNOSTICS INC.

    EPA Science Inventory

    In July 1997, the U.S. Environmental Protection Agency (EPA) conducted a demonstration of polychlorinated biphenyl (PCB) field analytical techniques. The demonstration design was subjected to extensive review and comment by EPA's National Exposure Research Laboratory (NERL) Envi...

  3. Applications of a hologram watermarking protocol: aging-aware biometric signature verification and time validity check with personal documents

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Croce Ferri, Lucilla

    2003-06-01

    Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.

  4. A preliminary study on the use of FX-Glycine gel and an in-house optical cone beam CT readout for IMRT and RapidArc verification

    NASA Astrophysics Data System (ADS)

    Ravindran, Paul B.; Ebenezer, Suman Babu S.; Winfred, Michael Raj; Amalan, S.

    2017-05-01

    The radiochromic FX gel with Optical CT readout has been investigated by several authors and has shown promising results for 3D dosimetry. One of the applications of the gel dosimeters is their use in 3D dose verification for IMRT and RapidArc quality assurance. Though polymer gel has been used successfully for clinical dose verification, the use of FX gel for clinical dose verification with optical cone beam CT needs further validation. In this work, we have used FX gel and an in- house optical readout system for gamma analysis between the dose matrices of measured dose distribution and a treatment planning system (TPS) calculated dose distribution for a few test cases.

  5. An elementary tutorial on formal specification and verification using PVS

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.

    1993-01-01

    A tutorial on the development of a formal specification and its verification using the Prototype Verification System (PVS) is presented. The tutorial presents the formal specification and verification techniques by way of specific example - an airline reservation system. The airline reservation system is modeled as a simple state machine with two basic operations. These operations are shown to preserve a state invariant using the theorem proving capabilities of PVS. The technique of validating a specification via 'putative theorem proving' is also discussed and illustrated in detail. This paper is intended for the novice and assumes only some of the basic concepts of logic. A complete description of user inputs and the PVS output is provided and thus it can be effectively used while one is sitting at a computer terminal.

  6. Information verification and encryption based on phase retrieval with sparsity constraints and optical inference

    NASA Astrophysics Data System (ADS)

    Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang

    2017-01-01

    A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.

  7. Construct Validation of Analytic Rating Scales in a Speaking Assessment: Reporting a Score Profile and a Composite

    ERIC Educational Resources Information Center

    Sawaki, Yasuyo

    2007-01-01

    This is a construct validation study of a second language speaking assessment that reported a language profile based on analytic rating scales and a composite score. The study addressed three key issues: score dependability, convergent/discriminant validity of analytic rating scales and the weighting of analytic ratings in the composite score.…

  8. Ionospheric Modeling: Development, Verification and Validation

    DTIC Science & Technology

    2005-09-01

    facilitate the automated processing of a large network of GPS receiver data. 4.; CALIBRATION AND VALIDATION OF IONOSPHERIC SENSORS We have been...NOFS Workshop, Estes Park, CO, January 2005. W. Rideout, A. Coster, P. Doherty, MIT Haystack Automated Processing of GPS Data to Produce Worldwide TEC

  9. Model-based verification and validation of the SMAP uplink processes

    NASA Astrophysics Data System (ADS)

    Khan, M. O.; Dubos, G. F.; Tirona, J.; Standley, S.

    Model-Based Systems Engineering (MBSE) is being used increasingly within the spacecraft design community because of its benefits when compared to document-based approaches. As the complexity of projects expands dramatically with continually increasing computational power and technology infusion, the time and effort needed for verification and validation (V& V) increases geometrically. Using simulation to perform design validation with system-level models earlier in the life cycle stands to bridge the gap between design of the system (based on system-level requirements) and verifying those requirements/validating the system as a whole. This case study stands as an example of how a project can validate a system-level design earlier in the project life cycle than traditional V& V processes by using simulation on a system model. Specifically, this paper describes how simulation was added to a system model of the Soil Moisture Active-Passive (SMAP) mission's uplink process. Also discussed are the advantages and disadvantages of the methods employed and the lessons learned; which are intended to benefit future model-based and simulation-based development efforts.

  10. Verification and Validation of a Three-Dimensional Generalized Composite Material Model

    NASA Technical Reports Server (NTRS)

    Hoffarth, Canio; Harrington, Joseph; Subramaniam, D. Rajan; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther

    2014-01-01

    A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800- F3900 fiber/resin composite material.

  11. Generic Methodology for Verification and Validation (GM-VV) to Support Acceptance of Models, Simulations and Data (Methodologie generale de verification et de validation (GM-VV) visant a soutenir l acceptation des modeles, simulations et donnees)

    DTIC Science & Technology

    2015-01-01

    RTO ou AGARD doivent comporter la dénomination « STO », « RTO » ou « AGARD » selon le cas, suivi du numéro de série. Des informations analogues...rapports de la STO au fur et à mesure de leur publication, vous pouvez consulter notre site Web (http://www.sto.nato.int/) et vous abonner à ce service...le cas, suivie du numéro de série (par exemple AGARD-AG-315). Des informations analogues, telles que le titre et la date de publication sont

  12. Verification and Validation of a Three-Dimensional Generalized Composite Material Model

    NASA Technical Reports Server (NTRS)

    Hoffarth, Canio; Harrington, Joseph; Rajan, Subramaniam D.; Goldberg, Robert K.; Carney, Kelly S.; DuBois, Paul; Blankenhorn, Gunther

    2015-01-01

    A general purpose orthotropic elasto-plastic computational constitutive material model has been developed to improve predictions of the response of composites subjected to high velocity impact. The three-dimensional orthotropic elasto-plastic composite material model is being implemented initially for solid elements in LS-DYNA as MAT213. In order to accurately represent the response of a composite, experimental stress-strain curves are utilized as input, allowing for a more general material model that can be used on a variety of composite applications. The theoretical details are discussed in a companion paper. This paper documents the implementation, verification and qualitative validation of the material model using the T800-F3900 fiber/resin composite material

  13. Aircraft electromagnetic compatibility

    NASA Technical Reports Server (NTRS)

    Clarke, Clifton A.; Larsen, William E.

    1987-01-01

    Illustrated are aircraft architecture, electromagnetic interference environments, electromagnetic compatibility protection techniques, program specifications, tasks, and verification and validation procedures. The environment of 400 Hz power, electrical transients, and radio frequency fields are portrayed and related to thresholds of avionics electronics. Five layers of protection for avionics are defined. Recognition is given to some present day electromagnetic compatibility weaknesses and issues which serve to reemphasize the importance of EMC verification of equipment and parts, and their ultimate EMC validation on the aircraft. Proven standards of grounding, bonding, shielding, wiring, and packaging are laid out to help provide a foundation for a comprehensive approach to successful future aircraft design and an understanding of cost effective EMC in an aircraft setting.

  14. RELAP-7 Software Verification and Validation Plan - Requirements Traceability Matrix (RTM) Part 2: Code Assessment Strategy, Procedure, and RTM Update

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yoo, Jun Soo; Choi, Yong Joon; Smith, Curtis Lee

    2016-09-01

    This document addresses two subjects involved with the RELAP-7 Software Verification and Validation Plan (SVVP): (i) the principles and plan to assure the independence of RELAP-7 assessment through the code development process, and (ii) the work performed to establish the RELAP-7 assessment plan, i.e., the assessment strategy, literature review, and identification of RELAP-7 requirements. Then, the Requirements Traceability Matrices (RTMs) proposed in previous document (INL-EXT-15-36684) are updated. These RTMs provide an efficient way to evaluate the RELAP-7 development status as well as the maturity of RELAP-7 assessment through the development process.

  15. Verification and Validation of Adaptive and Intelligent Systems with Flight Test Results

    NASA Technical Reports Server (NTRS)

    Burken, John J.; Larson, Richard R.

    2009-01-01

    F-15 IFCS project goals are: a) Demonstrate Control Approaches that can Efficiently Optimize Aircraft Performance in both Normal and Failure Conditions [A] & [B] failures. b) Advance Neural Network-Based Flight Control Technology for New Aerospace Systems Designs with a Pilot in the Loop. Gen II objectives include; a) Implement and Fly a Direct Adaptive Neural Network Based Flight Controller; b) Demonstrate the Ability of the System to Adapt to Simulated System Failures: 1) Suppress Transients Associated with Failure; 2) Re-Establish Sufficient Control and Handling of Vehicle for Safe Recovery. c) Provide Flight Experience for Development of Verification and Validation Processes for Flight Critical Neural Network Software.

  16. Validating Analytical Methods

    ERIC Educational Resources Information Center

    Ember, Lois R.

    1977-01-01

    The procedures utilized by the Association of Official Analytical Chemists (AOAC) to develop, evaluate, and validate analytical methods for the analysis of chemical pollutants are detailed. Methods validated by AOAC are used by the EPA and FDA in their enforcement programs and are granted preferential treatment by the courts. (BT)

  17. Investigation of a Verification and Validation Tool with a Turbofan Aircraft Engine Application

    NASA Technical Reports Server (NTRS)

    Uth, Peter; Narang-Siddarth, Anshu; Wong, Edmond

    2018-01-01

    The development of more advanced control architectures for turbofan aircraft engines can yield gains in performance and efficiency over the lifetime of an engine. However, the implementation of these increasingly complex controllers is contingent on their ability to provide safe, reliable engine operation. Therefore, having the means to verify the safety of new control algorithms is crucial. As a step towards this goal, CoCoSim, a publicly available verification tool for Simulink, is used to analyze C-MAPSS40k, a 40,000 lbf class turbo-fan engine model developed at NASA for testing new control algorithms. Due to current limitations of the verification software, several modifications are made to C-MAPSS40k to achieve compatibility with CoCoSim. Some of these modifications sacrifice fidelity to the original model. Several safety and performance requirements typical for turbofan engines are identified and constructed into a verification framework. Preliminary results using an industry standard baseline controller for these requirements are presented. While verification capabilities are demonstrated, a truly comprehensive analysis will require further development of the verification tool.

  18. Fourth NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)

    1997-01-01

    This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.

  19. Quality Assurance of Chemical Measurements.

    ERIC Educational Resources Information Center

    Taylor, John K.

    1981-01-01

    Reviews aspects of quality control (methods to control errors) and quality assessment (verification that systems are operating within acceptable limits) including an analytical measurement system, quality control by inspection, control charts, systematic errors, and use of SRMs, materials for which properties are certified by the National Bureau…

  20. Verification Tests for Sierra/SM's Reproducing Kernal Particle Method

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Giffin, Brian D.

    2015-09-01

    This report seeks to verify the proper implemention of RKPM within Sierra by comparing the results from several basic example problems excecuted with RKPM against the analytical and FEM solutions for these same problems. This report was compiled as a summer student intern project.

  1. Stennis Space Center Verification and Validation Capabilities

    NASA Technical Reports Server (NTRS)

    O'Neal, Duane; Daehler, Erik

    2006-01-01

    Topics covered include: Spatial Response; Reflectance Radiometry; Positional Accuracy; Stationary Atmospheric Monitoring; Laboratory Calibration; Thermal Radiometry; Hyperspectral Radiometry; and Portable Atmospheric Monitoring.

  2. Application of Dynamic Analysis in Semi-Analytical Finite Element Method.

    PubMed

    Liu, Pengfei; Xing, Qinyan; Wang, Dawei; Oeser, Markus

    2017-08-30

    Analyses of dynamic responses are significantly important for the design, maintenance and rehabilitation of asphalt pavement. In order to evaluate the dynamic responses of asphalt pavement under moving loads, a specific computational program, SAFEM, was developed based on a semi-analytical finite element method. This method is three-dimensional and only requires a two-dimensional FE discretization by incorporating Fourier series in the third dimension. In this paper, the algorithm to apply the dynamic analysis to SAFEM was introduced in detail. Asphalt pavement models under moving loads were built in the SAFEM and commercial finite element software ABAQUS to verify the accuracy and efficiency of the SAFEM. The verification shows that the computational accuracy of SAFEM is high enough and its computational time is much shorter than ABAQUS. Moreover, experimental verification was carried out and the prediction derived from SAFEM is consistent with the measurement. Therefore, the SAFEM is feasible to reliably predict the dynamic response of asphalt pavement under moving loads, thus proving beneficial to road administration in assessing the pavement's state.

  3. Analytic theory of photoacoustic wave generation from a spheroidal droplet.

    PubMed

    Li, Yong; Fang, Hui; Min, Changjun; Yuan, Xiaocong

    2014-08-25

    In this paper, we develop an analytic theory for describing the photoacoustic wave generation from a spheroidal droplet and derive the first complete analytic solution. Our derivation is based on solving the photoacoustic Helmholtz equation in spheroidal coordinates with the separation-of-variables method. As the verification, besides carrying out the asymptotic analyses which recover the standard solutions for a sphere, an infinite cylinder and an infinite layer, we also confirm that the partial transmission and reflection model previously demonstrated for these three geometries still stands. We expect that this analytic solution will find broad practical uses in interpreting experiment results, considering that its building blocks, the spheroidal wave functions (SWFs), can be numerically calculated by the existing computer programs.

  4. RF model of the distribution system as a communication channel, phase 2. Volume 1: Summary Report

    NASA Technical Reports Server (NTRS)

    Rustay, R. C.; Gajjar, J. T.; Rankin, R. W.; Wentz, R. C.; Wooding, R.

    1982-01-01

    The design, implementation, and verification of a computerized model for predicting the steady-state sinusoidal response of radial (tree) configured distribution feeders was undertaken. That work demonstrated the feasibility and validity based on verification measurements made on a limited size portion of an actual live feeder. On that basis a follow-on effort concerned with (1) extending the verification based on a greater variety of situations and network size, (2) extending the model capabilities for reverse direction propagation, (3) investigating parameter sensitivities, (4) improving transformer models, and (5) investigating procedures/fixes for ameliorating propagation trouble spots was conducted. Results are summarized.

  5. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dinh, Nam; Athe, Paridhi; Jones, Christopher

    The Virtual Environment for Reactor Applications (VERA) code suite is assessed in terms of capability and credibility against the Consortium for Advanced Simulation of Light Water Reactors (CASL) Verification and Validation Plan (presented herein) in the context of three selected challenge problems: CRUD-Induced Power Shift (CIPS), Departure from Nucleate Boiling (DNB), and Pellet-Clad Interaction (PCI). Capability refers to evidence of required functionality for capturing phenomena of interest while capability refers to the evidence that provides confidence in the calculated results. For this assessment, each challenge problem defines a set of phenomenological requirements against which the VERA software is assessed. Thismore » approach, in turn, enables the focused assessment of only those capabilities relevant to the challenge problem. The evaluation of VERA against the challenge problem requirements represents a capability assessment. The mechanism for assessment is the Sandia-developed Predictive Capability Maturity Model (PCMM) that, for this assessment, evaluates VERA on 8 major criteria: (1) Representation and Geometric Fidelity, (2) Physics and Material Model Fidelity, (3) Software Quality Assurance and Engineering, (4) Code Verification, (5) Solution Verification, (6) Separate Effects Model Validation, (7) Integral Effects Model Validation, and (8) Uncertainty Quantification. For each attribute, a maturity score from zero to three is assigned in the context of each challenge problem. The evaluation of these eight elements constitutes the credibility assessment for VERA.« less

  6. A Framework for Performing Verification and Validation in Reuse Based Software Engineering

    NASA Technical Reports Server (NTRS)

    Addy, Edward A.

    1997-01-01

    Verification and Validation (V&V) is currently performed during application development for many systems, especially safety-critical and mission- critical systems. The V&V process is intended to discover errors, especially errors related to critical processing, as early as possible during the development process. The system application provides the context under which the software artifacts are validated. This paper describes a framework that extends V&V from an individual application system to a product line of systems that are developed within an architecture-based software engineering environment. This framework includes the activities of traditional application-level V&V, and extends these activities into domain engineering and into the transition between domain engineering and application engineering. The framework includes descriptions of the types of activities to be performed during each of the life-cycle phases, and provides motivation for the activities.

  7. Setup, Validation and Quality Control of a Centralized WGS laboratory - Lessons Learned.

    PubMed

    Arnold, Cath; Edwards, Kirstin; Desai, Meeta; Platt, Steve; Green, Jonathan; Conway, David

    2018-04-25

    Routine use of Whole Genome analysis for infectious diseases can be used to enlighten various scenarios pertaining to Public Health, including identification of microbial pathogens; relating individual cases to an outbreak of infectious disease; establishing an association between an outbreak of food poisoning and a specific food vehicle; inferring drug susceptibility; source tracing of contaminants and study of variations in the genome affect pathogenicity/virulence. We describe the setup, validation and ongoing verification of a centralised WGS laboratory to carry out the sequencing for these public health functions for the National Infection Services, Public Health England in the UK. The performance characteristics and Quality Control metrics measured during validation and verification of the entire end to end process (accuracy, precision, reproducibility and repeatability) are described and include information regarding the automated pass and release of data to service users without intervention. © Crown copyright 2018.

  8. WEC3: Wave Energy Converter Code Comparison Project: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Combourieu, Adrien; Lawson, Michael; Babarit, Aurelien

    This paper describes the recently launched Wave Energy Converter Code Comparison (WEC3) project and present preliminary results from this effort. The objectives of WEC3 are to verify and validate numerical modelling tools that have been developed specifically to simulate wave energy conversion devices and to inform the upcoming IEA OES Annex VI Ocean Energy Modelling Verification and Validation project. WEC3 is divided into two phases. Phase 1 consists of a code-to-code verification and Phase II entails code-to-experiment validation. WEC3 focuses on mid-fidelity codes that simulate WECs using time-domain multibody dynamics methods to model device motions and hydrodynamic coefficients to modelmore » hydrodynamic forces. Consequently, high-fidelity numerical modelling tools, such as Navier-Stokes computational fluid dynamics simulation, and simple frequency domain modelling tools were not included in the WEC3 project.« less

  9. Code Verification of the HIGRAD Computational Fluid Dynamics Solver

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Van Buren, Kendra L.; Canfield, Jesse M.; Hemez, Francois M.

    2012-05-04

    The purpose of this report is to outline code and solution verification activities applied to HIGRAD, a Computational Fluid Dynamics (CFD) solver of the compressible Navier-Stokes equations developed at the Los Alamos National Laboratory, and used to simulate various phenomena such as the propagation of wildfires and atmospheric hydrodynamics. Code verification efforts, as described in this report, are an important first step to establish the credibility of numerical simulations. They provide evidence that the mathematical formulation is properly implemented without significant mistakes that would adversely impact the application of interest. Highly accurate analytical solutions are derived for four code verificationmore » test problems that exercise different aspects of the code. These test problems are referred to as: (i) the quiet start, (ii) the passive advection, (iii) the passive diffusion, and (iv) the piston-like problem. These problems are simulated using HIGRAD with different levels of mesh discretization and the numerical solutions are compared to their analytical counterparts. In addition, the rates of convergence are estimated to verify the numerical performance of the solver. The first three test problems produce numerical approximations as expected. The fourth test problem (piston-like) indicates the extent to which the code is able to simulate a 'mild' discontinuity, which is a condition that would typically be better handled by a Lagrangian formulation. The current investigation concludes that the numerical implementation of the solver performs as expected. The quality of solutions is sufficient to provide credible simulations of fluid flows around wind turbines. The main caveat associated to these findings is the low coverage provided by these four problems, and somewhat limited verification activities. A more comprehensive evaluation of HIGRAD may be beneficial for future studies.« less

  10. Qualification and initial characterization of a high-purity 233U spike for use in uranium analyses

    DOE PAGES

    Mathew, K. J.; Canaan, R. D.; Hexel, C.; ...

    2015-08-20

    Several high-purity 233U items potentially useful as isotope dilution mass spectrometry standards for safeguards, non-proliferation, and nuclear forensics measurements are identified and rescued from downblending. By preserving the supply of 233U materials of different pedigree for use as source materials for certified reference materials (CRMs), it is ensured that the safeguards community has high quality uranium isotopic standards required for calibration of the analytical instruments. One of the items identified as a source material for a high-purity CRM is characterized for the uranium isotope-amount ratios using thermal ionization mass spectrometry (TIMS). Additional verification measurements on this material using quadrupole inductivelymore » coupled plasma mass spectrometry (ICPMS) are also performed. As a result, the comparison of the ICPMS uranium isotope-amount ratios with the TIMS data, with much smaller uncertainties, validated the ICPMS measurement practices. ICPMS is proposed for the initial screening of the purity of items in the rescue campaign.« less

  11. Equilibrium shapes of a heterogeneous bubble in an electric field: a variational formulation and numerical verifications

    NASA Astrophysics Data System (ADS)

    Wang, Hanxiong; Liu, Liping; Liu, Dong

    2017-03-01

    The equilibrium shape of a bubble/droplet in an electric field is important for electrowetting over dielectrics (EWOD), electrohydrodynamic (EHD) enhancement for heat transfer and electro-deformation of a single biological cell among others. In this work, we develop a general variational formulation in account of electro-mechanical couplings. In the context of EHD, we identify the free energy functional and the associated energy minimization problem that determines the equilibrium shape of a bubble in an electric field. Based on this variational formulation, we implement a fixed mesh level-set gradient method for computing the equilibrium shapes. This numerical scheme is efficient and validated by comparing with analytical solutions at the absence of electric field and experimental results at the presence of electric field. We also present simulation results for zero gravity which will be useful for space applications. The variational formulation and numerical scheme are anticipated to have broad applications in areas of EWOD, EHD and electro-deformation in biomechanics.

  12. Verification and validation of a rapid heat transfer calculation methodology for transient melt pool solidification conditions in powder bed metal additive manufacturing

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plotkowski, A.; Kirka, M. M.; Babu, S. S.

    A fundamental understanding of spatial and temporal thermal distributions is crucial for predicting solidification and solid-state microstructural development in parts made by additive manufacturing. While sophisticated numerical techniques that are based on finite element or finite volume methods are useful for gaining insight into these phenomena at the length scale of the melt pool (100 - 500 µm), they are ill-suited for predicting engineering trends over full part cross-sections (> 10 x 10 cm) or many layers over long process times (> many days) due to the necessity of fully resolving the heat source characteristics. On the other hand, itmore » is extremely difficult to resolve the highly dynamic nature of the process using purely in-situ characterization techniques. This article proposes a pragmatic alternative based on a semi-analytical approach to predicting the transient heat conduction during powder bed metal additive manufacturing process. The model calculations were theoretically verified for selective laser melting of AlSi10Mg and electron beam melting of IN718 powders for simple cross-sectional geometries and the transient results are compared to steady state predictions from the Rosenthal equation. It is shown that the transient effects of the scan strategy create significant variations in the melt pool geometry and solid-liquid interface velocity, especially as the thermal diffusivity of the material decreases and the pre-heat of the process increases. With positive verification of the strategy, the model was then experimentally validated to simulate two point-melt scan strategies during electron beam melting of IN718, one intended to produce a columnar and one an equiaxed grain structure. Lastly, through comparison of the solidification conditions (i.e. transient and spatial variations of thermal gradient and liquid-solid interface velocity) predicted by the model to phenomenological CET theory, the model accurately predicted the experimental grain structures.« less

  13. Verification and validation of a rapid heat transfer calculation methodology for transient melt pool solidification conditions in powder bed metal additive manufacturing

    DOE PAGES

    Plotkowski, A.; Kirka, M. M.; Babu, S. S.

    2017-10-16

    A fundamental understanding of spatial and temporal thermal distributions is crucial for predicting solidification and solid-state microstructural development in parts made by additive manufacturing. While sophisticated numerical techniques that are based on finite element or finite volume methods are useful for gaining insight into these phenomena at the length scale of the melt pool (100 - 500 µm), they are ill-suited for predicting engineering trends over full part cross-sections (> 10 x 10 cm) or many layers over long process times (> many days) due to the necessity of fully resolving the heat source characteristics. On the other hand, itmore » is extremely difficult to resolve the highly dynamic nature of the process using purely in-situ characterization techniques. This article proposes a pragmatic alternative based on a semi-analytical approach to predicting the transient heat conduction during powder bed metal additive manufacturing process. The model calculations were theoretically verified for selective laser melting of AlSi10Mg and electron beam melting of IN718 powders for simple cross-sectional geometries and the transient results are compared to steady state predictions from the Rosenthal equation. It is shown that the transient effects of the scan strategy create significant variations in the melt pool geometry and solid-liquid interface velocity, especially as the thermal diffusivity of the material decreases and the pre-heat of the process increases. With positive verification of the strategy, the model was then experimentally validated to simulate two point-melt scan strategies during electron beam melting of IN718, one intended to produce a columnar and one an equiaxed grain structure. Lastly, through comparison of the solidification conditions (i.e. transient and spatial variations of thermal gradient and liquid-solid interface velocity) predicted by the model to phenomenological CET theory, the model accurately predicted the experimental grain structures.« less

  14. A Hybrid On-line Verification Method of Relay Setting

    NASA Astrophysics Data System (ADS)

    Gao, Wangyuan; Chen, Qing; Si, Ji; Huang, Xin

    2017-05-01

    Along with the rapid development of the power industry, grid structure gets more sophisticated. The validity and rationality of protective relaying are vital to the security of power systems. To increase the security of power systems, it is essential to verify the setting values of relays online. Traditional verification methods mainly include the comparison of protection range and the comparison of calculated setting value. To realize on-line verification, the verifying speed is the key. The verifying result of comparing protection range is accurate, but the computation burden is heavy, and the verifying speed is slow. Comparing calculated setting value is much faster, but the verifying result is conservative and inaccurate. Taking the overcurrent protection as example, this paper analyses the advantages and disadvantages of the two traditional methods above, and proposes a hybrid method of on-line verification which synthesizes the advantages of the two traditional methods. This hybrid method can meet the requirements of accurate on-line verification.

  15. [Uniqueness seeking behavior as a self-verification: an alternative approach to the study of uniqueness].

    PubMed

    Yamaoka, S

    1995-06-01

    Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.

  16. Ion mobility spectrometer, spectrometer analyte detection and identification verification system, and method

    DOEpatents

    Atkinson, David A.

    2002-01-01

    Methods and apparatus for ion mobility spectrometry and analyte detection and identification verification system are disclosed. The apparatus is configured to be used in an ion mobility spectrometer and includes a plurality of reactant reservoirs configured to contain a plurality of reactants which can be reacted with the sample to form adducts having varying ion mobilities. A carrier fluid, such as air or nitrogen, is used to carry the sample into the spectrometer. The plurality of reactants are configured to be selectively added to the carrier stream by use inlet and outlet manifolds in communication with the reagent reservoirs, the reservoirs being selectively isolatable by valves. The invention further includes a spectrometer having the reagent system described. In the method, a first reactant is used with the sample. Following a positive result, a second reactant is used to determine whether a predicted response occurs. The occurrence of the second predicted response tends to verify the existence of a component of interest within the sample. A third reactant can also be used to provide further verification of the existence of a component of interest. A library can be established of known responses of compounds of interest with various reactants and the results of a specific multi-reactant survey of a sample can be compared against the library to determine whether a component detected in the sample is likely to be a specific component of interest.

  17. Establishing the Validity of Recovery from Stuttering without Formal Treatment.

    ERIC Educational Resources Information Center

    Finn, Patrick

    1996-01-01

    This study examined a validation procedure combining self-reports with independent verification to identify cases of recovery from stuttering without formal treatment. A Speech Behavior Checklist was administered to 42 individuals familiar with recovered subjects' past speech. Analysis of subjects' descriptions of their past stuttering was…

  18. Verification and benchmark testing of the NUFT computer code

    NASA Astrophysics Data System (ADS)

    Lee, K. H.; Nitao, J. J.; Kulshrestha, A.

    1993-10-01

    This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.

  19. Quantifying residues from postharvest fumigation of almonds and walnuts with propylene oxide

    USDA-ARS?s Scientific Manuscript database

    A novel analytical approach, involving solvent extraction with methyl tert-butyl ether (MTBE) followed by gas chromatography (GC), was developed to quantify residues that result from the postharvest fumigation of almonds and walnuts with propylene oxide (PPO). Verification and quantification of PPO,...

  20. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

  1. Nuclear Energy Knowledge and Validation Center (NEKVaC) Needs Workshop Summary Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gougar, Hans

    2015-02-01

    The Department of Energy (DOE) has made significant progress developing simulation tools to predict the behavior of nuclear systems with greater accuracy and of increasing our capability to predict the behavior of these systems outside of the standard range of applications. These analytical tools require a more complex array of validation tests to accurately simulate the physics and multiple length and time scales. Results from modern simulations will allow experiment designers to narrow the range of conditions needed to bound system behavior and to optimize the deployment of instrumentation to limit the breadth and cost of the campaign. Modern validation,more » verification and uncertainty quantification (VVUQ) techniques enable analysts to extract information from experiments in a systematic manner and provide the users with a quantified uncertainty estimate. Unfortunately, the capability to perform experiments that would enable taking full advantage of the formalisms of these modern codes has progressed relatively little (with some notable exceptions in fuels and thermal-hydraulics); the majority of the experimental data available today is the "historic" data accumulated over the last decades of nuclear systems R&D. A validated code-model is a tool for users. An unvalidated code-model is useful for code developers to gain understanding, publish research results, attract funding, etc. As nuclear analysis codes have become more sophisticated, so have the measurement and validation methods and the challenges that confront them. A successful yet cost-effective validation effort requires expertise possessed only by a few, resources possessed only by the well-capitalized (or a willing collective), and a clear, well-defined objective (validating a code that is developed to satisfy the need(s) of an actual user). To that end, the Idaho National Laboratory established the Nuclear Energy Knowledge and Validation Center to address the challenges of modern code validation and to manage the knowledge from past, current, and future experimental campaigns. By pulling together the best minds involved in code development, experiment design, and validation to establish and disseminate best practices and new techniques, the Nuclear Energy Knowledge and Validation Center (NEKVaC or the ‘Center’) will be a resource for industry, DOE Programs, and academia validation efforts.« less

  2. Design and verification of distributed logic controllers with application of Petri nets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  3. Design and verification of distributed logic controllers with application of Petri nets

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-01

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  4. Survey of statistical techniques used in validation studies of air pollution prediction models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bornstein, R D; Anderson, S F

    1979-03-01

    Statistical techniques used by meteorologists to validate predictions made by air pollution models are surveyed. Techniques are divided into the following three groups: graphical, tabular, and summary statistics. Some of the practical problems associated with verification are also discussed. Characteristics desired in any validation program are listed and a suggested combination of techniques that possesses many of these characteristics is presented.

  5. SU-F-T-440: The Feasibility Research of Checking Cervical Cancer IMRT Pre- Treatment Dose Verification by Automated Treatment Planning Verification System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, X; Yin, Y; Lin, X

    Purpose: To assess the preliminary feasibility of automated treatment planning verification system in cervical cancer IMRT pre-treatment dose verification. Methods: The study selected randomly clinical IMRT treatment planning data for twenty patients with cervical cancer, all IMRT plans were divided into 7 fields to meet the dosimetric goals using a commercial treatment planning system(PianncleVersion 9.2and the EclipseVersion 13.5). The plans were exported to the Mobius 3D (M3D)server percentage differences of volume of a region of interest (ROI) and dose calculation of target region and organ at risk were evaluated, in order to validate the accuracy automated treatment planning verification system.more » Results: The difference of volume for Pinnacle to M3D was less than results for Eclipse to M3D in ROI, the biggest difference was 0.22± 0.69%, 3.5±1.89% for Pinnacle and Eclipse respectively. M3D showed slightly better agreement in dose of target and organ at risk compared with TPS. But after recalculating plans by M3D, dose difference for Pinnacle was less than Eclipse on average, results were within 3%. Conclusion: The method of utilizing the automated treatment planning system to validate the accuracy of plans is convenientbut the scope of differences still need more clinical patient cases to determine. At present, it should be used as a secondary check tool to improve safety in the clinical treatment planning.« less

  6. Empirical evaluation of decision support systems: Needs, definitions, potential methods, and an example pertaining to waterfowl management

    USGS Publications Warehouse

    Sojda, R.S.

    2007-01-01

    Decision support systems are often not empirically evaluated, especially the underlying modelling components. This can be attributed to such systems necessarily being designed to handle complex and poorly structured problems and decision making. Nonetheless, evaluation is critical and should be focused on empirical testing whenever possible. Verification and validation, in combination, comprise such evaluation. Verification is ensuring that the system is internally complete, coherent, and logical from a modelling and programming perspective. Validation is examining whether the system is realistic and useful to the user or decision maker, and should answer the question: “Was the system successful at addressing its intended purpose?” A rich literature exists on verification and validation of expert systems and other artificial intelligence methods; however, no single evaluation methodology has emerged as preeminent. At least five approaches to validation are feasible. First, under some conditions, decision support system performance can be tested against a preselected gold standard. Second, real-time and historic data sets can be used for comparison with simulated output. Third, panels of experts can be judiciously used, but often are not an option in some ecological domains. Fourth, sensitivity analysis of system outputs in relation to inputs can be informative. Fifth, when validation of a complete system is impossible, examining major components can be substituted, recognizing the potential pitfalls. I provide an example of evaluation of a decision support system for trumpeter swan (Cygnus buccinator) management that I developed using interacting intelligent agents, expert systems, and a queuing system. Predicted swan distributions over a 13-year period were assessed against observed numbers. Population survey numbers and banding (ringing) studies may provide long term data useful in empirical evaluation of decision support.

  7. Blood collection tubes as medical devices: The potential to affect assays and proposed verification and validation processes for the clinical laboratory.

    PubMed

    Bowen, Raffick A R; Adcock, Dorothy M

    2016-12-01

    Blood collection tubes (BCTs) are an often under-recognized variable in the preanalytical phase of clinical laboratory testing. Unfortunately, even the best-designed and manufactured BCTs may not work well in all clinical settings. Clinical laboratories, in collaboration with healthcare providers, should carefully evaluate BCTs prior to putting them into clinical use to determine their limitations and ensure that patients are not placed at risk because of inaccuracies due to poor tube performance. Selection of the best BCTs can be achieved through comparing advertising materials, reviewing the literature, observing the device at a scientific meeting, receiving a demonstration, evaluating the device under simulated conditions, or testing the device with patient samples. Although many publications have discussed method validations, few detail how to perform experiments for tube verification and validation. This article highlights the most common and impactful variables related to BCTs and discusses the validation studies that a typical clinical laboratory should perform when selecting BCTs. We also present a brief review of how in vitro diagnostic devices, particularly BCTs, are regulated in the United States, the European Union, and Canada. The verification and validation of BCTs will help to avoid the economic and human costs associated with incorrect test results, including poor patient care, unnecessary testing, and delays in test results. We urge laboratorians, tube manufacturers, diagnostic companies, and other researchers to take all the necessary steps to protect against the adverse effects of BCT components and their additives on clinical assays. Copyright © 2016 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  8. Numerical Simulation of the Fluid-Structure Interaction of a Surface Effect Ship Bow Seal

    NASA Astrophysics Data System (ADS)

    Bloxom, Andrew L.

    Numerical simulations of fluid-structure interaction (FSI) problems were performed in an effort to verify and validate a commercially available FSI tool. This tool uses an iterative partitioned coupling scheme between CD-adapco's STAR-CCM+ finite volume fluid solver and Simulia's Abaqus finite element structural solver to simulate the FSI response of a system. Preliminary verification and validation work (V&V) was carried out to understand the numerical behavior of the codes individually and together as a FSI tool. Verification and Validation work that was completed included code order verification of the respective fluid and structural solvers with Couette-Poiseuille flow and Euler-Bernoulli beam theory. These results confirmed the 2 nd order accuracy of the spatial discretizations used. Following that, a mixture of solution verifications and model calibrations was performed with the inclusion of the physics models implemented in the solution of the FSI problems. Solution verifications were completed for fluid and structural stand-alone models as well as for the coupled FSI solutions. These results re-confirmed the spatial order of accuracy but for more complex flows and physics models as well as the order of accuracy of the temporal discretizations. In lieu of a good material definition, model calibration is performed to reproduce the experimental results. This work used model calibration for both instances of hyperelastic materials which were presented in the literature as validation cases because these materials were defined as linear elastic. Calibrated, three dimensional models of the bow seal on the University of Michigan bow seal test platform showed the ability to reproduce the experimental results qualitatively through averaging of the forces and seal displacements. These simulations represent the only current 3D results for this case. One significant result of this study is the ability to visualize the flow around the seal and to directly measure the seal resistances at varying cushion pressures, seal immersions, forward speeds, and different seal materials. SES design analysis could greatly benefit from the inclusion of flexible seals in simulations, and this work is a positive step in that direction. In future work, the inclusion of more complex seal geometries and contact will further enhance the capability of this tool.

  9. RAZORBACK - A Research Reactor Transient Analysis Code Version 1.0 - Volume 3: Verification and Validation Report.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talley, Darren G.

    2017-04-01

    This report describes the work and results of the verification and validation (V&V) of the version 1.0 release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, the equation of motion for fuel element thermal expansion, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This V&V effort was intended to confirm that the code showsmore » good agreement between simulation and actual ACRR operations.« less

  10. Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia

    1996-01-01

    The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.

  11. Validating Analytical Protocols to Determine Selected Pesticides and PCBs Using Routine Samples.

    PubMed

    Pindado Jiménez, Oscar; García Alonso, Susana; Pérez Pastor, Rosa María

    2017-01-01

    This study aims at providing recommendations concerning the validation of analytical protocols by using routine samples. It is intended to provide a case-study on how to validate the analytical methods in different environmental matrices. In order to analyze the selected compounds (pesticides and polychlorinated biphenyls) in two different environmental matrices, the current work has performed and validated two analytical procedures by GC-MS. A description is given of the validation of the two protocols by the analysis of more than 30 samples of water and sediments collected along nine months. The present work also scopes the uncertainty associated with both analytical protocols. In detail, uncertainty of water sample was performed through a conventional approach. However, for the sediments matrices, the estimation of proportional/constant bias is also included due to its inhomogeneity. Results for the sediment matrix are reliable, showing a range 25-35% of analytical variability associated with intermediate conditions. The analytical methodology for the water matrix determines the selected compounds with acceptable recoveries and the combined uncertainty ranges between 20 and 30%. Analyzing routine samples is rarely applied to assess trueness of novel analytical methods and up to now this methodology was not focused on organochlorine compounds in environmental matrices.

  12. A Protocol-Analytic Study of Metacognition in Mathematical Problem Solving.

    ERIC Educational Resources Information Center

    Cai, Jinfa

    1994-01-01

    Metacognitive behaviors of subjects having high (n=2) and low (n=2) levels of mathematical experience were compared across four cognitive processes in mathematical problem solving: orientation, organization, execution, and verification. High-experience subjects engaged in self-regulation and spent more time on orientation and organization. (36…

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - FIELD PORTABLE GAS CHROMATOGRAPH/MASS SPECTROMETER - BRUKER-FRANZEN ANALYTICAL SYSTENS, INC

    EPA Science Inventory

    The performance evaluation of innovative and alternative environmental technologies is an integral part of the U.S. Environmental Protection Agency's (EPA) mission. In the spirit of the technology policy, the Agency began to direct a portion of its resources toward the promotion...

  14. Perspectives of human verification via binary QRS template matching of single-lead and 12-lead electrocardiogram.

    PubMed

    Krasteva, Vessela; Jekova, Irena; Schmid, Ramun

    2018-01-01

    This study aims to validate the 12-lead electrocardiogram (ECG) as a biometric modality based on two straightforward binary QRS template matching characteristics. Different perspectives of the human verification problem are considered, regarding the optimal lead selection and stability over sample size, gender, age, heart rate (HR). A clinical 12-lead resting ECG database, including a population of 460 subjects with two-session recordings (>1 year apart) is used. Cost-effective strategies for extraction of personalized QRS patterns (100ms) and binary template matching estimate similarity in the time scale (matching time) and dissimilarity in the amplitude scale (mismatch area). The two-class person verification task, taking the decision to validate or to reject the subject identity is managed by linear discriminant analysis (LDA). Non-redundant LDA models for different lead configurations (I,II,III,aVF,aVL,aVF,V1-V6) are trained on the first half of 230 subjects by stepwise feature selection until maximization of the area under the receiver operating characteristic curve (ROC AUC). The operating point on the training ROC at equal error rate (EER) is tested on the independent dataset (second half of 230 subjects) to report unbiased validation of test-ROC AUC and true verification rate (TVR = 100-EER). The test results are further evaluated in groups by sample size, gender, age, HR. The optimal QRS pattern projection for single-lead ECG biometric modality is found in the frontal plane sector (60°-0°) with best (Test-AUC/TVR) for lead II (0.941/86.8%) and slight accuracy drop for -aVR (-0.017/-1.4%), I (-0.01/-1.5%). Chest ECG leads have degrading accuracy from V1 (0.885/80.6%) to V6 (0.799/71.8%). The multi-lead ECG improves verification: 6-chest (0.97/90.9%), 6-limb (0.986/94.3%), 12-leads (0.995/97.5%). The QRS pattern matching model shows stable performance for verification of 10 to 230 individuals; insignificant degradation of TVR in women by (1.2-3.6%), adults ≥70 years (3.7%), younger <40 years (1.9%), HR<60bpm (1.2%), HR>90bpm (3.9%), no degradation for HR change (0 to >20bpm).

  15. Generalized Spencer-Lewis equation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Filippone, W.L.

    The Spencer-Lewis equation, which describes electron transport in homogeneous media when continuous slowing down theory is valid, is derived from the Boltzmann equation. Also derived is a time-dependent generalized Spencer-Lewis equation valid for inhomogeneous media. An independent verification of this last equation is obtained for the one-dimensional case using particle balance considerations.

  16. EUCLID/NISP GRISM qualification model AIT/AIV campaign: optical, mechanical, thermal and vibration tests

    NASA Astrophysics Data System (ADS)

    Caillat, A.; Costille, A.; Pascal, S.; Rossin, C.; Vives, S.; Foulon, B.; Sanchez, P.

    2017-09-01

    Dark matter and dark energy mysteries will be explored by the Euclid ESA M-class space mission which will be launched in 2020. Millions of galaxies will be surveyed through visible imagery and NIR imagery and spectroscopy in order to map in three dimensions the Universe at different evolution stages over the past 10 billion years. The massive NIR spectroscopic survey will be done efficiently by the NISP instrument thanks to the use of grisms (for "Grating pRISMs") developed under the responsibility of the LAM. In this paper, we present the verification philosophy applied to test and validate each grism before the delivery to the project. The test sequence covers a large set of verifications: optical tests to validate efficiency and WFE of the component, mechanical tests to validate the robustness to vibration, thermal tests to validate its behavior in cryogenic environment and a complete metrology of the assembled component. We show the test results obtained on the first grism Engineering and Qualification Model (EQM) which will be delivered to the NISP project in fall 2016.

  17. Verification and Validation: High Charge and Energy (HZE) Transport Codes and Future Development

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Mertens, Christopher J.; Blattnig, Steve R.; Clowdsley, Martha S.; Cucinotta, Francis A.; Tweed, John; Heinbockel, John H.; Walker, Steven A.; Nealy, John E.

    2005-01-01

    In the present paper, we give the formalism for further developing a fully three-dimensional HZETRN code using marching procedures but also development of a new Green's function code is discussed. The final Green's function code is capable of not only validation in the space environment but also in ground based laboratories with directed beams of ions of specific energy and characterized with detailed diagnostic particle spectrometer devices. Special emphasis is given to verification of the computational procedures and validation of the resultant computational model using laboratory and spaceflight measurements. Due to historical requirements, two parallel development paths for computational model implementation using marching procedures and Green s function techniques are followed. A new version of the HZETRN code capable of simulating HZE ions with either laboratory or space boundary conditions is under development. Validation of computational models at this time is particularly important for President Bush s Initiative to develop infrastructure for human exploration with first target demonstration of the Crew Exploration Vehicle (CEV) in low Earth orbit in 2008.

  18. Sensor Based Framework for Secure Multimedia Communication in VANET

    PubMed Central

    Rahim, Aneel; Khan, Zeeshan Shafi; Bin Muhaya, Fahad T.; Sher, Muhammad; Kim, Tai-Hoon

    2010-01-01

    Secure multimedia communication enhances the safety of passengers by providing visual pictures of accidents and danger situations. In this paper we proposed a framework for secure multimedia communication in Vehicular Ad-Hoc Networks (VANETs). Our proposed framework is mainly divided into four components: redundant information, priority assignment, malicious data verification and malicious node verification. The proposed scheme jhas been validated with the help of the NS-2 network simulator and the Evalvid tool. PMID:22163462

  19. A Tool for Automatic Verification of Real-Time Expert Systems

    NASA Technical Reports Server (NTRS)

    Traylor, B.; Schwuttke, U.; Quan, A.

    1994-01-01

    The creation of an automated, user-driven tool for expert system development, validation, and verification is curretly onoging at NASA's Jet Propulsion Laboratory. In the new age of faster, better, cheaper missions, there is an increased willingness to utilize embedded expert systems for encapsulating and preserving mission expertise in systems which combine conventional algorithmic processing and artifical intelligence. The once-questioned role of automation in spacecraft monitoring is now becoming one of increasing importance.

  20. Some remarks relating to Short Notice Random Inspection (SNRI) and verification of flow strata

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Murphey, W.; Emeigh, C.; Lessler, L.

    1991-01-01

    Short Notice Random Inspection (SNRI) is a concept which is to enable the International Atomic Energy Agency (Agency) to make technically valid statements of verification of shipment or receipt strata when the Agency cannot have a resident inspector. Gordon and Sanborn addressed this problem for a centrifuge enrichment plant. In this paper other operating conditions of interest are examined and modifications of the necessary conditions for application of SNRI discussed.

  1. Consolidated Site (CS) 022 Verification Survey at Former McClellan AFB, Sacramento, California

    DTIC Science & Technology

    2015-03-31

    currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 31 Mar 2015 2. REPORT TYPE...Consultative Letter 3. DATES COVERED (From – To) July 2014 – December 2014 4. TITLE AND SUBTITLE Consolidated Site (CS) 022 Verification Survey at...the U.S. Air Force Radioisotope Committee Secretariat (RICS), the U.S. Air Force School of Aerospace Medicine, Consultative Services Division

  2. Consolidated Site (CS) 024 Verification Survey at Former McClellan AFB, Sacramento, California

    DTIC Science & Technology

    2015-03-31

    currently valid OMB control number. PLEASE DO NOT RETURN YOUR FORM TO THE ABOVE ADDRESS. 1. REPORT DATE (DD-MM-YYYY) 31 Mar 2015 2. REPORT TYPE...Consultative Letter 3. DATES COVERED (From – To) July 2014 – December 2014 4. TITLE AND SUBTITLE Consolidated Site (CS) 024 Verification Survey at...the U.S. Air Force Radioisotope Committee Secretariat (RICS), the U.S. Air Force School of Aerospace Medicine, Consultative Services Division

  3. Multicentre validation of IMRT pre-treatment verification: comparison of in-house and external audit.

    PubMed

    Jornet, Núria; Carrasco, Pablo; Beltrán, Mercè; Calvo, Juan Francisco; Escudé, Lluís; Hernández, Victor; Quera, Jaume; Sáez, Jordi

    2014-09-01

    We performed a multicentre intercomparison of IMRT optimisation and dose planning and IMRT pre-treatment verification methods and results. The aims were to check consistency between dose plans and to validate whether in-house pre-treatment verification results agreed with those of an external audit. Participating centres used two mock cases (prostate and head and neck) for the intercomparison and audit. Compliance to dosimetric goals and total number of MU per plan were collected. A simple quality index to compare the different plans was proposed. We compared gamma index pass rates using the centre's equipment and methodology to those of an external audit. While for the prostate case, all centres fulfilled the dosimetric goals and plan quality was homogeneous, that was not the case for the head and neck case. The number of MU did not correlate with the plan quality index. Pre-treatment verifications results of the external audit did not agree with those of the in-house measurements for two centres: being within tolerance for in-house measurements and unacceptable for the audit or the other way round. Although all plans fulfilled dosimetric constraints, plan quality is highly dependent on the planner expertise. External audits are an excellent tool to detect errors in IMRT implementation and cannot be replaced by intercomparison using results obtained by centres. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  4. Self-verification and social anxiety: preference for negative social feedback and low social self-esteem.

    PubMed

    Valentiner, David P; Skowronski, John J; McGrath, Patrick B; Smith, Sarah A; Renner, Kerry A

    2011-10-01

    A self-verification model of social anxiety views negative social self-esteem as a core feature of social anxiety. This core feature is proposed to be maintained through self-verification processes, such as by leading individuals with negative social self-esteem to prefer negative social feedback. This model is tested in two studies. In Study 1, questionnaires were administered to a college sample (N = 317). In Study 2, questionnaires were administered to anxiety disordered patients (N = 62) before and after treatment. Study 1 developed measures of preference for negative social feedback and social self-esteem, and provided evidence of their incremental validity in a college sample. Study 2 found that these two variables are not strongly related to fears of evaluation, are relatively unaffected by a treatment that targets such fears, and predict residual social anxiety following treatment. Overall, these studies provide preliminary evidence for a self-verification model of social anxiety.

  5. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  6. Exploring system interconnection architectures with VIPACES: from direct connections to NOCs

    NASA Astrophysics Data System (ADS)

    Sánchez-Peña, Armando; Carballo, Pedro P.; Núñez, Antonio

    2007-05-01

    This paper presents a simple environment for the verification of AMBA 3 AXI systems in Verification IP (VIP) production called VIPACES (Verification Interface Primitives for the development of AXI Compliant Elements and Systems). These primitives are presented as a not compiled library written in SystemC where interfaces are the core of the library. The definition of interfaces instead of generic modules let the user construct custom modules improving the resources spent during the verification phase as well as easily adapting his modules to the AMBA 3 AXI protocol. This topic is the main discussion in the VIPACES library. The paper focuses on comparing and contrasting the main interconnection schemes for AMBA 3 AXI as modeled by VIPACES. For assessing these results we propose a validation scenario with a particular architecture belonging to the domain of MPEG4 video decoding, which is compound by an AXI bus connecting an IDCT and other processing resources.

  7. Progress Towards a Cartesian Cut-Cell Method for Viscous Compressible Flow

    NASA Technical Reports Server (NTRS)

    Berger, Marsha; Aftosmis, Michael J.

    2012-01-01

    We present preliminary development of an approach for simulating high Reynolds number steady compressible flow in two space dimensions using a Cartesian cut-cell finite volume method. We consider both laminar and turbulent flow with both low and high cell Reynolds numbers near the wall. The approach solves the full Navier-Stokes equations in all cells, and uses a wall model to address the resolution requirements near boundaries and to mitigate mesh irregularities in cut cells. We present a quadratic wall model for low cell Reynolds numbers. At high cell Reynolds numbers, the quadratic is replaced with a newly developed analytic wall model stemming from solution of a limiting form of the Spalart-Allmaras turbulence model which features a forward evaluation for flow velocity and exactly matches characteristics of the SA turbulence model in the field. We develop multigrid operators which attain convergence rates similar to inviscid multigrid. Investigations focus on preliminary verification and validation of the method. Flows over flat plates and compressible airfoils show good agreement with both theoretical results and experimental data. Mesh convergence studies on sub- and transonic airfoil flows show convergence of surface pressures with wall spacings as large as approx.0.1% chord. With the current analytic wall model, one or two additional refinements near the wall are required to obtain mesh converged values of skin friction.

  8. Skeletal age and age verification in youth sport.

    PubMed

    Malina, Robert M

    2011-11-01

    Problems with accurate chronological age (CA) reporting occur on a more or less regular basis in youth sports. As a result, there is increasing discussion of age verification. Use of 'bone age' or skeletal age (SA) for the purpose of estimating or verifying CA has been used in medicolegal contexts for many years and also in youth sport competitions. This article reviews the concept of SA, and the three most commonly used methods of assessment. Variation in SA within CA groups among male soccer players and female artistic gymnasts is evaluated relative to the use of SA as a tool for verification of CA. Corresponding data for athletes in several other sports are also summarized. Among adolescent males, a significant number of athletes will be identified as older than a CA cutoff because of advanced skeletal maturation when they in fact have a valid CA. SA assessments of soccer players are comparable to MRI assessments of epiphyseal-diaphyseal union of the distal radius in under-17 soccer players. Both protocols indicate a relatively large number of false negatives among youth players aged 15-17 years. Among adolescent females, a significant number of age-eligible artistic gymnasts will be identified as younger than the CA cutoff because of later skeletal maturation when in fact they have a valid CA. There is also the possibility of false positives-identifying gymnasts as younger than the CA cutoff because of late skeletal maturation when they have a valid CA. The risk of false negatives and false positives implies that SA is not a valid indicator of CA.

  9. Multi-analyte validation in heterogeneous solution by ELISA.

    PubMed

    Lakshmipriya, Thangavel; Gopinath, Subash C B; Hashim, Uda; Murugaiyah, Vikneswaran

    2017-12-01

    Enzyme Linked Immunosorbent Assay (ELISA) is a standard assay that has been used widely to validate the presence of analyte in the solution. With the advancement of ELISA, different strategies have shown and became a suitable immunoassay for a wide range of analytes. Herein, we attempted to provide additional evidence with ELISA, to show its suitability for multi-analyte detection. To demonstrate, three clinically relevant targets have been chosen, which include 16kDa protein from Mycobacterium tuberculosis, human blood clotting Factor IXa and a tumour marker Squamous Cell Carcinoma antigen. Indeed, we adapted the routine steps from the conventional ELISA to validate the occurrence of analytes both in homogeneous and heterogeneous solutions. With the homogeneous and heterogeneous solutions, we could attain the sensitivity of 2, 8 and 1nM for the targets 16kDa protein, FIXa and SSC antigen, respectively. Further, the specific multi-analyte validations were evidenced with the similar sensitivities in the presence of human serum. ELISA assay in this study has proven its applicability for the genuine multiple target validation in the heterogeneous solution, can be followed for other target validations. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. 21 CFR 812.35 - Supplemental applications.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... control procedures of § 820.30, preclinical/animal testing, peer reviewed published literature, or other... the verification and validation testing, as appropriate, demonstrated that the design outputs met the...

  11. 21 CFR 812.35 - Supplemental applications.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... control procedures of § 820.30, preclinical/animal testing, peer reviewed published literature, or other... the verification and validation testing, as appropriate, demonstrated that the design outputs met the...

  12. 21 CFR 812.35 - Supplemental applications.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... control procedures of § 820.30, preclinical/animal testing, peer reviewed published literature, or other... the verification and validation testing, as appropriate, demonstrated that the design outputs met the...

  13. 21 CFR 812.35 - Supplemental applications.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... control procedures of § 820.30, preclinical/animal testing, peer reviewed published literature, or other... the verification and validation testing, as appropriate, demonstrated that the design outputs met the...

  14. Simulation verification techniques study: Simulation performance validation techniques document. [for the space shuttle system

    NASA Technical Reports Server (NTRS)

    Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.

    1975-01-01

    Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.

  15. Teaching Analytical Method Transfer through Developing and Validating Then Transferring Dissolution Testing Methods for Pharmaceuticals

    ERIC Educational Resources Information Center

    Kimaru, Irene; Koether, Marina; Chichester, Kimberly; Eaton, Lafayette

    2017-01-01

    Analytical method transfer (AMT) and dissolution testing are important topics required in industry that should be taught in analytical chemistry courses. Undergraduate students in senior level analytical chemistry laboratory courses at Kennesaw State University (KSU) and St. John Fisher College (SJFC) participated in development, validation, and…

  16. System verification and validation: a fundamental systems engineering task

    NASA Astrophysics Data System (ADS)

    Ansorge, Wolfgang R.

    2004-09-01

    Systems Engineering (SE) is the discipline in a project management team, which transfers the user's operational needs and justifications for an Extremely Large Telescope (ELT) -or any other telescope-- into a set of validated required system performance characteristics. Subsequently transferring these validated required system performance characteris-tics into a validated system configuration, and eventually into the assembled, integrated telescope system with verified performance characteristics and provided it with "objective evidence that the particular requirements for the specified intended use are fulfilled". The latter is the ISO Standard 8402 definition for "Validation". This presentation describes the verification and validation processes of an ELT Project and outlines the key role System Engineering plays in these processes throughout all project phases. If these processes are implemented correctly into the project execution and are started at the proper time, namely at the very beginning of the project, and if all capabilities of experienced system engineers are used, the project costs and the life-cycle costs of the telescope system can be reduced between 25 and 50 %. The intention of this article is, to motivate and encourage project managers of astronomical telescopes and scientific instruments to involve the entire spectrum of Systems Engineering capabilities performed by trained and experienced SYSTEM engineers for the benefit of the project by explaining them the importance of Systems Engineering in the AIV and validation processes.

  17. Validation of 15 kGy as a radiation sterilisation dose for bone allografts manufactured at the Queensland Bone Bank: application of the VDmax 15 method.

    PubMed

    Nguyen, Huynh; Morgan, David A F; Sly, Lindsay I; Benkovich, Morris; Cull, Sharon; Forwood, Mark R

    2008-06-01

    ISO 11137-2006 (ISO 11137-2a 2006) provides a VDmax 15 method for substantiation of 15 kGy as radiation sterilisation dose (RSD) for health care products with a relatively low sample requirement. Moreover, the method is also valid for products in which the bioburden level is less than or equal to 1.5. In the literature, the bioburden level of processed bone allografts is extremely low. Similarly, the Queensland Bone Bank (QBB) usually recovers no viable organisms from processed bone allografts. Because bone allografts are treated as a type of health care product, the aim of this research was to substantiate 15 kGy as a RSD for frozen bone allografts at the QBB using method VDmax 15-ISO 11137-2: 2006 (ISO 11137-2e, Procedure for method VDmax 15 for multiple production batches. Sterilisation of health care products - radiation - part 2: establishing the sterilisation dose, 2006; ISO 11137-2f, Procedure for method VDmax 15 for a single production batch. Sterilisation of health care products - radiation - part 2: establishing the sterilisation dose, 2006). 30 femoral heads, 40 milled bone allografts and 40 structural bone allografts manufactured according to QBB standard operating procedures were used. Estimated bioburdens for each bone allograft group were used to calculate the verification doses. Next, 10 samples per group were irradiated at the verification dose, sterility was tested and the number of positive tests of sterility recorded. If the number of positive samples was no more than 1, from the 10 tests carried out in each group, the verification was accepted and 15 kGy was substantiated as RSD for those bone allografts. The bioburdens in all three groups were 0, and therefore the verification doses were 0 kGy. Sterility tests of femoral heads and milled bones were all negative (no contamination), and there was one positive test of sterility in the structural bone allograft. Accordingly, the verification was accepted. Using the ISO validated protocol, VDmax 15, 15 kGy was substantiated as RSD for frozen bone allografts manufactured at the QBB.

  18. Static and Dynamic Verification of Critical Software for Space Applications

    NASA Astrophysics Data System (ADS)

    Moreira, F.; Maia, R.; Costa, D.; Duro, N.; Rodríguez-Dapena, P.; Hjortnaes, K.

    Space technology is no longer used only for much specialised research activities or for sophisticated manned space missions. Modern society relies more and more on space technology and applications for every day activities. Worldwide telecommunications, Earth observation, navigation and remote sensing are only a few examples of space applications on which we rely daily. The European driven global navigation system Galileo and its associated applications, e.g. air traffic management, vessel and car navigation, will significantly expand the already stringent safety requirements for space based applications Apart from their usefulness and practical applications, every single piece of onboard software deployed into the space represents an enormous investment. With a long lifetime operation and being extremely difficult to maintain and upgrade, at least when comparing with "mainstream" software development, the importance of ensuring their correctness before deployment is immense. Verification &Validation techniques and technologies have a key role in ensuring that the onboard software is correct and error free, or at least free from errors that can potentially lead to catastrophic failures. Many RAMS techniques including both static criticality analysis and dynamic verification techniques have been used as a means to verify and validate critical software and to ensure its correctness. But, traditionally, these have been isolated applied. One of the main reasons is the immaturity of this field in what concerns to its application to the increasing software product(s) within space systems. This paper presents an innovative way of combining both static and dynamic techniques exploiting their synergy and complementarity for software fault removal. The methodology proposed is based on the combination of Software FMEA and FTA with Fault-injection techniques. The case study herein described is implemented with support from two tools: The SoftCare tool for the SFMEA and SFTA, and the Xception tool for fault-injection. Keywords: Verification &Validation, RAMS, Onboard software, SFMEA, STA, Fault-injection 1 This work is being performed under the project STADY Applied Static And Dynamic Verification Of Critical Software, ESA/ESTEC Contract Nr. 15751/02/NL/LvH.

  19. Expert system verification and validation study: ES V/V Workshop

    NASA Technical Reports Server (NTRS)

    French, Scott; Hamilton, David

    1992-01-01

    The primary purpose of this document is to build a foundation for applying principles of verification and validation (V&V) of expert systems. To achieve this, some V&V as applied to conventionally implemented software is required. Part one will discuss the background of V&V from the perspective of (1) what is V&V of software and (2) V&V's role in developing software. Part one will also overview some common analysis techniques that are applied when performing V&V of software. All of these materials will be presented based on the assumption that the reader has little or no background in V&V or in developing procedural software. The primary purpose of part two is to explain the major techniques that have been developed for V&V of expert systems.

  20. Verification and Validation Plan for Flight Performance Requirements on the CEV Parachute Assembly System

    NASA Technical Reports Server (NTRS)

    Morris, Aaron L.; Olson, Leah M.

    2011-01-01

    The Crew Exploration Vehicle Parachute Assembly System (CPAS) is engaged in a multi-year design and test campaign aimed at qualifying a parachute recovery system for human use on the Orion Spacecraft. Orion has parachute flight performance requirements that will ultimately be verified through the use of Monte Carlo multi-degree of freedom flight simulations. These simulations will be anchored by real world flight test data and iteratively improved to provide a closer approximation to the real physics observed in the inherently chaotic inflation and steady state flight of the CPAS parachutes. This paper will examine the processes necessary to verify the flight performance requirements of the human rated spacecraft. The focus will be on the requirements verification and model validation planned on CPAS.

  1. Initial verification and validation of RAZORBACK - A research reactor transient analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talley, Darren G.

    2015-09-01

    This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actualmore » ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.« less

  2. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  3. Horizontal lifelines - review of regulations and simple design method considering anchorage rigidity.

    PubMed

    Galy, Bertrand; Lan, André

    2018-03-01

    Among the many occupational risks construction workers encounter every day falling from a height is the most dangerous. The objective of this article is to propose a simple analytical design method for horizontal lifelines (HLLs) that considers anchorage flexibility. The article presents a short review of the standards and regulations/acts/codes concerning HLLs in Canada the USA and Europe. A static analytical approach is proposed considering anchorage flexibility. The analytical results are compared with a series of 42 dynamic fall tests and a SAP2000 numerical model. The experimental results show that the analytical method is a little conservative and overestimates the line tension in most cases with a maximum of 17%. The static SAP2000 results show a maximum 2.1% difference with the analytical method. The analytical method is accurate enough to safely design HLLs and quick design abaci are provided to allow the engineer to make quick on-site verification if needed.

  4. ERDEC Contribution to the 1993 International Treaty Verification Round Robin Exercise 4

    DTIC Science & Technology

    1994-07-01

    COLUMN Detector: MS (Finnigan 5100) Phase: AT-5 Manufacturer: Alltech GC CONDITIONS Length: 25 m Carrier gas: Helium Inner diameter: 0.25 mm Carrier...ionized in the ion source. The resulting CH,* then chemically reacts with the analyte. The advantage of this technique is that because less energy is

  5. URANS simulations of the tip-leakage cavitating flow with verification and validation procedures

    NASA Astrophysics Data System (ADS)

    Cheng, Huai-yu; Long, Xin-ping; Liang, Yun-zhi; Long, Yun; Ji, Bin

    2018-04-01

    In the present paper, the Vortex Identified Zwart-Gerber-Belamri (VIZGB) cavitation model coupled with the SST-CC turbulence model is used to investigate the unsteady tip-leakage cavitating flow induced by a NACA0009 hydrofoil. A qualitative comparison between the numerical and experimental results is made. In order to quantitatively evaluate the reliability of the numerical data, the verification and validation (V&V) procedures are used in the present paper. Errors of numerical results are estimated with seven error estimators based on the Richardson extrapolation method. It is shown that though a strict validation cannot be achieved, a reasonable prediction of the gross characteristics of the tip-leakage cavitating flow can be obtained. Based on the numerical results, the influence of the cavitation on the tip-leakage vortex (TLV) is discussed, which indicates that the cavitation accelerates the fusion of the TLV and the tip-separation vortex (TSV). Moreover, the trajectory of the TLV, when the cavitation occurs, is close to the side wall.

  6. Verification and validation of a Work Domain Analysis with turing machine task analysis.

    PubMed

    Rechard, J; Bignon, A; Berruet, P; Morineau, T

    2015-03-01

    While the use of Work Domain Analysis as a methodological framework in cognitive engineering is increasing rapidly, verification and validation of work domain models produced by this method are becoming a significant issue. In this article, we propose the use of a method based on Turing machine formalism named "Turing Machine Task Analysis" to verify and validate work domain models. The application of this method on two work domain analyses, one of car driving which is an "intentional" domain, and the other of a ship water system which is a "causal domain" showed the possibility of highlighting improvements needed by these models. More precisely, the step by step analysis of a degraded task scenario in each work domain model pointed out unsatisfactory aspects in the first modelling, like overspecification, underspecification, omission of work domain affordances, or unsuitable inclusion of objects in the work domain model. Copyright © 2014 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  7. Experimental Validation of L1 Adaptive Control: Rohrs' Counterexample in Flight

    NASA Technical Reports Server (NTRS)

    Xargay, Enric; Hovakimyan, Naira; Dobrokhodov, Vladimir; Kaminer, Issac; Kitsios, Ioannis; Cao, Chengyu; Gregory, Irene M.; Valavani, Lena

    2010-01-01

    The paper presents new results on the verification and in-flight validation of an L1 adaptive flight control system, and proposes a general methodology for verification and validation of adaptive flight control algorithms. The proposed framework is based on Rohrs counterexample, a benchmark problem presented in the early 80s to show the limitations of adaptive controllers developed at that time. In this paper, the framework is used to evaluate the performance and robustness characteristics of an L1 adaptive control augmentation loop implemented onboard a small unmanned aerial vehicle. Hardware-in-the-loop simulations and flight test results confirm the ability of the L1 adaptive controller to maintain stability and predictable performance of the closed loop adaptive system in the presence of general (artificially injected) unmodeled dynamics. The results demonstrate the advantages of L1 adaptive control as a verifiable robust adaptive control architecture with the potential of reducing flight control design costs and facilitating the transition of adaptive control into advanced flight control systems.

  8. Bayesian truthing as experimental verification of C4ISR sensors

    NASA Astrophysics Data System (ADS)

    Jannson, Tomasz; Forrester, Thomas; Romanov, Volodymyr; Wang, Wenjian; Nielsen, Thomas; Kostrzewski, Andrew

    2015-05-01

    In this paper, the general methodology for experimental verification/validation of C4ISR and other sensors' performance, is presented, based on Bayesian inference, in general, and binary sensors, in particular. This methodology, called Bayesian Truthing, defines Performance Metrics for binary sensors in: physics, optics, electronics, medicine, law enforcement, C3ISR, QC, ATR (Automatic Target Recognition), terrorism related events, and many others. For Bayesian Truthing, the sensing medium itself is not what is truly important; it is how the decision process is affected.

  9. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Krantz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type 1 censoring. The software was verified by reproducing results published by others.

  10. Software for Statistical Analysis of Weibull Distributions with Application to Gear Fatigue Data: User Manual with Verification

    NASA Technical Reports Server (NTRS)

    Kranz, Timothy L.

    2002-01-01

    The Weibull distribution has been widely adopted for the statistical description and inference of fatigue data. This document provides user instructions, examples, and verification for software to analyze gear fatigue test data. The software was developed presuming the data are adequately modeled using a two-parameter Weibull distribution. The calculations are based on likelihood methods, and the approach taken is valid for data that include type I censoring. The software was verified by reproducing results published by others.

  11. TORMES-BEXUS 17 and 19: Precursor of the 6U CubeSat 3CAT-2

    NASA Astrophysics Data System (ADS)

    Carreno-Luengo, H.; Amezaga, A.; Bolet, A.; Vidal, D.; Jane, J.; Munoz, J. F.; Olive, R.; Camps, A.; Carola, J.; Catarino, N.; Hagenfeldt, M.; Palomo, P.; Cornara, S.

    2015-09-01

    3Cat-2 Assembly, Integration and Verification (AIV) activities of the Engineering Model (EM) and the Flight Model (FM) are being carried out at present. The Attitude Determination and Control System (ADCS) and Flight Software (FSW) validation campaigns will be performed at Universitat Politècnica de Catalunya (UPC) during the incomings months. An analysis and verification of the 3Cat-2 key mission requirements has been performed. The main results are summarized in this work.

  12. Application of Dynamic Analysis in Semi-Analytical Finite Element Method

    PubMed Central

    Oeser, Markus

    2017-01-01

    Analyses of dynamic responses are significantly important for the design, maintenance and rehabilitation of asphalt pavement. In order to evaluate the dynamic responses of asphalt pavement under moving loads, a specific computational program, SAFEM, was developed based on a semi-analytical finite element method. This method is three-dimensional and only requires a two-dimensional FE discretization by incorporating Fourier series in the third dimension. In this paper, the algorithm to apply the dynamic analysis to SAFEM was introduced in detail. Asphalt pavement models under moving loads were built in the SAFEM and commercial finite element software ABAQUS to verify the accuracy and efficiency of the SAFEM. The verification shows that the computational accuracy of SAFEM is high enough and its computational time is much shorter than ABAQUS. Moreover, experimental verification was carried out and the prediction derived from SAFEM is consistent with the measurement. Therefore, the SAFEM is feasible to reliably predict the dynamic response of asphalt pavement under moving loads, thus proving beneficial to road administration in assessing the pavement’s state. PMID:28867813

  13. Validation protocol of analytical procedures for quantification of drugs in polymeric systems for parenteral administration: dexamethasone phosphate disodium microparticles.

    PubMed

    Martín-Sabroso, Cristina; Tavares-Fernandes, Daniel Filipe; Espada-García, Juan Ignacio; Torres-Suárez, Ana Isabel

    2013-12-15

    In this work a protocol to validate analytical procedures for the quantification of drug substances formulated in polymeric systems that comprise both drug entrapped into the polymeric matrix (assay:content test) and drug released from the systems (assay:dissolution test) is developed. This protocol is applied to the validation two isocratic HPLC analytical procedures for the analysis of dexamethasone phosphate disodium microparticles for parenteral administration. Preparation of authentic samples and artificially "spiked" and "unspiked" samples is described. Specificity (ability to quantify dexamethasone phosphate disodium in presence of constituents of the dissolution medium and other microparticle constituents), linearity, accuracy and precision are evaluated, in the range from 10 to 50 μg mL(-1) in the assay:content test procedure and from 0.25 to 10 μg mL(-1) in the assay:dissolution test procedure. The robustness of the analytical method to extract drug from microparticles is also assessed. The validation protocol developed allows us to conclude that both analytical methods are suitable for their intended purpose, but the lack of proportionality of the assay:dissolution analytical method should be taken into account. The validation protocol designed in this work could be applied to the validation of any analytical procedure for the quantification of drugs formulated in controlled release polymeric microparticles. Copyright © 2013 Elsevier B.V. All rights reserved.

  14. MICROORGANISMS IN BIOSOLIDS: ANALYTICAL METHODS DEVELOPMENT, STANDARDIZATION, AND VALIDATION

    EPA Science Inventory

    The objective of this presentation is to discuss pathogens of concern in biosolids, the analytical techniques used to evaluate microorganisms in biosolids, and to discuss standardization and validation of analytical protocols for microbes within such a complex matrix. Implicatio...

  15. Validation of COG10 and ENDFB6R7 on the Auk Workstation for General Application to Plutonium Systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Percher, Catherine G

    2011-08-08

    The COG 10 code package1 on the Auk workstation is now validated with the ENBFB6R7 neutron cross section library for general application to plutonium (Pu) systems by comparison of the calculated keffective to the expected keffective of several relevant experimental benchmarks. This validation is supplemental to the installation and verification of COG 10 on the Auk workstation2.

  16. Expert system verification and validation guidelines/workshop task. Deliverable no. 1: ES V/V guidelines

    NASA Technical Reports Server (NTRS)

    French, Scott W.

    1991-01-01

    The goals are to show that verifying and validating a software system is a required part of software development and has a direct impact on the software's design and structure. Workshop tasks are given in the areas of statistics, integration/system test, unit and architectural testing, and a traffic controller problem.

  17. 34 CFR 668.59 - Consequences of a change in an applicant's FAFSA information.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 34 Education 3 2012-07-01 2012-07-01 false Consequences of a change in an applicant's FAFSA... Verification and Updating of Student Aid Application Information § 668.59 Consequences of a change in an... on the corrected valid SAR or valid ISIR; and (2)(i) Disburse any additional funds under that award...

  18. 34 CFR 668.59 - Consequences of a change in an applicant's FAFSA information.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 34 Education 3 2014-07-01 2014-07-01 false Consequences of a change in an applicant's FAFSA... Verification and Updating of Student Aid Application Information § 668.59 Consequences of a change in an... on the corrected valid SAR or valid ISIR; and (2)(i) Disburse any additional funds under that award...

  19. 34 CFR 668.59 - Consequences of a change in an applicant's FAFSA information.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 34 Education 3 2013-07-01 2013-07-01 false Consequences of a change in an applicant's FAFSA... Verification and Updating of Student Aid Application Information § 668.59 Consequences of a change in an... on the corrected valid SAR or valid ISIR; and (2)(i) Disburse any additional funds under that award...

  20. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?

    PubMed

    Schaun, Gustavo Z

    2017-12-08

    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

Top