Sample records for interface verification test

  1. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  2. Using a Modular Open Systems Approach in Defense Acquisitions: Implications for the Contracting Process

    DTIC Science & Technology

    2006-01-30

    He has taught contract management courses for the UCLA Government Contracts Certificate program and is also a senior faculty member for the Keller...standards for its key interfaces, and has been subjected to successful validation and verification tests to ensure the openness of its key interfaces...widely supported and consensus based standards for its key interfaces, and is subject to validation and verification tests to ensure the openness of its

  3. Independent Verification and Validation of Complex User Interfaces: A Human Factors Approach

    NASA Technical Reports Server (NTRS)

    Whitmore, Mihriban; Berman, Andrea; Chmielewski, Cynthia

    1996-01-01

    The Usability Testing and Analysis Facility (UTAF) at the NASA Johnson Space Center has identified and evaluated a potential automated software interface inspection tool capable of assessing the degree to which space-related critical and high-risk software system user interfaces meet objective human factors standards across each NASA program and project. Testing consisted of two distinct phases. Phase 1 compared analysis times and similarity of results for the automated tool and for human-computer interface (HCI) experts. In Phase 2, HCI experts critiqued the prototype tool's user interface. Based on this evaluation, it appears that a more fully developed version of the tool will be a promising complement to a human factors-oriented independent verification and validation (IV&V) process.

  4. Software Independent Verification and Validation (SIV&V) Simplified

    DTIC Science & Technology

    2006-12-01

    Configuration Item I/O Input/Output I2V2 Independent Integrated Verification and Validation IBM International Business Machines ICD Interface...IPT Integrated Product Team IRS Interface Requirements Specification ISD Integrated System Diagram ITD Integrated Test Description ITP ...programming languages such as COBOL (Common Business Oriented Language) (Codasyl committee 1960), and FORTRAN (FORmula TRANslator) ( IBM 1952) (Robat 11

  5. Ada(R) Test and Verification System (ATVS)

    NASA Technical Reports Server (NTRS)

    Strelich, Tom

    1986-01-01

    The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.

  6. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  7. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  8. Interfacing and Verifying ALHAT Safe Precision Landing Systems with the Morpheus Vehicle

    NASA Technical Reports Server (NTRS)

    Carson, John M., III; Hirsh, Robert L.; Roback, Vincent E.; Villalpando, Carlos; Busa, Joseph L.; Pierrottet, Diego F.; Trawny, Nikolas; Martin, Keith E.; Hines, Glenn D.

    2015-01-01

    The NASA Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project developed a suite of prototype sensors to enable autonomous and safe precision landing of robotic or crewed vehicles under any terrain lighting conditions. Development of the ALHAT sensor suite was a cross-NASA effort, culminating in integration and testing on-board a variety of terrestrial vehicles toward infusion into future spaceflight applications. Terrestrial tests were conducted on specialized test gantries, moving trucks, helicopter flights, and a flight test onboard the NASA Morpheus free-flying, rocket-propulsive flight-test vehicle. To accomplish these tests, a tedious integration process was developed and followed, which included both command and telemetry interfacing, as well as sensor alignment and calibration verification to ensure valid test data to analyze ALHAT and Guidance, Navigation and Control (GNC) performance. This was especially true for the flight test campaign of ALHAT onboard Morpheus. For interfacing of ALHAT sensors to the Morpheus flight system, an adaptable command and telemetry architecture was developed to allow for the evolution of per-sensor Interface Control Design/Documents (ICDs). Additionally, individual-sensor and on-vehicle verification testing was developed to ensure functional operation of the ALHAT sensors onboard the vehicle, as well as precision-measurement validity for each ALHAT sensor when integrated within the Morpheus GNC system. This paper provides some insight into the interface development and the integrated-systems verification that were a part of the build-up toward success of the ALHAT and Morpheus flight test campaigns in 2014. These campaigns provided valuable performance data that is refining the path toward spaceflight infusion of the ALHAT sensor suite.

  9. Space shuttle engineering and operations support. Avionics system engineering

    NASA Technical Reports Server (NTRS)

    Broome, P. A.; Neubaur, R. J.; Welsh, R. T.

    1976-01-01

    The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.

  10. A thermal scale modeling study for Apollo and Apollo applications, volume 1

    NASA Technical Reports Server (NTRS)

    Shannon, R. L.

    1972-01-01

    The program is reported for developing and demonstrating the capabilities of thermal scale modeling as a thermal design and verification tool for Apollo and Apollo Applications Projects. The work performed for thermal scale modeling of STB; cabin atmosphere/spacecraft cabin wall thermal interface; closed loop heat rejection radiator; and docked module/spacecraft thermal interface are discussed along with the test facility requirements for thermal scale model testing of AAP spacecraft. It is concluded that thermal scale modeling can be used as an effective thermal design and verification tool to provide data early in a spacecraft development program.

  11. Quality Assurance Project Plan for Verification of Sediment Ecotoxicity Assessment Ring(SEA Ring)

    EPA Science Inventory

    The objective of the verification is to test the efficacy and ability of the Sediment Ecotoxicity Assessment Ring (SEA Ring) to evaluate the toxicity of contaminants in the sediment, at the sediment-water interface, and WC to organisms that live in those respective environments.

  12. Constrained structural dynamic model verification using free vehicle suspension testing methods

    NASA Technical Reports Server (NTRS)

    Blair, Mark A.; Vadlamudi, Nagarjuna

    1988-01-01

    Verification of the validity of a spacecraft's structural dynamic math model used in computing ascent (or in the case of the STS, ascent and landing) loads is mandatory. This verification process requires that tests be carried out on both the payload and the math model such that the ensuing correlation may validate the flight loads calculations. To properly achieve this goal, the tests should be performed with the payload in the launch constraint (i.e., held fixed at only the payload-booster interface DOFs). The practical achievement of this set of boundary conditions is quite difficult, especially with larger payloads, such as the 12-ton Hubble Space Telescope. The development of equations in the paper will show that by exciting the payload at its booster interface while it is suspended in the 'free-free' state, a set of transfer functions can be produced that will have minima that are directly related to the fundamental modes of the payload when it is constrained in its launch configuration.

  13. Z-2 Architecture Description and Requirements Verification Results

    NASA Technical Reports Server (NTRS)

    Graziosi, Dave; Jones, Bobby; Ferl, Jinny; Scarborough, Steve; Hewes, Linda; Ross, Amy; Rhodes, Richard

    2016-01-01

    The Z-2 Prototype Planetary Extravehicular Space Suit Assembly is a continuation of NASA's Z series of spacesuits. The Z-2 is another step in NASA's technology development roadmap leading to human exploration of the Martian surface. The suit was designed for maximum mobility at 8.3 psid, reduced mass, and to have high fidelity life support interfaces. As Z-2 will be man-tested at full vacuum in NASA JSC's Chamber B, it was manufactured as Class II, making it the most flight-like planetary walking suit produced to date. The Z-2 suit architecture is an evolution of previous EVA suits, namely the ISS EMU, Mark III, Rear Entry I-Suit and Z-1 spacesuits. The suit is a hybrid hard and soft multi-bearing, rear entry spacesuit. The hard upper torso (HUT) is an all-composite structure and includes a 2-bearing rolling convolute shoulder with Vernier sizing mechanism, removable suit port interface plate (SIP), elliptical hemispherical helmet and self-don/doff shoulder harness. The hatch is a hybrid aluminum and composite construction with Apollo style gas connectors, custom water pass-thru, removable hatch cage and interfaces to primary and auxiliary life support feed water bags. The suit includes Z-1 style lower arms with cam brackets for Vernier sizing and government furnished equipment (GFE) Phase VI gloves. The lower torso includes a telescopic waist sizing system, waist bearing, rolling convolute waist joint, hard brief, 2 bearing soft hip thigh, Z-1 style legs with ISS EMU style cam brackets for sizing, and conformal walking boots with ankle bearings. The Z-2 Requirements Verification Plan includes the verification of more than 200 individual requirements. The verification methods include test, analysis, inspection, demonstration or a combination of methods. Examples of unmanned requirements include suit leakage, proof pressure testing, operational life, mass, isometric man-loads, sizing adjustment ranges, internal and external interfaces such as in-suit drink bag, partial pressure relief valve, purge valve, donning stand and ISS Body Restraint Tether (BRT). Examples of manned requirements include verification of anthropometric range, suit self-don/doff, secondary suit exit method, donning stand self-ingress/egress and manned mobility covering eight functional tasks. The eight functional tasks include kneeling with object pick-up, standing toe touch, cross-body reach, walking, reach to the SIP and helmet visor. This paper will provide an overview of the Z-2 design. Z-2 requirements verification testing was performed with NASA at the ILC Houston test facility. This paper will also discuss pre-delivery manned and unmanned test results as well as analysis performed in support of requirements verification.

  14. UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces

    NASA Technical Reports Server (NTRS)

    Shiffman, Smadar; Degani, Asaf; Heymann, Michael

    2004-01-01

    In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.

  15. Towards the formal verification of the requirements and design of a processor interface unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.

  16. Neutron Source Facility Training Simulator Based on EPICS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Young Soo; Wei, Thomas Y.; Vilim, Richard B.

    A plant operator training simulator is developed for training the plant operators as well as for design verification of plant control system (PCS) and plant protection system (PPS) for the Kharkov Institute of Physics and Technology Neutron Source Facility. The simulator provides the operator interface for the whole plant including the sub-critical assembly coolant loop, target coolant loop, secondary coolant loop, and other facility systems. The operator interface is implemented based on Experimental Physics and Industrial Control System (EPICS), which is a comprehensive software development platform for distributed control systems. Since its development at Argonne National Laboratory, it has beenmore » widely adopted in the experimental physics community, e.g. for control of accelerator facilities. This work is the first implementation for a nuclear facility. The main parts of the operator interface are the plant control panel and plant protection panel. The development involved implementation of process variable database, sequence logic, and graphical user interface (GUI) for the PCS and PPS utilizing EPICS and related software tools, e.g. sequencer for sequence logic, and control system studio (CSS-BOY) for graphical use interface. For functional verification of the PCS and PPS, a plant model is interfaced, which is a physics-based model of the facility coolant loops implemented as a numerical computer code. The training simulator is tested and demonstrated its effectiveness in various plant operation sequences, e.g. start-up, shut-down, maintenance, and refueling. It was also tested for verification of the plant protection system under various trip conditions.« less

  17. SPR Hydrostatic Column Model Verification and Validation.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bettin, Giorgia; Lord, David; Rudeen, David Keith

    2015-10-01

    A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extendedmore » nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.« less

  18. Towards the formal verification of the requirements and design of a processor interface unit: HOL listings

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    This technical report contains the Higher-Order Logic (HOL) listings of the partial verification of the requirements and design for a commercially developed processor interface unit (PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault tolerant computer system. This system, the Fault Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU verification as it currently exists. Section two of this report contains general-purpose HOL theories and definitions that support the PIU verification. These include arithmetic theories dealing with inequalities and associativity, and a collection of tactics used in the PIU proofs. Section three contains the HOL listings for the completed PIU design verification. Section 4 contains the HOL listings for the partial requirements verification of the P-Port.

  19. Real-Time Extended Interface Automata for Software Testing Cases Generation

    PubMed Central

    Yang, Shunkun; Xu, Jiaqi; Man, Tianlong; Liu, Bin

    2014-01-01

    Testing and verification of the interface between software components are particularly important due to the large number of complex interactions, which requires the traditional modeling languages to overcome the existing shortcomings in the aspects of temporal information description and software testing input controlling. This paper presents the real-time extended interface automata (RTEIA) which adds clearer and more detailed temporal information description by the application of time words. We also establish the input interface automaton for every input in order to solve the problems of input controlling and interface covering nimbly when applied in the software testing field. Detailed definitions of the RTEIA and the testing cases generation algorithm are provided in this paper. The feasibility and efficiency of this method have been verified in the testing of one real aircraft braking system. PMID:24892080

  20. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    NASA Astrophysics Data System (ADS)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  1. bcROCsurface: an R package for correcting verification bias in estimation of the ROC surface and its volume for continuous diagnostic tests.

    PubMed

    To Duc, Khanh

    2017-11-18

    Receiver operating characteristic (ROC) surface analysis is usually employed to assess the accuracy of a medical diagnostic test when there are three ordered disease status (e.g. non-diseased, intermediate, diseased). In practice, verification bias can occur due to missingness of the true disease status and can lead to a distorted conclusion on diagnostic accuracy. In such situations, bias-corrected inference tools are required. This paper introduce an R package, named bcROCsurface, which provides utility functions for verification bias-corrected ROC surface analysis. The shiny web application of the correction for verification bias in estimation of the ROC surface analysis is also developed. bcROCsurface may become an important tool for the statistical evaluation of three-class diagnostic markers in presence of verification bias. The R package, readme and example data are available on CRAN. The web interface enables users less familiar with R to evaluate the accuracy of diagnostic tests, and can be found at http://khanhtoduc.shinyapps.io/bcROCsurface_shiny/ .

  2. Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.

  3. First International Conference on Ada (R) Programming Language Applications for the NASA Space Station, volume 1

    NASA Technical Reports Server (NTRS)

    Bown, Rodney L. (Editor)

    1986-01-01

    Topics discussed include: test and verification; environment issues; distributed Ada issues; life cycle issues; Ada in Europe; management/training issues; common Ada interface set; and run time issues.

  4. GRIZZLY/FAVOR Interface Project Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Dickson, Terry L; Williams, Paul T; Yin, Shengjun

    As part of the Light Water Reactor Sustainability (LWRS) Program, the objective of the GRIZZLY/FAVOR Interface project is to create the capability to apply GRIZZLY 3-D finite element (thermal and stress) analysis results as input to FAVOR probabilistic fracture mechanics (PFM) analyses. The one benefit of FAVOR to Grizzly is the PROBABILISTIC capability. This document describes the implementation of the GRIZZLY/FAVOR Interface, the preliminary verification and tests results and a user guide that provides detailed step-by-step instructions to run the program.

  5. Integrated guidance, navigation and control verification plan primary flight system. [space shuttle avionics integration

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.

  6. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Morten T.; Wendt, Fabian F.; Robertson, Amy N.

    2016-07-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  7. Customer Avionics Interface Development and Analysis (CAIDA): Software Developer for Avionics Systems

    NASA Technical Reports Server (NTRS)

    Mitchell, Sherry L.

    2018-01-01

    The Customer Avionics Interface Development and Analysis (CAIDA) supports the testing of the Launch Control System (LCS), NASA's command and control system for the Space Launch System (SLS), Orion Multi-Purpose Crew Vehicle (MPCV), and ground support equipment. The objective of the semester-long internship was to support day-to-day operations of CAIDA and help prepare for verification and validation of CAIDA software.

  8. Proceedings of the 3rd Annual Conference on Aerospace Computational Control, volume 1

    NASA Technical Reports Server (NTRS)

    Bernard, Douglas E. (Editor); Man, Guy K. (Editor)

    1989-01-01

    Conference topics included definition of tool requirements, advanced multibody component representation descriptions, model reduction, parallel computation, real time simulation, control design and analysis software, user interface issues, testing and verification, and applications to spacecraft, robotics, and aircraft.

  9. Verification and Validation of Multisegmented Mooring Capabilities in FAST v8: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Andersen, Morten T.; Wendt, Fabian; Robertson, Amy

    2016-08-01

    The quasi-static and dynamic mooring modules of the open-source aero-hydro-servo-elastic wind turbine simulation software, FAST v8, have previously been verified and validated, but only for mooring arrangements consisting of single lines connecting each fairlead and anchor. This paper extends the previous verification and validation efforts to focus on the multisegmented mooring capability of the FAST v8 modules: MAP++, MoorDyn, and the OrcaFlex interface. The OC3-Hywind spar buoy system tested by the DeepCwind consortium at the MARIN ocean basin, which includes a multisegmented bridle layout of the mooring system, was used for the verification and validation activities.

  10. NDEC: A NEA platform for nuclear data testing, verification and benchmarking

    NASA Astrophysics Data System (ADS)

    Díez, C. J.; Michel-Sendis, F.; Cabellos, O.; Bossant, M.; Soppera, N.

    2017-09-01

    The selection, testing, verification and benchmarking of evaluated nuclear data consists, in practice, in putting an evaluated file through a number of checking steps where different computational codes verify that the file and the data it contains complies with different requirements. These requirements range from format compliance to good performance in application cases, while at the same time physical constraints and the agreement with experimental data are verified. At NEA, the NDEC (Nuclear Data Evaluation Cycle) platform aims at providing, in a user friendly interface, a thorough diagnose of the quality of a submitted evaluated nuclear data file. Such diagnose is based on the results of different computational codes and routines which carry out the mentioned verifications, tests and checks. NDEC also searches synergies with other existing NEA tools and databases, such as JANIS, DICE or NDaST, including them into its working scheme. Hence, this paper presents NDEC, its current development status and its usage in the JEFF nuclear data project.

  11. Development of CO2 laser Doppler instrumentation for detection of clear air turbulence, volume 1

    NASA Technical Reports Server (NTRS)

    Harris, C. E.; Jelalian, A. V.

    1979-01-01

    Modification, construction, test and operation of an advanced airborne carbon dioxide laser Doppler system for detecting clear air turbulence are described. The second generation CAT program and those auxiliary activities required to support and verify such a first-of-a-kind system are detailed: aircraft interface; ground and flight verification tests; data analysis; and laboratory examinations.

  12. WIS Implementation Study Report. Volume 2. Resumes.

    DTIC Science & Technology

    1983-10-01

    WIS modernization that major attention be paid to interface definition and design, system integra- tion and test , and configuration management of the...Estimates -- Computer Corporation of America -- 155 Test Processing Systems -- Newburyport Computer Associates, Inc. -- 183 Cluster II Papers-- Standards...enhancements of the SPL/I compiler system, development of test systems for the verification of SDEX/M and the timing and architecture of the AN/U YK-20 and

  13. Exploring system interconnection architectures with VIPACES: from direct connections to NOCs

    NASA Astrophysics Data System (ADS)

    Sánchez-Peña, Armando; Carballo, Pedro P.; Núñez, Antonio

    2007-05-01

    This paper presents a simple environment for the verification of AMBA 3 AXI systems in Verification IP (VIP) production called VIPACES (Verification Interface Primitives for the development of AXI Compliant Elements and Systems). These primitives are presented as a not compiled library written in SystemC where interfaces are the core of the library. The definition of interfaces instead of generic modules let the user construct custom modules improving the resources spent during the verification phase as well as easily adapting his modules to the AMBA 3 AXI protocol. This topic is the main discussion in the VIPACES library. The paper focuses on comparing and contrasting the main interconnection schemes for AMBA 3 AXI as modeled by VIPACES. For assessing these results we propose a validation scenario with a particular architecture belonging to the domain of MPEG4 video decoding, which is compound by an AXI bus connecting an IDCT and other processing resources.

  14. Interface Generation and Compositional Verification in JavaPathfinder

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina

    2009-01-01

    We present a novel algorithm for interface generation of software components. Given a component, our algorithm uses learning techniques to compute a permissive interface representing legal usage of the component. Unlike our previous work, this algorithm does not require knowledge about the component s environment. Furthermore, in contrast to other related approaches, our algorithm computes permissive interfaces even in the presence of non-determinism in the component. Our algorithm is implemented in the JavaPathfinder model checking framework for UML statechart components. We have also added support for automated assume-guarantee style compositional verification in JavaPathfinder, using component interfaces. We report on the application of the presented approach to the generation of interfaces for flight software components.

  15. Mission Control Center (MCC) System Specification for the Shuttle Orbital Flight Test (OFT) Timeframe

    NASA Technical Reports Server (NTRS)

    1976-01-01

    System specifications to be used by the mission control center (MCC) for the shuttle orbital flight test (OFT) time frame were described. The three support systems discussed are the communication interface system (CIS), the data computation complex (DCC), and the display and control system (DCS), all of which may interfere with, and share processing facilities with other applications processing supporting current MCC programs. The MCC shall provide centralized control of the space shuttle OFT from launch through orbital flight, entry, and landing until the Orbiter comes to a stop on the runway. This control shall include the functions of vehicle management in the area of hardware configuration (verification), flight planning, communication and instrumentation configuration management, trajectory, software and consumables, payloads management, flight safety, and verification of test conditions/environment.

  16. Towards the formal specification of the requirements and design of a processor interface unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    Work to formally specify the requirements and design of a Processor Interface Unit (PIU), a single-chip subsystem providing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system, is described. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance free operation, or both. The approaches that were developed for modeling the PIU requirements and for composition of the PIU subcomponents at high levels of abstraction are described. These approaches were used to specify and verify a nontrivial subset of the PIU behavior. The PIU specification in Higher Order Logic (HOL) is documented in a companion NASA contractor report entitled 'Towards the Formal Specification of the Requirements and Design of a Processor Interfacs Unit - HOL Listings.' The subsequent verification approach and HOL listings are documented in NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit' and NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings.'

  17. Cassini's RTGs undergo mechanical and electrical verification testing in the PHSF

    NASA Technical Reports Server (NTRS)

    1997-01-01

    Jet Propulsion Laboratory (JPL) engineers examine the interface surface on the Cassini spacecraft prior to installation of the third radioisotope thermoelectric generator (RTG). The other two RTGs, at left, already are installed on Cassini. The three RTGs will be used to power Cassini on its mission to the Saturnian system. They are undergoing mechanical and electrical verification testing in the Payload Hazardous Servicing Facility. RTGs use heat from the natural decay of plutonium to generate electric power. The generators enable spacecraft to operate far from the Sun where solar power systems are not feasible. The Cassini mission is scheduled for an Oct. 6 launch aboard a Titan IVB/Centaur expendable launch vehicle. Cassini is built and managed for NASA by JPL.

  18. Concept Verification Test - Evaluation of Spacelab/Payload operation concepts

    NASA Technical Reports Server (NTRS)

    Mcbrayer, R. O.; Watters, H. H.

    1977-01-01

    The Concept Verification Test (CVT) procedure is used to study Spacelab operational concepts by conducting mission simulations in a General Purpose Laboratory (GPL) which represents a possible design of Spacelab. In conjunction with the laboratory a Mission Development Simulator, a Data Management System Simulator, a Spacelab Simulator, and Shuttle Interface Simulator have been designed. (The Spacelab Simulator is more functionally and physically representative of the Spacelab than the GPL.) Four simulations of Spacelab mission experimentation were performed, two involving several scientific disciplines, one involving life sciences, and the last involving material sciences. The purpose of the CVT project is to support the pre-design and development of payload carriers and payloads, and to coordinate hardware, software, and operational concepts of different developers and users.

  19. Small Projects Rapid Integration and Test Environment (SPRITE): Application for Increasing Robustness

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Heater, Daniel; Lee, Ashley

    2013-01-01

    Marshall Space Flight Center's (MSFC) Small Projects Rapid Integration and Test Environment (SPRITE) is a Hardware-In-The-Loop (HWIL) facility that provides rapid development, integration, and testing capabilities for small projects (CubeSats, payloads, spacecraft, and launch vehicles). This facility environment focuses on efficient processes and modular design to support rapid prototyping, integration, testing and verification of small projects at an affordable cost, especially compared to larger type HWIL facilities. SPRITE (Figure 1) consists of a "core" capability or "plant" simulation platform utilizing a graphical programming environment capable of being rapidly re-configured for any potential test article's space environments, as well as a standard set of interfaces (i.e. Mil-Std 1553, Serial, Analog, Digital, etc.). SPRITE also allows this level of interface testing of components and subsystems very early in a program, thereby reducing program risk.

  20. Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program

    NASA Technical Reports Server (NTRS)

    Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby

    2017-01-01

    Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.

  1. Ground Systems Development Environment (GSDE) interface requirements analysis: Operations scenarios

    NASA Technical Reports Server (NTRS)

    Church, Victor E.; Phillips, John

    1991-01-01

    This report is a preliminary assessment of the functional and data interface requirements to the link between the GSDE GS/SPF (Amdahl) and the Space Station Control Center (SSCC) and Space Station Training Facility (SSTF) Integration, Verification, and Test Environments (IVTE's). These interfaces will be involved in ground software development of both the control center and the simulation and training systems. Our understanding of the configuration management (CM) interface and the expected functional characteristics of the Amdahl-IVTE interface is described. A set of assumptions and questions that need to be considered and resolved in order to complete the interface functional and data requirements definitions are presented. A listing of information items defined to describe software configuration items in the GSDE CM system is included. It also includes listings of standard reports of CM information and of CM-related tools in the GSDE.

  2. Design and performance of a large vocabulary discrete word recognition system. Volume 1: Technical report. [real time computer technique for voice data processing

    NASA Technical Reports Server (NTRS)

    1973-01-01

    The development, construction, and test of a 100-word vocabulary near real time word recognition system are reported. Included are reasonable replacement of any one or all 100 words in the vocabulary, rapid learning of a new speaker, storage and retrieval of training sets, verbal or manual single word deletion, continuous adaptation with verbal or manual error correction, on-line verification of vocabulary as spoken, system modes selectable via verification display keyboard, relationship of classified word to neighboring word, and a versatile input/output interface to accommodate a variety of applications.

  3. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  4. Integrated verification and testing system (IVTS) for HAL/S programs

    NASA Technical Reports Server (NTRS)

    Senn, E. H.; Ames, K. R.; Smith, K. A.

    1983-01-01

    The IVTS is a large software system designed to support user-controlled verification analysis and testing activities for programs written in the HAL/S language. The system is composed of a user interface and user command language, analysis tools and an organized data base of host system files. The analysis tools are of four major types: (1) static analysis, (2) symbolic execution, (3) dynamic analysis (testing), and (4) documentation enhancement. The IVTS requires a split HAL/S compiler, divided at the natural separation point between the parser/lexical analyzer phase and the target machine code generator phase. The IVTS uses the internal program form (HALMAT) between these two phases as primary input for the analysis tools. The dynamic analysis component requires some way to 'execute' the object HAL/S program. The execution medium may be an interpretive simulation or an actual host or target machine.

  5. Interface COMSOL-PHREEQC (iCP), an efficient numerical framework for the solution of coupled multiphysics and geochemistry

    NASA Astrophysics Data System (ADS)

    Nardi, Albert; Idiart, Andrés; Trinchero, Paolo; de Vries, Luis Manuel; Molinero, Jorge

    2014-08-01

    This paper presents the development, verification and application of an efficient interface, denoted as iCP, which couples two standalone simulation programs: the general purpose Finite Element framework COMSOL Multiphysics® and the geochemical simulator PHREEQC. The main goal of the interface is to maximize the synergies between the aforementioned codes, providing a numerical platform that can efficiently simulate a wide number of multiphysics problems coupled with geochemistry. iCP is written in Java and uses the IPhreeqc C++ dynamic library and the COMSOL Java-API. Given the large computational requirements of the aforementioned coupled models, special emphasis has been placed on numerical robustness and efficiency. To this end, the geochemical reactions are solved in parallel by balancing the computational load over multiple threads. First, a benchmark exercise is used to test the reliability of iCP regarding flow and reactive transport. Then, a large scale thermo-hydro-chemical (THC) problem is solved to show the code capabilities. The results of the verification exercise are successfully compared with those obtained using PHREEQC and the application case demonstrates the scalability of a large scale model, at least up to 32 threads.

  6. Galileo attitude and articulation control subsystem closed loop testing

    NASA Technical Reports Server (NTRS)

    Lembeck, M. F.; Pignatano, N. D.

    1983-01-01

    In order to ensure the reliable operation of the Attitude and Articulation Control Subsystem (AACS) which will guide the Galileo spacecraft on its two and one-half year journey to Jupiter, the AACS is being rigorously tested. The primary objectives of the test program are the verification of the AACS's form, fit, and function, especially with regard to subsystem external interfaces and the functional operation of the flight software. Attention is presently given to the Galileo Closed Loop Test System, which simulates the dynamic and 'visual' flight environment for AACS components in the laboratory.

  7. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    At the SPACEHAB Facility, STS-96 Mission Specialist Ellen Ochoa and Commander Kent Rominger pause during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  8. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Julie Payette closes a container, part of the equipment to be carried on the SPACEHAB and mission STS-96. She and other crew members Commander Kent Rominger, Pilot Rick Husband, and Mission Speciaists Ellen Ochoa, Tamara Jernigan, Dan Barry and Valery Tokarev of Russia are at KSC for a payload Interface Verification Test for the upcoming mission to the International Space Station . Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  9. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    Posing on the platform next to the SPACEHAB Logistics Double Module in the SPACEHAB Facility are the STS-96 crew (from left) Mission Specialists Dan Barry, Tamara Jernigan, Valery Tokarev of Russia, and Julie Payette; Pilot Rick Husband; Mission Specialist Ellen Ochoa; and Commander Kent Rominger. The crew is at KSC for a payload Interface Verification Test for their upcoming mission to the International Space Station. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  10. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    At the SPACEHAB Facility, STS-96 Mission Specialist Ellen Ochoa and Commander Kent Rominger smile for the camera during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  11. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) for the upcoming mission to the International Space Station , Chris Jaskolka of Boeing points out a piece of equipment in the SPACEHAB module to STS-96 Commander Kent Rominger, Mission Specialist Ellen Ochoa and Pilot Rick Husband. Other crew members visiting KSC for the IVT are Mission Specialists Tamara Jernigan, Dan Barry, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  12. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialists Dan Barry and Tamara Jernigan discuss procedures during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband and Mission Specialists Ellen Ochoa, Julie Payette and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  13. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, James Behling, with Boeing, talks about equipment for mission STS-96 during a payload Interface Verification Test (IVT). Watching are (from left) Mission Specialists Ellen Ochoa, Julie Payette and Dan Berry, and Pilot Rick Husband. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  14. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station, STS-96 Mission Specialists Julie Payette, Dan Barry, and Valery Tokarev of Russia, look at a Sequential Shunt Unit in the SPACEHAB Facility. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband, and Mission Specialists Ellen Ochoa and Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  15. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station are (left to right) Mission Specialists Valery Tokarev, Julie Payette (holding a lithium hydroxide canister) and Dan Barry. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband and Mission Specialists Ellen Ochoa and Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  16. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, the STS-96 crew looks over equipment during a payload Interface Verification Test for the upcoming mission to the International Space Station. From left are Commander Kent Rominger, Mission Specialists Tamara Jernigan and Valery Tokarev of Russia, Pilot Rick Husband, and Mission Specialists Ellen Ochoa and Julie Payette (backs to the camera). They are listening to Chris Jaskolka of Boeing talk about the equipment. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m. EDT.

  17. Rule groupings: An approach towards verification of expert systems

    NASA Technical Reports Server (NTRS)

    Mehrotra, Mala

    1991-01-01

    Knowledge-based expert systems are playing an increasingly important role in NASA space and aircraft systems. However, many of NASA's software applications are life- or mission-critical and knowledge-based systems do not lend themselves to the traditional verification and validation techniques for highly reliable software. Rule-based systems lack the control abstractions found in procedural languages. Hence, it is difficult to verify or maintain such systems. Our goal is to automatically structure a rule-based system into a set of rule-groups having a well-defined interface to other rule-groups. Once a rule base is decomposed into such 'firewalled' units, studying the interactions between rules would become more tractable. Verification-aid tools can then be developed to test the behavior of each such rule-group. Furthermore, the interactions between rule-groups can be studied in a manner similar to integration testing. Such efforts will go a long way towards increasing our confidence in the expert-system software. Our research efforts address the feasibility of automating the identification of rule groups, in order to decompose the rule base into a number of meaningful units.

  18. 78 FR 48170 - Privacy Act of 1974; CMS Computer Match No. 2013-12; HHS Computer Match No. 1307; SSA Computer...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-07

    ... Wesolowski, Director, Verifications Policy & Operations Branch, Division of Eligibility and Enrollment Policy..., electronic interfaces and an on-line system for the verification of eligibility. PURPOSE(S) OF THE MATCHING... Security number (SSN) verifications, (2) a death indicator, (3) an indicator of a finding of disability by...

  19. Extending the granularity of representation and control for the MIL-STD CAIS 1.0 node model

    NASA Technical Reports Server (NTRS)

    Rogers, Kathy L.

    1986-01-01

    The Common APSE (Ada 1 Program Support Environment) Interface Set (CAIS) (DoD85) node model provides an excellent baseline for interfaces in a single-host development environment. To encompass the entire spectrum of computing, however, the CAIS model should be extended in four areas. It should provide the interface between the engineering workstation and the host system throughout the entire lifecycle of the system. It should provide a basis for communication and integration functions needed by distributed host environments. It should provide common interfaces for communications mechanisms to and among target processors. It should provide facilities for integration, validation, and verification of test beds extending to distributed systems on geographically separate processors with heterogeneous instruction set architectures (ISAS). Additions to the PROCESS NODE model to extend the CAIS into these four areas are proposed.

  20. KSC-05PD-0527

    NASA Technical Reports Server (NTRS)

    2005-01-01

    KENNEDY SPACE CENTER, FLA. In the Vehicle Assembly Building at NASAs Kennedy Space Center, workers mate the External Tank, at left, to the underside of Space Shuttle Discovery, at right. Each of two aft external tank umbilical plates mate with a corresponding plate on the orbiter. The plates help maintain alignment among the umbilicals. The attach fitting is aft of the nose gear wheel well. Workers next will perform an electrical and mechanical verification of the mated interfaces to verify all critical vehicle connections. A Shuttle interface test is performed using the launch processing system to verify Space Shuttle vehicle interfaces and Space Shuttle vehicle-to-ground interfaces. In approximately one week, Space Shuttle Discovery will be ready for rollout to Launch Pad 39B for Return to Flight mission STS-114. The launch window for STS-114 is May 15 to June 3.

  1. Power System Test and Verification at Satellite Level

    NASA Astrophysics Data System (ADS)

    Simonelli, Giulio; Mourra, Olivier; Tonicello, Ferdinando

    2008-09-01

    Most of the articles on Power Systems deal with the architecture and technical solutions related to the functionalities of the power system and their performances. Very few articles, if none, address integration and verification aspects of the Power System at satellite level and the related issues with the Power EGSE (Electrical Ground Support Equipment), which, also, have to support the AIT/AIV (Assembly Integration Test and Verification) program of the satellite and, eventually, the launch campaign. In the last years a more complex development and testing concept based on MDVE (Model Based Development and Verification Environment) has been introduced. In the MDVE approach the simulation software is used to simulate the Satellite environment and, in the early stages, the satellites units. This approach changed significantly the Power EGSE requirements. Power EGSEs or, better, Power SCOEs (Special Check Out Equipment) are now requested to provide the instantaneous power generated by the solar array throughout the orbit. To achieve that, the Power SCOE interfaces to the RTS (Real Time Simulator) of the MDVE. The RTS provides the instantaneous settings, which belong to that point along the orbit, to the Power SCOE so that the Power SCOE generates the instantaneous {I,V} curve of the SA (Solar Array). That means a real time test for the power system, which is even more valuable for EO (Earth Observation) satellites where the Solar Array aspect angle to the sun is rarely fixed, and the power load profile can be particularly complex (for example, in radar applications). In this article the major issues related to integration and testing of Power Systems will be discussed taking into account different power system topologies (i.e. regulated bus, unregulated bus, battery bus, based on MPPT or S3R…). Also aspects about Power System AIT I/Fs (interfaces) and Umbilical I/Fs with the launcher and the Power SCOE I/Fs will be addressed. Last but not least, protection strategy of the Power System during AIT/AIV program will also be discussed. The objective of this discussion is also to provide the Power System Engineer with a checklist of key aspects linked to the satellite AIT/AIV program, that have to be considered in the early phases of a new power system development.

  2. Guidelines for mission integration, a summary report

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Guidelines are presented for instrument/experiment developers concerning hardware design, flight verification, and operations and mission implementation requirements. Interface requirements between the STS and instruments/experiments are defined. Interface constraints and design guidelines are presented along with integrated payload requirements for Spacelab Missions 1, 2, and 3. Interim data are suggested for use during hardware development until more detailed information is developed when a complete mission and an integrated payload system are defined. Safety requirements, flight verification requirements, and operations procedures are defined.

  3. KSC-01pp0249

    NASA Image and Video Library

    2001-02-03

    An overhead crane lowers the Multi-Purpose Logistics Module Donatello onto a workstand. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  4. Gluing interface qualification test results and gluing process development of the EUCLID near-infrared spectro-photometer optical assembly

    NASA Astrophysics Data System (ADS)

    Mottaghibonab, A.; Thiele, H.; Gubbini, E.; Dubowy, M.; Gal, C.; Mecsaci, A.; Gawlik, K.; Vongehr, M.; Grupp, F.; Penka, D.; Wimmer, C.; Bender, R.

    2016-07-01

    The Near Infrared Spectro-Photometer Optical assembly (NIOA) of EUCLID satellite requires high precision large lens holders with different lens materials, shapes and diameters. The aspherical lenses are glued into their separate CTE matched lens holder. The gluing of the lenses in their holder with 2K epoxy is selected as bonding process to minimize the stress in the lenses to achieve the required surface form error (SFE) performance (32nm) and lens position stability (+/-10μm) due to glue shrinkage. Adhesive shrinkage stress occurs during the glue curing at room temperature and operation in cryogenic temperatures, which might overstress the lens, cause performance loss, lens breakage or failure of the gluing interface. The selection of the suitable glue and required bonding parameters, design and qualification of the gluing interface, development and verification of the gluing process was a great challenge because of the low TRL and heritage of the bonding technology. The different material combinations (CaF2 to SS316L, LF5G15 and S-FTM16 to Titanium, SUPRASIL3001 to Invar M93), large diameter (168mm) and thin edge of the lenses, cryogenic nonoperational temperature (100K) and high performance accuracy of the lenses were the main design driver of the development. The different coefficients of thermal expansion (CTE) between lens and lens holder produce large local mechanical stress. As hygroscopic crystal calcium fluoride (CaF2) is very sensitive to moisture therefore an additional surface treatment of the gluing area is necessary. Extensive tests e.g glue handling and single lap shear tests are performed to select the suitable adhesive. Interface connection tests are performed to verify the feasibility of selected design (double pad design), injection channel, the roughness and treatment of the metal and lens interfaces, glue thickness, glue pad diameter and the gluing process. CTE and dynamic measurements of the glue, thermal cycling, damp- heat, connection shear and tension tests with all material combinations at RT and 100K are carried out to qualify the gluing interface. The gluing interface of the glued lenses in their mounts is also qualified with thermal cycling, 3D coordinate measurements before and after environmental tests, Polarimetry and vibration test of the lens assemblies. A multi-function double pad gluing tool and lens mounting tool is designed, manufactured and verified to meet the lens positioning and alignment performance of the lens in the holder which provides the possibility to glue lenses, filters, mirrors with different diameters, shapes and thickness with +/-10μm accuracy in plane, out of plane and +/-10 arcsec in tip/tilt with respect to the lens holder interface. The paper presents the glue interface qualification results, the qualification/verification methods, the developed ground support equipment and the gluing process of the EUCLID high precision large cryogenic lens mounts. Test results achieved in the test campaign demonstrate the suitability of the selected adhesive, glue pad design, interface parameters and the processes for the precise gluing of the lenses in lens holders for all lenses. The qualification models of the NIOA are successfully glued and qualified. The developed process can also be used for other glass materials e.g. MaF2 and optical black coated metallic surfaces.

  5. Free-free and fixed base modal survey tests of the Space Station Common Module Prototype

    NASA Technical Reports Server (NTRS)

    Driskill, T. C.; Anderson, J. B.; Coleman, A. D.

    1992-01-01

    This paper describes the testing aspects and the problems encountered during the free-free and fixed base modal surveys completed on the original Space Station Common Module Prototype (CMP). The CMP is a 40-ft long by 14.5-ft diameter 'waffle-grid' cylinder built by the Boeing Company and housed at the Marshall Space Flight Center (MSFC) near Huntsville, AL. The CMP modal survey tests were conducted at MSFC by the Dynamics Test Branch. The free-free modal survey tests (June '90 to Sept. '90) included interface verification tests (IFVT), often referred to as impedance measurements, mass-additive testing and linearity studies. The fixed base modal survey tests (Feb. '91 to April '91), including linearity studies, were conducted in a fixture designed to constrain the CMP in 7 total degrees-of-freedom at five trunnion interfaces (two primary, two secondary, and the keel). The fixture also incorporated an airbag off-load system designed to alleviate the non-linear effects of friction in the primary and secondary trunnion interfaces. Numerous test configurations were performed with the objective of providing a modal data base for evaluating the various testing methodologies to verify dynamic finite element models used for input to coupled load analysis.

  6. RELAP-7 Software Verification and Validation Plan

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis L.; Choi, Yong-Joon; Zou, Ling

    This INL plan comprehensively describes the software for RELAP-7 and documents the software, interface, and software design requirements for the application. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7. The RELAP-7 (Reactor Excursion and Leak Analysis Program) code is a nuclear reactor system safety analysis code being developed at Idaho National Laboratory (INL). The code is based on the INL’s modern scientific software development framework – MOOSE (Multi-Physics Object-Oriented Simulation Environment). The overall design goal of RELAP-7 is to take advantage of the previous thirty yearsmore » of advancements in computer architecture, software design, numerical integration methods, and physical models. The end result will be a reactor systems analysis capability that retains and improves upon RELAP5’s capability and extends the analysis capability for all reactor system simulation scenarios.« less

  7. A Hardware-in-the-Loop Simulation Platform for the Verification and Validation of Safety Control Systems

    NASA Astrophysics Data System (ADS)

    Rankin, Drew J.; Jiang, Jin

    2011-04-01

    Verification and validation (V&V) of safety control system quality and performance is required prior to installing control system hardware within nuclear power plants (NPPs). Thus, the objective of the hardware-in-the-loop (HIL) platform introduced in this paper is to verify the functionality of these safety control systems. The developed platform provides a flexible simulated testing environment which enables synchronized coupling between the real and simulated world. Within the platform, National Instruments (NI) data acquisition (DAQ) hardware provides an interface between a programmable electronic system under test (SUT) and a simulation computer. Further, NI LabVIEW resides on this remote DAQ workstation for signal conversion and routing between Ethernet and standard industrial signals as well as for user interface. The platform is applied to the testing of a simplified implementation of Canadian Deuterium Uranium (CANDU) shutdown system no. 1 (SDS1) which monitors only the steam generator level of the simulated NPP. CANDU NPP simulation is performed on a Darlington NPP desktop training simulator provided by Ontario Power Generation (OPG). Simplified SDS1 logic is implemented on an Invensys Tricon v9 programmable logic controller (PLC) to test the performance of both the safety controller and the implemented logic. Prior to HIL simulation, platform availability of over 95% is achieved for the configuration used during the V&V of the PLC. Comparison of HIL simulation results to benchmark simulations shows good operational performance of the PLC following a postulated initiating event (PIE).

  8. A Multifunctional Interface Method for Coupling Finite Element and Finite Difference Methods: Two-Dimensional Scalar-Field Problems

    NASA Technical Reports Server (NTRS)

    Ransom, Jonathan B.

    2002-01-01

    A multifunctional interface method with capabilities for variable-fidelity modeling and multiple method analysis is presented. The methodology provides an effective capability by which domains with diverse idealizations can be modeled independently to exploit the advantages of one approach over another. The multifunctional method is used to couple independently discretized subdomains, and it is used to couple the finite element and the finite difference methods. The method is based on a weighted residual variational method and is presented for two-dimensional scalar-field problems. A verification test problem and a benchmark application are presented, and the computational implications are discussed.

  9. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, the STS-96 crew looks at equipment as part of a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station . From left are Mission Specialist Ellen Ochoa (behind the opened storage cover ), Commander Kent Rominger, Pilot Rick Husband (holding a lithium hydroxide canister) and Mission Specialists Dan Barry, Valery Tokarev of Russia and Julie Payette. In the background is TTI interpreter Valentina Maydell. The other crew member at KSC for the IVT is Mission Specialist Tamara Jernigan. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  10. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 crew members look over equipment during a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station. From left are Khristal Parker, with Boeing; Mission Specialist Dan Barry, Pilot Rick Husband, Mission Specialist Tamara Jernigan, and at the far right, Mission Specialist Julie Payette. An unidentified worker is in the background. Also at KSC for the IVT are Commander Kent Rominger and Mission Specialists Ellen Ochoa and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  11. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, (left to right) STS-96 Pilot Rick Husband and Mission Specialists Julie Payette and Ellen Ochoa work the straps on the Sequential Shunt Unit (SSU) in front of them. The STS-96 crew is at KSC for a payload Interface Verification Test (IVT) for its upcoming mission to the International Space Station . Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  12. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev of Russia (left) and Commander Kent Rominger (second from right) listen to Lynn Ashby (far right), with JSC, talking about the SPACEHAB equipment in front of them during a payload Interface Verification Test (IVT). In the background behind Tokarev is TTI interpreter Valentina Maydell. Other STS-96 crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Dan Barry, Ellen Ochoa, Tamara Jernigan and Julie Payette. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  13. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev of Russia (second from left) and Commander Kent Rominger learn about the Sequential Shunt Unit (SSU) in front of them from Lynn Ashby (far right), with Johnson Space Center. At the far left looking on is TTI interpreter Valentina Maydell. Other crew members at KSC for the IVT are Pilot Rick Husband and Mission Specialists Ellen Ochoa, Tamara Jernigan, Dan Barry and Julie Payette. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  14. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, STS-96 Mission Specialist Valery Tokarev (in foreground) of the Russian Space Agency closes a container, part of the equipment that will be in the SPACEHAB module on mission STS-96. Behind Tokarev are Pilot Rick Husband (left) and Mission Specialist Dan Barry (right). Other crew members at KSC for a payload Interface Verification Test for the upcoming mission to the International Space Station are Commander Kent Rominger and Mission Specialists Ellen Ochoa, Tamara Jernigan and Julie Payette. Mission STS-96 carries the SPACEHAB Logistics Double Module, which has equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  15. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Mission Specialist Tamara Jernigan checks over instructions while Mission Specialist Dan Barry looks up from the Sequential Shunt Unit (SSU) in front of him to other equipment Lynn Ashby (right), with Johnson Space Center, is pointing at. Other crew members at KSC for the IVT are Commander Kent Rominger, Pilot Rick Husband, and Mission Specialists Ellen Ochoa, Julie Payette and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  16. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    During a payload Interface Verification Test (IVT) in the SPACEHAB Facility, STS-96 Pilot Rick Husband and Mission Specialist Ellen Ochoa (on the left) and Mission Specialist Julie Payette (on the far right) listen to Khristal Parker (second from right), with Boeing, explain about the equipment in front of them. Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  17. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station are (kneeling) STS-96 Mission Specialists Julie Payette and Ellen Ochoa, Pilot Rick Husband, and (standing at right) Mission Specialist Dan Barry. At the left is James Behling, with Boeing, explaining some of the equipment that will be on board STS-96. Other STS-96 crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan and Valery Tokarev of Russia. Mission STS-96 carries the SPACEHAB Logistics Double Module, which will have equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. It carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  18. Discrete Abstractions of Hybrid Systems: Verification of Safety and Application to User-Interface Design

    NASA Technical Reports Server (NTRS)

    Oishi, Meeko; Tomlin, Claire; Degani, Asaf

    2003-01-01

    Human interaction with complex hybrid systems involves the user, the automation's discrete mode logic, and the underlying continuous dynamics of the physical system. Often the user-interface of such systems displays a reduced set of information about the entire system. In safety-critical systems, how can we identify user-interface designs which do not have adequate information, or which may confuse the user? Here we describe a methodology, based on hybrid system analysis, to verify that a user-interface contains information necessary to safely complete a desired procedure or task. Verification within a hybrid framework allows us to account for the continuous dynamics underlying the simple, discrete representations displayed to the user. We provide two examples: a car traveling through a yellow light at an intersection and an aircraft autopilot in a landing/go-around maneuver. The examples demonstrate the general nature of this methodology, which is applicable to hybrid systems (not fully automated) which have operational constraints we can pose in terms of safety. This methodology differs from existing work in hybrid system verification in that we directly account for the user's interactions with the system.

  19. Enhancing the Human Factors Engineering Role in an Austere Fiscal Environment

    NASA Technical Reports Server (NTRS)

    Stokes, Jack W.

    2003-01-01

    An austere fiscal environment in the aerospace community creates pressures to reduce program costs, often minimizing or sometimes even deleting the human interface requirements from the design process. With an assumption that the flight crew can recover real time from a poorly human factored space vehicle design, the classical crew interface requirements have been either not included in the design or not properly funded, though carried as requirements. Cost cuts have also affected quality of retained human factors engineering personnel. In response to this concern, planning is ongoing to correct the acting issues. Herein are techniques for ensuring that human interface requirements are integrated into a flight design, from proposal through verification and launch activation. This includes human factors requirements refinement and consolidation across flight programs; keyword phrases in the proposals; closer ties with systems engineering and other classical disciplines; early planning for crew-interface verification; and an Agency integrated human factors verification program, under the One NASA theme. Importance is given to communication within the aerospace human factors discipline, and utilizing the strengths of all government, industry, and academic human factors organizations in an unified research and engineering approach. A list of recommendations and concerns are provided in closing.

  20. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  1. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smartt, Heidi A.; Romero, Juan A.; Custer, Joyce Olsen

    Containment/Surveillance (C/S) measures are critical to any verification regime in order to maintain Continuity of Knowledge (CoK). The Ceramic Seal project is research into the next generation technologies to advance C/S, in particular improving security and efficiency. The Ceramic Seal is a small form factor loop seal with improved tamper-indication including a frangible seal body, tamper planes, external coatings, and electronic monitoring of the seal body integrity. It improves efficiency through a self-securing wire and in-situ verification with a handheld reader. Sandia National Laboratories (SNL) and Savannah River National Laboratory (SRNL), under sponsorship from the U.S. National Nuclear Security Administrationmore » (NNSA) Office of Defense Nuclear Nonproliferation Research and Development (DNN R&D), have previously designed and have now fabricated and tested Ceramic Seals. Tests have occurred at both SNL and SRNL, with different types of tests occurring at each facility. This interim report will describe the Ceramic Seal prototype, the design and development of a handheld standalone reader and an interface to a data acquisition system, fabrication of the seals, and results of initial testing.« less

  2. Cryo Testing of tbe James Webb Space Telescope's Integrated Science Instrument Module

    NASA Technical Reports Server (NTRS)

    VanCampen, Julie

    2004-01-01

    The Integrated Science Instrument Module (ISIM) of the James Webb Space Telescope will be integrated and tested at the Environmental Test Facilities at Goddard Space Flight Center (GSFC). The cryogenic thermal vacuum testing of the ISIM will be the most difficult and problematic portion of the GSFC Integration and Test flow. The test is to validate the coupled interface of the science instruments and the ISIM structure and to sufficiently stress that interface while validating image quality of the science instruments. The instruments and the structure are not made from the same materials and have different CTE. Test objectives and verification rationale are currently being evaluated in Phase B of the project plan. The test program will encounter engineering challenges and limitations, which are derived by cost and technology many of which can be mitigated by facility upgrades, creative GSE, and thorough forethought. The cryogenic testing of the ISIM will involve a number of risks such as the implementation of unique metrology techniques, mechanical, electrical and optical simulators housed within the cryogenic vacuum environment. These potential risks are investigated and possible solutions are proposed.

  3. Vehicle Performance Recorder (VPR)/ HMMWV (High Mobility Multi-Purpose Wheeled Vehicle) Interface Verification.

    DTIC Science & Technology

    1984-05-01

    hybrid transmission used in the VPR vehicle. From these comparisons made with HMMWV Developmental Test data, confidence can be placed in the validity of...I ............ ........ . ... .... ......... I................ l........ igr 3-2 Drwa pul .1 - hg rne 4004 .. d 3000 \\. PR VEIcLE 2000...Engine: GMC, V-8 diesel, 6.2 L. Transmission: Model THM 475/400 ( hybrid ). Transfer: New process 218, full time 4-wheel drive. Differential: Gleasman

  4. Development of a prototype two-phase thermal bus system for Space Station

    NASA Technical Reports Server (NTRS)

    Myron, D. L.; Parish, R. C.

    1987-01-01

    This paper describes the basic elements of a pumped two-phase ammonia thermal control system designed for microgravity environments, the development of the concept into a Space Station flight design, and design details of the prototype to be ground-tested in the Johnson Space Center (JSC) Thermal Test Bed. The basic system concept is one of forced-flow heat transport through interface heat exchangers with anhydrous ammonia being pumped by a device expressly designed for two-phase fluid management in reduced gravity. Control of saturation conditions, and thus system interface temperatures, is accomplished with a single central pressure regulating valve. Flow control and liquid inventory are controlled by passive, nonelectromechanical devices. Use of these simple control elements results in minimal computer controls and high system reliability. Building on the basic system concept, a brief overview of a potential Space Station flight design is given. Primary verification of the system concept will involve testing at JSC of a 25-kW ground test article currently in fabrication.

  5. System Testing of Ground Cooling System Components

    NASA Technical Reports Server (NTRS)

    Ensey, Tyler Steven

    2014-01-01

    This internship focused primarily upon software unit testing of Ground Cooling System (GCS) components, one of the three types of tests (unit, integrated, and COTS/regression) utilized in software verification. Unit tests are used to test the software of necessary components before it is implemented into the hardware. A unit test determines that the control data, usage procedures, and operating procedures of a particular component are tested to determine if the program is fit for use. Three different files are used to make and complete an efficient unit test. These files include the following: Model Test file (.mdl), Simulink SystemTest (.test), and autotest (.m). The Model Test file includes the component that is being tested with the appropriate Discrete Physical Interface (DPI) for testing. The Simulink SystemTest is a program used to test all of the requirements of the component. The autotest tests that the component passes Model Advisor and System Testing, and puts the results into proper files. Once unit testing is completed on the GCS components they can then be implemented into the GCS Schematic and the software of the GCS model as a whole can be tested using integrated testing. Unit testing is a critical part of software verification; it allows for the testing of more basic components before a model of higher fidelity is tested, making the process of testing flow in an orderly manner.

  6. Communication Satellite Payload Special Check out Equipment (SCOE) for Satellite Testing

    NASA Astrophysics Data System (ADS)

    Subhani, Noman

    2016-07-01

    This paper presents Payload Special Check out Equipment (SCOE) for the test and measurement of communication satellite Payload at subsystem and system level. The main emphasis of this paper is to demonstrate the principle test equipment, instruments and the payload test matrix for an automatic test control. Electrical Ground Support Equipment (EGSE)/ Special Check out Equipment (SCOE) requirements, functions and architecture for C-band and Ku-band payloads are presented in details along with their interface with satellite during different phases of satellite testing. It provides test setup, in a single rack cabinet that can easily be moved from payload assembly and integration environment to thermal vacuum chamber all the way to launch site (for pre-launch test and verification).

  7. International interface design for Space Station Freedom - Challenges and solutions

    NASA Technical Reports Server (NTRS)

    Mayo, Richard E.; Bolton, Gordon R.; Laurini, Daniele

    1988-01-01

    The definition of interfaces for the International Space Station is discussed, with a focus on negotiations between NASA and ESA. The program organization and division of responsibilities for the Space Station are outlined; the basic features of physical and functional interfaces are described; and particular attention is given to the interface management and documentation procedures, architectural control elements, interface implementation and verification, and examples of Columbus interface solutions (including mechanical, ECLSS, thermal-control, electrical, data-management, standardized user, and software interfaces). Diagrams, drawings, graphs, and tables listing interface types are provided.

  8. STS-96 crew takes part in payload Interface Verification Test

    NASA Technical Reports Server (NTRS)

    1999-01-01

    In the SPACEHAB Facility, (from left) STS-96 Mission Specialist Julie Payette, Pilot Rick Husband and Mission Specialist Ellen Ochoa learn about the Sequential Shunt Unit (SSU) in front of them from Lynn Ashby (far right), with Johnson Space Center. The STS-96 crew is at KSC for a payload Interface Verification Test (IVT) for their upcoming mission to the International Space Station . Other crew members at KSC for the IVT are Commander Kent Rominger and Mission Specialists Tamara Jernigan, Dan Barry and Valery Tokarev of Russia. The SSU is part of the cargo on Mission STS-96, which carries the SPACEHAB Logistics Double Module, with equipment to further outfit the International Space Station service module and equipment that can be off-loaded from the early U.S. assembly flights. The SPACEHAB carries internal logistics and resupply cargo for station outfitting, plus an external Russian cargo crane to be mounted to the exterior of the Russian station segment and used to perform space walking maintenance activities. The double module stowage provides capacity of up to 10,000 lbs. with the ability to accommodate powered payloads, four external rooftop stowage locations, four double-rack locations (two powered), up to 61 bulkhead-mounted middeck locker locations, and floor storage for large unique items and Soft Stowage. STS-96 is targeted to launch May 20 about 9:32 a.m.

  9. MPLM On-Orbit Interface Dynamic Flexibility Modal Test

    NASA Technical Reports Server (NTRS)

    Bookout, Paul S.; Rodriguez, Pedro I.; Tinson, Ian; Fleming, Paolo

    2001-01-01

    Now that the International Space Station (ISS) is being constructed, payload developers have to not only verify the Shuttle-to-payload interface, but also the interfaces their payload will have with the ISS. The Multi Purpose Logistic Module (MPLM) being designed and built by Alenia Spazio in Torino, Italy is one such payload. The MPLM is the primary carrier for the ISS Payload Racks, Re-supply Stowage Racks, and the Resupply Stowage Platforms to re-supply the ISS with food, water, experiments, maintenance equipment and etc. During the development of the MPLM there was no requirement for verification of the on-orbit interfaces with the ISS. When this oversight was discovered, all the dynamic test stands had already been disassembled. A method was needed that would not require an extensive testing stand and could be completed in a short amount of time. The residual flexibility testing technique was chosen. The residual flexibility modal testing method consists of measuring the free-free natural frequencies and mode shapes along with the interface frequency response functions (FRF's). Analytically, the residual flexibility method has been investigated in detail by, MacNeal, Martinez, Carne, and Miller, and Rubin, but has not been implemented extensively for model correlation due to difficulties in data acquisition. In recent years improvement of data acquisition equipment has made possible the implementation of the residual flexibility method as in Admire, Tinker, and Ivey, and Klosterman and Lemon. The residual flexibility modal testing technique is applicable to a structure with distinct points (DOF) of contact with its environment, such as the MPLM-to-Station interface through the Common Berthing Mechanism (CBM). The CBM is bolted to a flange on the forward cone of the MPLM. During the fixed base test (to verify Shuttle interfaces) some data was gathered on the forward cone panels. Even though there was some data on the forward cones, an additional modal test was performed to better characterize its behavior. The CBM mounting flange is the only remaining structure of the MPLM that no test data was available. This paper discusses the implementation of the residual flexibility modal testing technique on the CBM flange and the modal test of the forward cone panels.

  10. Mars 2020 Model Based Systems Engineering Pilot

    NASA Technical Reports Server (NTRS)

    Dukes, Alexandra Marie

    2017-01-01

    The pilot study is led by the Integration Engineering group in NASA's Launch Services Program (LSP). The Integration Engineering (IE) group is responsible for managing the interfaces between the spacecraft and launch vehicle. This pilot investigates the utility of Model-Based Systems Engineering (MBSE) with respect to managing and verifying interface requirements. The main objectives of the pilot are to model several key aspects of the Mars 2020 integrated operations and interface requirements based on the design and verification artifacts from Mars Science Laboratory (MSL) and to demonstrate how MBSE could be used by LSP to gain further insight on the interface between the spacecraft and launch vehicle as well as to enhance how LSP manages the launch service. The method used to accomplish this pilot started through familiarization of SysML, MagicDraw, and the Mars 2020 and MSL systems through books, tutorials, and NASA documentation. MSL was chosen as the focus of the model since its processes and verifications translate easily to the Mars 2020 mission. The study was further focused by modeling specialized systems and processes within MSL in order to demonstrate the utility of MBSE for the rest of the mission. The systems chosen were the In-Flight Disconnect (IFD) system and the Mass Properties process. The IFD was chosen as a system of focus since it is an interface between the spacecraft and launch vehicle which can demonstrate the usefulness of MBSE from a system perspective. The Mass Properties process was chosen as a process of focus since the verifications for mass properties occur throughout the lifecycle and can demonstrate the usefulness of MBSE from a multi-discipline perspective. Several iterations of both perspectives have been modeled and evaluated. While the pilot study will continue for another 2 weeks, pros and cons of using MBSE for LSP IE have been identified. A pro of using MBSE includes an integrated view of the disciplines, requirements, and verifications leading up to launch. The model allows IE to understand the relationships between disciplines throughout test activities and verifications. Additionally, the relationships between disciplines and integration tasks are generally consistent. The model allows for the generic relationships and tasks to be captured and used throughout multiple mission models should LSP further pursue MBSE. A con of MBSE is the amount of time it takes upfront to understand MBSE and create a useful model. The upfront time it takes to create a useful model is heavily discussed in MBSE literature and is a consistent con throughout the known applications of MBSE. The need to understand SysML and the software chosen also poses the possibility of a "bottleneck" or one person being the sole MBSE user for the working group. The utility of MBSE will continue to be evaluated through the remainder of the study. In conclusion, the original objectives of the pilot study were to use artifacts from MSL to model key aspects of Mars 2020 and demonstrate how MBSE could be used by LSP to gain insight into the spacecraft and launch vehicle interfaces. Progress has been made in modeling and identifying the utility of MBSE to LSP IE and will continue to be made until the pilot study's conclusion in mid-August. The results of this study will produce initial models, modeling instructions and examples, and a summary of MBSE's utility for future use by LSP.

  11. Correlation of ground tests and analyses of a dynamically scaled Space Station model configuration

    NASA Technical Reports Server (NTRS)

    Javeed, Mehzad; Edighoffer, Harold H.; Mcgowan, Paul E.

    1993-01-01

    Verification of analytical models through correlation with ground test results of a complex space truss structure is demonstrated. A multi-component, dynamically scaled space station model configuration is the focus structure for this work. Previously established test/analysis correlation procedures are used to develop improved component analytical models. Integrated system analytical models, consisting of updated component analytical models, are compared with modal test results to establish the accuracy of system-level dynamic predictions. Design sensitivity model updating methods are shown to be effective for providing improved component analytical models. Also, the effects of component model accuracy and interface modeling fidelity on the accuracy of integrated model predictions is examined.

  12. KSC-01pp0244

    NASA Image and Video Library

    2001-02-03

    The lid is off the shipping container with the Multi-Purpose Logistics Module Donatello inside. It sits on a transporter inside the Space Station Processing Facility. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  13. KSC-01pp0245

    NASA Image and Video Library

    2001-02-03

    Workers in the Space Station Processing Facility attach an overhead crane to the Multi-Purpose Logistics Module Donatello to lift it out of the shipping container. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  14. KSC-01pp0246

    NASA Image and Video Library

    2001-02-03

    In the Space Station Processing Facility, workers help guide the overhead crane as it lifts the Multi-Purpose Logistics Module Donatello out of the shipping container. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  15. KSC-01pp0247

    NASA Image and Video Library

    2001-02-03

    In the Space Station Processing Facility, workers help guide the Multi-Purpose Logistics Module Donatello as it moves the length of the SSPF toward a workstand. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  16. KSC-01pp0248

    NASA Image and Video Library

    2001-02-03

    In the Space Station Processing Facility, workers wait for the Multi-Purpose Logistics Module Donatello, suspended by an overhead crane, to move onto a workstand. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004

  17. A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware

    NASA Astrophysics Data System (ADS)

    Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun

    During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.

  18. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  19. Using Dynamic Interface Modeling and Simulation to Develop a Launch and Recovery Flight Simulation for a UH-60A Blackhawk

    NASA Technical Reports Server (NTRS)

    Sweeney, Christopher; Bunnell, John; Chung, William; Giovannetti, Dean; Mikula, Julie; Nicholson, Bob; Roscoe, Mike

    2001-01-01

    Joint Shipboard Helicopter Integration Process (JSHIP) is a Joint Test and Evaluation (JT&E) program sponsored by the Office of the Secretary of Defense (OSD). Under the JSHDP program is a simulation effort referred to as the Dynamic Interface Modeling and Simulation System (DIMSS). The purpose of DIMSS is to develop and test the processes and mechanisms that facilitate ship-helicopter interface testing via man-in-the-loop ground-based flight simulators. Specifically, the DIMSS charter is to develop an accredited process for using a flight simulator to determine the wind-over-the-deck (WOD) launch and recovery flight envelope for the UH-60A ship/helicopter combination. DIMSS is a collaborative effort between the NASA Ames Research Center and OSD. OSD determines the T&E and warfighter training requirements, provides the programmatics and dynamic interface T&E experience, and conducts ship/aircraft interface tests for validating the simulation. NASA provides the research and development element, simulation facility, and simulation technical experience. This paper will highlight the benefits of the NASA/JSHIP collaboration and detail achievements of the project in terms of modeling and simulation. The Vertical Motion Simulator (VMS) at NASA Ames Research Center offers the capability to simulate a wide range of simulation cueing configurations, which include visual, aural, and body-force cueing devices. The system flexibility enables switching configurations io allow back-to-back evaluation and comparison of different levels of cueing fidelity in determining minimum training requirements. The investigation required development and integration of several major simulation system at the VMS. A new UH-60A BlackHawk interchangeable cab that provides an out-the-window (OTW) field-of-view (FOV) of 220 degrees in azimuth and 70 degrees in elevation was built. Modeling efforts involved integrating Computational Fluid Dynamics (CFD) generated data of an LHA ship airwake and integrating a real-time ship motion model developed based on a batch model from Naval Surface Warfare Center. Engineering development and integration of a three degrees-of-freedom (DOF) dynamic seat to simulate high frequency rotor-dynamics dependent motion cues for use in conjunction with the large motion system was accomplished. The development of an LHA visual model in several different levels of resolution and an aural cueing system in which three separate fidelity levels could be selected were developed. VMS also integrated a PC-based E&S simFUSION system to investigate cost effective IG alternatives. The DIMSS project consists of three phases that follow an approved Validation, Verification and accreditation (VV&A) process. The first phase will support the accreditation of the individual subsystems and models. The second will follow the verification and validation of the integrated subsystems and models, and will address fidelity requirements of the integrated models and subsystems. The third and final phase will allow the verification and validation of the full system integration. This VV&A process will address the utility of the simulated WOD launch and recovery envelope. Simulations supporting the first two stages have been completed and the data is currently being reviewed and analyzed.

  20. Space station dynamics, attitude control and momentum management

    NASA Technical Reports Server (NTRS)

    Sunkel, John W.; Singh, Ramen P.; Vengopal, Ravi

    1989-01-01

    The Space Station Attitude Control System software test-bed provides a rigorous environment for the design, development and functional verification of GN and C algorithms and software. The approach taken for the simulation of the vehicle dynamics and environmental models using a computationally efficient algorithm is discussed. The simulation includes capabilities for docking/berthing dynamics, prescribed motion dynamics associated with the Mobile Remote Manipulator System (MRMS) and microgravity disturbances. The vehicle dynamics module interfaces with the test-bed through the central Communicator facility which is in turn driven by the Station Control Simulator (SCS) Executive. The Communicator addresses issues such as the interface between the discrete flight software and the continuous vehicle dynamics, and multi-programming aspects such as the complex flow of control in real-time programs. Combined with the flight software and redundancy management modules, the facility provides a flexible, user-oriented simulation platform.

  1. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  2. Astronaut tool development: An orbital replaceable unit-portable handhold

    NASA Technical Reports Server (NTRS)

    Redmon, John W., Jr.

    1989-01-01

    A tool to be used during astronaut Extra-Vehicular Activity (EVA) replacement of spent or defective electrical/electronic component boxes is described. The generation of requirements and design philosophies are detailed, as well as specifics relating to mechanical development, interface verifications, testing, and astronaut feedback. Findings are presented in the form of: (1) a design which is universally applicable to spacecraft component replacement, and (2) guidelines that the designer of orbital replacement units might incorporate to enhance spacecraft on-orbit maintainability and EVA mission safety.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Singleton, Jr., Robert; Israel, Daniel M.; Doebling, Scott William

    For code verification, one compares the code output against known exact solutions. There are many standard test problems used in this capacity, such as the Noh and Sedov problems. ExactPack is a utility that integrates many of these exact solution codes into a common API (application program interface), and can be used as a stand-alone code or as a python package. ExactPack consists of python driver scripts that access a library of exact solutions written in Fortran or Python. The spatial profiles of the relevant physical quantities, such as the density, fluid velocity, sound speed, or internal energy, are returnedmore » at a time specified by the user. The solution profiles can be viewed and examined by a command line interface or a graphical user interface, and a number of analysis tools and unit tests are also provided. We have documented the physics of each problem in the solution library, and provided complete documentation on how to extend the library to include additional exact solutions. ExactPack’s code architecture makes it easy to extend the solution-code library to include additional exact solutions in a robust, reliable, and maintainable manner.« less

  4. Space Station Furnace Facility. Volume 2: Appendix 1: Contract End Item specification (CEI), part 1

    NASA Technical Reports Server (NTRS)

    Seabrook, Craig

    1992-01-01

    This specification establishes the performance, design, development, and verification requirements for the Space Station Furnace Facility (SSFF) Core. The definition of the SSFF Core and its interfaces, specifies requirements for the SSFF Core performance, specifies requirements for the SSFF Core design, and construction are presented, and the verification requirements are established.

  5. Gas cooled fuel cell systems technology development

    NASA Technical Reports Server (NTRS)

    Feret, J. M.

    1983-01-01

    The first phase of a planned multiphase program to develop a Phosphoric is addressed. This report describes the efforts performed that culminated in the: (1) Establishment of the preliminary design requirements and system conceptual design for the nominally rated 375 kW PAFC module and is interfacing power plant systems; (2) Establishment of PAFC component and stack performance, endurance, and design parameter data needed for design verification for power plant application; (3) Improvement of the existing PAFC materials data base and establishment of materials specifications and process procedes for the cell components; and (4) Testing of 122 subscale cell atmospheric test for 110,000 cumulative test hours, 12 subscale cell pressurized tests for 15,000 cumulative test hours, and 12 pressurized stack test for 10,000 cumulative test hours.

  6. International Space Station Requirement Verification for Commercial Visiting Vehicles

    NASA Technical Reports Server (NTRS)

    Garguilo, Dan

    2017-01-01

    The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.

  7. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the Shuttle Landing Facility, cranes help offload the Italian Space Agency's Multi-Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane. The third of three for the International Space Station, the module will be moved on a transporter to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  8. Ground based ISS payload microgravity disturbance assessments.

    PubMed

    McNelis, Anne M; Heese, John A; Samorezov, Sergey; Moss, Larry A; Just, Marcus L

    2005-01-01

    In order to verify that the International Space Station (ISS) payload facility racks do not disturb the microgravity environment of neighboring facility racks and that the facility science operations are not compromised, a testing and analytical verification process must be followed. Currently no facility racks have taken this process from start to finish. The authors are participants in implementing this process for the NASA Glenn Research Center (GRC) Fluids and Combustion Facility (FCF). To address the testing part of the verification process, the Microgravity Emissions Laboratory (MEL) was developed at GRC. The MEL is a 6 degree of freedom inertial measurement system capable of characterizing inertial response forces (emissions) of components, sub-rack payloads, or rack-level payloads down to 10(-7) g's. The inertial force output data, generated from the steady state or transient operations of the test articles, are utilized in analytical simulations to predict the on-orbit vibratory environment at specific science or rack interface locations. Once the facility payload rack and disturbers are properly modeled an assessment can be made as to whether required microgravity levels are achieved. The modeling is utilized to develop microgravity predictions which lead to the development of microgravity sensitive ISS experiment operations once on-orbit. The on-orbit measurements will be verified by use of the NASA GRC Space Acceleration Measurement System (SAMS). The major topics to be addressed in this paper are: (1) Microgravity Requirements, (2) Microgravity Disturbers, (3) MEL Testing, (4) Disturbance Control, (5) Microgravity Control Process, and (6) On-Orbit Predictions and Verification. Published by Elsevier Ltd.

  9. Development of Standard Station Interface for Comprehensive Nuclear Test Ban Treaty Organistation Monitoring Networks

    NASA Astrophysics Data System (ADS)

    Dricker, I. G.; Friberg, P.; Hellman, S.

    2001-12-01

    Under the contract with the CTBTO, Instrumental Software Technologies Inc., (ISTI) has designed and developed a Standard Station Interface (SSI) - a set of executable programs and application programming interface libraries for acquisition, authentication, archiving and telemetry of seismic and infrasound data for stations of the CTBTO nuclear monitoring network. SSI (written in C) is fully supported under both the Solaris and Linux operating systems and will be shipped with fully documented source code. SSI consists of several interconnected modules. The Digitizer Interface Module maintains a near-real-time data flow between multiple digitizers and the SSI. The Disk Buffer Module is responsible for local data archival. The Station Key Management Module is a low-level tool for data authentication and verification of incoming signatures. The Data Transmission Module supports packetized near-real-time data transmission from the primary CTBTO stations to the designated Data Center. The AutoDRM module allows transport of seismic and infrasound signed data via electronic mail (auxiliary station mode). The Command Interface Module is used to pass the remote commands to the digitizers and other modules of SSI. A station operator has access to the state-of-health information and waveforms via an the Operator Interface Module. Modular design of SSI will allow painless extension of the software system within and outside the boundaries of CTBTO station requirements. Currently an alpha version of SSI undergoes extensive tests in the lab and onsite.

  10. Inexpensive Eddy-Current Standard

    NASA Technical Reports Server (NTRS)

    Berry, Robert F., Jr.

    1985-01-01

    Radial crack replicas serve as evaluation standards. Technique entails intimately joining two pieces of appropriate aluminum alloy stock and centering drilled hole through and along interface. Bore surface of hole presents two vertical stock interface lines 180 degrees apart. These lines serve as radial crack defect replicas during eddy-current technique setup and verification.

  11. Ver-i-Fus: an integrated access control and information monitoring and management system

    NASA Astrophysics Data System (ADS)

    Thomopoulos, Stelios C.; Reisman, James G.; Papelis, Yiannis E.

    1997-01-01

    This paper describes the Ver-i-Fus Integrated Access Control and Information Monitoring and Management (IAC-I2M) system that INTELNET Inc. has developed. The Ver-i-Fus IAC-I2M system has been designed to meet the most stringent security and information monitoring requirements while allowing two- way communication between the user and the system. The systems offers a flexible interface that permits to integrate practically any sensing device, or combination of sensing devices, including a live-scan fingerprint reader, thus providing biometrics verification for enhanced security. Different configurations of the system provide solutions to different sets of access control problems. The re-configurable hardware interface, tied together with biometrics verification and a flexible interface that allows to integrate Ver-i-Fus with an MIS, provide an integrated solution to security, time and attendance, labor monitoring, production monitoring, and payroll applications.

  12. Advanced flight control system study

    NASA Technical Reports Server (NTRS)

    Hartmann, G. L.; Wall, J. E., Jr.; Rang, E. R.; Lee, H. P.; Schulte, R. W.; Ng, W. K.

    1982-01-01

    A fly by wire flight control system architecture designed for high reliability includes spare sensor and computer elements to permit safe dispatch with failed elements, thereby reducing unscheduled maintenance. A methodology capable of demonstrating that the architecture does achieve the predicted performance characteristics consists of a hierarchy of activities ranging from analytical calculations of system reliability and formal methods of software verification to iron bird testing followed by flight evaluation. Interfacing this architecture to the Lockheed S-3A aircraft for flight test is discussed. This testbed vehicle can be expanded to support flight experiments in advanced aerodynamics, electromechanical actuators, secondary power systems, flight management, new displays, and air traffic control concepts.

  13. Reliability design and verification for launch-vehicle propulsion systems - Report of an AIAA Workshop, Washington, DC, May 16, 17, 1989

    NASA Astrophysics Data System (ADS)

    Launch vehicle propulsion system reliability considerations during the design and verification processes are discussed. The tools available for predicting and minimizing anomalies or failure modes are described and objectives for validating advanced launch system propulsion reliability are listed. Methods for ensuring vehicle/propulsion system interface reliability are examined and improvements in the propulsion system development process are suggested to improve reliability in launch operations. Also, possible approaches to streamline the specification and procurement process are given. It is suggested that government and industry should define reliability program requirements and manage production and operations activities in a manner that provides control over reliability drivers. Also, it is recommended that sufficient funds should be invested in design, development, test, and evaluation processes to ensure that reliability is not inappropriately subordinated to other management considerations.

  14. Numerical modeling of HgCdTe solidification: Effects of phase diagram, double-diffusion convection and microgravity level

    NASA Technical Reports Server (NTRS)

    Bune, Andris V.; Gillies, Donald C.; Lehozky, Sandor L.

    1997-01-01

    A numerical model of HgCdTe solidification was implemented using finite the element code FIDAP. Model verification was done using both experimental data and numerical test problems. The model was used to evaluate possible effects of double-diffusion convection in molten material, and microgravity level on concentration distribution in the solidified HgCdTe. Particular attention was paid to incorporation of HgCdTe phase diagram. It was found, that below a critical microgravity amplitude, the maximum convective velocity in the melt appears virtually independent on the microgravity vector orientation. Good agreement between predicted interface shape and an interface obtained experimentally by quenching was achieved. The results of numerical modeling are presented in the form of video film.

  15. Low-cost and high-speed optical mark reader based on an intelligent line camera

    NASA Astrophysics Data System (ADS)

    Hussmann, Stephan; Chan, Leona; Fung, Celine; Albrecht, Martin

    2003-08-01

    Optical Mark Recognition (OMR) is thoroughly reliable and highly efficient provided that high standards are maintained at both the planning and implementation stages. It is necessary to ensure that OMR forms are designed with due attention to data integrity checks, the best use is made of features built into the OMR, used data integrity is checked before the data is processed and data is validated before it is processed. This paper describes the design and implementation of an OMR prototype system for marking multiple-choice tests automatically. Parameter testing is carried out before the platform and the multiple-choice answer sheet has been designed. Position recognition and position verification methods have been developed and implemented in an intelligent line scan camera. The position recognition process is implemented into a Field Programmable Gate Array (FPGA), whereas the verification process is implemented into a micro-controller. The verified results are then sent to the Graphical User Interface (GUI) for answers checking and statistical analysis. At the end of the paper the proposed OMR system will be compared with commercially available system on the market.

  16. Cooperative GN&C development in a rapid prototyping environment. [flight software design for space vehicles

    NASA Technical Reports Server (NTRS)

    Bordano, Aldo; Uhde-Lacovara, JO; Devall, Ray; Partin, Charles; Sugano, Jeff; Doane, Kent; Compton, Jim

    1993-01-01

    The Navigation, Control and Aeronautics Division (NCAD) at NASA-JSC is exploring ways of producing Guidance, Navigation and Control (GN&C) flight software faster, better, and cheaper. To achieve these goals NCAD established two hardware/software facilities that take an avionics design project from initial inception through high fidelity real-time hardware-in-the-loop testing. Commercially available software products are used to develop the GN&C algorithms in block diagram form and then automatically generate source code from these diagrams. A high fidelity real-time hardware-in-the-loop laboratory provides users with the capability to analyze mass memory usage within the targeted flight computer, verify hardware interfaces, conduct system level verification, performance, acceptance testing, as well as mission verification using reconfigurable and mission unique data. To evaluate these concepts and tools, NCAD embarked on a project to build a real-time 6 DOF simulation of the Soyuz Assured Crew Return Vehicle flight software. To date, a productivity increase of 185 percent has been seen over traditional NASA methods for developing flight software.

  17. Feasibility and systems definition study for Microwave Multi-Application Payload (MMAP)

    NASA Technical Reports Server (NTRS)

    Horton, J. B.; Allen, C. C.; Massaro, M. J.; Zemany, J. L.; Murrell, J. W.; Stanhouse, R. W.; Condon, G. P.; Stone, R. F.; Swana, J.; Afifi, M.

    1977-01-01

    Work completed on three Shuttle/Spacelab experiments is examined: the Adaptive Multibeam Phased Array Antenna (AMPA) Experiment, Electromagnetic Environment Experiment (EEE) and Millimeter Wave Communications Experiment (MWCE). Results included the definition of operating modes, sequence of operation, radii of operation about several ground stations, signal format, foot prints of typical orbits and preliminary definition of ground and user terminals. Conceptual hardware designs, Spacelab interfaces, data handling methods, experiment testing and verification studies were included. The MWCE-MOD I was defined conceptually for a steerable high gain antenna.

  18. Development of a Software Safety Process and a Case Study of Its Use

    NASA Technical Reports Server (NTRS)

    Knight, J. C.

    1996-01-01

    Research in the year covered by this reporting period has been primarily directed toward: continued development of mock-ups of computer screens for operator of a digital reactor control system; development of a reactor simulation to permit testing of various elements of the control system; formal specification of user interfaces; fault-tree analysis including software; evaluation of formal verification techniques; and continued development of a software documentation system. Technical results relating to this grant and the remainder of the principal investigator's research program are contained in various reports and papers.

  19. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  20. Implementation of an Adaptive Controller System from Concept to Flight Test

    NASA Technical Reports Server (NTRS)

    Larson, Richard R.; Burken, John J.; Butler, Bradley S.; Yokum, Steve

    2009-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) is conducting ongoing flight research using adaptive controller algorithms. A highly modified McDonnell-Douglas NF-15B airplane called the F-15 Intelligent Flight Control System (IFCS) is used to test and develop these algorithms. Modifications to this airplane include adding canards and changing the flight control systems to interface a single-string research controller processor for neural network algorithms. Research goals include demonstration of revolutionary control approaches that can efficiently optimize aircraft performance in both normal and failure conditions and advancement of neural-network-based flight control technology for new aerospace system designs. This report presents an overview of the processes utilized to develop adaptive controller algorithms during a flight-test program, including a description of initial adaptive controller concepts and a discussion of modeling formulation and performance testing. Design finalization led to integration with the system interfaces, verification of the software, validation of the hardware to the requirements, design of failure detection, development of safety limiters to minimize the effect of erroneous neural network commands, and creation of flight test control room displays to maximize human situational awareness; these are also discussed.

  1. Fuselage Versus Subcomponent Panel Response Correlation Based on ABAQUS Explicit Progressive Damage Analysis Tools

    NASA Technical Reports Server (NTRS)

    Gould, Kevin E.; Satyanarayana, Arunkumar; Bogert, Philip B.

    2016-01-01

    Analysis performed in this study substantiates the need for high fidelity vehicle level progressive damage analyses (PDA) structural models for use in the verification and validation of proposed sub-scale structural models and to support required full-scale vehicle level testing. PDA results are presented that capture and correlate the responses of sub-scale 3-stringer and 7-stringer panel models and an idealized 8-ft diameter fuselage model, which provides a vehicle level environment for the 7-stringer sub-scale panel model. Two unique skin-stringer attachment assumptions are considered and correlated in the models analyzed: the TIE constraint interface versus the cohesive element (COH3D8) interface. Evaluating different interfaces allows for assessing a range of predicted damage modes, including delamination and crack propagation responses. Damage models considered in this study are the ABAQUS built-in Hashin procedure and the COmplete STress Reduction (COSTR) damage procedure implemented through a VUMAT user subroutine using the ABAQUS/Explicit code.

  2. An Airbus arrives at KSC with third MPLM

    NASA Technical Reports Server (NTRS)

    2001-01-01

    An Airbus '''Beluga''' air cargo plane, The Super Transporter, lands at KSC's Shuttle Landing Facility. Its cargo, from the factory of Alenia Aerospazio in Turin, Italy, is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  3. An Airbus arrives at KSC with third MPLM

    NASA Technical Reports Server (NTRS)

    2001-01-01

    An Airbus '''Beluga''' air cargo plane, The Super Transporter, arrives at KSC's Shuttle Landing Facility from the factory of Alenia Aerospazio in Turin, Italy. Its cargo is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  4. A mass and momentum conserving unsplit semi-Lagrangian framework for simulating multiphase flows

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Owkes, Mark, E-mail: mark.owkes@montana.edu; Desjardins, Olivier

    In this work, we present a computational methodology for convection and advection that handles discontinuities with second order accuracy and maintains conservation to machine precision. This method can transport a variety of discontinuous quantities and is used in the context of an incompressible gas–liquid flow to transport the phase interface, momentum, and scalars. The proposed method provides a modification to the three-dimensional, unsplit, second-order semi-Lagrangian flux method of Owkes & Desjardins (JCP, 2014). The modification adds a refined grid that provides consistent fluxes of mass and momentum defined on a staggered grid and discrete conservation of mass and momentum, evenmore » for flows with large density ratios. Additionally, the refined grid doubles the resolution of the interface without significantly increasing the computational cost over previous non-conservative schemes. This is possible due to a novel partitioning of the semi-Lagrangian fluxes into a small number of simplices. The proposed scheme is tested using canonical verification tests, rising bubbles, and an atomizing liquid jet.« less

  5. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    PubMed

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  6. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    NASA Astrophysics Data System (ADS)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  7. Design Authority in the Test Programme Definition: The Alenia Spazio Experience

    NASA Astrophysics Data System (ADS)

    Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.

    2004-08-01

    In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).

  8. Verification Testing: Meet User Needs Figure of Merit

    NASA Technical Reports Server (NTRS)

    Kelly, Bryan W.; Welch, Bryan W.

    2017-01-01

    Verification is the process through which Modeling and Simulation(M&S) software goes to ensure that it has been rigorously tested and debugged for its intended use. Validation confirms that said software accurately models and represents the real world system. Credibility gives an assessment of the development and testing effort that the software has gone through as well as how accurate and reliable test results are. Together, these three components form Verification, Validation, and Credibility(VV&C), the process by which all NASA modeling software is to be tested to ensure that it is ready for implementation. NASA created this process following the CAIB (Columbia Accident Investigation Board) report seeking to understand the reasons the Columbia space shuttle failed during reentry. The reports conclusion was that the accident was fully avoidable, however, among other issues, the necessary data to make an informed decision was not there and the result was complete loss of the shuttle and crew. In an effort to mitigate this problem, NASA put out their Standard for Models and Simulations, currently in version NASA-STD-7009A, in which they detailed their recommendations, requirements and rationale for the different components of VV&C. They did this with the intention that it would allow for people receiving MS software to clearly understand and have data from the past development effort. This in turn would allow the people who had not worked with the MS software before to move forward with greater confidence and efficiency in their work. This particular project looks to perform Verification on several MATLAB (Registered Trademark)(The MathWorks, Inc.) scripts that will be later implemented in a website interface. It seeks to take note and define the limits of operation, the units and significance, and the expected datatype and format of the inputs and outputs of each of the scripts. This is intended to prevent the code from attempting to make incorrect or impossible calculations. Additionally, this project will look at the coding generally and note inconsistencies, redundancies, and other aspects that may become problematic or slow down the codes run time. Certain scripts lacking in documentation also will be commented and cataloged.

  9. OMV: A simplified mathematical model of the orbital maneuvering vehicle

    NASA Technical Reports Server (NTRS)

    Teoh, W.

    1984-01-01

    A model of the orbital maneuvering vehicle (OMV) is presented which contains several simplications. A set of hand controller signals may be used to control the motion of the OMV. Model verification is carried out using a sequence of tests. The dynamic variables generated by the model are compared, whenever possible, with the corresponding analytical variables. The results of the tests show conclusively that the present model is behaving correctly. Further, this model interfaces properly with the state vector transformation module (SVX) developed previously. Correct command sentence sequences are generated by the OMV and and SVX system, and these command sequences can be used to drive the flat floor simulation system at MSFC.

  10. Simulating flight boundary conditions for orbiter payload modal survey

    NASA Technical Reports Server (NTRS)

    Chung, Y. T.; Sernaker, M. L.; Peebles, J. H.

    1993-01-01

    An approach to simulate the characteristics of the payload/orbiter interfaces for the payload modal survey was developed. The flexure designed for this approach is required to provide adequate stiffness separation in the free and constrained interface degrees of freedom to closely resemble the flight boundary condition. Payloads will behave linearly and demonstrate similar modal effective mass distribution and load path as the flight if the flexure fixture is used for the payload modal survey. The potential non-linearities caused by the trunnion slippage during the conventional fixed base modal survey may be eliminated. Consequently, the effort to correlate the test and analysis models can be significantly reduced. An example is given to illustrate the selection and the sensitivity of the flexure stiffness. The advantages of using flexure fixtures for the modal survey and for the analytical model verification are also demonstrated.

  11. Integration Process for Payloads in the Fluids and Combustion Facility

    NASA Technical Reports Server (NTRS)

    Free, James M.; Nall, Marsha M.

    2001-01-01

    The Fluids and Combustion Facility (FCF) is an ISS research facility located in the United States Laboratory (US Lab), Destiny. The FCF is a multi-discipline facility that performs microgravity research primarily in fluids physics science and combustion science. This facility remains on-orbit and provides accommodations to multi-user and Principal investigator (PI) unique hardware. The FCF is designed to accommodate 15 PI's per year. In order to allow for this number of payloads per year, the FCF has developed an end-to-end analytical and physical integration process. The process includes provision of integration tools, products and interface management throughout the life of the payload. The payload is provided with a single point of contact from the facility and works with that interface from PI selection through post flight processing. The process utilizes electronic tools for creation of interface documents/agreements, storage of payload data and rollup for facility submittals to ISS. Additionally, the process provides integration to and testing with flight-like simulators prior to payload delivery to KSC. These simulators allow the payload to test in the flight configuration and perform final facility interface and science verifications. The process also provides for support to the payload from the FCF through the Payload Safety Review Panel (PSRP). Finally, the process includes support in the development of operational products and the operation of the payload on-orbit.

  12. A dedicated software application for treatment verification with off-line PET/CT imaging at the Heidelberg Ion Beam Therapy Center

    NASA Astrophysics Data System (ADS)

    Chen, W.; Bauer, J.; Kurz, C.; Tessonnier, T.; Handrack, J.; Haberer, T.; Debus, J.; Parodi, K.

    2017-01-01

    We present the workflow of the offline-PET based range verification method used at the Heidelberg Ion Beam Therapy Center, detailing the functionalities of an in-house developed software application, SimInterface14, with which range analysis is performed. Moreover, we introduce the design of a decision support system assessing uncertainties and facilitating physicians in decisions making for plan adaptation.

  13. Simulation verification techniques study. Task report 4: Simulation module performance parameters and performance standards

    NASA Technical Reports Server (NTRS)

    1974-01-01

    Shuttle simulation software modules in the environment, crew station, vehicle configuration and vehicle dynamics categories are discussed. For each software module covered, a description of the module functions and operational modes, its interfaces with other modules, its stored data, inputs, performance parameters and critical performance parameters is given. Reference data sources which provide standards of performance are identified for each module. Performance verification methods are also discussed briefly.

  14. Test Telemetry And Command System (TTACS)

    NASA Technical Reports Server (NTRS)

    Fogel, Alvin J.

    1994-01-01

    The Jet Propulsion Laboratory has developed a multimission Test Telemetry and Command System (TTACS) which provides a multimission telemetry and command data system in a spacecraft test environment. TTACS reuses, in the spacecraft test environment, components of the same data system used for flight operations; no new software is developed for the spacecraft test environment. Additionally, the TTACS is transportable to any spacecraft test site, including the launch site. The TTACS is currently operational in the Galileo spacecraft testbed; it is also being provided to support the Cassini and Mars Surveyor Program projects. Minimal personnel data system training is required in the transition from pre-launch spacecraft test to post-launch flight operations since test personnel are already familiar with the data system's operation. Additionally, data system components, e.g. data display, can be reused to support spacecraft software development; and the same data system components will again be reused during the spacecraft integration and system test phases. TTACS usage also results in early availability of spacecraft data to data system development and, as a result, early data system development feedback to spacecraft system developers. The TTACS consists of a multimission spacecraft support equipment interface and components of the multimission telemetry and command software adapted for a specific project. The TTACS interfaces to the spacecraft, e.g., Command Data System (CDS), support equipment. The TTACS telemetry interface to the CDS support equipment performs serial (RS-422)-to-ethernet conversion at rates between 1 bps and 1 mbps, telemetry data blocking and header generation, guaranteed data transmission to the telemetry data system, and graphical downlink routing summary and control. The TTACS command interface to the CDS support equipment is nominally a command file transferred in non-real-time via ethernet. The CDS support equipment is responsible for metering the commands to the CDS; additionally for Galileo, TTACS includes a real-time-interface to the CDS support equipment. The TTACS provides the basic functionality of the multimission telemetry and command data system used during flight operations. TTACS telemetry capabilities include frame synchronization, Reed-Solomon decoding, packet extraction and channelization, and data storage/query. Multimission data display capabilities are also available. TTACS command capabilities include command generation verification, and storage.

  15. A Verification System for Distributed Objects with Asynchronous Method Calls

    NASA Astrophysics Data System (ADS)

    Ahrendt, Wolfgang; Dylla, Maximilian

    We present a verification system for Creol, an object-oriented modeling language for concurrent distributed applications. The system is an instance of KeY, a framework for object-oriented software verification, which has so far been applied foremost to sequential Java. Building on KeY characteristic concepts, like dynamic logic, sequent calculus, explicit substitutions, and the taclet rule language, the system presented in this paper addresses functional correctness of Creol models featuring local cooperative thread parallelism and global communication via asynchronous method calls. The calculus heavily operates on communication histories which describe the interfaces of Creol units. Two example scenarios demonstrate the usage of the system.

  16. SLS Flight Software Testing: Using a Modified Agile Software Testing Approach

    NASA Technical Reports Server (NTRS)

    Bolton, Albanie T.

    2016-01-01

    NASA's Space Launch System (SLS) is an advanced launch vehicle for a new era of exploration beyond earth's orbit (BEO). The world's most powerful rocket, SLS, will launch crews of up to four astronauts in the agency's Orion spacecraft on missions to explore multiple deep-space destinations. Boeing is developing the SLS core stage, including the avionics that will control vehicle during flight. The core stage will be built at NASA's Michoud Assembly Facility (MAF) in New Orleans, LA using state-of-the-art manufacturing equipment. At the same time, the rocket's avionics computer software is being developed here at Marshall Space Flight Center in Huntsville, AL. At Marshall, the Flight and Ground Software division provides comprehensive engineering expertise for development of flight and ground software. Within that division, the Software Systems Engineering Branch's test and verification (T&V) team uses an agile test approach in testing and verification of software. The agile software test method opens the door for regular short sprint release cycles. The idea or basic premise behind the concept of agile software development and testing is that it is iterative and developed incrementally. Agile testing has an iterative development methodology where requirements and solutions evolve through collaboration between cross-functional teams. With testing and development done incrementally, this allows for increased features and enhanced value for releases. This value can be seen throughout the T&V team processes that are documented in various work instructions within the branch. The T&V team produces procedural test results at a higher rate, resolves issues found in software with designers at an earlier stage versus at a later release, and team members gain increased knowledge of the system architecture by interfacing with designers. SLS Flight Software teams want to continue uncovering better ways of developing software in an efficient and project beneficial manner. Through agile testing, there has been increased value through individuals and interactions over processes and tools, improved customer collaboration, and improved responsiveness to changes through controlled planning. The presentation will describe agile testing methodology as taken with the SLS FSW Test and Verification team at Marshall Space Flight Center.

  17. Command module/service module reaction control subsystem assessment

    NASA Technical Reports Server (NTRS)

    Weary, D. P.

    1971-01-01

    Detailed review of component failure histories, qualification adequacy, manufacturing flow, checkout requirements and flow, ground support equipment interfaces, subsystem interface verification, protective devices, and component design did not reveal major weaknesses in the command service module (CSM) reaction control system (RCS). No changes to the CSM RCS were recommended. The assessment reaffirmed the adequacy of the CSM RCS for future Apollo missions.

  18. An Airbus arrives at KSC with third MPLM

    NASA Technical Reports Server (NTRS)

    2001-01-01

    An Airbus '''Beluga''' air cargo plane, The Super Transporter, taxis onto the parking apron at KSC's Shuttle Landing Facility. Its cargo, from the factory of Alenia Aerospazio in Turin, Italy, is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  19. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the Shuttle Landing Facility, workers in cherry pickers (right) help guide offloading of the Italian Space Agency's Multi-Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  20. KSC01pp0234

    NASA Image and Video Library

    2001-02-01

    An Airbus “Beluga” air cargo plane, The Super Transporter, taxis onto the parking apron at KSC’s Shuttle Landing Facility. Its cargo, from the factory of Alenia Aerospazio in Turin, Italy, is the Italian Space Agency’s Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo

  1. Interface Management for a NASA Flight Project Using Model-Based Systems Engineering (MBSE)

    NASA Technical Reports Server (NTRS)

    Vipavetz, Kevin; Shull, Thomas A.; Infeld, Samatha; Price, Jim

    2016-01-01

    The goal of interface management is to identify, define, control, and verify interfaces; ensure compatibility; provide an efficient system development; be on time and within budget; while meeting stakeholder requirements. This paper will present a successful seven-step approach to interface management used in several NASA flight projects. The seven-step approach using Model Based Systems Engineering will be illustrated by interface examples from the Materials International Space Station Experiment-X (MISSE-X) project. The MISSE-X was being developed as an International Space Station (ISS) external platform for space environmental studies, designed to advance the technology readiness of materials and devices critical for future space exploration. Emphasis will be given to best practices covering key areas such as interface definition, writing good interface requirements, utilizing interface working groups, developing and controlling interface documents, handling interface agreements, the use of shadow documents, the importance of interface requirement ownership, interface verification, and product transition.

  2. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  3. 30 cm Engineering Model thruster design and qualification tests

    NASA Technical Reports Server (NTRS)

    Schnelker, D. E.; Collett, C. R.

    1975-01-01

    Development of a 30-cm mercury electron bombardment Engineering Model ion thruster has successfully brought the thruster from the status of a laboratory experimental device to a point approaching flight readiness. This paper describes the development progress of the Engineering Model (EM) thruster in four areas: (1) design features and fabrication approaches, (2) performance verification and thruster to thruster variations, (3) structural integrity, and (4) interface definition. The design of major subassemblies, including the cathode-isolator-vaporizer (CIV), main isolator-vaporizer (MIV), neutralizer isolator-vaporizer (NIV), ion optical system, and discharge chamber/outer housing is discussed along with experimental results.

  4. Space Station personal hygiene study

    NASA Technical Reports Server (NTRS)

    Prejean, Stephen E.; Booher, Cletis R.

    1986-01-01

    A personal hygiene system is currently under development for Space Station application that will provide capabilities equivalent to those found on earth. This paper addresses the study approach for specifying both primary and contingency personal hygiene systems and provisions for specified growth. Topics covered are system definition and subsystem descriptions. Subsystem interfaces are explored to determine which concurrent NASA study efforts must be monitored during future design phases to stay up-to-date on critical Space Station parameters. A design concept for a three (3) compartment personal hygiene facility is included as a baseline for planned test and verification activities.

  5. Feasibility and systems definition study for microwave multi-application payload (MMAP)

    NASA Technical Reports Server (NTRS)

    Horton, J. B.; Allen, C. C.; Massaro, M. J.; Zemany, J. L.; Murrell, J. W.; Stanhouse, R. W.; Condon, G. P.; Stone, R. F.

    1977-01-01

    There were three Shuttle/Spacelab experiments: adaptive multibeam phased array antenna (AMPA) experiment, electromagnetic environment experiment (EEE), and millimeter wave communications experiment (MWCE). Work on the AMPA experiment was completed. Results included are definition of operating modes, sequence of operation, radii of operation about several ground stations, signal format, foot prints of typical orbits and preliminary definition of ground and user terminals. Definition of the MOD I EEE included conceptual hardware designs, spacelab interfaces, preliminary data handling methods, experiment tests and verification, and EMC studies. The MWCE was defined conceptually for a steerable high gain antenna.

  6. Toward a formal verification of a floating-point coprocessor and its composition with a central processing unit

    NASA Technical Reports Server (NTRS)

    Pan, Jing; Levitt, Karl N.; Cohen, Gerald C.

    1991-01-01

    Discussed here is work to formally specify and verify a floating point coprocessor based on the MC68881. The HOL verification system developed at Cambridge University was used. The coprocessor consists of two independent units: the bus interface unit used to communicate with the cpu and the arithmetic processing unit used to perform the actual calculation. Reasoning about the interaction and synchronization among processes using higher order logic is demonstrated.

  7. MESA: Message-Based System Analysis Using Runtime Verification

    NASA Technical Reports Server (NTRS)

    Shafiei, Nastaran; Tkachuk, Oksana; Mehlitz, Peter

    2017-01-01

    In this paper, we present a novel approach and framework for run-time verication of large, safety critical messaging systems. This work was motivated by verifying the System Wide Information Management (SWIM) project of the Federal Aviation Administration (FAA). SWIM provides live air traffic, site and weather data streams for the whole National Airspace System (NAS), which can easily amount to several hundred messages per second. Such safety critical systems cannot be instrumented, therefore, verification and monitoring has to happen using a nonintrusive approach, by connecting to a variety of network interfaces. Due to a large number of potential properties to check, the verification framework needs to support efficient formulation of properties with a suitable Domain Specific Language (DSL). Our approach is to utilize a distributed system that is geared towards connectivity and scalability and interface it at the message queue level to a powerful verification engine. We implemented our approach in the tool called MESA: Message-Based System Analysis, which leverages the open source projects RACE (Runtime for Airspace Concept Evaluation) and TraceContract. RACE is a platform for instantiating and running highly concurrent and distributed systems and enables connectivity to SWIM and scalability. TraceContract is a runtime verication tool that allows for checking traces against properties specified in a powerful DSL. We applied our approach to verify a SWIM service against several requirements.We found errors such as duplicate and out-of-order messages.

  8. Multiparticle imaging technique for two-phase fluid flows using pulsed laser speckle velocimetry. Final report, September 1988--November 1992

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassan, T.A.

    1992-12-01

    The practical use of Pulsed Laser Velocimetry (PLV) requires the use of fast, reliable computer-based methods for tracking numerous particles suspended in a fluid flow. Two methods for performing tracking are presented. One method tracks a particle through multiple sequential images (minimum of four required) by prediction and verification of particle displacement and direction. The other method, requiring only two sequential images uses a dynamic, binary, spatial, cross-correlation technique. The algorithms are tested on computer-generated synthetic data and experimental data which was obtained with traditional PLV methods. This allowed error analysis and testing of the algorithms on real engineering flows.more » A novel method is proposed which eliminates tedious, undersirable, manual, operator assistance in removing erroneous vectors. This method uses an iterative process involving an interpolated field produced from the most reliable vectors. Methods are developed to allow fast analysis and presentation of sets of PLV image data. Experimental investigation of a two-phase, horizontal, stratified, flow regime was performed to determine the interface drag force, and correspondingly, the drag coefficient. A horizontal, stratified flow test facility using water and air was constructed to allow interface shear measurements with PLV techniques. The experimentally obtained local drag measurements were compared with theoretical results given by conventional interfacial drag theory. Close agreement was shown when local conditions near the interface were similar to space-averaged conditions. However, theory based on macroscopic, space-averaged flow behavior was shown to give incorrect results if the local gas velocity near the interface as unstable, transient, and dissimilar from the average gas velocity through the test facility.« less

  9. Multiparticle imaging technique for two-phase fluid flows using pulsed laser speckle velocimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hassan, T.A.

    1992-12-01

    The practical use of Pulsed Laser Velocimetry (PLV) requires the use of fast, reliable computer-based methods for tracking numerous particles suspended in a fluid flow. Two methods for performing tracking are presented. One method tracks a particle through multiple sequential images (minimum of four required) by prediction and verification of particle displacement and direction. The other method, requiring only two sequential images uses a dynamic, binary, spatial, cross-correlation technique. The algorithms are tested on computer-generated synthetic data and experimental data which was obtained with traditional PLV methods. This allowed error analysis and testing of the algorithms on real engineering flows.more » A novel method is proposed which eliminates tedious, undersirable, manual, operator assistance in removing erroneous vectors. This method uses an iterative process involving an interpolated field produced from the most reliable vectors. Methods are developed to allow fast analysis and presentation of sets of PLV image data. Experimental investigation of a two-phase, horizontal, stratified, flow regime was performed to determine the interface drag force, and correspondingly, the drag coefficient. A horizontal, stratified flow test facility using water and air was constructed to allow interface shear measurements with PLV techniques. The experimentally obtained local drag measurements were compared with theoretical results given by conventional interfacial drag theory. Close agreement was shown when local conditions near the interface were similar to space-averaged conditions. However, theory based on macroscopic, space-averaged flow behavior was shown to give incorrect results if the local gas velocity near the interface as unstable, transient, and dissimilar from the average gas velocity through the test facility.« less

  10. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  11. 40 CFR 1066.420 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Pre-test verification procedures and pre-test data collection. 1066.420 Section 1066.420 Protection of Environment ENVIRONMENTAL PROTECTION... Test § 1066.420 Pre-test verification procedures and pre-test data collection. (a) Follow the...

  12. 40 CFR 1066.420 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Pre-test verification procedures and pre-test data collection. 1066.420 Section 1066.420 Protection of Environment ENVIRONMENTAL PROTECTION... Test § 1066.420 Pre-test verification procedures and pre-test data collection. (a) Follow the...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION TEST PROTOCOL, GENERAL VENTILATION FILTERS

    EPA Science Inventory

    The Environmental Technology Verification Test Protocol, General Ventilation Filters provides guidance for verification tests.

    Reference is made in the protocol to the ASHRAE 52.2P "Method of Testing General Ventilation Air-cleaning Devices for Removal Efficiency by P...

  14. Environmental Technology Verification: Supplement to Test/QA Plan for Biological and Aerosol Testing of General Ventilation Air Cleaners; Bioaerosol Inactivation Efficiency by HVAC In-Duct Ultraviolet Light Air Cleaners

    EPA Science Inventory

    The Air Pollution Control Technology Verification Center has selected general ventilation air cleaners as a technology area. The Generic Verification Protocol for Biological and Aerosol Testing of General Ventilation Air Cleaners is on the Environmental Technology Verification we...

  15. A novel approach in formulation of special transition elements: Mesh interface elements

    NASA Technical Reports Server (NTRS)

    Sarigul, Nesrin

    1991-01-01

    The objective of this research program is in the development of more accurate and efficient methods for solution of singular problems encountered in various branches of mechanics. The research program can be categorized under three levels. The first two levels involve the formulation of a new class of elements called 'mesh interface elements' (MIE) to connect meshes of traditional elements either in three dimensions or in three and two dimensions. The finite element formulations are based on boolean sum and blending operators. MEI are being formulated and tested in this research to account for the steep gradients encountered in aircraft and space structure applications. At present, the heat transfer and structural analysis problems are being formulated from uncoupled theory point of view. The status report: (1) summarizes formulation for heat transfer and structural analysis; (2) explains formulation of MEI; (3) examines computational efficiency; and (4) shows verification examples.

  16. GeMS: an advanced software package for designing synthetic genes.

    PubMed

    Jayaraj, Sebastian; Reid, Ralph; Santi, Daniel V

    2005-01-01

    A user-friendly, advanced software package for gene design is described. The software comprises an integrated suite of programs-also provided as stand-alone tools-that automatically performs the following tasks in gene design: restriction site prediction, codon optimization for any expression host, restriction site inclusion and exclusion, separation of long sequences into synthesizable fragments, T(m) and stem-loop determinations, optimal oligonucleotide component design and design verification/error-checking. The output is a complete design report and a list of optimized oligonucleotides to be prepared for subsequent gene synthesis. The user interface accommodates both inexperienced and experienced users. For inexperienced users, explanatory notes are provided such that detailed instructions are not necessary; for experienced users, a streamlined interface is provided without such notes. The software has been extensively tested in the design and successful synthesis of over 400 kb of genes, many of which exceeded 5 kb in length.

  17. An Integrated Approach to Exploration Launch Office Requirements Development

    NASA Technical Reports Server (NTRS)

    Holladay, Jon B.; Langford, Gary

    2006-01-01

    The proposed paper will focus on the Project Management and Systems Engineering approach utilized to develop a set of both integrated and cohesive requirements for the Exploration Launch Office, within the Constellation Program. A summary of the programmatic drivers which influenced the approach along with details of the resulting implementation will be discussed as well as metrics evaluating the efficiency and accuracy of the various requirements development activities. Requirements development activities will focus on the procedures utilized to ensure that technical content was valid and mature in preparation for the Crew Launch Vehicle and Constellation System s Requirements Reviews. This discussion will begin at initial requirements development during the Exploration Systems Architecture Study and progress through formal development of the program structure. Specific emphasis will be given to development and validation of the requirements. This discussion will focus on approaches to garner the appropriate requirement owners (or customers), project infrastructure utilized to emphasize proper integration, and finally the procedure to technically mature, verify and validate the requirements. Examples of requirements being implemented on the Launch Vehicle (systems, interfaces, test & verification) will be utilized to demonstrate the various processes and also provide a top level understanding of the launch vehicle(s) performance goals. Details may also be provided on the approaches for verification, which range from typical aerospace hardware development (qualification/acceptance) through flight certification (flight test, etc.). The primary intent of this paper is to provide a demonstrated procedure for the development of a mature, effective, integrated set of requirements on a complex system, which also has the added intricacies of both heritage and new hardware development integration. Ancillary focus of the paper will include discussion of Test and Verification approaches along with top level systems/elements performance capabilities.

  18. Design-order, non-conformal low-Mach fluid algorithms using a hybrid CVFEM/DG approach

    NASA Astrophysics Data System (ADS)

    Domino, Stefan P.

    2018-04-01

    A hybrid, design-order sliding mesh algorithm, which uses a control volume finite element method (CVFEM), in conjunction with a discontinuous Galerkin (DG) approach at non-conformal interfaces, is outlined in the context of a low-Mach fluid dynamics equation set. This novel hybrid DG approach is also demonstrated to be compatible with a classic edge-based vertex centered (EBVC) scheme. For the CVFEM, element polynomial, P, promotion is used to extend the low-order P = 1 CVFEM method to higher-order, i.e., P = 2. An equal-order low-Mach pressure-stabilized methodology, with emphasis on the non-conformal interface boundary condition, is presented. A fully implicit matrix solver approach that accounts for the full stencil connectivity across the non-conformal interface is employed. A complete suite of formal verification studies using the method of manufactured solutions (MMS) is performed to verify the order of accuracy of the underlying methodology. The chosen suite of analytical verification cases range from a simple steady diffusion system to a traveling viscous vortex across mixed-order non-conformal interfaces. Results from all verification studies demonstrate either second- or third-order spatial accuracy and, for transient solutions, second-order temporal accuracy. Significant accuracy gains in manufactured solution error norms are noted even with modest promotion of the underlying polynomial order. The paper also demonstrates the CVFEM/DG methodology on two production-like simulation cases that include an inner block subjected to solid rotation, i.e., each of the simulations include a sliding mesh, non-conformal interface. The first production case presented is a turbulent flow past a high-rate-of-rotation cube (Re, 4000; RPM, 3600) on like and mixed-order polynomial interfaces. The final simulation case is a full-scale Vestas V27 225 kW wind turbine (tower and nacelle omitted) in which a hybrid topology, low-order mesh is used. Both production simulations provide confidence in the underlying capability and demonstrate the viability of this hybrid method for deployment towards high-fidelity wind energy validation and analysis.

  19. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...

  20. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...

  1. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) For...

  2. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...

  3. 40 CFR 1065.520 - Pre-test verification procedures and pre-test data collection.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Pre-test verification procedures and pre-test data collection. 1065.520 Section 1065.520 Protection of Environment ENVIRONMENTAL PROTECTION... Specified Duty Cycles § 1065.520 Pre-test verification procedures and pre-test data collection. (a) If your...

  4. Process Document, Joint Verification Protocol, and Joint Test Plan for Verification of HACH-LANGE GmbH LUMIStox 300 Bench Top Luminometer and ECLOX Handheld Luminometer for Luminescent Bacteria Test for use in Wastewater

    EPA Science Inventory

    The Danish Environmental Technology Verification program (DANETV) Water Test Centre operated by DHI, is supported by the Danish Ministry for Science, Technology and Innovation. DANETV, the United States Environmental Protection Agency Environmental Technology Verification Progra...

  5. Advanced software integration: The case for ITV facilities

    NASA Technical Reports Server (NTRS)

    Garman, John R.

    1990-01-01

    The array of technologies and methodologies involved in the development and integration of avionics software has moved almost as rapidly as computer technology itself. Future avionics systems involve major advances and risks in the following areas: (1) Complexity; (2) Connectivity; (3) Security; (4) Duration; and (5) Software engineering. From an architectural standpoint, the systems will be much more distributed, involve session-based user interfaces, and have the layered architectures typified in the layers of abstraction concepts popular in networking. Typified in the NASA Space Station Freedom will be the highly distributed nature of software development itself. Systems composed of independent components developed in parallel must be bound by rigid standards and interfaces, the clean requirements and specifications. Avionics software provides a challenge in that it can not be flight tested until the first time it literally flies. It is the binding of requirements for such an integration environment into the advances and risks of future avionics systems that form the basis of the presented concept and the basic Integration, Test, and Verification concept within the development and integration life cycle of Space Station Mission and Avionics systems.

  6. Test/QA Plan for Verification of Leak Detection and Repair Technologies

    EPA Science Inventory

    The purpose of the leak detection and repair (LDAR) test and quality assurance plan is to specify procedures for a verification test applicable to commercial LDAR technologies. The purpose of the verification test is to evaluate the performance of participating technologies in b...

  7. Test/QA Plan (TQAP) for Verification of Semi-Continuous Ambient Air Monitoring Systems

    EPA Science Inventory

    The purpose of the semi-continuous ambient air monitoring technology (or MARGA) test and quality assurance plan is to specify procedures for a verification test applicable to commercial semi-continuous ambient air monitoring technologies. The purpose of the verification test is ...

  8. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  9. Verification Challenges of Dynamic Testing of Space Flight Hardware

    NASA Technical Reports Server (NTRS)

    Winnitoy, Susan

    2010-01-01

    The Six Degree-of-Freedom Dynamic Test System (SDTS) is a test facility at the National Aeronautics and Space Administration (NASA) Johnson Space Center in Houston, Texas for performing dynamic verification of space structures and hardware. Some examples of past and current tests include the verification of on-orbit robotic inspection systems, space vehicle assembly procedures and docking/berthing systems. The facility is able to integrate a dynamic simulation of on-orbit spacecraft mating or demating using flight-like mechanical interface hardware. A force moment sensor is utilized for input to the simulation during the contact phase, thus simulating the contact dynamics. While the verification of flight hardware presents many unique challenges, one particular area of interest is with respect to the use of external measurement systems to ensure accurate feedback of dynamic contact. There are many commercial off-the-shelf (COTS) measurement systems available on the market, and the test facility measurement systems have evolved over time to include two separate COTS systems. The first system incorporates infra-red sensing cameras, while the second system employs a laser interferometer to determine position and orientation data. The specific technical challenges with the measurement systems in a large dynamic environment include changing thermal and humidity levels, operational area and measurement volume, dynamic tracking, and data synchronization. The facility is located in an expansive high-bay area that is occasionally exposed to outside temperature when large retractable doors at each end of the building are opened. The laser interferometer system, in particular, is vulnerable to the environmental changes in the building. The operational area of the test facility itself is sizeable, ranging from seven meters wide and five meters deep to as much as seven meters high. Both facility measurement systems have desirable measurement volumes and the accuracies vary within the respective volumes. In addition, because this is a dynamic facility with a moving test bed, direct line-of-sight may not be available at all times between the measurement sensors and the tracking targets. Finally, the feedback data from the active test bed along with the two external measurement systems must be synchronized to allow for data correlation. To ensure the desired accuracy and resolution of these systems, calibration of the systems must be performed regularly. New innovations in sensor technology itself are periodically incorporated into the facility s overall measurement scheme. In addressing the challenges of the measurement systems, the facility is able to provide essential position and orientation data to verify the dynamic performance of space flight hardware.

  10. Analysis, testing and verification of the behavior of composite pavements under Florida conditions using a heavy vehicle simulator

    NASA Astrophysics Data System (ADS)

    Tapia Gutierrez, Patricio Enrique

    Whitetopping (WT) is a rehabilitation method to resurface deteriorated asphalt pavements. While some of these composite pavements have performed very well carrying heavy load, other have shown poor performance with early cracking. With the objective of analyzing the applicability of WT pavements under Florida conditions, a total of nine full-scale WT test sections were constructed and tested using a Heavy Vehicle Simulator (HVS) in the APT facility at the FDOT Material Research Park. The test sections were instrumented to monitor both strain and temperature. A 3-D finite element model was developed to analyze the WT test sections. The model was calibrated and verified using measured FWD deflections and HVS load-induced strains from the test sections. The model was then used to evaluate the potential performance of these test sections under critical temperature-load condition in Florida. Six of the WT pavement test sections had a bonded concrete-asphalt interface by milling, cleaning and spraying with water the asphalt surface. This method produced excellent bonding at the interface, with shear strength of 195 to 220 psi. Three of the test sections were intended to have an unbonded concrete-asphalt interface by applying a debonding agent in the asphalt surface. However, shear strengths between 119 and 135 psi and a careful analysis of the strain and the temperature data indicated a partial bond condition. The computer model was able to satisfactorily model the behavior of the composite pavement by mainly considering material properties from standard laboratory tests and calibrating the spring elements used to model the interface. Reasonable matches between the measured and the calculated strains were achieved when a temperature-dependent AC elastic modulus was included in the analytical model. The expected numbers of repetitions of the 24-kip single axle loads at critical thermal condition were computed for the nine test sections based on maximum tensile stresses and fatigue theory. The results showed that 4" slabs can be used for heavy loads only for low-volume traffic. To withstand the critical load without fear of fatigue failure, 6" slabs and 8" slabs would be needed for joint spacings of 4' and 6', respectively.

  11. Environmental Technology Verification Report for Abraxis Ecologenia® 17β-Estradiol (E2) Microplate Enzyme-Linked Immunosorbent Assay (ELISA) Test Kits

    EPA Science Inventory

    This verification test was conducted according to procedures specifiedin the Test/QA Planfor Verification of Enzyme-Linked Immunosorbent Assay (ELISA) Test Kis for the Quantitative Determination of Endocrine Disrupting Compounds (EDCs) in Aqueous Phase Samples. Deviations to the...

  12. Test/QA Plan for Verification of Cavity Ringdown Spectroscopy Systems for Ammonia Monitoring in Stack Gas

    EPA Science Inventory

    The purpose of the cavity ringdown spectroscopy (CRDS) technology test and quality assurance plan is to specify procedures for a verification test applicable to commercial cavity ringdown spectroscopy technologies. The purpose of the verification test is to evaluate the performa...

  13. 40 CFR 86.1849-01 - Right of entry.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... entity who conducts or causes to be conducted in-use verification or in-use confirmatory testing under... where any such certification or in-use verification or in-use confirmatory testing or any procedures or... test vehicle used for certification, in-use verification or in-use confirmatory testing which is being...

  14. User interface and operational issues with thermionic space power systems

    NASA Technical Reports Server (NTRS)

    Dahlberg, R. C.; Fisher, C. R.

    1987-01-01

    Thermionic space power systems have unique features which facilitate predeployment operations, provide operational flexibility and simplify the interface with the user. These were studied in some detail during the SP-100 program from 1983 to 1985. Three examples are reviewed in this paper: (1) system readiness verification in the prelaunch phase; (2) startup, shutdown, and dormancy in the operations phase; (3) part-load operation in the operations phase.

  15. Inertial Upper Stage (IUS) software analysis

    NASA Technical Reports Server (NTRS)

    Grayson, W. L.; Nickel, C. E.; Rose, P. L.; Singh, R. P.

    1979-01-01

    The Inertial Upper Stage (IUS) System, an extension of the Space Transportation System (STS) operating regime to include higher orbits, orbital plane changes, geosynchronous orbits, and interplanetary trajectories is presented. The IUS software design, the IUS software interfaces with other systems, and the cost effectiveness in software verification are described. Tasks of the IUS discussed include: (1) design analysis; (2) validation requirements analysis; (3) interface analysis; and (4) requirements analysis.

  16. 76 FR 50164 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-12

    ...-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing... correct certain portions of the Protocol Gas Verification Program and Minimum Competency Requirements for... final rule that amends the Agency's Protocol Gas Verification Program (PGVP) and the minimum competency...

  17. Verification test report on a solar heating and hot water system

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information is provided on the development, qualification and acceptance verification of commercial solar heating and hot water systems and components. The verification includes the performances, the efficiences and the various methods used, such as similarity, analysis, inspection, test, etc., that are applicable to satisfying the verification requirements.

  18. Implementation of an Adaptive Controller System from Concept to Flight Test

    NASA Technical Reports Server (NTRS)

    Larson, Richard R.; Burken, John J.; Butler, Bradley S.

    2009-01-01

    The National Aeronautics and Space Administration Dryden Flight Research Center (Edwards, California) is conducting ongoing flight research using adaptive controller algorithms. A highly modified McDonnell-Douglas NF-15B airplane called the F-15 Intelligent Flight Control System (IFCS) was used for these algorithms. This airplane has been modified by the addition of canards and by changing the flight control systems to interface a single-string research controller processor for neural network algorithms. Research goals included demonstration of revolutionary control approaches that can efficiently optimize aircraft performance for both normal and failure conditions, and to advance neural-network-based flight control technology for new aerospace systems designs. Before the NF-15B IFCS airplane was certified for flight test, however, certain processes needed to be completed. This paper presents an overview of these processes, including a description of the initial adaptive controller concepts followed by a discussion of modeling formulation and performance testing. Upon design finalization, the next steps are: integration with the system interfaces, verification of the software, validation of the hardware to the requirements, design of failure detection, development of safety limiters to minimize the effect of erroneous neural network commands, and creation of flight test control room displays to maximize human situational awareness.

  19. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  20. Real-time high speed generator system emulation with hardware-in-the-loop application

    NASA Astrophysics Data System (ADS)

    Stroupe, Nicholas

    The emerging emphasis and benefits of distributed generation on smaller scale networks has prompted much attention and focus to research in this field. Much of the research that has grown in distributed generation has also stimulated the development of simulation software and techniques. Testing and verification of these distributed power networks is a complex task and real hardware testing is often desired. This is where simulation methods such as hardware-in-the-loop become important in which an actual hardware unit can be interfaced with a software simulated environment to verify proper functionality. In this thesis, a simulation technique is taken one step further by utilizing a hardware-in-the-loop technique to emulate the output voltage of a generator system interfaced to a scaled hardware distributed power system for testing. The purpose of this thesis is to demonstrate a new method of testing a virtually simulated generation system supplying a scaled distributed power system in hardware. This task is performed by using the Non-Linear Loads Test Bed developed by the Energy Conversion and Integration Thrust at the Center for Advanced Power Systems. This test bed consists of a series of real hardware developed converters consistent with the Navy's All-Electric-Ship proposed power system to perform various tests on controls and stability under the expected non-linear load environment of the Navy weaponry. This test bed can also explore other distributed power system research topics and serves as a flexible hardware unit for a variety of tests. In this thesis, the test bed will be utilized to perform and validate this newly developed method of generator system emulation. In this thesis, the dynamics of a high speed permanent magnet generator directly coupled with a micro turbine are virtually simulated on an FPGA in real-time. The calculated output stator voltage will then serve as a reference for a controllable three phase inverter at the input of the test bed that will emulate and reproduce these voltages on real hardware. The output of the inverter is then connected with the rest of the test bed and can consist of a variety of distributed system topologies for many testing scenarios. The idea is that the distributed power system under test in hardware can also integrate real generator system dynamics without physically involving an actual generator system. The benefits of successful generator system emulation are vast and lead to much more detailed system studies without the draw backs of needing physical generator units. Some of these advantages are safety, reduced costs, and the ability of scaling while still preserving the appropriate system dynamics. This thesis will introduce the ideas behind generator emulation and explain the process and necessary steps to obtaining such an objective. It will also demonstrate real results and verification of numerical values in real-time. The final goal of this thesis is to introduce this new idea and show that it is in fact obtainable and can prove to be a highly useful tool in the simulation and verification of distributed power systems.

  1. Multimodal fusion of polynomial classifiers for automatic person recgonition

    NASA Astrophysics Data System (ADS)

    Broun, Charles C.; Zhang, Xiaozheng

    2001-03-01

    With the prevalence of the information age, privacy and personalization are forefront in today's society. As such, biometrics are viewed as essential components of current evolving technological systems. Consumers demand unobtrusive and non-invasive approaches. In our previous work, we have demonstrated a speaker verification system that meets these criteria. However, there are additional constraints for fielded systems. The required recognition transactions are often performed in adverse environments and across diverse populations, necessitating robust solutions. There are two significant problem areas in current generation speaker verification systems. The first is the difficulty in acquiring clean audio signals in all environments without encumbering the user with a head- mounted close-talking microphone. Second, unimodal biometric systems do not work with a significant percentage of the population. To combat these issues, multimodal techniques are being investigated to improve system robustness to environmental conditions, as well as improve overall accuracy across the population. We propose a multi modal approach that builds on our current state-of-the-art speaker verification technology. In order to maintain the transparent nature of the speech interface, we focus on optical sensing technology to provide the additional modality-giving us an audio-visual person recognition system. For the audio domain, we use our existing speaker verification system. For the visual domain, we focus on lip motion. This is chosen, rather than static face or iris recognition, because it provides dynamic information about the individual. In addition, the lip dynamics can aid speech recognition to provide liveness testing. The visual processing method makes use of both color and edge information, combined within Markov random field MRF framework, to localize the lips. Geometric features are extracted and input to a polynomial classifier for the person recognition process. A late integration approach, based on a probabilistic model, is employed to combine the two modalities. The system is tested on the XM2VTS database combined with AWGN in the audio domain over a range of signal-to-noise ratios.

  2. Verification of performance specifications of a molecular test: cystic fibrosis carrier testing using the Luminex liquid bead array.

    PubMed

    Lacbawan, Felicitas L; Weck, Karen E; Kant, Jeffrey A; Feldman, Gerald L; Schrijver, Iris

    2012-01-01

    The number of clinical laboratories introducing various molecular tests to their existing test menu is continuously increasing. Prior to offering a US Food and Drug Administration-approved test, it is necessary that performance characteristics of the test, as claimed by the company, are verified before the assay is implemented in a clinical laboratory. To provide an example of the verification of a specific qualitative in vitro diagnostic test: cystic fibrosis carrier testing using the Luminex liquid bead array (Luminex Molecular Diagnostics, Inc, Toronto, Ontario). The approach used by an individual laboratory for verification of a US Food and Drug Administration-approved assay is described. Specific verification data are provided to highlight the stepwise verification approach undertaken by a clinical diagnostic laboratory. Protocols for verification of in vitro diagnostic assays may vary between laboratories. However, all laboratories must verify several specific performance specifications prior to implementation of such assays for clinical use. We provide an example of an approach used for verifying performance of an assay for cystic fibrosis carrier screening.

  3. 40 CFR 1065.920 - PEMS Calibrations and verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... verification. The verification consists of operating an engine over a duty cycle in the laboratory and... by laboratory equipment as follows: (1) Mount an engine on a dynamometer for laboratory testing...

  4. Efficient Verification of Holograms Using Mobile Augmented Reality.

    PubMed

    Hartl, Andreas Daniel; Arth, Clemens; Grubert, Jens; Schmalstieg, Dieter

    2016-07-01

    Paper documents such as passports, visas and banknotes are frequently checked by inspection of security elements. In particular, optically variable devices such as holograms are important, but difficult to inspect. Augmented Reality can provide all relevant information on standard mobile devices. However, hologram verification on mobiles still takes long and provides lower accuracy than inspection by human individuals using appropriate reference information. We aim to address these drawbacks by automatic matching combined with a special parametrization of an efficient goal-oriented user interface which supports constrained navigation. We first evaluate a series of similarity measures for matching hologram patches to provide a sound basis for automatic decisions. Then a re-parametrized user interface is proposed based on observations of typical user behavior during document capture. These measures help to reduce capture time to approximately 15 s with better decisions regarding the evaluated samples than what can be achieved by untrained users.

  5. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  6. The design and implementation of a windowing interface pinch force measurement system

    NASA Astrophysics Data System (ADS)

    Ho, Tze-Yee; Chen, Yuanu-Joan; Chung, Chin-Teng; Hsiao, Ming-Heng

    2010-02-01

    This paper presents a novel windowing interface pinch force measurement system that is basically based on an USB (Universal Series Bus) microcontroller which mainly processes the sensing data from the force sensing resistance sensors mounted on five digits. It possesses several friendly functions, such as the value and curve trace of the applied force by a hand injured patient displayed in real time on a monitoring screen, consequently, not only the physician can easily evaluate the effect of hand injury rehabilitation, but also the patients get more progressive during the hand physical therapy by interacting with the screen of pinch force measurement. In order to facilitate the pinch force measurement system and make it friendly, the detail hardware design and software programming flowchart are described in this paper. Through a series of carefully and detailed experimental tests, first of all, the relationship between the applying force and the FSR sensors are measured and verified. Later, the different type of pinch force measurements are verified by the oscilloscope and compared with the corresponding values and waveform traces in the window interface display panel to obtain the consistency. Finally, a windowing interface pinch force measurement system based on the USB microcontroller is implemented and demonstrated. The experimental results show the verification and feasibility of the designed system.

  7. A robust method using propensity score stratification for correcting verification bias for binary tests

    PubMed Central

    He, Hua; McDermott, Michael P.

    2012-01-01

    Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650

  8. Modal Test/Analysis Correlation of Space Station Structures Using Nonlinear Sensitivity

    NASA Technical Reports Server (NTRS)

    Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan

    1992-01-01

    The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlation. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.

  9. Modal test/analysis correlation of Space Station structures using nonlinear sensitivity

    NASA Technical Reports Server (NTRS)

    Gupta, Viney K.; Newell, James F.; Berke, Laszlo; Armand, Sasan

    1992-01-01

    The modal correlation problem is formulated as a constrained optimization problem for validation of finite element models (FEM's). For large-scale structural applications, a pragmatic procedure for substructuring, model verification, and system integration is described to achieve effective modal correlations. The space station substructure FEM's are reduced using Lanczos vectors and integrated into a system FEM using Craig-Bampton component modal synthesis. The optimization code is interfaced with MSC/NASTRAN to solve the problem of modal test/analysis correlation; that is, the problem of validating FEM's for launch and on-orbit coupled loads analysis against experimentally observed frequencies and mode shapes. An iterative perturbation algorithm is derived and implemented to update nonlinear sensitivity (derivatives of eigenvalues and eigenvectors) during optimizer iterations, which reduced the number of finite element analyses.

  10. Space Radar Laboratory photos taken at Kennedy Space Center

    NASA Image and Video Library

    1994-03-18

    S94-30393 (23 Nov 1993) --- In the south level IV stand of the Operations and Checkout Building low bay, the Space Radar Laboratory -1 (SRL-1) antenna is being placed atop a pallet which holds the antenna electronics. SRL-1 is scheduled to fly on Space Shuttle mission STS-59 next year. It is comprised of two different imaging radars, the Spaceborne Imaging Radar-C (SIR-C) and the X-band Synthetic Aperture Radar (X-SAR). These radars are the most advanced of their kind to fly in space to date, and will allow scientists to make highly detailed studies of the Earth's surface on a global scale. An Interface Verification Test of the antenna and a Mission Sequence Test will be performed on the fully assembled SRL-1 later this month.

  11. Joint ETV/NOWATECH test plan for the Sorbisense GSW40 passive sampler

    EPA Science Inventory

    The joint test plan is the implementation of a test design developed for verification of the performance of an environmental technology following the NOWATECH ETV method. The verification is a joint verification with the US EPA ETV scheme and the Advanced Monitoring Systems Cent...

  12. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  13. Digital data storage systems, computers, and data verification methods

    DOEpatents

    Groeneveld, Bennett J.; Austad, Wayne E.; Walsh, Stuart C.; Herring, Catherine A.

    2005-12-27

    Digital data storage systems, computers, and data verification methods are provided. According to a first aspect of the invention, a computer includes an interface adapted to couple with a dynamic database; and processing circuitry configured to provide a first hash from digital data stored within a portion of the dynamic database at an initial moment in time, to provide a second hash from digital data stored within the portion of the dynamic database at a subsequent moment in time, and to compare the first hash and the second hash.

  14. Development of an Ion Thruster and Power Processor for New Millennium's Deep Space 1 Mission

    NASA Technical Reports Server (NTRS)

    Sovey, James S.; Hamley, John A.; Haag, Thomas W.; Patterson, Michael J.; Pencil, Eric J.; Peterson, Todd T.; Pinero, Luis R.; Power, John L.; Rawlin, Vincent K.; Sarmiento, Charles J.; hide

    1997-01-01

    The NASA Solar Electric Propulsion Technology Applications Readiness Program (NSTAR) will provide a single-string primary propulsion system to NASA's New Millennium Deep Space 1 Mission which will perform comet and asteroid flybys in the years 1999 and 2000. The propulsion system includes a 30-cm diameter ion thruster, a xenon feed system, a power processing unit, and a digital control and interface unit. A total of four engineering model ion thrusters, three breadboard power processors, and a controller have been built, integrated, and tested. An extensive set of development tests has been completed along with thruster design verification tests of 2000 h and 1000 h. An 8000 h Life Demonstration Test is ongoing and has successfully demonstrated more than 6000 h of operation. In situ measurements of accelerator grid wear are consistent with grid lifetimes well in excess of the 12,000 h qualification test requirement. Flight hardware is now being assembled in preparation for integration, functional, and acceptance tests.

  15. Practical Formal Verification of MPI and Thread Programs

    NASA Astrophysics Data System (ADS)

    Gopalakrishnan, Ganesh; Kirby, Robert M.

    Large-scale simulation codes in science and engineering are written using the Message Passing Interface (MPI). Shared memory threads are widely used directly, or to implement higher level programming abstractions. Traditional debugging methods for MPI or thread programs are incapable of providing useful formal guarantees about coverage. They get bogged down in the sheer number of interleavings (schedules), often missing shallow bugs. In this tutorial we will introduce two practical formal verification tools: ISP (for MPI C programs) and Inspect (for Pthread C programs). Unlike other formal verification tools, ISP and Inspect run directly on user source codes (much like a debugger). They pursue only the relevant set of process interleavings, using our own customized Dynamic Partial Order Reduction algorithms. For a given test harness, DPOR allows these tools to guarantee the absence of deadlocks, instrumented MPI object leaks and communication races (using ISP), and shared memory races (using Inspect). ISP and Inspect have been used to verify large pieces of code: in excess of 10,000 lines of MPI/C for ISP in under 5 seconds, and about 5,000 lines of Pthread/C code in a few hours (and much faster with the use of a cluster or by exploiting special cases such as symmetry) for Inspect. We will also demonstrate the Microsoft Visual Studio and Eclipse Parallel Tools Platform integrations of ISP (these will be available on the LiveCD).

  16. Enhanced Verification Test Suite for Physics Simulation Codes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kamm, J R; Brock, J S; Brandon, S T

    2008-10-10

    This document discusses problems with which to augment, in quantity and in quality, the existing tri-laboratory suite of verification problems used by Los Alamos National Laboratory (LANL), Lawrence Livermore National Laboratory (LLNL), and Sandia National Laboratories (SNL). The purpose of verification analysis is demonstrate whether the numerical results of the discretization algorithms in physics and engineering simulation codes provide correct solutions of the corresponding continuum equations. The key points of this document are: (1) Verification deals with mathematical correctness of the numerical algorithms in a code, while validation deals with physical correctness of a simulation in a regime of interest.more » This document is about verification. (2) The current seven-problem Tri-Laboratory Verification Test Suite, which has been used for approximately five years at the DOE WP laboratories, is limited. (3) Both the methodology for and technology used in verification analysis have evolved and been improved since the original test suite was proposed. (4) The proposed test problems are in three basic areas: (a) Hydrodynamics; (b) Transport processes; and (c) Dynamic strength-of-materials. (5) For several of the proposed problems we provide a 'strong sense verification benchmark', consisting of (i) a clear mathematical statement of the problem with sufficient information to run a computer simulation, (ii) an explanation of how the code result and benchmark solution are to be evaluated, and (iii) a description of the acceptance criterion for simulation code results. (6) It is proposed that the set of verification test problems with which any particular code be evaluated include some of the problems described in this document. Analysis of the proposed verification test problems constitutes part of a necessary--but not sufficient--step that builds confidence in physics and engineering simulation codes. More complicated test cases, including physics models of greater sophistication or other physics regimes (e.g., energetic material response, magneto-hydrodynamics), would represent a scientifically desirable complement to the fundamental test cases discussed in this report. The authors believe that this document can be used to enhance the verification analyses undertaken at the DOE WP Laboratories and, thus, to improve the quality, credibility, and usefulness of the simulation codes that are analyzed with these problems.« less

  17. Hafnium transistor design for neural interfacing.

    PubMed

    Parent, David W; Basham, Eric J

    2008-01-01

    A design methodology is presented that uses the EKV model and the g(m)/I(D) biasing technique to design hafnium oxide field effect transistors that are suitable for neural recording circuitry. The DC gain of a common source amplifier is correlated to the structural properties of a Field Effect Transistor (FET) and a Metal Insulator Semiconductor (MIS) capacitor. This approach allows a transistor designer to use a design flow that starts with simple and intuitive 1-D equations for gain that can be verified in 1-D MIS capacitor TCAD simulations, before final TCAD process verification of transistor properties. The DC gain of a common source amplifier is optimized by using fast 1-D simulations and using slower, complex 2-D simulations only for verification. The 1-D equations are used to show that the increased dielectric constant of hafnium oxide allows a higher DC gain for a given oxide thickness. An additional benefit is that the MIS capacitor can be employed to test additional performance parameters important to an open gate transistor such as dielectric stability and ionic penetration.

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR TREATMENT OF WASTEWATER GENERATED DURING DECONTAMINATION ACTIVITIES - ULTRASTRIP SYSTEMS, INC., MOBILE EMERGENCY FILTRATION SYSTEM (MEFS) - 04/14/WQPC-HS

    EPA Science Inventory

    Performance verification testing of the UltraStrip Systems, Inc., Mobile Emergency Filtration System (MEFS) was conducted under EPA's Environmental Technology Verification (ETV) Program at the EPA Test and Evaluation (T&E) Facility in Cincinnati, Ohio, during November, 2003, thr...

  19. Model verification of large structural systems. [space shuttle model response

    NASA Technical Reports Server (NTRS)

    Lee, L. T.; Hasselman, T. K.

    1978-01-01

    A computer program for the application of parameter identification on the structural dynamic models of space shuttle and other large models with hundreds of degrees of freedom is described. Finite element, dynamic, analytic, and modal models are used to represent the structural system. The interface with math models is such that output from any structural analysis program applied to any structural configuration can be used directly. Processed data from either sine-sweep tests or resonant dwell tests are directly usable. The program uses measured modal data to condition the prior analystic model so as to improve the frequency match between model and test. A Bayesian estimator generates an improved analytical model and a linear estimator is used in an iterative fashion on highly nonlinear equations. Mass and stiffness scaling parameters are generated for an improved finite element model, and the optimum set of parameters is obtained in one step.

  20. A Re-programmable Platform for Dynamic Burn-in Test of Xilinx Virtexll 3000 FPGA for Military and Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Roosta, Ramin; Wang, Xinchen; Sadigursky, Michael; Tracton, Phil

    2004-01-01

    Field Programmable Gate Arrays (FPGA) have played increasingly important roles in military and aerospace applications. Xilinx SRAM-based FPGAs have been extensively used in commercial applications. They have been used less frequently in space flight applications due to their susceptibility to single-event upsets. Reliability of these devices in space applications is a concern that has not been addressed. The objective of this project is to design a fully programmable hardware/software platform that allows (but is not limited to) comprehensive static/dynamic burn-in test of Virtex-II 3000 FPGAs, at speed test and SEU test. Conventional methods test very few discrete AC parameters (primarily switching) of a given integrated circuit. This approach will test any possible configuration of the FPGA and any associated performance parameters. It allows complete or partial re-programming of the FPGA and verification of the program by using read back followed by dynamic test. Designers have full control over which functional elements of the FPGA to stress. They can completely simulate all possible types of configurations/functions. Another benefit of this platform is that it allows collecting information on elevation of the junction temperature as a function of gate utilization, operating frequency and functionality. A software tool has been implemented to demonstrate the various features of the system. The software consists of three major parts: the parallel interface driver, main system procedure and a graphical user interface (GUI).

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  2. Use of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments Across International Borders -Test/QA Plan

    EPA Science Inventory

    The Environmental Technology Verification (ETV) – Environmental and Sustainable Technology Evaluations (ESTE) Program conducts third-party verification testing of commercially available technologies that may accomplish environmental program management goals. In this verification...

  3. PERFORMANCE VERIFICATION TEST FOR FIELD-PORTABLE MEASUREMENTS OF LEAD IN DUST

    EPA Science Inventory

    The US Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program (www.epa.jzov/etv) conducts performance verification tests of technologies used for the characterization and monitoring of contaminated media. The program exists to provide high-quali...

  4. VERIFICATION TESTING OF HIGH-RATE MECHANICAL INDUCTION MIXERS FOR CHEMICAL DISINFECTANTS

    EPA Science Inventory

    This paper describes the results of verification testing of mechanical induction mixers for dispersion of chemical disinfectants in wet-weather flow (WWF) conducted under the U.S. Environmental Protection Agency's Environmental Technology Verification (ETV) WWF Pilot Program. Th...

  5. Gaia challenging performances verification: combination of spacecraft models and test results

    NASA Astrophysics Data System (ADS)

    Ecale, Eric; Faye, Frédéric; Chassat, François

    2016-08-01

    To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.

  6. Test and Verification Approach for the NASA Constellation Program

    NASA Technical Reports Server (NTRS)

    Strong, Edward

    2008-01-01

    This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.

  7. Poster - Thurs Eve-43: Verification of dose calculation with tissue inhomogeneity using MapCHECK.

    PubMed

    Korol, R; Chen, J; Mosalaei, H; Karnas, S

    2008-07-01

    MapCHECK (Sun Nuclear, Melbourne, FL) with 445 diode detectors has been used widely for routine IMRT quality assurance (QA) 1 . However, routine IMRT QA has not included the verification of inhomogeneity effects. The objective of this study is to use MapCHECK and a phantom to verify dose calculation and IMRT delivery with tissue inhomogeneity. A phantom with tissue inhomogeneities was placed on top of MapCHECK to measure the planar dose for an anterior beam with photon energy 6 MV or 18 MV. The phantom was composed of a 3.5 cm thick block of lung equivalent material and solid water arranged side by side with a 0.5 cm slab of solid water on the top of the phantom. The phantom setup including MapCHECK was CT scanned and imported into Pinnacle 8.0d for dose calculation. Absolute dose distributions were compared with gamma criteria 3% for dose difference and 3 mm for distance-to-agreement. The results are in good agreement between the measured and calculated planar dose with 88% pass rate based on the gamma analysis. The major dose difference was at the lung-water interface. Further investigation will be performed on a custom designed inhomogeneity phantom with inserts of varying densities and effective depth to create various dose gradients at the interface for dose calculation and delivery verification. In conclusion, a phantom with tissue inhomogeneities can be used with MapCHECK for verification of dose calculation and delivery with tissue inhomogeneity. © 2008 American Association of Physicists in Medicine.

  8. Towards composition of verified hardware devices

    NASA Technical Reports Server (NTRS)

    Schubert, E. Thomas; Levitt, K.; Cohen, G. C.

    1991-01-01

    Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.

  9. Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) Independent Analysis

    NASA Technical Reports Server (NTRS)

    Davis, Mitchell L.; Aguilar, Michael L.; Mora, Victor D.; Regenie, Victoria A.; Ritz, William F.

    2009-01-01

    Two approaches were compared to the Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) approach: the Flat-Sat and Shuttle Avionics Integration Laboratory (SAIL). The Flat-Sat and CAIL/SAIL approaches are two different tools designed to mitigate different risks. Flat-Sat approach is designed to develop a mission concept into a flight avionics system and associated ground controller. The SAIL approach is designed to aid in the flight readiness verification of the flight avionics system. The approaches are complimentary in addressing both the system development risks and mission verification risks. The following NESC team findings were identified: The CAIL assumption is that the flight subsystems will be matured for the system level verification; The Flat-Sat and SAIL approaches are two different tools designed to mitigate different risks. The following NESC team recommendation was provided: Define, document, and manage a detailed interface between the design and development (EDL and other integration labs) to the verification laboratory (CAIL).

  10. WRAP-RIB antenna technology development

    NASA Technical Reports Server (NTRS)

    Freeland, R. E.; Garcia, N. F.; Iwamoto, H.

    1985-01-01

    The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.

  11. Designing an autoverification system in Zagazig University Hospitals Laboratories: preliminary evaluation on thyroid function profile.

    PubMed

    Sediq, Amany Mohy-Eldin; Abdel-Azeez, Ahmad GabAllahm Hala

    2014-01-01

    The current practice in Zagazig University Hospitals Laboratories (ZUHL) is manual verification of all results for the later release of reports. These processes are time consuming and tedious, with large inter-individual variation that slows the turnaround time (TAT). Autoverification is the process of comparing patient results, generated from interfaced instruments, against laboratory-defined acceptance parameters. This study describes an autoverification engine designed and implemented in ZUHL, Egypt. A descriptive study conducted at ZUHL, from January 2012-December 2013. A rule-based system was used in designing an autoverification engine. The engine was preliminarily evaluated on a thyroid function panel. A total of 563 rules were written and tested on 563 simulated cases and 1673 archived cases. The engine decisions were compared to that of 4 independent expert reviewers. The impact of engine implementation on TAT was evaluated. Agreement was achieved among the 4 reviewers in 55.5% of cases, and with the engine in 51.5% of cases. The autoverification rate for archived cases was 63.8%. Reported lab TAT was reduced by 34.9%, and TAT segment from the completion of analysis to verification was reduced by 61.8%. The developed rule-based autoverification system has a verification rate comparable to that of the commercially available software. However, the in-house development of this system had saved the hospital the cost of commercially available ones. The implementation of the system shortened the TAT and minimized the number of samples that needed staff revision, which enabled laboratory staff to devote more time and effort to handle problematic test results and to improve patient care quality.

  12. GENERIC VERIFICATION PROTOCOL: DISTRIBUTED GENERATION AND COMBINED HEAT AND POWER FIELD TESTING PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol by which EPA’s Environmental Technology Verification program tests newly developed equipment for distributed generation of electric power, usually micro-turbine generators and internal combustion engine generators. The protocol will ...

  13. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  14. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  15. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Field Testing and Portable Emission Measurement Systems § 1065... that your new configuration meets this verification. The verification consists of operating an engine... with data simultaneously generated and recorded by laboratory equipment as follows: (1) Mount an engine...

  16. VERIFICATION TESTING OF HIGH-RATE MECHANICAL INDUCTION MIXERS FOR CHEMICAL DISINFECTANTS, Oregon

    EPA Science Inventory

    This paper describes the results of verification testing of mechanical induction mixers for dispersion of chemical disinfectants in wet-weather flow (WWF) conducted under the U.S. Environmental Protection Agency's Environmental Technology Verification (ETV) WWF Pilot Program. Th...

  17. PFLOTRAN Verification: Development of a Testing Suite to Ensure Software Quality

    NASA Astrophysics Data System (ADS)

    Hammond, G. E.; Frederick, J. M.

    2016-12-01

    In scientific computing, code verification ensures the reliability and numerical accuracy of a model simulation by comparing the simulation results to experimental data or known analytical solutions. The model is typically defined by a set of partial differential equations with initial and boundary conditions, and verification ensures whether the mathematical model is solved correctly by the software. Code verification is especially important if the software is used to model high-consequence systems which cannot be physically tested in a fully representative environment [Oberkampf and Trucano (2007)]. Justified confidence in a particular computational tool requires clarity in the exercised physics and transparency in its verification process with proper documentation. We present a quality assurance (QA) testing suite developed by Sandia National Laboratories that performs code verification for PFLOTRAN, an open source, massively-parallel subsurface simulator. PFLOTRAN solves systems of generally nonlinear partial differential equations describing multiphase, multicomponent and multiscale reactive flow and transport processes in porous media. PFLOTRAN's QA test suite compares the numerical solutions of benchmark problems in heat and mass transport against known, closed-form, analytical solutions, including documentation of the exercised physical process models implemented in each PFLOTRAN benchmark simulation. The QA test suite development strives to follow the recommendations given by Oberkampf and Trucano (2007), which describes four essential elements in high-quality verification benchmark construction: (1) conceptual description, (2) mathematical description, (3) accuracy assessment, and (4) additional documentation and user information. Several QA tests within the suite will be presented, including details of the benchmark problems and their closed-form analytical solutions, implementation of benchmark problems in PFLOTRAN simulations, and the criteria used to assess PFLOTRAN's performance in the code verification procedure. References Oberkampf, W. L., and T. G. Trucano (2007), Verification and Validation Benchmarks, SAND2007-0853, 67 pgs., Sandia National Laboratories, Albuquerque, NM.

  18. Bayesian Estimation of Combined Accuracy for Tests with Verification Bias

    PubMed Central

    Broemeling, Lyle D.

    2011-01-01

    This presentation will emphasize the estimation of the combined accuracy of two or more tests when verification bias is present. Verification bias occurs when some of the subjects are not subject to the gold standard. The approach is Bayesian where the estimation of test accuracy is based on the posterior distribution of the relevant parameter. Accuracy of two combined binary tests is estimated employing either “believe the positive” or “believe the negative” rule, then the true and false positive fractions for each rule are computed for two tests. In order to perform the analysis, the missing at random assumption is imposed, and an interesting example is provided by estimating the combined accuracy of CT and MRI to diagnose lung cancer. The Bayesian approach is extended to two ordinal tests when verification bias is present, and the accuracy of the combined tests is based on the ROC area of the risk function. An example involving mammography with two readers with extreme verification bias illustrates the estimation of the combined test accuracy for ordinal tests. PMID:26859487

  19. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  20. Artwork Interactive Design System (AIDS) program description

    NASA Technical Reports Server (NTRS)

    Johnson, B. T.; Taylor, J. F.

    1976-01-01

    An artwork interactive design system is described which provides the microelectronic circuit designer/engineer a tool to perform circuit design, automatic layout modification, standard cell design, and artwork verification at a graphics computer terminal using a graphics tablet at the designer/computer interface.

  1. 47 CFR 15.101 - Equipment authorization of unintentional radiators.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    .... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...

  2. 47 CFR 15.101 - Equipment authorization of unintentional radiators.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    .... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...

  3. 47 CFR 15.101 - Equipment authorization of unintentional radiators.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    .... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...

  4. 47 CFR 15.101 - Equipment authorization of unintentional radiators.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    .... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...

  5. 47 CFR 15.101 - Equipment authorization of unintentional radiators.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    .... TV interface device Declaration of Conformity or Certification. Cable system terminal device Declaration of Conformity. Stand-alone cable input selector switch Verification. Class B personal computers... used with Class B personal computers Declaration of Conformity or Certification. 1 Class B personal...

  6. KSC-97PC1066

    NASA Image and Video Library

    1997-07-18

    Jet Propulsion Laboratory (JPL) engineers examine the interface surface on the Cassini spacecraft prior to installation of the third radioisotope thermoelectric generator (RTG). The other two RTGs, at left, already are installed on Cassini. The three RTGs will be used to power Cassini on its mission to the Saturnian system. They are undergoing mechanical and electrical verification testing in the Payload Hazardous Servicing Facility. RTGs use heat from the natural decay of plutonium to generate electric power. The generators enable spacecraft to operate far from the Sun where solar power systems are not feasible. The Cassini mission is scheduled for an Oct. 6 launch aboard a Titan IVB/Centaur expendable launch vehicle. Cassini is built and managed for NASA by JPL

  7. Generic Verification Protocol for Testing Pesticide Application Spray Drift Reduction Technologies for Row and Field Crops

    EPA Pesticide Factsheets

    This generic verification protocol provides a detailed method to conduct and report results from a verification test of pesticide application technologies that can be used to evaluate these technologies for their potential to reduce spray drift.

  8. Verification of performance specifications for a US Food and Drug Administration-approved molecular microbiology test: Clostridium difficile cytotoxin B using the Becton, Dickinson and Company GeneOhm Cdiff assay.

    PubMed

    Schlaberg, Robert; Mitchell, Michael J; Taggart, Edward W; She, Rosemary C

    2012-01-01

    US Food and Drug Administration (FDA)-approved diagnostic tests based on molecular genetic technologies are becoming available for an increasing number of microbial pathogens. Advances in technology and lower costs have moved molecular diagnostic tests formerly performed for research purposes only into much wider use in clinical microbiology laboratories. To provide an example of laboratory studies performed to verify the performance of an FDA-approved assay for the detection of Clostridium difficile cytotoxin B compared with the manufacturer's performance standards. We describe the process and protocols used by a laboratory for verification of an FDA-approved assay, assess data from the verification studies, and implement the assay after verification. Performance data from the verification studies conducted by the laboratory were consistent with the manufacturer's performance standards and the assay was implemented into the laboratory's test menu. Verification studies are required for FDA-approved diagnostic assays prior to use in patient care. Laboratories should develop a standardized approach to verification studies that can be adapted and applied to different types of assays. We describe the verification of an FDA-approved real-time polymerase chain reaction assay for the detection of a toxin gene in a bacterial pathogen.

  9. Diffuse-interface model for rapid phase transformations in nonequilibrium systems.

    PubMed

    Galenko, Peter; Jou, David

    2005-04-01

    A thermodynamic approach to rapid phase transformations within a diffuse interface in a binary system is developed. Assuming an extended set of independent thermodynamic variables formed by the union of the classic set of slow variables and the space of fast variables, we introduce finiteness of the heat and solute diffusive propagation at the finite speed of the interface advancing. To describe transformations within the diffuse interface, we use the phase-field model which allows us to follow steep but smooth changes of phase within the width of the diffuse interface. Governing equations of the phase-field model are derived for the hyperbolic model, a model with memory, and a model of nonlinear evolution of transformation within the diffuse interface. The consistency of the model is proved by the verification of the validity of the condition of positive entropy production and by outcomes of the fluctuation-dissipation theorem. A comparison with existing sharp-interface and diffuse-interface versions of the model is given.

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION: Reduction of Nitrogen in Domestic Wastewater from Individual Residential Homes. BioConcepts, Inc. ReCip® RTS ~ 500 System

    EPA Science Inventory

    Verification testing of the ReCip® RTS-500 System was conducted over a 12-month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located on Otis Air National Guard Base in Bourne, Massachusetts. A nine-week startup period preceded the verification test t...

  11. Definition of ground test for Large Space Structure (LSS) control verification

    NASA Technical Reports Server (NTRS)

    Waites, H. B.; Doane, G. B., III; Tollison, D. K.

    1984-01-01

    An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.

  12. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  13. VeriClick: an efficient tool for table format verification

    NASA Astrophysics Data System (ADS)

    Nagy, George; Tamhankar, Mangesh

    2012-01-01

    The essential layout attributes of a visual table can be defined by the location of four critical grid cells. Although these critical cells can often be located by automated analysis, some means of human interaction is necessary for correcting residual errors. VeriClick is a macro-enabled spreadsheet interface that provides ground-truthing, confirmation, correction, and verification functions for CSV tables. All user actions are logged. Experimental results of seven subjects on one hundred tables suggest that VeriClick can provide a ten- to twenty-fold speedup over performing the same functions with standard spreadsheet editing commands.

  14. Verification and validation of an advanced model of heat and mass transfer in the protective clothing

    NASA Astrophysics Data System (ADS)

    Łapka, Piotr; Furmański, Piotr

    2018-04-01

    The paper presents verification and validation of an advanced numerical model of heat and moisture transfer in the multi-layer protective clothing and in components of the experimental stand subjected to either high surroundings temperature or high radiative heat flux emitted by hot objects. The developed model included conductive-radiative heat transfer in the hygroscopic porous fabrics and air gaps as well as conductive heat transfer in components of the stand. Additionally, water vapour diffusion in the pores and air spaces as well as phase transition of the bound water in the fabric fibres (sorption and desorption) were accounted for. All optical phenomena at internal or external walls were modelled and the thermal radiation was treated in the rigorous way, i.e., semi-transparent absorbing, emitting and scattering fabrics with the non-grey properties were assumed. The air was treated as transparent. Complex energy and mass balances as well as optical conditions at internal or external interfaces were formulated in order to find values of temperatures, vapour densities and radiation intensities at these interfaces. The obtained highly non-linear coupled system of discrete equations was solved by the Finite Volume based in-house iterative algorithm. The developed model passed discretisation convergence tests and was successfully verified against the results obtained applying commercial software for simplified cases. Then validation was carried out using experimental measurements collected during exposure of the protective clothing to high radiative heat flux emitted by the IR lamp. Satisfactory agreement of simulated and measured temporal variation of temperature at external and internal surfaces of the multi-layer clothing was attained.

  15. The New NASA Orbital Debris Engineering Model ORDEM2000

    NASA Technical Reports Server (NTRS)

    Liou, Jer-Chyi; Matney, Mark J.; Anz-Meador, Phillip D.; Kessler, Donald; Jansen, Mark; Theall, Jeffery R.

    2002-01-01

    The NASA Orbital Debris Program Office at Johnson Space Center has developed a new computer-based orbital debris engineering model, ORDEM2000, which describes the orbital debris environment in the low Earth orbit region between 200 and 2000 km altitude. The model is appropriate for those engineering solutions requiring knowledge and estimates of the orbital debris environment (debris spatial density, flux, etc.). ORDEM2000 can also be used as a benchmark for ground-based debris measurements and observations. We incorporated a large set of observational data, covering the object size range from 10 mm to 10 m, into the ORDEM2000 debris database, utilizing a maximum likelihood estimator to convert observations into debris population probability distribution functions. These functions then form the basis of debris populations. We developed a finite element model to process the debris populations to form the debris environment. A more capable input and output structure and a user-friendly graphical user interface are also implemented in the model. ORDEM2000 has been subjected to a significant verification and validation effort. This document describes ORDEM2000, which supersedes the previous model, ORDEM96. The availability of new sensor and in situ data, as well as new analytical techniques, has enabled the construction of this new model. Section 1 describes the general requirements and scope of an engineering model. Data analyses and the theoretical formulation of the model are described in Sections 2 and 3. Section 4 describes the verification and validation effort and the sensitivity and uncertainty analyses. Finally, Section 5 describes the graphical user interface, software installation, and test cases for the user.

  16. Interface coupling and growth rate measurements in multilayer Rayleigh-Taylor instabilities

    NASA Astrophysics Data System (ADS)

    Adkins, Raymond; Shelton, Emily M.; Renoult, Marie-Charlotte; Carles, Pierre; Rosenblatt, Charles

    2017-06-01

    Magnetic levitation was used to measure the growth rate Σ vs wave vector k of a Rayleigh-Taylor instability in a three-layer fluid system, a crucial step in the elucidation of interface coupling in finite-layer instabilities. For a three-layer (low-high-low density) system, the unstable mode growth rate decreases as both the height h of the middle layer and k are reduced, consistent with an interface coupling ∝e-k h . The ratios of the three-layer to the established two-layer growth rates are in good agreement with those of classic linear stability theory, which has long resisted verification in that configuration.

  17. Cockpit Interfaces, Displays, and Alerting Messages for the Interval Management Alternative Clearances (IMAC) Experiment

    NASA Technical Reports Server (NTRS)

    Baxley, Brian T.; Palmer, Michael T.; Swieringa, Kurt A.

    2015-01-01

    This document describes the IM cockpit interfaces, displays, and alerting capabilities that were developed for and used in the IMAC experiment, which was conducted at NASA Langley in the summer of 2015. Specifically, this document includes: (1) screen layouts for each page of the interface; (2) step-by-step instructions for data entry, data verification and input error correction; (3) algorithm state messages and error condition alerting messages; (4) aircraft speed guidance and deviation indications; and (5) graphical display of the spatial relationships between the Ownship aircraft and the Target aircraft. The controller displays for IM will be described in a separate document.

  18. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  19. Voltage verification unit

    DOEpatents

    Martin, Edward J [Virginia Beach, VA

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  20. Test/QA Plan for Verification of Radio Frequency Identification (RFID) for Tracking Hazardous Waste Shipments across International Borders

    EPA Science Inventory

    The verification test will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Technology Verification (ETV) Program. It will be performed by Battelle, which is managing the ETV Advanced Monitoring Systems (AMS) Center throu...

  1. Measurement of a True [Formula: see text]O2max during a Ramp Incremental Test Is Not Confirmed by a Verification Phase.

    PubMed

    Murias, Juan M; Pogliaghi, Silvia; Paterson, Donald H

    2018-01-01

    The accuracy of an exhaustive ramp incremental (RI) test to determine maximal oxygen uptake ([Formula: see text]O 2max ) was recently questioned and the utilization of a verification phase proposed as a gold standard. This study compared the oxygen uptake ([Formula: see text]O 2 ) during a RI test to that obtained during a verification phase aimed to confirm attainment of [Formula: see text]O 2max . Sixty-one healthy males [31 older (O) 65 ± 5 yrs; 30 younger (Y) 25 ± 4 yrs] performed a RI test (15-20 W/min for O and 25 W/min for Y). At the end of the RI test, a 5-min recovery period was followed by a verification phase of constant load cycling to fatigue at either 85% ( n = 16) or 105% ( n = 45) of the peak power output obtained from the RI test. The highest [Formula: see text]O 2 after the RI test (39.8 ± 11.5 mL·kg -1 ·min -1 ) and the verification phase (40.1 ± 11.2 mL·kg -1 ·min -1 ) were not different ( p = 0.33) and they were highly correlated ( r = 0.99; p < 0.01). This response was not affected by age or intensity of the verification phase. The Bland-Altman analysis revealed a very small absolute bias (-0.25 mL·kg -1 ·min -1 , not different from 0) and a precision of ±1.56 mL·kg -1 ·min -1 between measures. This study indicated that a verification phase does not highlight an under-estimation of [Formula: see text]O 2max derived from a RI test, in a large and heterogeneous group of healthy younger and older men naïve to laboratory testing procedures. Moreover, only minor within-individual differences were observed between the maximal [Formula: see text]O 2 elicited during the RI and the verification phase. Thus a verification phase does not add any validation of the determination of a [Formula: see text]O 2max . Therefore, the recommendation that a verification phase should become a gold standard procedure, although initially appealing, is not supported by the experimental data.

  2. Flight test experience and controlled impact of a large, four-engine, remotely piloted airplane

    NASA Technical Reports Server (NTRS)

    Kempel, R. W.; Horton, T. W.

    1985-01-01

    A controlled impact demonstration (CID) program using a large, four engine, remotely piloted transport airplane was conducted. Closed loop primary flight control was performed from a ground based cockpit and digital computer in conjunction with an up/down telemetry link. Uplink commands were received aboard the airplane and transferred through uplink interface systems to a highly modified Bendix PB-20D autopilot. Both proportional and discrete commands were generated by the ground pilot. Prior to flight tests, extensive simulation was conducted during the development of ground based digital control laws. The control laws included primary control, secondary control, and racetrack and final approach guidance. Extensive ground checks were performed on all remotely piloted systems. However, manned flight tests were the primary method of verification and validation of control law concepts developed from simulation. The design, development, and flight testing of control laws and the systems required to accomplish the remotely piloted mission are discussed.

  3. Code Verification Results of an LLNL ASC Code on Some Tri-Lab Verification Test Suite Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Anderson, S R; Bihari, B L; Salari, K

    As scientific codes become more complex and involve larger numbers of developers and algorithms, chances for algorithmic implementation mistakes increase. In this environment, code verification becomes essential to building confidence in the code implementation. This paper will present first results of a new code verification effort within LLNL's B Division. In particular, we will show results of code verification of the LLNL ASC ARES code on the test problems: Su Olson non-equilibrium radiation diffusion, Sod shock tube, Sedov point blast modeled with shock hydrodynamics, and Noh implosion.

  4. Analytical Verifications in Cryogenic Testing of NGST Advanced Mirror System Demonstrators

    NASA Technical Reports Server (NTRS)

    Cummings, Ramona; Levine, Marie; VanBuren, Dave; Kegley, Jeff; Green, Joseph; Hadaway, James; Presson, Joan; Cline, Todd; Stahl, H. Philip (Technical Monitor)

    2002-01-01

    Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper opens with strategy of analytical verification segments added to vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies. These AMSD assemblies incorporate material and architecture concepts being considered in the Next Generation Space Telescope (NGST) design. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of specific structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with support plan for analyses, data, and observation requirements during the AMSD testing, currently slated for late calendar year 2002 to mid calendar year 2003. The paper includes anomaly resolution as gleaned by authors from similar analytical verification support of a previous large space telescope, then closes with draft of plans for parameter extrapolations, to form a well-verified portion of the integrated modeling being done for NGST performance predictions.

  5. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    Considerations regarding the Space Transportation System (STS) payload environmental verification are reviewed. It is noted that emphasis is placed on testing at the subassembly level and that the basic objective of structural dynamic payload verification is to ensure reliability in a cost-effective manner. Structural analyses consist of: (1) stress analysis for critical loading conditions, (2) model analysis for launch and orbital configurations, (3) flight loads analysis, (4) test simulation analysis to verify models, (5) kinematic analysis of deployment/retraction sequences, and (6) structural-thermal-optical program analysis. In addition to these approaches, payload verification programs are being developed in the thermal-vacuum area. These include the exposure to extreme temperatures, temperature cycling, thermal-balance testing and thermal-vacuum testing.

  6. Space telescope observatory management system preliminary test and verification plan

    NASA Technical Reports Server (NTRS)

    Fritz, J. S.; Kaldenbach, C. F.; Williams, W. B.

    1982-01-01

    The preliminary plan for the Space Telescope Observatory Management System Test and Verification (TAV) is provided. Methodology, test scenarios, test plans and procedure formats, schedules, and the TAV organization are included. Supporting information is provided.

  7. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  8. 76 FR 75878 - Information Collection Being Submitted to the Office of Management and Budget for Review and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    ...: 3060-0329. Title: Section 2.955, Equipment Authorization-Verification (Retention of Records). Form No.... Section 2.955 describes for each equipment device subject to verification, the responsible party, as shown... performing the verification testing. The Commission may request additional information regarding the test...

  9. Orbit attitude processor. STS-1 bench program verification test plan

    NASA Technical Reports Server (NTRS)

    Mcclain, C. R.

    1980-01-01

    A plan for the static verification of the STS-1 ATT PROC ORBIT software requirements is presented. The orbit version of the SAPIENS bench program is used to generate the verification data. A brief discussion of the simulation software and flight software modules is presented along with a description of the test cases.

  10. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Nox and N2o... recommend that you extract engine exhaust to perform this verification. Use a CLD that meets the..., if one is used during testing, introduce the engine exhaust to the NDUV analyzer. (4) Allow time for...

  11. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Nox and N2o... recommend that you extract engine exhaust to perform this verification. Use a CLD that meets the..., if one is used during testing, introduce the engine exhaust to the NDUV analyzer. (4) Allow time for...

  12. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Nox and N2o... recommend that you extract engine exhaust to perform this verification. Use a CLD that meets the..., if one is used during testing, introduce the engine exhaust to the NDUV analyzer. (4) Allow time for...

  13. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications Nox and N2o... recommend that you extract engine exhaust to perform this verification. Use a CLD that meets the..., if one is used during testing, introduce the engine exhaust to the NDUV analyzer. (4) Allow time for...

  14. Generic Verification Protocol for Testing Pesticide Application Spray Drift Reduction Technologies for Row and Field Crops (Version 1.4)

    EPA Science Inventory

    This generic verification protocol provides a detailed method for conducting and reporting results from verification testing of pesticide application technologies. It can be used to evaluate technologies for their potential to reduce spray drift, hence the term “drift reduction t...

  15. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  16. Environmental Technology Verification Report -- Baghouse filtration products, GE Energy QG061 filtration media ( tested May 2007)

    EPA Science Inventory

    EPA has created the Environmental Technology Verification Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The Air Pollution Control Technology Verification Center, a cente...

  17. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270 and... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066...

  18. Design and Verification of Space Station EVA-Operated Truss Attachment System

    NASA Technical Reports Server (NTRS)

    Katell, Gabriel

    2001-01-01

    This paper describes the design and verification of a system used to attach two segments of the International Space Station (ISS). This system was first used in space to mate the P6 and Z1 trusses together in December 2000, through a combination of robotic and extravehicular tasks. Features that provided capture, coarse alignment, and fine alignment during the berthing process are described. Attachment of this high value hardware was critical to the ISS's sequential assembly, necessitating the inclusion of backup design and operational features. Astronauts checked for the proper performance of the alignment and bolting features during on-orbit operations. During berthing, the system accommodates truss-to-truss relative displacements that are caused by manufacturing tolerances and on-orbit thermal gradients. After bolt installation, the truss interface becomes statically determinate with respect to in-plane shear loads and isolates attach bolts from bending moments. The approach used to estimate relative displacements and the means of accommodating them is explained. Confidence in system performance was achieved through a cost-effective collection of tests and analyses, including thermal, structural, vibration, misalignment, contact dynamics, underwater simulation, and full-scale functional testing. Design considerations that have potential application to other mechanisms include accommodating variations of friction coefficients in the on-orbit joints, wrench torque tolerances, joint preload, moving element clearances at temperature extremes, and bolt-nut torque reaction.

  19. General Dynamic (GD) Launch Waveform On-Orbit Performance Report

    NASA Technical Reports Server (NTRS)

    Briones, Janette C.; Shalkhauser, Mary Jo

    2014-01-01

    The purpose of this report is to present the results from the GD SDR on-orbit performance testing using the launch waveform over TDRSS. The tests include the evaluation of well-tested waveform modes, the operation of RF links that are expected to have high margins, the verification of forward return link operation (including full duplex), the verification of non-coherent operational models, and the verification of radio at-launch operational frequencies. This report also outlines the launch waveform tests conducted and comparisons to the results obtained from ground testing.

  20. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the KSC Shuttle Landing Facility, the Italian Space Agency's Multi- Purpose Logistics Module Donatello begins rolling out of the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  1. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the KSC Shuttle Landing Facility, an Airbus '''Beluga''' air cargo plane opens to reveal its cargo, the Italian Space Agency's Multi- Purpose Logistics Module Donatello, from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  2. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the KSC Shuttle Landing Facility, the Italian Space Agency's Multi- Purpose Logistics Module Donatello rolls out of the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  3. Ground vibration tests of a high fidelity truss for verification of on orbit damage location techniques

    NASA Technical Reports Server (NTRS)

    Kashangaki, Thomas A. L.

    1992-01-01

    This paper describes a series of modal tests that were performed on a cantilevered truss structure. The goal of the tests was to assemble a large database of high quality modal test data for use in verification of proposed methods for on orbit model verification and damage detection in flexible truss structures. A description of the hardware is provided along with details of the experimental setup and procedures for 16 damage cases. Results from selected cases are presented and discussed. Differences between ground vibration testing and on orbit modal testing are also described.

  4. Revisiting Training and Verification Process Implementation for Risk Reduction on New Missions at NASA Jet Propulsion Laboratory

    NASA Technical Reports Server (NTRS)

    Bryant, Larry W.; Fragoso, Ruth S.

    2007-01-01

    In 2003 we proposed an effort to develop a core program of standardized training and verification practices and standards against which the implementation of these practices could be measured. The purpose was to provide another means of risk reduction for deep space missions to preclude the likelihood of a repeat of the tragedies of the 1998 Mars missions. We identified six areas where the application of standards and standardization would benefit the overall readiness process for flight projects at JPL. These are Individual Training, Team Training, Interface and Procedure Development, Personnel Certification, Interface and procedure Verification, and Operations Readiness Testing. In this paper we will discuss the progress that has been made in the tasks of developing the proposed infrastructure in each of these areas. Specifically we will address the Position Training and Certification Standards that are now available for each operational position found on our Flight Operations Teams (FOT). We will also discuss the MGSS Baseline Flight Operations Team Training Plan which can be tailored for each new flight project at JPL. As these tasks have been progressing, the climate and emphasis for Training and for V and V at JPL has changed, and we have learned about the expansion, growth, and limitations in the roles of traditional positions at JPL such as the Project's Training Engineer, V and V Engineer, and Operations Engineer. The need to keep a tight rein on budgets has led to a merging and/or reduction in these positions which pose challenges to individual capacities and capabilities. We examine the evolution of these processes and the roles involved while taking a look at the impact or potential impact of our proposed training related infrastructure tasks. As we conclude our examination of the changes taking place for new flight projects, we see that the importance of proceeding with our proposed tasks and adapting them to the changing climate remains an important element in reducing the risk in the challenging business of space exploration.

  5. Verification and Validation Studies for the LAVA CFD Solver

    NASA Technical Reports Server (NTRS)

    Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.

    2013-01-01

    The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.

  6. Integrated testing and verification system for research flight software

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  7. The New Payload Handling System for the G erman On- Orbit Verification Satellite TET with the Sensor Bus as Example for Payloads

    NASA Astrophysics Data System (ADS)

    Heyer, H.-V.; Föckersperger, S.; Lattner, K.; Moldenhauer, W.; Schmolke, J.; Turk, M.; Willemsen, P.; Schlicker, M.; Westerdorff, K.

    2008-08-01

    The technology verification satellite TET (Technologie ErprobungsTräger) is the core element of the German On-Orbit-Verification (OOV) program of new technologies and techniques. The goal of this program is the support of the German space industry and research facilities for on-orbit verification of satellite technologies. The TET satellite is a small satellite developed and built in Germany under leadership of Kayser-Threde. The satellite bus is based on the successfully operated satellite BIRD and the newly developed payload platform with the new payload handling system called NVS (Nutzlastversorgungs-system). The NVS can be detailed in three major parts: the power supply the processor boards and the I/O-interfaces. The NVS is realized via several PCBs in Europe format which are connected to each other via an integrated backplane. The payloads are connected by front connectors to the NVS. This paper describes the concept, architecture, and the hard-/software of the NVS. Phase B of this project was successfully finished last year.

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT; UV DISINFECTION FOR REUSE APPLICATION, AQUIONICS, INC. BERSONINLINE 4250 UV SYSTEM

    EPA Science Inventory

    Verification testing of the Aquionics, Inc. bersonInLine® 4250 UV System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills Wastewater Treatment Plant test site in Parsippany, New Jersey. Two full-scale reactors were mounted in series. T...

  9. A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems

    DTIC Science & Technology

    2016-03-01

    insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: UV DISINFECTION FOR REUSE APPLICATIONS, ONDEO DEGREMONT, INC., AQUARAY® 40 HO VLS DISINFECTION SYSTEM

    EPA Science Inventory

    Verification testing of the Ondeo Degremont, Inc. Aquaray® 40 HO VLS Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Three reactor modules were m...

  11. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  12. UAS Integration in the NAS Project: Part Task 6 V & V Simulation: Primary Results

    NASA Technical Reports Server (NTRS)

    Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor

    2016-01-01

    This is a presentation of the preliminary results on final V and V (Verification and Validation) activity of [RTCA (Radio Technical Commission for Aeronautics)] SC (Special Committee)-228 DAA (Detect and Avoid) HMI (Human-Machine Interface) requirements for display alerting and guidance.

  13. Digital conversion of INEL archeological data using ARC/INFO and Oracle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, R.D.; Brizzee, J.; White, L.

    1993-11-04

    This report documents the procedures used to convert archaeological data for the INEL to digital format, lists the equipment used, and explains the verification and validation steps taken to check data entry. It also details the production of an engineered interface between ARC/INFO and Oracle.

  14. Toward Ada Verification: A Collection of Relevant Topics

    DTIC Science & Technology

    1986-06-01

    presumably it is this- if there are no default values, a programming error which results in failure to initialize a variable is more likely to advertise ... disavantages tu using AVID. First, TDL is a more complicated interface than first-order logic (as used in the CSG). Second, AVID is unsupported and

  15. How Formal Dynamic Verification Tools Facilitate Novel Concurrency Visualizations

    NASA Astrophysics Data System (ADS)

    Aananthakrishnan, Sriram; Delisi, Michael; Vakkalanka, Sarvani; Vo, Anh; Gopalakrishnan, Ganesh; Kirby, Robert M.; Thakur, Rajeev

    With the exploding scale of concurrency, presenting valuable pieces of information collected by formal verification tools intuitively and graphically can greatly enhance concurrent system debugging. Traditional MPI program debuggers present trace views of MPI program executions. Such views are redundant, often containing equivalent traces that permute independent MPI calls. In our ISP formal dynamic verifier for MPI programs, we present a collection of alternate views made possible by the use of formal dynamic verification. Some of ISP’s views help pinpoint errors, some facilitate discerning errors by eliminating redundancy, while others help understand the program better by displaying concurrent even orderings that must be respected by all MPI implementations, in the form of completes-before graphs. In this paper, we describe ISP’s graphical user interface (GUI) capabilities in all these areas which are currently supported by a portable Java based GUI, a Microsoft Visual Studio GUI, and an Eclipse based GUI whose development is in progress.

  16. Verification and transfer of thermal pollution model. Volume 3: Verification of 3-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.; Sinha, S. K.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free-surface model also provides surface height variations with time.

  17. Verification and transfer of thermal pollution model. Volume 5: Verification of 2-dimensional numerical model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorate (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  18. Highly efficient simulation environment for HDTV video decoder in VLSI design

    NASA Astrophysics Data System (ADS)

    Mao, Xun; Wang, Wei; Gong, Huimin; He, Yan L.; Lou, Jian; Yu, Lu; Yao, Qingdong; Pirsch, Peter

    2002-01-01

    With the increase of the complex of VLSI such as the SoC (System on Chip) of MPEG-2 Video decoder with HDTV scalability especially, simulation and verification of the full design, even as high as the behavior level in HDL, often proves to be very slow, costly and it is difficult to perform full verification until late in the design process. Therefore, they become bottleneck of the procedure of HDTV video decoder design, and influence it's time-to-market mostly. In this paper, the architecture of Hardware/Software Interface of HDTV video decoder is studied, and a Hardware-Software Mixed Simulation (HSMS) platform is proposed to check and correct error in the early design stage, based on the algorithm of MPEG-2 video decoding. The application of HSMS to target system could be achieved by employing several introduced approaches. Those approaches speed up the simulation and verification task without decreasing performance.

  19. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.

  20. A service-oriented architecture for integrating the modeling and formal verification of genetic regulatory networks

    PubMed Central

    2009-01-01

    Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075

  1. Universal Linear Optics: An implementation of Boson Sampling on a Fully Reconfigurable Circuit

    NASA Astrophysics Data System (ADS)

    Harrold, Christopher; Carolan, Jacques; Sparrow, Chris; Russell, Nicholas J.; Silverstone, Joshua W.; Marshall, Graham D.; Thompson, Mark G.; Matthews, Jonathan C. F.; O'Brien, Jeremy L.; Laing, Anthony; Martín-López, Enrique; Shadbolt, Peter J.; Matsuda, Nobuyuki; Oguma, Manabu; Itoh, Mikitaka; Hashimoto, Toshikazu

    Linear optics has paved the way for fundamental tests in quantum mechanics and has gone on to enable a broad range of quantum information processing applications for quantum technologies. We demonstrate an integrated photonics processor that is universal for linear optics. The device is a silica-on-silicon planar waveguide circuit (PLC) comprising a cascade of 15 Mach Zehnder interferometers, with 30 directional couplers and 30 tunable thermo-optic phase shifters which are electrically interfaced for the arbitrary setting of a phase. We input ensembles of up to six photons, and monitor the output with a 12-single-photon detector system. The calibrated device is capable of implementing any linear optical protocol. This enables the implementation of new quantum information processing tasks in seconds, which would have previously taken months to realise. We demonstrate 100 instances of the boson sampling problem with verification tests, and six-dimensional complex Hadamards. Also Imperial College London.

  2. Design and evaluation of a low-cost smartphone pulse oximeter.

    PubMed

    Petersen, Christian L; Chen, Tso P; Ansermino, J Mark; Dumont, Guy A

    2013-12-06

    Infectious diseases such as pneumonia take the lives of millions of children in low- and middle-income countries every year. Many of these deaths could be prevented with the availability of robust and low-cost diagnostic tools using integrated sensor technology. Pulse oximetry in particular, offers a unique non-invasive and specific test for an increase in the severity of many infectious diseases such as pneumonia. If pulse oximetry could be delivered on widely available mobile phones, it could become a compelling solution to global health challenges. Many lives could be saved if this technology was disseminated effectively in the affected regions of the world to rescue patients from the fatal consequences of these infectious diseases. We describe the implementation of such an oximeter that interfaces a conventional clinical oximeter finger sensor with a smartphone through the headset jack audio interface, and present a simulator-based systematic verification system to be used for automated validation of the sensor interface on different smartphones and media players. An excellent agreement was found between the simulator and the audio oximeter for both oxygen saturation and heart rate over a wide range of optical transmission levels on 4th and 5th generations of the iPod TouchTM and iPhoneTM devices.

  3. Design and Evaluation of a Low-Cost Smartphone Pulse Oximeter

    PubMed Central

    Petersen, Christian L.; Chen, Tso P.; Ansermino, J. Mark; Dumont, Guy A.

    2013-01-01

    Infectious diseases such as pneumonia take the lives of millions of children in low- and middle-income countries every year. Many of these deaths could be prevented with the availability of robust and low-cost diagnostic tools using integrated sensor technology. Pulse oximetry in particular, offers a unique non-invasive and specific test for an increase in the severity of many infectious diseases such as pneumonia. If pulse oximetry could be delivered on widely available mobile phones, it could become a compelling solution to global health challenges. Many lives could be saved if this technology was disseminated effectively in the affected regions of the world to rescue patients from the fatal consequences of these infectious diseases. We describe the implementation of such an oximeter that interfaces a conventional clinical oximeter finger sensor with a smartphone through the headset jack audio interface, and present a simulator-based systematic verification system to be used for automated validation of the sensor interface on different smartphones and media players. An excellent agreement was found between the simulator and the audio oximeter for both oxygen saturation and heart rate over a wide range of optical transmission levels on 4th and 5th generations of the iPod Touch™ and iPhone™ devices. PMID:24322563

  4. Verification of component mode techniques for flexible multibody systems

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1990-01-01

    Investigations were conducted in the modeling aspects of flexible multibodies undergoing large angular displacements. Models were to be generated and analyzed through application of computer simulation packages employing the 'component mode synthesis' techniques. Multibody Modeling, Verification and Control Laboratory (MMVC) plan was implemented, which includes running experimental tests on flexible multibody test articles. From these tests, data was to be collected for later correlation and verification of the theoretical results predicted by the modeling and simulation process.

  5. NASA Operational Simulator for Small Satellites: Tools for Software Based Validation and Verification of Small Satellites

    NASA Technical Reports Server (NTRS)

    Grubb, Matt

    2016-01-01

    The NASA Operational Simulator for Small Satellites (NOS3) is a suite of tools to aid in areas such as software development, integration test (IT), mission operations training, verification and validation (VV), and software systems check-out. NOS3 provides a software development environment, a multi-target build system, an operator interface-ground station, dynamics and environment simulations, and software-based hardware models. NOS3 enables the development of flight software (FSW) early in the project life cycle, when access to hardware is typically not available. For small satellites there are extensive lead times on many of the commercial-off-the-shelf (COTS) components as well as limited funding for engineering test units (ETU). Considering the difficulty of providing a hardware test-bed to each developer tester, hardware models are modeled based upon characteristic data or manufacturers data sheets for each individual component. The fidelity of each hardware models is such that FSW executes unaware that physical hardware is not present. This allows binaries to be compiled for both the simulation environment, and the flight computer, without changing the FSW source code. For hardware models that provide data dependent on the environment, such as a GPS receiver or magnetometer, an open-source tool from NASA GSFC (42 Spacecraft Simulation) is used to provide the necessary data. The underlying infrastructure used to transfer messages between FSW and the hardware models can also be used to monitor, intercept, and inject messages, which has proven to be beneficial for VV of larger missions such as James Webb Space Telescope (JWST). As hardware is procured, drivers can be added to the environment to enable hardware-in-the-loop (HWIL) testing. When strict time synchronization is not vital, any number of combinations of hardware components and software-based models can be tested. The open-source operator interface used in NOS3 is COSMOS from Ball Aerospace. For testing, plug-ins are implemented in COSMOS to control the NOS3 simulations, while the command and telemetry tools available in COSMOS are used to communicate with FSW. NOS3 is actively being used for FSW development and component testing of the Simulation-to-Flight 1 (STF-1) CubeSat. As NOS3 matures, hardware models have been added for common CubeSat components such as Novatel GPS receivers, ClydeSpace electrical power systems and batteries, ISISpace antenna systems, etc. In the future, NASA IVV plans to distribute NOS3 to other CubeSat developers and release the suite to the open-source community.

  6. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  7. TEST/QA PLAN FOR THE VERIFICATION TESTING OF ALTERNATIVES OR REFORMULATED LIQUID FUELS, FUEL ADDITIVES, FUEL EMULSONS, AND LUBRICANTS FOR HIGHWAY AND NONROAD USE HEAVY DUTY DIESEL ENGINES AND LIGHT DUTY GASOLINE ENGINES AND VEHICLES

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  8. Software design and code generation for the engineering graphical user interface of the ASTRI SST-2M prototype for the Cherenkov Telescope Array

    NASA Astrophysics Data System (ADS)

    Tanci, Claudio; Tosti, Gino; Antolini, Elisa; Gambini, Giorgio F.; Bruno, Pietro; Canestrari, Rodolfo; Conforti, Vito; Lombardi, Saverio; Russo, Federico; Sangiorgi, Pierluca; Scuderi, Salvatore

    2016-08-01

    ASTRI is an on-going project developed in the framework of the Cherenkov Telescope Array (CTA). An end- to-end prototype of a dual-mirror small-size telescope (SST-2M) has been installed at the INAF observing station on Mt. Etna, Italy. The next step is the development of the ASTRI mini-array composed of nine ASTRI SST-2M telescopes proposed to be installed at the CTA southern site. The ASTRI mini-array is a collaborative and international effort carried on by Italy, Brazil and South-Africa and led by the Italian National Institute of Astrophysics, INAF. To control the ASTRI telescopes, a specific ASTRI Mini-Array Software System (MASS) was designed using a scalable and distributed architecture to monitor all the hardware devices for the telescopes. Using code generation we built automatically from the ASTRI Interface Control Documents a set of communication libraries and extensive Graphical User Interfaces that provide full access to the capabilities offered by the telescope hardware subsystems for testing and maintenance. Leveraging these generated libraries and components we then implemented a human designed, integrated, Engineering GUI for MASS to perform the verification of the whole prototype and test shared services such as the alarms, configurations, control systems, and scientific on-line outcomes. In our experience the use of code generation dramatically reduced the amount of effort in development, integration and testing of the more basic software components and resulted in a fast software release life cycle. This approach could be valuable for the whole CTA project, characterized by a large diversity of hardware components.

  9. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  10. RELAP5-3D Resolution of Known Restart/Backup Issues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mesina, George L.; Anderson, Nolan A.

    2014-12-01

    The state-of-the-art nuclear reactor system safety analysis computer program developed at the Idaho National Laboratory (INL), RELAP5-3D, continues to adapt to changes in computer hardware and software and to develop to meet the ever-expanding needs of the nuclear industry. To continue at the forefront, code testing must evolve with both code and industry developments, and it must work correctly. To best ensure this, the processes of Software Verification and Validation (V&V) are applied. Verification compares coding against its documented algorithms and equations and compares its calculations against analytical solutions and the method of manufactured solutions. A form of this, sequentialmore » verification, checks code specifications against coding only when originally written then applies regression testing which compares code calculations between consecutive updates or versions on a set of test cases to check that the performance does not change. A sequential verification testing system was specially constructed for RELAP5-3D to both detect errors with extreme accuracy and cover all nuclear-plant-relevant code features. Detection is provided through a “verification file” that records double precision sums of key variables. Coverage is provided by a test suite of input decks that exercise code features and capabilities necessary to model a nuclear power plant. A matrix of test features and short-running cases that exercise them is presented. This testing system is used to test base cases (called null testing) as well as restart and backup cases. It can test RELAP5-3D performance in both standalone and coupled (through PVM to other codes) runs. Application of verification testing revealed numerous restart and backup issues in both standalone and couple modes. This document reports the resolution of these issues.« less

  11. Integration and verification testing of the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.

    2016-08-01

    We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.

  12. Implementation and verification of global optimization benchmark problems

    NASA Astrophysics Data System (ADS)

    Posypkin, Mikhail; Usov, Alexander

    2017-12-01

    The paper considers the implementation and verification of a test suite containing 150 benchmarks for global deterministic box-constrained optimization. A C++ library for describing standard mathematical expressions was developed for this purpose. The library automate the process of generating the value of a function and its' gradient at a given point and the interval estimates of a function and its' gradient on a given box using a single description. Based on this functionality, we have developed a collection of tests for an automatic verification of the proposed benchmarks. The verification has shown that literary sources contain mistakes in the benchmarks description. The library and the test suite are available for download and can be used freely.

  13. 40 CFR 1065.372 - NDUV analyzer HC and H2O interference verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... compensation algorithms that utilize measurements of other gases to meet this interference verification, simultaneously conduct such measurements to test the algorithms during the analyzer interference verification. (c...

  14. Idaho out-of-service verification field operational test

    DOT National Transportation Integrated Search

    2000-02-01

    The Out-of-Service Verification Field Operational Test Project was initiated in 1994. The purpose of the project was to test the feasibility of using sensors and a computerized tracking system to augment the ability of inspectors to monitor and contr...

  15. Angulated Dental Implants in Posterior Maxilla FEA and Experimental Verification.

    PubMed

    Hamed, Hamed A; Marzook, Hamdy A; Ghoneem, Nahed E; El-Anwar, Mohamed I

    2018-02-15

    This study aimed to evaluate the effect of different implant angulations in posterior maxilla on stress distribution by finite element analysis and verify its results experimentally. Two simplified models were prepared for an implant placed vertically and tilted 25° piercing the maxillary sinus. Geometric models' components were prepared by Autodesk Inventor then assembled in ANSYS for finite element analysis. The results of finite element analysis were verified against experimental trials results which were statistically analysed using student t-test (level of significance p < 0.05). Implant - abutment complex absorbed the load energy in case of vertical implant better than the case of angulated one. That was reflected on cortical bone stress, while both cases showed stress levels within the physiological limits. Comparing results between FEA and experiment trials showed full agreement. It was found that the tilted implant by 25° can be utilised in the posterior region maxilla for replacing maxillary first molar avoiding sinus penetration. The implant-bone interface and peri-implant bones received the highest Von Mises stress. Implant - bone interface with angulated implant received about 66% more stresses than the straight one.

  16. Software and Human-Machine Interface Development for Environmental Controls Subsystem Support

    NASA Technical Reports Server (NTRS)

    Dobson, Matthew

    2018-01-01

    The Space Launch System (SLS) is the next premier launch vehicle for NASA. It is the next stage of manned space exploration from American soil, and will be the platform in which we push further beyond Earth orbit. In preparation of the SLS maiden voyage on Exploration Mission 1 (EM-1), the existing ground support architecture at Kennedy Space Center required significant overhaul and updating. A comprehensive upgrade of controls systems was necessary, including programmable logic controller software, as well as Launch Control Center (LCC) firing room and local launch pad displays for technician use. Environmental control acts as an integral component in these systems, being the foremost system for conditioning the pad and extremely sensitive launch vehicle until T-0. The Environmental Controls Subsystem (ECS) required testing and modification to meet the requirements of the designed system, as well as the human factors requirements of NASA software for Validation and Verification (V&V). This term saw significant strides in the progress and functionality of the human-machine interfaces used at the launch pad, and improved integration with the controller code.

  17. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  18. Fabrication and verification testing of ETM 30 cm diameter ion thrusters

    NASA Technical Reports Server (NTRS)

    Collett, C.

    1977-01-01

    Engineering model designs and acceptance tests are described for the 800 and 900 series 30 cm electron bombardment thrustors. Modifications to the test console for a 1000 hr verification test were made. The 10,000 hr endurance test of the S/N 701 thruster is described, and post test analysis results are included.

  19. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF THE INDUSTRIAL TEST SYSTEM, INC. CYANIDE REAGENTSTRIP™ TEST KIT

    EPA Science Inventory

    Cyanide can be present in various forms in water. The cyanide test kit evaluated in this verification study (Industrial Test System, Inc. Cyanide Regent Strip ™ Test Kit) was designed to detect free cyanide in water. This is done by converting cyanide in water to cyanogen...

  20. CAPTIONALS: A computer aided testing environment for the verification and validation of communication protocols

    NASA Technical Reports Server (NTRS)

    Feng, C.; Sun, X.; Shen, Y. N.; Lombardi, Fabrizio

    1992-01-01

    This paper covers the verification and protocol validation for distributed computer and communication systems using a computer aided testing approach. Validation and verification make up the so-called process of conformance testing. Protocol applications which pass conformance testing are then checked to see whether they can operate together. This is referred to as interoperability testing. A new comprehensive approach to protocol testing is presented which address: (1) modeling for inter-layer representation for compatibility between conformance and interoperability testing; (2) computational improvement to current testing methods by using the proposed model inclusive of formulation of new qualitative and quantitative measures and time-dependent behavior; (3) analysis and evaluation of protocol behavior for interactive testing without extensive simulation.

  1. Environmental Testing Campaign and Verification of Satellite Deimos-2 at INTA

    NASA Astrophysics Data System (ADS)

    Hernandez, Daniel; Vazquez, Mercedes; Anon, Manuel; Olivo, Esperanza; Gallego, Pablo; Morillo, Pablo; Parra, Javier; Capraro; Luengo, Mar; Garcia, Beatriz; Villacorta, Pablo

    2014-06-01

    In this paper the environmental test campaign and verification of the DEIMOS-2 (DM2) satellite will be presented and described. DM2 will be ready for launch in 2014.Firstly, a short description of the satellite is presented, including its physical characteristics and intended optical performances. DEIMOS-2 is a LEO satellite for earth observation that will provide high resolution imaging services for agriculture, civil protection, environmental issues, disasters monitoring, climate change, urban planning, cartography, security and intelligence.Then, the verification and test campaign carried out on the SM and FM models at INTA is described; including Mechanical test for the SM and Climatic, Mechanical and Electromagnetic Compatibility tests for the FM. In addition, this paper includes Centre of Gravity and Moment of Inertia measurements for both models, and other verification activities carried out in order to ensure satellite's health during launch and its in orbit performance.

  2. Results from an Independent View on The Validation of Safety-Critical Space Systems

    NASA Astrophysics Data System (ADS)

    Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.

    2013-08-01

    The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, MIRATECH CORPORATIONM GECO 3001 AIR/FUEL RATIO CONTROLLER

    EPA Science Inventory

    Details on the verification test design, measurement test procedures, and Quality assurance/Quality Control (QA/QC) procedures can be found in the test plan titled Testing and Quality Assurance Plan, MIRATECH Corporation GECO 3100 Air/Fuel Ratio Controller (SRI 2001). It can be d...

  4. Test/QA Plan for Verification of Semi-Continuous Ambient Air Monitoring Systems - Second Round

    EPA Science Inventory

    Test/QA Plan for Verification of Semi-Continuous Ambient Air Monitoring Systems - Second Round. Changes reflect performance of second round of testing at new location and with various changes to personnel. Additional changes reflect general improvements to the Version 1 test/QA...

  5. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  6. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  7. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  8. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  9. 9 CFR 310.25 - Contamination with microorganisms; process control verification criteria and testing; pathogen...

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... CERTIFICATION POST-MORTEM INSPECTION § 310.25 Contamination with microorganisms; process control verification... testing. (1) Each official establishment that slaughters livestock must test for Escherichia coli Biotype... poultry, shall test the type of livestock or poultry slaughtered in the greatest number. The establishment...

  10. Built-in-Test Verification Techniques

    DTIC Science & Technology

    1987-02-01

    report documents the results of the effort for the Rome Air Development Center Contract F30602-84-C-0021, BIT Verification Techniques. The work was...Richard Spillman of Sp.,llman Research Associates. The principal investigators were Mike Partridge and subsequently Jeffrey Albert. The contract was...two your effort to develop techniques for Built-In Test (BIT) verification. The objective of the contract was to develop specifications and technical

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL RESIDENTIAL HOMES, WATERLOO BIOFILTER® MODEL 4-BEDROOM (NSF 02/03/WQPC-SWP)

    EPA Science Inventory

    Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE UV DISINFECTION OF SECONDARY EFFLUENTS, SUNTEC, INC. MODEL LPX200 DISINFECTION SYSTEM - 03/09/WQPC-SWP

    EPA Science Inventory

    Verification testing of the SUNTEC LPX200 UV Disinfection System to develop the UV delivered dose flow relationship was conducted at the Parsippany-Troy Hills wastewater treatment plant test site in Parsippany, New Jersey. Two lamp modules were mounted parallel in a 6.5-meter lon...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSDF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, SEPTITECH, INC. MODEL 400 SYSTEM - 02/04/WQPC-SWP

    EPA Science Inventory

    Verification testing of the SeptiTech Model 400 System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing was u...

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, BIO-MICROBICS, INC., MODEL RETROFAST ®0.375

    EPA Science Inventory

    Verification testing of the Bio-Microbics RetroFAST® 0.375 System to determine the reduction of nitrogen in residential wastewater was conducted over a twelve-month period at the Mamquam Wastewater Technology Test Facility, located at the Mamquam Wastewater Treatment Plant. The R...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, AQUAPOINT, INC. BIOCLERE MODEL 16/12 - 02/02/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Aquapoint, Inc. (AQP) BioclereTM Model 16/12 was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC), located at Otis Air National Guard Base in Bourne, Massachusetts. Sanitary sewerage from the ba...

  16. A finite-element model for moving contact line problems in immiscible two-phase flow

    NASA Astrophysics Data System (ADS)

    Kucala, Alec

    2017-11-01

    Accurate modeling of moving contact line (MCL) problems is imperative in predicting capillary pressure vs. saturation curves, permeability, and preferential flow paths for a variety of applications, including geological carbon storage (GCS) and enhanced oil recovery (EOR). The macroscale movement of the contact line is dependent on the molecular interactions occurring at the three-phase interface, however most MCL problems require resolution at the meso- and macro-scale. A phenomenological model must be developed to account for the microscale interactions, as resolving both the macro- and micro-scale would render most problems computationally intractable. Here, a model for the moving contact line is presented as a weak forcing term in the Navier-Stokes equation and applied directly at the location of the three-phase interface point. The moving interface is tracked with the level set method and discretized using the conformal decomposition finite element method (CDFEM), allowing for the surface tension and the wetting model to be computed at the exact interface location. A variety of verification test cases for simple two- and three-dimensional geometries are presented to validate the current MCL model, which can exhibit grid independence when a proper scaling for the slip length is chosen. Sandia National Laboratories is a multi-mission laboratory managed and operated by National Technology and Engineering Solutions of Sandia, LLC., a wholly owned subsidiary of Honeywell International, Inc., for the U.S. Department of Energy's National Nuclear Security Administration under contract DE-NA-0003525.

  17. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  18. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  19. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  20. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  1. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during a flight. Verification must include flight testing. ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification...

  2. Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.

    PubMed

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M

    2012-04-15

    A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.

  3. Test/QA Plan for Verification of Ozone Indicator Cards

    EPA Science Inventory

    This verification test will address ozone indicator cards (OICs) that provide short-term semi-quantitative measures of ozone concentration in ambient air. Testing will be conducted under the auspices of the U.S. Environmental Protection Agency (EPA) through the Environmental Tec...

  4. Critical Software for Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Preden, Antonio; Kaschner, Jens; Rettig, Felix; Rodriggs, Michael

    2017-01-01

    The NASA Orion vehicle that will fly to the moon in the next years is propelled along its mission by the European Service Module (ESM), developed by ESA and its prime contractor Airbus Defense and Space. This paper describes the development of the Propulsion Drive Electronics (PDE) Software that provides the interface between the propulsion hardware of the European Service Module with the Orion flight computers, and highlights the challenges that have been faced during the development. Particularly, the specific aspects relevant to Human Spaceflight in an international cooperation are presented, as the compliance to both European and US standards and the software criticality classification to the highest category A. An innovative aspect of the PDE SW is its Time- Triggered Ethernet interface with the Orion Flight Computers, which has never been flown so far on any European spacecraft. Finally the verification aspects are presented, applying the most exigent quality requirements defined in the European Cooperation for Space Standardization (ECSS) standards such as the structural coverage analysis of the object code and the recourse to an independent software verification and validation activity carried on in parallel by a different team.

  5. Verification Testing of Air Pollution Control Technology Quality Management Plan Revision 2.3

    EPA Pesticide Factsheets

    The Air Pollution Control Technology Verification Center was established in 1995 as part of the EPA’s Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technologies’ performance.

  6. Evaluating the performance of the two-phase flow solver interFoam

    NASA Astrophysics Data System (ADS)

    Deshpande, Suraj S.; Anumolu, Lakshman; Trujillo, Mario F.

    2012-01-01

    The performance of the open source multiphase flow solver, interFoam, is evaluated in this work. The solver is based on a modified volume of fluid (VoF) approach, which incorporates an interfacial compression flux term to mitigate the effects of numerical smearing of the interface. It forms a part of the C + + libraries and utilities of OpenFOAM and is gaining popularity in the multiphase flow research community. However, to the best of our knowledge, the evaluation of this solver is confined to the validation tests of specific interest to the users of the code and the extent of its applicability to a wide range of multiphase flow situations remains to be explored. In this work, we have performed a thorough investigation of the solver performance using a variety of verification and validation test cases, which include (i) verification tests for pure advection (kinematics), (ii) dynamics in the high Weber number limit and (iii) dynamics of surface tension-dominated flows. With respect to (i), the kinematics tests show that the performance of interFoam is generally comparable with the recent algebraic VoF algorithms; however, it is noticeably worse than the geometric reconstruction schemes. For (ii), the simulations of inertia-dominated flows with large density ratios {\\sim }\\mathscr {O}(10^3) yielded excellent agreement with analytical and experimental results. In regime (iii), where surface tension is important, consistency of pressure-surface tension formulation and accuracy of curvature are important, as established by Francois et al (2006 J. Comput. Phys. 213 141-73). Several verification tests were performed along these lines and the main findings are: (a) the algorithm of interFoam ensures a consistent formulation of pressure and surface tension; (b) the curvatures computed by the solver converge to a value slightly (10%) different from the analytical value and a scope for improvement exists in this respect. To reduce the disruptive effects of spurious currents, we followed the analysis of Galusinski and Vigneaux (2008 J. Comput. Phys. 227 6140-64) and arrived at the following criterion for stable capillary simulations for interFoam: \\Delta t\\leqslant \\max (10\\tau _\\mu , 0.1\\tau _\\rho) where \\tau _\\mu =\\mu \\Delta x/\\sigma ,~ {and}~\\tau _\\rho =\\sqrt {\\rho \\Delta x^3/\\sigma } . Finally, some capillary flows relevant to atomization were simulated, resulting in good agreement with the results from the literature.

  7. The Electronic View Box: a software tool for radiation therapy treatment verification.

    PubMed

    Bosch, W R; Low, D A; Gerber, R L; Michalski, J M; Graham, M V; Perez, C A; Harms, W B; Purdy, J A

    1995-01-01

    We have developed a software tool for interactively verifying treatment plan implementation. The Electronic View Box (EVB) tool copies the paradigm of current practice but does so electronically. A portal image (online portal image or digitized port film) is displayed side by side with a prescription image (digitized simulator film or digitally reconstructed radiograph). The user can measure distances between features in prescription and portal images and "write" on the display, either to approve the image or to indicate required corrective actions. The EVB tool also provides several features not available in conventional verification practice using a light box. The EVB tool has been written in ANSI C using the X window system. The tool makes use of the Virtual Machine Platform and Foundation Library specifications of the NCI-sponsored Radiation Therapy Planning Tools Collaborative Working Group for portability into an arbitrary treatment planning system that conforms to these specifications. The present EVB tool is based on an earlier Verification Image Review tool, but with a substantial redesign of the user interface. A graphical user interface prototyping system was used in iteratively refining the tool layout to allow rapid modifications of the interface in response to user comments. Features of the EVB tool include 1) hierarchical selection of digital portal images based on physician name, patient name, and field identifier; 2) side-by-side presentation of prescription and portal images at equal magnification and orientation, and with independent grayscale controls; 3) "trace" facility for outlining anatomical structures; 4) "ruler" facility for measuring distances; 5) zoomed display of corresponding regions in both images; 6) image contrast enhancement; and 7) communication of portal image evaluation results (approval, block modification, repeat image acquisition, etc.). The EVB tool facilitates the rapid comparison of prescription and portal images and permits electronic communication of corrections in port shape and positioning.

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT FOR THE REDUCTION OF NITROGEN IN DOMESTIC WASTEWATER FROM INDIVIDUAL HOMES, F. R. MAHONEY & ASSOC., AMPHIDROME SYSTEM FOR SINGLE FAMILY HOMES - 02/05/WQPC-SWP

    EPA Science Inventory

    Verification testing of the F.R. Mahoney Amphidrome System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing w...

  9. Inverse probability weighting estimation of the volume under the ROC surface in the presence of verification bias.

    PubMed

    Zhang, Ying; Alonzo, Todd A

    2016-11-01

    In diagnostic medicine, the volume under the receiver operating characteristic (ROC) surface (VUS) is a commonly used index to quantify the ability of a continuous diagnostic test to discriminate between three disease states. In practice, verification of the true disease status may be performed only for a subset of subjects under study since the verification procedure is invasive, risky, or expensive. The selection for disease examination might depend on the results of the diagnostic test and other clinical characteristics of the patients, which in turn can cause bias in estimates of the VUS. This bias is referred to as verification bias. Existing verification bias correction in three-way ROC analysis focuses on ordinal tests. We propose verification bias-correction methods to construct ROC surface and estimate the VUS for a continuous diagnostic test, based on inverse probability weighting. By applying U-statistics theory, we develop asymptotic properties for the estimator. A Jackknife estimator of variance is also derived. Extensive simulation studies are performed to evaluate the performance of the new estimators in terms of bias correction and variance. The proposed methods are used to assess the ability of a biomarker to accurately identify stages of Alzheimer's disease. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Microgravity Acceleration Measurement System (MAMS) Flight Configuration Verification and Status

    NASA Technical Reports Server (NTRS)

    Wagar, William

    2000-01-01

    The Microgravity Acceleration Measurement System (MAMS) is a precision spaceflight instrument designed to measure and characterize the microgravity environment existing in the US Lab Module of the International Space Station. Both vibratory and quasi-steady triaxial acceleration data are acquired and provided to an Ethernet data link. The MAMS Double Mid-Deck Locker (DMDL) EXPRESS Rack payload meets all the ISS IDD and ICD interface requirements as discussed in the paper which also presents flight configuration illustrations. The overall MAMS sensor and data acquisition performance and verification data are presented in addition to a discussion of the Command and Data Handling features implemented via the ISS, downlink and the GRC Telescience Center displays.

  11. Verification bias an underrecognized source of error in assessing the efficacy of medical imaging.

    PubMed

    Petscavage, Jonelle M; Richardson, Michael L; Carr, Robert B

    2011-03-01

    Diagnostic tests are validated by comparison against a "gold standard" reference test. When the reference test is invasive or expensive, it may not be applied to all patients. This can result in biased estimates of the sensitivity and specificity of the diagnostic test. This type of bias is called "verification bias," and is a common problem in imaging research. The purpose of our study is to estimate the prevalence of verification bias in the recent radiology literature. All issues of the American Journal of Roentgenology (AJR), Academic Radiology, Radiology, and European Journal of Radiology (EJR) between November 2006 and October 2009 were reviewed for original research articles mentioning sensitivity or specificity as endpoints. Articles were read to determine whether verification bias was present and searched for author recognition of verification bias in the design. During 3 years, these journals published 2969 original research articles. A total of 776 articles used sensitivity or specificity as an outcome. Of these, 211 articles demonstrated potential verification bias. The fraction of articles with potential bias was respectively 36.4%, 23.4%, 29.5%, and 13.4% for AJR, Academic Radiology, Radiology, and EJR. The total fraction of papers with potential bias in which the authors acknowledged this bias was 17.1%. Verification bias is a common and frequently unacknowledged source of error in efficacy studies of diagnostic imaging. Bias can often be eliminated by proper study design. When it cannot be eliminated, it should be estimated and acknowledged. Published by Elsevier Inc.

  12. Verification and Validation of Digitally Upgraded Control Rooms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Lau, Nathan

    2015-09-01

    As nuclear power plants undertake main control room modernization, a challenge is the lack of a clearly defined human factors process to follow. Verification and validation (V&V) as applied in the nuclear power community has tended to involve efforts such as integrated system validation, which comes at the tail end of the design stage. To fill in guidance gaps and create a step-by-step process for control room modernization, we have developed the Guideline for Operational Nuclear Usability and Knowledge Elicitation (GONUKE). This approach builds on best practices in the software industry, which prescribe an iterative user-centered approach featuring multiple cyclesmore » of design and evaluation. Nuclear regulatory guidance for control room design emphasizes summative evaluation—which occurs after the design is complete. In the GONUKE approach, evaluation is also performed at the formative stage of design—early in the design cycle using mockups and prototypes for evaluation. The evaluation may involve expert review (e.g., software heuristic evaluation at the formative stage and design verification against human factors standards like NUREG-0700 at the summative stage). The evaluation may also involve user testing (e.g., usability testing at the formative stage and integrated system validation at the summative stage). An additional, often overlooked component of evaluation is knowledge elicitation, which captures operator insights into the system. In this report we outline these evaluation types across design phases that support the overall modernization process. The objective is to provide industry-suitable guidance for steps to be taken in support of the design and evaluation of a new human-machine interface (HMI) in the control room. We suggest the value of early-stage V&V and highlight how this early-stage V&V can help improve the design process for control room modernization. We argue that there is a need to overcome two shortcomings of V&V in current practice—the propensity for late-stage V&V and the use of increasingly complex psychological assessment measures for V&V.« less

  13. Certification and verification for Northrup model NSC-01-0732 fresnel lens concentrating solar collector

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Structural analysis and certification of the collector system is presented. System verification against the interim performance criteria is presented and indicated by matrices. The verification discussion, analysis, and test results are also given.

  14. Test and analysis procedures for updating math models of Space Shuttle payloads

    NASA Technical Reports Server (NTRS)

    Craig, Roy R., Jr.

    1991-01-01

    Over the next decade or more, the Space Shuttle will continue to be the primary transportation system for delivering payloads to Earth orbit. Although a number of payloads have already been successfully carried by the Space Shuttle in the payload bay of the Orbiter vehicle, there continues to be a need for evaluation of the procedures used for verifying and updating the math models of the payloads. The verified payload math models is combined with an Orbiter math model for the coupled-loads analysis, which is required before any payload can fly. Several test procedures were employed for obtaining data for use in verifying payload math models and for carrying out the updating of the payload math models. Research was directed at the evaluation of test/update procedures for use in the verification of Space Shuttle payload math models. The following research tasks are summarized: (1) a study of free-interface test procedures; (2) a literature survey and evaluation of model update procedures; and (3) the design and construction of a laboratory payload simulator.

  15. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  16. Test/QA Plan For Verification Of Anaerobic Digester For Energy Production And Pollution Prevention

    EPA Science Inventory

    The ETV-ESTE Program conducts third-party verification testing of commercially available technologies that improve the environmental conditions in the U.S. A stakeholder committee of buyers and users of such technologies guided the development of this test on anaerobic digesters...

  17. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    The current philosophy of the GSFS regarding environmental verification of Shuttle payloads is reviewed. In the structures area, increased emphasis will be placed on the use of analysis for design verification, with selective testing performed as necessary. Furthermore, as a result of recent cost optimization analysis, the multitier test program will presumably give way to a comprehensive test program at the major payload subassembly level after adequate workmanship at the component level has been verified. In the thermal vacuum area, thought is being given to modifying the approaches used for conventional spacecraft.

  18. The role of the real-time simulation facility, SIMFAC, in the design, development and performance verification of the Shuttle Remote Manipulator System (SRMS) with man-in-the-loop

    NASA Technical Reports Server (NTRS)

    Mccllough, J. R.; Sharpe, A.; Doetsch, K. H.

    1980-01-01

    The SIMFAC has played a vital role in the design, development, and performance verification of the shuttle remote manipulator system (SRMS) to be installed in the space shuttle orbiter. The facility provides for realistic man-in-the-loop operation of the SRMS by an operator in the operator complex, a flightlike crew station patterned after the orbiter aft flight deck with all necessary man machine interface elements, including SRMS displays and controls and simulated out-of-the-window and CCTV scenes. The characteristics of the manipulator system, including arm and joint servo dynamics and control algorithms, are simulated by a comprehensive mathematical model within the simulation subsystem of the facility. Major studies carried out using SIMFAC include: SRMS parameter sensitivity evaluations; the development, evaluation, and verification of operating procedures; and malfunction simulation and analysis of malfunction performance. Among the most important and comprehensive man-in-the-loop simulations carried out to date on SIMFAC are those which support SRMS performance verification and certification when the SRMS is part of the integrated orbiter-manipulator system.

  19. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    NASA Technical Reports Server (NTRS)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  20. 65nm RadSafe™ Technology for RC64 and Advanced SOCs

    NASA Astrophysics Data System (ADS)

    Liran, Tuvia; Ginosar, Ran; Lange, Fredy; Mandler, Alberto; Aviely, Peleg; Meirov, Henri; Goldberg, Michael; Meister, Zeev; Oliel, Mickey

    2015-09-01

    The trend of scaling of microelectronic provides certain advantages for space components, as well as some challenges. It enables implementing highly integrated and high performance ASICs, reducing power, area and weight. Scaling also improves the immunity to TID and SEL in most cases, but increases soft error rate significantly. Ramon Chips adopted the 65nm technology for implementing RC64 [1,2], a 64 core DSP for space applications, and for making other future products. The 65nm process node is widely used, very mature, and supported by wide range of IP providers. Thus the need for full custom design of cores and IPs is minimized, and radiation hardening is achievable by mitigating the radiation effects on the available IPs, and developing proprietary IPs only for complementing the available IPs. The RadSafe_65TM technology includes hardened standard cells and I/O libraries, methods for mitigation of radiation effects in COTS IP cores (SRAM, PLL, SERDES, DDR2/3 interface) and adding unique cores for monitoring radiation effects and junction temperature. We had developed RADIC6, a technology development vehicle, for verification of all hard cores and verification of the methodologies and design flow required for RC64. RADIC6 includes the test structures for characterizing the IP cores for immunity to all radiation effects. This paper describes the main elements and IP cores of RadSafe_65TM, as well as the contents of RADIC6 test chip.

  1. Visual exploration and analysis of human-robot interaction rules

    NASA Astrophysics Data System (ADS)

    Zhang, Hui; Boyles, Michael J.

    2013-01-01

    We present a novel interaction paradigm for the visual exploration, manipulation and analysis of human-robot interaction (HRI) rules; our development is implemented using a visual programming interface and exploits key techniques drawn from both information visualization and visual data mining to facilitate the interaction design and knowledge discovery process. HRI is often concerned with manipulations of multi-modal signals, events, and commands that form various kinds of interaction rules. Depicting, manipulating and sharing such design-level information is a compelling challenge. Furthermore, the closed loop between HRI programming and knowledge discovery from empirical data is a relatively long cycle. This, in turn, makes design-level verification nearly impossible to perform in an earlier phase. In our work, we exploit a drag-and-drop user interface and visual languages to support depicting responsive behaviors from social participants when they interact with their partners. For our principal test case of gaze-contingent HRI interfaces, this permits us to program and debug the robots' responsive behaviors through a graphical data-flow chart editor. We exploit additional program manipulation interfaces to provide still further improvement to our programming experience: by simulating the interaction dynamics between a human and a robot behavior model, we allow the researchers to generate, trace and study the perception-action dynamics with a social interaction simulation to verify and refine their designs. Finally, we extend our visual manipulation environment with a visual data-mining tool that allows the user to investigate interesting phenomena such as joint attention and sequential behavioral patterns from multiple multi-modal data streams. We have created instances of HRI interfaces to evaluate and refine our development paradigm. As far as we are aware, this paper reports the first program manipulation paradigm that integrates visual programming interfaces, information visualization, and visual data mining methods to facilitate designing, comprehending, and evaluating HRI interfaces.

  2. Certification and verification for Northrup Model NSC-01-0732 Fresnel lens concentrating solar collector

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1979-03-01

    The certification and verification of the Northrup Model NSC-01-0732 Fresnel lens tracking solar collector are presented. A certification statement is included with signatures and a separate report on the structural analysis of the collector system. System verification against the Interim Performance Criteria are indicated by matrices with verification discussion, analysis, and enclosed test results.

  3. A RESEARCH DATABASE FOR IMPROVED DATA MANAGEMENT AND ANALYSIS IN LONGITUDINAL STUDIES

    PubMed Central

    BIELEFELD, ROGER A.; YAMASHITA, TOYOKO S.; KEREKES, EDWARD F.; ERCANLI, EHAT; SINGER, LYNN T.

    2014-01-01

    We developed a research database for a five-year prospective investigation of the medical, social, and developmental correlates of chronic lung disease during the first three years of life. We used the Ingres database management system and the Statit statistical software package. The database includes records containing 1300 variables each, the results of 35 psychological tests, each repeated five times (providing longitudinal data on the child, the parents, and behavioral interactions), both raw and calculated variables, and both missing and deferred values. The four-layer menu-driven user interface incorporates automatic activation of complex functions to handle data verification, missing and deferred values, static and dynamic backup, determination of calculated values, display of database status, reports, bulk data extraction, and statistical analysis. PMID:7596250

  4. DFX via the Internet

    NASA Astrophysics Data System (ADS)

    Wagner, Rick; Castanotto, Giuseppe; Goldberg, Kenneth A.

    1995-11-01

    The Internet offers tremendous potential for rapid development of mechanical products to meet global competition. In the past several years, a number of geometric algorithms have been developed to evaluate manufacturing properties such as feedability, fixturability, assemblability, etc. This class of algorithms is sometimes termed `DFX: Design for X'. One problem is that most of these algorithms are tailored to a particular CAD system and format and so have not been widely tested by industry. the World Wide Web may offer a solution: its simple interface language may become a de facto standard for the exchange of geometric data. In this preliminary paper we describe one model for remote analysis of CAD models that we believe holds promise for use in industry (e.g. during the design cycle) and in research (e.g. to encourage verification of results).

  5. KSC-2013-2205

    NASA Image and Video Library

    2013-04-25

    VANDENBERG AIR FORCE BASE, Calif. -- The Interface Region Imaging Spectrograph, or IRIS, is being readied for mating to the Orbital Sciences Corp. Pegasus XL rocket that will launch the spacecraft. A fairing will be fitted to the nose of the Pegasus to protect the spacecraft from atmospheric heating and stress during launch. Upcoming work includes electrical verification testing. IRIS will open a new window of discovery by tracing the flow of energy and plasma through the chromospheres and transition region into the sun’s corona using spectrometry and imaging. IRIS fills a crucial gap in our ability to advance studies of the sun-to-Earth connection by tracing the flow of energy and plasma through the foundation of the corona and the region around the sun known as the heliosphere. Photo credit: VAFB/Randy Beaudoin

  6. KSC-2013-2206

    NASA Image and Video Library

    2013-04-25

    VANDENBERG AIR FORCE BASE, Calif. -- The Interface Region Imaging Spectrograph, or IRIS, is being readied for mating to the Orbital Sciences Corp. Pegasus XL rocket that will launch the spacecraft. A fairing will be fitted to the nose of the Pegasus to protect the spacecraft from atmospheric heating and stress during launch. Upcoming work includes electrical verification testing. IRIS will open a new window of discovery by tracing the flow of energy and plasma through the chromospheres and transition region into the sun’s corona using spectrometry and imaging. IRIS fills a crucial gap in our ability to advance studies of the sun-to-Earth connection by tracing the flow of energy and plasma through the foundation of the corona and the region around the sun known as the heliosphere. Photo credit: VAFB/Randy Beaudoin

  7. KSC-2013-2203

    NASA Image and Video Library

    2013-04-25

    VANDENBERG AIR FORCE BASE, Calif. -- The Interface Region Imaging Spectrograph, or IRIS, is being readied for mating to the Orbital Sciences Corp. Pegasus XL rocket that will launch the spacecraft. IRIS will be covered in a fairing after it's connected to the nose of the Pegasus to protect the spacecraft from atmospheric heating and stress during launch. Upcoming work includes electrical verification testing. IRIS will open a new window of discovery by tracing the flow of energy and plasma through the chromospheres and transition region into the sun’s corona using spectrometry and imaging. IRIS fills a crucial gap in our ability to advance studies of the sun-to-Earth connection by tracing the flow of energy and plasma through the foundation of the corona and the region around the sun known as the heliosphere. Photo credit: VAFB/Randy Beaudoin

  8. KSC-2013-2207

    NASA Image and Video Library

    2013-04-25

    VANDENBERG AIR FORCE BASE, Calif. -- The Interface Region Imaging Spectrograph, or IRIS, is being readied for mating to the Orbital Sciences Corp. Pegasus XL rocket that will launch the spacecraft. A fairing will be fitted to the nose of the Pegasus to protect the spacecraft from atmospheric heating and stress during launch. Upcoming work includes electrical verification testing. IRIS will open a new window of discovery by tracing the flow of energy and plasma through the chromospheres and transition region into the sun’s corona using spectrometry and imaging. IRIS fills a crucial gap in our ability to advance studies of the sun-to-Earth connection by tracing the flow of energy and plasma through the foundation of the corona and the region around the sun known as the heliosphere. Photo credit: VAFB/Randy Beaudoin

  9. Software Verification of Orion Cockpit Displays

    NASA Technical Reports Server (NTRS)

    Biswas, M. A. Rafe; Garcia, Samuel; Prado, Matthew; Hossain, Sadad; Souris, Matthew; Morin, Lee

    2017-01-01

    NASA's latest spacecraft Orion is in the development process of taking humans deeper into space. Orion is equipped with three main displays to monitor and control the spacecraft. To ensure the software behind the glass displays operates without faults, rigorous testing is needed. To conduct such testing, the Rapid Prototyping Lab at NASA's Johnson Space Center along with the University of Texas at Tyler employed a software verification tool, EggPlant Functional by TestPlant. It is an image based test automation tool that allows users to create scripts to verify the functionality within a program. A set of edge key framework and Common EggPlant Functions were developed to enable creation of scripts in an efficient fashion. This framework standardized the way to code and to simulate user inputs in the verification process. Moreover, the Common EggPlant Functions can be used repeatedly in verification of different displays.

  10. 40 CFR 1066.215 - Summary of verification and calibration procedures for chassis dynamometers.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer... manufacturer instructions and good engineering judgment. (c) Automated dynamometer verifications and... accomplish the verifications and calibrations specified in this subpart. You may use these automated...

  11. 40 CFR 1066.215 - Summary of verification and calibration procedures for chassis dynamometers.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer... manufacturer instructions and good engineering judgment. (c) Automated dynamometer verifications and... accomplish the verifications and calibrations specified in this subpart. You may use these automated...

  12. WORDGRAPH: Keyword-in-Context Visualization for NETSPEAK's Wildcard Search.

    PubMed

    Riehmann, Patrick; Gruendl, Henning; Potthast, Martin; Trenkmann, Martin; Stein, Benno; Froehlich, Benno

    2012-09-01

    The WORDGRAPH helps writers in visually choosing phrases while writing a text. It checks for the commonness of phrases and allows for the retrieval of alternatives by means of wildcard queries. To support such queries, we implement a scalable retrieval engine, which returns high-quality results within milliseconds using a probabilistic retrieval strategy. The results are displayed as WORDGRAPH visualization or as a textual list. The graphical interface provides an effective means for interactive exploration of search results using filter techniques, query expansion, and navigation. Our observations indicate that, of three investigated retrieval tasks, the textual interface is sufficient for the phrase verification task, wherein both interfaces support context-sensitive word choice, and the WORDGRAPH best supports the exploration of a phrase's context or the underlying corpus. Our user study confirms these observations and shows that WORDGRAPH is generally the preferred interface over the textual result list for queries containing multiple wildcards.

  13. Options and Risk for Qualification of Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)

    2002-01-01

    Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.

  14. Environmental Technology Verification--Baghouse Filtration Products: GE Energy QG061 Filtration Media (Tested September 2008)

    EPA Science Inventory

    This report reviews the filtration and pressure drop performance of GE Energy's QG061 filtration media. Environmental Technology Verification (ETV) testing of this technology/product was conducted during a series of tests in September 2008. The objective of the ETV Program is to ...

  15. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  16. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  17. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  18. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  19. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  20. 49 CFR Appendix D to Part 229 - Criteria for Certification of Crashworthy Event Recorder Memory Module

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... that the ERMM meets the performance criteria contained in this appendix and that test verification data are available to a railroad or to FRA upon request. 2. The test verification data shall contain, at a minimum, all pertinent original data logs and documentation that the test sample preparation, test set up...

  1. Aircraft/Stores Compatibility Symposium Proceedings, Held at Arlington, Virginia on 2-4 September 1975. Volume 2

    DTIC Science & Technology

    1975-01-01

    initialization. The switching section must switch high bandwidth analog signrtls (i.e., video ) and route certain hardwired functions. These functions are...tenwfltlon of the tfeCxiCI,& and w*c interface rfqmuied fit. mate the Pedo -Recruit with the F,4J aka 770 b. Analysis and wind tunnel verification of

  2. Shuttle payload interface verification equipment study. Volume 3: Specification data

    NASA Technical Reports Server (NTRS)

    1976-01-01

    A complete description is given of the IVE physical and performance design requirements as evolved in this study. The data are presented in a format to facilitate the development of an item specification. Data were used to support the development of the project plan data (schedules, cost, etc.) contained in Volume 4 of this report.

  3. Spacelab, Spacehab, and Space Station Freedom payload interface projects

    NASA Technical Reports Server (NTRS)

    Smith, Dean Lance

    1992-01-01

    Contributions were made to several projects. Howard Nguyen was assisted in developing the Space Station RPS (Rack Power Supply). The RPS is a computer controlled power supply that helps test equipment used for experiments before the equipment is installed on Space Station Freedom. Ron Bennett of General Electric Government Services was assisted in the design and analysis of the Standard Interface Rack Controller hardware and software. An analysis was made of the GPIB (General Purpose Interface Bus), looking for any potential problems while transmitting data across the bus, such as the interaction of the bus controller with a data talker and its listeners. An analysis was made of GPIB bus communications in general, including any negative impact the bus may have on transmitting data back to Earth. A study was made of transmitting digital data back to Earth over a video channel. A report was written about the study and a revised version of the report will be submitted for publication. Work was started on the design of a PC/AT compatible circuit board that will combine digital data with a video signal. Another PC/AT compatible circuit board is being designed to recover the digital data from the video signal. A proposal was submitted to support the continued development of the interface boards after the author returns to Memphis State University in the fall. A study was also made of storing circuit board design software and data on the hard disk server of a LAN (Local Area Network) that connects several IBM style PCs. A report was written that makes several recommendations. A preliminary design review was started of the AIVS (Automatic Interface Verification System). The summer was over before any significant contribution could be made to this project.

  4. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the Shuttle Landing Facility, workers in cherry pickers (left and right) help direct the offloading of the Italian Space Agency's Multi- Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  5. MPLM Donatello is offloaded at the SLF

    NASA Technical Reports Server (NTRS)

    2001-01-01

    At the Shuttle Landing Facility, cranes are poised to help offload the Italian Space Agency's Multi- Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.

  6. Benchmarking on Tsunami Currents with ComMIT

    NASA Astrophysics Data System (ADS)

    Sharghi vand, N.; Kanoglu, U.

    2015-12-01

    There were no standards for the validation and verification of tsunami numerical models before 2004 Indian Ocean tsunami. Even, number of numerical models has been used for inundation mapping effort, evaluation of critical structures, etc. without validation and verification. After 2004, NOAA Center for Tsunami Research (NCTR) established standards for the validation and verification of tsunami numerical models (Synolakis et al. 2008 Pure Appl. Geophys. 165, 2197-2228), which will be used evaluation of critical structures such as nuclear power plants against tsunami attack. NCTR presented analytical, experimental and field benchmark problems aimed to estimate maximum runup and accepted widely by the community. Recently, benchmark problems were suggested by the US National Tsunami Hazard Mitigation Program Mapping & Modeling Benchmarking Workshop: Tsunami Currents on February 9-10, 2015 at Portland, Oregon, USA (http://nws.weather.gov/nthmp/index.html). These benchmark problems concentrated toward validation and verification of tsunami numerical models on tsunami currents. Three of the benchmark problems were: current measurement of the Japan 2011 tsunami in Hilo Harbor, Hawaii, USA and in Tauranga Harbor, New Zealand, and single long-period wave propagating onto a small-scale experimental model of the town of Seaside, Oregon, USA. These benchmark problems were implemented in the Community Modeling Interface for Tsunamis (ComMIT) (Titov et al. 2011 Pure Appl. Geophys. 168, 2121-2131), which is a user-friendly interface to the validated and verified Method of Splitting Tsunami (MOST) (Titov and Synolakis 1995 J. Waterw. Port Coastal Ocean Eng. 121, 308-316) model and is developed by NCTR. The modeling results are compared with the required benchmark data, providing good agreements and results are discussed. Acknowledgment: The research leading to these results has received funding from the European Union's Seventh Framework Programme (FP7/2007-2013) under grant agreement no 603839 (Project ASTARTE - Assessment, Strategy and Risk Reduction for Tsunamis in Europe)

  7. Assessment of Galileo modal test results for mathematical model verification

    NASA Technical Reports Server (NTRS)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  8. VERIFYING THE VOC CONTROL PERFORMANCE OF BIOREACTORS

    EPA Science Inventory

    The paper describes the verification testing approach used to collect high-quality, peer-reviewed data on the performance of bioreaction-based technologies for the control of volatile organic compounds (VOCs). The verification protocol that describes the approach for these tests ...

  9. BAGHOUSE FILTRATION PRODUCTS VERIFICATION TESTING, HOW IT BENEFITS THE BOILER BAGHOUSE OPERATOR

    EPA Science Inventory

    The paper describes the Environmental Technology Verification (ETV) Program for baghouse filtration products developed by the Air Pollution Control Technology Verification Center, one of six Centers under the ETV Program, and discusses how it benefits boiler baghouse operators. A...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION--GENERIC VERIFICATION PROTOCOL FOR BIOLOGICAL AND AEROSOL TESTING OF GENERAL VENTILATION AIR CLEANERS

    EPA Science Inventory

    Under EPA's Environmental Technology Verification Program, Research Triangle Institute (RTI) will operate the Air Pollution Control Technology Center to verify the filtration efficiency and bioaerosol inactivation efficiency of heating, ventilation and air conditioning air cleane...

  11. Certification and verification for Calmac flat plate solar collector

    NASA Technical Reports Server (NTRS)

    1978-01-01

    Information used in the certification and verification of the Calmac Flat Plate Collector is presented. Contained are such items as test procedures and results, information on materials used, installation, operation, and maintenance manuals, and other information pertaining to the verification and certification.

  12. THIRD PARTY TECHNOLOGY PERFORMANCE VERIFICATION DATA FROM A STAKEHOLD-DRIVEN TECHNOLOGY TESTING PROGRAM

    EPA Science Inventory

    The Greenhouse Gas (GHG) Technology Verification Center is one of 12 independently operated verification centers established by the U.S. Environmental Protection Agency. The Center provides third-party performance data to stakeholders interested in environmetnal technologies tha...

  13. 40 CFR 1066.275 - Daily dynamometer readiness verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.275 Daily... automated process for this verification procedure, perform this evaluation by setting the initial speed and... your dynamometer does not perform this verification with an automated process: (1) With the dynamometer...

  14. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  15. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  16. Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report

    NASA Technical Reports Server (NTRS)

    Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.

    2017-01-01

    This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.

  17. Test/QA plan for the verification testing of alternative or reformulated liquid fuels, fuel additives, fuel emulsions, and lubricants for highway and nonroad use heavy-duty diesel engines

    EPA Science Inventory

    This Environmental Technology Verification Program test/QA plan for heavy-duty diesel engine testing at the Southwest Research Institute’s Department of Emissions Research describes how the Federal Test Procedure (FTP), as listed in 40 CFR Part 86 for highway engines and 40 CFR P...

  18. Knowledge Based Systems (KBS) Verification, Validation, Evaluation, and Testing (VVE&T) Bibliography: Topical Categorization

    DTIC Science & Technology

    2003-03-01

    Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig

  19. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.

  20. Experimental Verification of Integrity of Low-Pressure Injection Piles Structure - Pile Internal Capacity

    NASA Astrophysics Data System (ADS)

    Pachla, Henryk

    2017-12-01

    The idea of strengthening the foundation using injection piles lies in transferring loads from the foundation to the piles anchorage in existing structure and formed in the soil. Such a system has to be able to transfer loads from the foundation to the pile and from the pile onto the soil. Pile structure often reinforced with steel element has to also be able to transfer such a loading. According to the rules of continuum mechanics, the bearing capacity of such a system and a deformation of its individual elements can be determined by way of an analysis of the contact problem of three interfaces. Each of these surfaces is determined by different couples of materials. Those surfaces create: pile-foundation anchorage, bonding between reinforcement and material from which the pile is formed and pilesoil interface. What is essential is that on the contact surfaces the deformation of materials which adhere to each other can vary and depends on the mechanical properties and geometry of these surfaces. Engineering practice and experimental research point out that the failure in such structures occurs at interfaces. The paper is concentrating on presenting the experiments on interaction between cement grout and various types of steel reinforcement. The tests were conducted on the special low pressure injection piles widely used to strengthen foundations of already existing structures of historical buildings due to the technology of formation and injection pressure.

  1. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: BAGHOUSE FILTRATION PRODUCTS—SOUTHERN FILTER MEDIA, LLC, PE-16/M-SPES FILTER SAMPLE

    EPA Science Inventory

    The U.S. EPA has created the Environmental Technology Verification program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program tested the performance of baghouse filtrati...

  2. Validation of mesoscale models

    NASA Technical Reports Server (NTRS)

    Kuo, Bill; Warner, Tom; Benjamin, Stan; Koch, Steve; Staniforth, Andrew

    1993-01-01

    The topics discussed include the following: verification of cloud prediction from the PSU/NCAR mesoscale model; results form MAPS/NGM verification comparisons and MAPS observation sensitivity tests to ACARS and profiler data; systematic errors and mesoscale verification for a mesoscale model; and the COMPARE Project and the CME.

  3. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT--BAGHOUSE FILTRATION PRODUCTS, W.L. GORE ASSOC., INC.

    EPA Science Inventory

    The U.S. Environmental Protection Agency Air Pollution Control Technology (APCT) Verification Center evaluates the performance of baghouse filtration products used primarily to control PM2.5 emissions. This verification statement summarizes the test results for W.L. Gore & Assoc....

  4. 40 CFR 1066.240 - Torque transducer verification and calibration.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Torque transducer verification and...

  5. 40 CFR 1066.240 - Torque transducer verification and calibration.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ...) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification and calibration. Calibrate torque-measurement systems as described in 40 CFR 1065.310. ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Torque transducer verification and...

  6. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  7. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  8. 40 CFR 1066.250 - Base inertia verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...

  9. ETV REPORT AND VERIFICATION STATEMENT - KASELCO POSI-FLO ELECTROCOAGULATION TREATMENT SYSTEM

    EPA Science Inventory

    The Kaselco Electrocoagulation Treatment System (Kaselco system) in combination with an ion exchange polishing system was tested, under actual production conditions, processing metal finishing wastewater at Gull Industries in Houston, Texas. The verification test evaluated the a...

  10. GENERIC VERIFICATION PROTOCOL FOR AQUEOUS CLEANER RECYCLING TECHNOLOGIES

    EPA Science Inventory

    This generic verification protocol has been structured based on a format developed for ETV-MF projects. This document describes the intended approach and explain plans for testing with respect to areas such as test methodology, procedures, parameters, and instrumentation. Also ...

  11. Evaluation of backscatter dose from internal lead shielding in clinical electron beams using EGSnrc Monte Carlo simulations.

    PubMed

    De Vries, Rowen J; Marsh, Steven

    2015-11-08

    Internal lead shielding is utilized during superficial electron beam treatments of the head and neck, such as lip carcinoma. Methods for predicting backscattered dose include the use of empirical equations or performing physical measurements. The accuracy of these empirical equations required verification for the local electron beams. In this study, a Monte Carlo model of a Siemens Artiste linac was developed for 6, 9, 12, and 15 MeV electron beams using the EGSnrc MC package. The model was verified against physical measurements to an accuracy of better than 2% and 2mm. Multiple MC simulations of lead interfaces at different depths, corresponding to mean electron energies in the range of 0.2-14 MeV at the interfaces, were performed to calculate electron backscatter values. The simulated electron backscatter was compared with current empirical equations to ascertain their accuracy. The major finding was that the current set of backscatter equations does not accurately predict electron backscatter, particularly in the lower energies region. A new equation was derived which enables estimation of electron backscatter factor at any depth upstream from the interface for the local treatment machines. The derived equation agreed to within 1.5% of the MC simulated electron backscatter at the lead interface and upstream positions. Verification of the equation was performed by comparing to measurements of the electron backscatter factor using Gafchromic EBT2 film. These results show a mean value of 0.997 ± 0.022 to 1σ of the predicted values of electron backscatter. The new empirical equation presented can accurately estimate electron backscatter factor from lead shielding in the range of 0.2 to 14 MeV for the local linacs.

  12. Evaluation of backscatter dose from internal lead shielding in clinical electron beams using EGSnrc Monte Carlo simulations

    PubMed Central

    Marsh, Steven

    2015-01-01

    Internal lead shielding is utilized during superficial electron beam treatments of the head and neck, such as lip carcinoma. Methods for predicting backscattered dose include the use of empirical equations or performing physical measurements. The accuracy of these empirical equations required verification for the local electron beams. In this study, a Monte Carlo model of a Siemens Artiste linac was developed for 6, 9, 12, and 15 MeV electron beams using the EGSnrc MC package. The model was verified against physical measurements to an accuracy of better than 2% and 2 mm. Multiple MC simulations of lead interfaces at different depths, corresponding to mean electron energies in the range of 0.2–14 MeV at the interfaces, were performed to calculate electron backscatter values. The simulated electron backscatter was compared with current empirical equations to ascertain their accuracy. The major finding was that the current set of backscatter equations does not accurately predict electron backscatter, particularly in the lower energies region. A new equation was derived which enables estimation of electron backscatter factor at any depth upstream from the interface for the local treatment machines. The derived equation agreed to within 1.5% of the MC simulated electron backscatter at the lead interface and upstream positions. Verification of the equation was performed by comparing to measurements of the electron backscatter factor using Gafchromic EBT2 film. These results show a mean value of 0.997±0.022 to 1σ of the predicted values of electron backscatter. The new empirical equation presented can accurately estimate electron backscatter factor from lead shielding in the range of 0.2 to 14 MeV for the local linacs. PACS numbers: 87.53.Bn, 87.55.K‐, 87.56.bd PMID:26699566

  13. A new method to address verification bias in studies of clinical screening tests: cervical cancer screening assays as an example.

    PubMed

    Xue, Xiaonan; Kim, Mimi Y; Castle, Philip E; Strickler, Howard D

    2014-03-01

    Studies to evaluate clinical screening tests often face the problem that the "gold standard" diagnostic approach is costly and/or invasive. It is therefore common to verify only a subset of negative screening tests using the gold standard method. However, undersampling the screen negatives can lead to substantial overestimation of the sensitivity and underestimation of the specificity of the diagnostic test. Our objective was to develop a simple and accurate statistical method to address this "verification bias." We developed a weighted generalized estimating equation approach to estimate, in a single model, the accuracy (eg, sensitivity/specificity) of multiple assays and simultaneously compare results between assays while addressing verification bias. This approach can be implemented using standard statistical software. Simulations were conducted to assess the proposed method. An example is provided using a cervical cancer screening trial that compared the accuracy of human papillomavirus and Pap tests, with histologic data as the gold standard. The proposed approach performed well in estimating and comparing the accuracy of multiple assays in the presence of verification bias. The proposed approach is an easy to apply and accurate method for addressing verification bias in studies of multiple screening methods. Copyright © 2014 Elsevier Inc. All rights reserved.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION COATINGS AND COATING EQUIPMENT PROGRAM (ETV CCEP), FINAL TECHNOLOGY APPLICATIONS GROUP TAGNITE--TESTING AND QUALITY ASSURANCE PLAN (T/QAP)

    EPA Science Inventory

    The overall objective of the Environmental Testing and Verification Coatings and Coating Equipment Program is to verify pollution prevention and performance characteristics of coating technologies and make the results of the testing available to prospective coating technology use...

  15. ANDalyze Lead 100 Test Kit and AND1000 Fluorimeter Environmental Technology Verification Report and Statement

    EPA Science Inventory

    This report provides results for the verification testing of the Lead100/AND1000. The following is a description of the technology based on information provided by the vendor. The information provided below was not verified in this test. The ANDalyze Lead100/AND1000 was des...

  16. Standardized Definitions for Code Verification Test Problems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doebling, Scott William

    This document contains standardized definitions for several commonly used code verification test problems. These definitions are intended to contain sufficient information to set up the test problem in a computational physics code. These definitions are intended to be used in conjunction with exact solutions to these problems generated using Exact- Pack, www.github.com/lanl/exactpack.

  17. Verification and benchmark testing of the NUFT computer code

    NASA Astrophysics Data System (ADS)

    Lee, K. H.; Nitao, J. J.; Kulshrestha, A.

    1993-10-01

    This interim report presents results of work completed in the ongoing verification and benchmark testing of the NUFT (Nonisothermal Unsaturated-saturated Flow and Transport) computer code. NUFT is a suite of multiphase, multicomponent models for numerical solution of thermal and isothermal flow and transport in porous media, with application to subsurface contaminant transport problems. The code simulates the coupled transport of heat, fluids, and chemical components, including volatile organic compounds. Grid systems may be cartesian or cylindrical, with one-, two-, or fully three-dimensional configurations possible. In this initial phase of testing, the NUFT code was used to solve seven one-dimensional unsaturated flow and heat transfer problems. Three verification and four benchmarking problems were solved. In the verification testing, excellent agreement was observed between NUFT results and the analytical or quasianalytical solutions. In the benchmark testing, results of code intercomparison were very satisfactory. From these testing results, it is concluded that the NUFT code is ready for application to field and laboratory problems similar to those addressed here. Multidimensional problems, including those dealing with chemical transport, will be addressed in a subsequent report.

  18. Current status of verification practices in clinical biochemistry in Spain.

    PubMed

    Gómez-Rioja, Rubén; Alvarez, Virtudes; Ventura, Montserrat; Alsina, M Jesús; Barba, Núria; Cortés, Mariano; Llopis, María Antonia; Martínez, Cecilia; Ibarz, Mercè

    2013-09-01

    Verification uses logical algorithms to detect potential errors before laboratory results are released to the clinician. Even though verification is one of the main processes in all laboratories, there is a lack of standardization mainly in the algorithms used and the criteria and verification limits applied. A survey in clinical laboratories in Spain was conducted in order to assess the verification process, particularly the use of autoverification. Questionnaires were sent to the laboratories involved in the External Quality Assurance Program organized by the Spanish Society of Clinical Biochemistry and Molecular Pathology. Seven common biochemical parameters were included (glucose, cholesterol, triglycerides, creatinine, potassium, calcium, and alanine aminotransferase). Completed questionnaires were received from 85 laboratories. Nearly all the laboratories reported using the following seven verification criteria: internal quality control, instrument warnings, sample deterioration, reference limits, clinical data, concordance between parameters, and verification of results. The use of all verification criteria varied according to the type of verification (automatic, technical, or medical). Verification limits for these parameters are similar to biological reference ranges. Delta Check was used in 24% of laboratories. Most laboratories (64%) reported using autoverification systems. Autoverification use was related to laboratory size, ownership, and type of laboratory information system, but amount of use (percentage of test autoverified) was not related to laboratory size. A total of 36% of Spanish laboratories do not use autoverification, despite the general implementation of laboratory information systems, most of them, with autoverification ability. Criteria and rules for seven routine biochemical tests were obtained.

  19. Reducing Wrong Patient Selection Errors: Exploring the Design Space of User Interface Techniques

    PubMed Central

    Sopan, Awalin; Plaisant, Catherine; Powsner, Seth; Shneiderman, Ben

    2014-01-01

    Wrong patient selection errors are a major issue for patient safety; from ordering medication to performing surgery, the stakes are high. Widespread adoption of Electronic Health Record (EHR) and Computerized Provider Order Entry (CPOE) systems makes patient selection using a computer screen a frequent task for clinicians. Careful design of the user interface can help mitigate the problem by helping providers recall their patients’ identities, accurately select their names, and spot errors before orders are submitted. We propose a catalog of twenty seven distinct user interface techniques, organized according to a task analysis. An associated video demonstrates eighteen of those techniques. EHR designers who consider a wider range of human-computer interaction techniques could reduce selection errors, but verification of efficacy is still needed. PMID:25954415

  20. Reducing wrong patient selection errors: exploring the design space of user interface techniques.

    PubMed

    Sopan, Awalin; Plaisant, Catherine; Powsner, Seth; Shneiderman, Ben

    2014-01-01

    Wrong patient selection errors are a major issue for patient safety; from ordering medication to performing surgery, the stakes are high. Widespread adoption of Electronic Health Record (EHR) and Computerized Provider Order Entry (CPOE) systems makes patient selection using a computer screen a frequent task for clinicians. Careful design of the user interface can help mitigate the problem by helping providers recall their patients' identities, accurately select their names, and spot errors before orders are submitted. We propose a catalog of twenty seven distinct user interface techniques, organized according to a task analysis. An associated video demonstrates eighteen of those techniques. EHR designers who consider a wider range of human-computer interaction techniques could reduce selection errors, but verification of efficacy is still needed.

  1. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K; de Frias, Gabriel Jose

    2018-03-01

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less

  2. Sierra/SolidMechanics 4.48 Verification Tests Manual.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Plews, Julia A.; Crane, Nathan K.; de Frias, Gabriel Jose

    Presented in this document is a small portion of the tests that exist in the Sierra / SolidMechanics (Sierra / SM) verification test suite. Most of these tests are run nightly with the Sierra / SM code suite, and the results of the test are checked versus the correct analytical result. For each of the tests presented in this document, the test setup, a description of the analytic solution, and comparison of the Sierra / SM code results to the analytic solution is provided. Mesh convergence is also checked on a nightly basis for several of these tests. This documentmore » can be used to confirm that a given code capability is verified or referenced as a compilation of example problems. Additional example problems are provided in the Sierra / SM Example Problems Manual. Note, many other verification tests exist in the Sierra / SM test suite, but have not yet been included in this manual.« less

  3. Formally verifying human–automation interaction as part of a system model: limitations and tradeoffs

    PubMed Central

    Bass, Ellen J.

    2011-01-01

    Both the human factors engineering (HFE) and formal methods communities are concerned with improving the design of safety-critical systems. This work discusses a modeling effort that leveraged methods from both fields to perform formal verification of human–automation interaction with a programmable device. This effort utilizes a system architecture composed of independent models of the human mission, human task behavior, human-device interface, device automation, and operational environment. The goals of this architecture were to allow HFE practitioners to perform formal verifications of realistic systems that depend on human–automation interaction in a reasonable amount of time using representative models, intuitive modeling constructs, and decoupled models of system components that could be easily changed to support multiple analyses. This framework was instantiated using a patient controlled analgesia pump in a two phased process where models in each phase were verified using a common set of specifications. The first phase focused on the mission, human-device interface, and device automation; and included a simple, unconstrained human task behavior model. The second phase replaced the unconstrained task model with one representing normative pump programming behavior. Because models produced in the first phase were too large for the model checker to verify, a number of model revisions were undertaken that affected the goals of the effort. While the use of human task behavior models in the second phase helped mitigate model complexity, verification time increased. Additional modeling tools and technological developments are necessary for model checking to become a more usable technique for HFE. PMID:21572930

  4. Recovering from "amnesia" brought about by radiation. Verification of the "Over the air" (OTA) application software update mechanism On-Board Solar Orbiter's Energetic Particle Detector

    NASA Astrophysics Data System (ADS)

    Da Silva, Antonio; Sánchez Prieto, Sebastián; Rodriguez Polo, Oscar; Parra Espada, Pablo

    Computer memories are not supposed to forget, but they do. Because of the proximity of the Sun, from the Solar Orbiter boot software perspective, it is mandatory to look out for permanent memory errors resulting from (SEL) latch-up failures in application binaries stored in EEPROM and its SDRAM deployment areas. In this situation, the last line in defense established by FDIR mechanisms is the capability of the boot software to provide an accurate report of the memories’ damages and to perform an application software update, that avoid the harmed locations by flashing EEPROM with a new binary. This paper describes the OTA EEPROM firmware update procedure verification of the boot software that will run in the Instrument Control Unit (ICU) of the Energetic Particle Detector (EPD) on-board Solar Orbiter. Since the maximum number of rewrites on real EEPROM is limited and permanent memory faults cannot be friendly emulated in real hardware, the verification has been accomplished by the use of a LEON2 Virtual Platform (Leon2ViP) with fault injection capabilities and real SpaceWire interfaces developed by the Space Research Group (SRG) of the University of Alcalá. This way it is possible to run the exact same target binary software as if was run on the real ICU platform. Furthermore, the use of this virtual hardware-in-the-loop (VHIL) approach makes it possible to communicate with Electrical Ground Support Equipment (EGSE) through real SpaceWire interfaces in an agile, controlled and deterministic environment.

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: Stormwater Source Area Treatment Device - Arkal Pressurized Stormwater Filtration System

    EPA Science Inventory

    Performance verification testing of the Arkal Pressurized Stormwater Filtration System was conducted under EPA's Environmental Technology Verification Program on a 5.5-acre parking lot and grounds of St. Mary's Hospital in Milwaukee, Wisconsin. The system consists of a water sto...

  6. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6282 Filtration Media (Tested March - April 2011)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  7. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6277 Filtration Media (Tested March 2011)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  8. Environmental Technology Verification: Baghouse filtration products--W.L. Gore & Associates L3650 filtration media (tested November--December 2009)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  9. Environmental Technology Verification Report: Baghouse Filtration Products, Donaldson Company, Inc. Tetratex® 6262 Filtration Media (Tested March 2011)

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  10. 47 CFR 25.132 - Verification of earth station antenna performance standards.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 2 2014-10-01 2014-10-01 false Verification of earth station antenna... Verification of earth station antenna performance standards. (a)(1) Except for applications for 20/30 GHz earth... the antenna manufacturer on representative equipment in representative configurations, and the test...

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION, TEST REPORT OF CONTROL OF BIOAEROSOLS IN HVAC SYSTEMS, COLUMBUS INDUSTRIES HIGH EFFICIENCY MINI PLEAT

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  12. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    NASA Astrophysics Data System (ADS)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: ENVIROFUELS DIESEL FUEL CATALYZER FUEL ADDITIVE

    EPA Science Inventory

    EPA's Environmental Technology Verification Program has tested EnviroFuels diesel fuel additive, called the Diesel Fuel Catalyzer. EnviroFuels has stated that heavy-duty on and off road diesel engines are the intended market for the catalyzer. Preliminary tests conducted indicate...

  14. Test/QA Plan for Verification of Nitrate Sensors for Groundwater Remediation Monitoring

    EPA Science Inventory

    A submersible nitrate sensor is capable of collecting in-situ measurements of dissolved nitrate concentrations in groundwater. Although several types of nitrate sensors currently exist, this verification test will focus on submersible sensors equipped with a nitrate-specific ion...

  15. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF FOUR IMMUNOASSAY TEST KITS

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, beginning as an initiative of the U.S. Environmental Protection Agency (EPA) in 1995, verifies the performance of commercially available, innovative technologies that can be used to measure environmental quality. The ETV p...

  16. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  17. Verification and transfer of thermal pollution model. Volume 6: User's manual for 1-dimensional numerical model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Nwadike, E. V.

    1982-01-01

    The six-volume report: describes the theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth (e.g., natural or man-made inland lakes) because surface elevation has been removed as a parameter.

  18. A chronic generalized bi-directional brain-machine interface.

    PubMed

    Rouse, A G; Stanslaski, S R; Cong, P; Jensen, R M; Afshar, P; Ullestad, D; Gupta, R; Molnar, G F; Moran, D W; Denison, T J

    2011-06-01

    A bi-directional neural interface (NI) system was designed and prototyped by incorporating a novel neural recording and processing subsystem into a commercial neural stimulator architecture. The NI system prototype leverages the system infrastructure from an existing neurostimulator to ensure reliable operation in a chronic implantation environment. In addition to providing predicate therapy capabilities, the device adds key elements to facilitate chronic research, such as four channels of electrocortigram/local field potential amplification and spectral analysis, a three-axis accelerometer, algorithm processing, event-based data logging, and wireless telemetry for data uploads and algorithm/configuration updates. The custom-integrated micropower sensor and interface circuits facilitate extended operation in a power-limited device. The prototype underwent significant verification testing to ensure reliability, and meets the requirements for a class CF instrument per IEC-60601 protocols. The ability of the device system to process and aid in classifying brain states was preclinically validated using an in vivo non-human primate model for brain control of a computer cursor (i.e. brain-machine interface or BMI). The primate BMI model was chosen for its ability to quantitatively measure signal decoding performance from brain activity that is similar in both amplitude and spectral content to other biomarkers used to detect disease states (e.g. Parkinson's disease). A key goal of this research prototype is to help broaden the clinical scope and acceptance of NI techniques, particularly real-time brain state detection. These techniques have the potential to be generalized beyond motor prosthesis, and are being explored for unmet needs in other neurological conditions such as movement disorders, stroke and epilepsy.

  19. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  20. Modal Testing of Seven Shuttle Cargo Elements for Space Station

    NASA Technical Reports Server (NTRS)

    Kappus, Kathy O.; Driskill, Timothy C.; Parks, Russel A.; Patterson, Alan (Technical Monitor)

    2001-01-01

    From December 1996 to May 2001, the Modal and Control Dynamics Team at NASA's Marshall Space Flight Center (MSFC) conducted modal tests on seven large elements of the International Space Station. Each of these elements has been or will be launched as a Space Shuttle payload for transport to the International Space Station (ISS). Like other Shuttle payloads, modal testing of these elements was required for verification of the finite element models used in coupled loads analyses for launch and landing. The seven modal tests included three modules - Node, Laboratory, and Airlock, and four truss segments - P6, P3/P4, S1/P1, and P5. Each element was installed and tested in the Shuttle Payload Modal Test Bed at MSFC. This unique facility can accommodate any Shuttle cargo element for modal test qualification. Flexure assemblies were utilized at each Shuttle-to-payload interface to simulate a constrained boundary in the load carrying degrees of freedom. For each element, multiple-input, multiple-output burst random modal testing was the primary approach with controlled input sine sweeps for linearity assessments. The accelerometer channel counts ranged from 252 channels to 1251 channels. An overview of these tests, as well as some lessons learned, will be provided in this paper.

  1. Design and evaluation of a wireless sensor network based aircraft strength testing system.

    PubMed

    Wu, Jian; Yuan, Shenfang; Zhou, Genyuan; Ji, Sai; Wang, Zilong; Wang, Yang

    2009-01-01

    The verification of aerospace structures, including full-scale fatigue and static test programs, is essential for structure strength design and evaluation. However, the current overall ground strength testing systems employ a large number of wires for communication among sensors and data acquisition facilities. The centralized data processing makes test programs lack efficiency and intelligence. Wireless sensor network (WSN) technology might be expected to address the limitations of cable-based aeronautical ground testing systems. This paper presents a wireless sensor network based aircraft strength testing (AST) system design and its evaluation on a real aircraft specimen. In this paper, a miniature, high-precision, and shock-proof wireless sensor node is designed for multi-channel strain gauge signal conditioning and monitoring. A cluster-star network topology protocol and application layer interface are designed in detail. To verify the functionality of the designed wireless sensor network for strength testing capability, a multi-point WSN based AST system is developed for static testing of a real aircraft undercarriage. Based on the designed wireless sensor nodes, the wireless sensor network is deployed to gather, process, and transmit strain gauge signals and monitor results under different static test loads. This paper shows the efficiency of the wireless sensor network based AST system, compared to a conventional AST system.

  2. Design and Evaluation of a Wireless Sensor Network Based Aircraft Strength Testing System

    PubMed Central

    Wu, Jian; Yuan, Shenfang; Zhou, Genyuan; Ji, Sai; Wang, Zilong; Wang, Yang

    2009-01-01

    The verification of aerospace structures, including full-scale fatigue and static test programs, is essential for structure strength design and evaluation. However, the current overall ground strength testing systems employ a large number of wires for communication among sensors and data acquisition facilities. The centralized data processing makes test programs lack efficiency and intelligence. Wireless sensor network (WSN) technology might be expected to address the limitations of cable-based aeronautical ground testing systems. This paper presents a wireless sensor network based aircraft strength testing (AST) system design and its evaluation on a real aircraft specimen. In this paper, a miniature, high-precision, and shock-proof wireless sensor node is designed for multi-channel strain gauge signal conditioning and monitoring. A cluster-star network topology protocol and application layer interface are designed in detail. To verify the functionality of the designed wireless sensor network for strength testing capability, a multi-point WSN based AST system is developed for static testing of a real aircraft undercarriage. Based on the designed wireless sensor nodes, the wireless sensor network is deployed to gather, process, and transmit strain gauge signals and monitor results under different static test loads. This paper shows the efficiency of the wireless sensor network based AST system, compared to a conventional AST system. PMID:22408521

  3. JPL control/structure interaction test bed real-time control computer architecture

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1989-01-01

    The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.

  4. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  5. Design and Calibration of a Flowfield Survey Rake for Inlet Flight Research

    NASA Technical Reports Server (NTRS)

    Flynn, Darin C.; Ratnayake, Nalin A.; Frederick, Michael

    2009-01-01

    The Propulsion Flight Test Fixture at the NASA Dryden Flight Research Center is a unique test platform available for use on NASA's F-15B aircraft, tail number 836, as a modular host for a variety of aerodynamics and propulsion research. For future flight data from this platform to be valid, more information must be gathered concerning the quality of the airflow underneath the body of the F-15B at various flight conditions, especially supersonic conditions. The flow angularity and Mach number must be known at multiple locations on any test article interface plane for measurement data at these locations to be valid. To determine this prerequisite information, flight data will be gathered in the Rake Airflow Gauge Experiment using a custom-designed flowfield rake to probe the airflow underneath the F-15B at the desired flight conditions. This paper addresses the design considerations of the rake and probe assembly, including the loads and stress analysis using analytical methods, computational fluid dynamics, and finite element analysis. It also details the flow calibration procedure, including the completed wind-tunnel test and posttest data reduction, calibration verification, and preparation for flight-testing.

  6. 46 CFR 109.227 - Verification of vessel compliance with applicable stability requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 4 2011-10-01 2011-10-01 false Verification of vessel compliance with applicable stability requirements. 109.227 Section 109.227 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS Tests, Drills, and Inspections § 109.227 Verification...

  7. 46 CFR 109.227 - Verification of vessel compliance with applicable stability requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 46 Shipping 4 2010-10-01 2010-10-01 false Verification of vessel compliance with applicable stability requirements. 109.227 Section 109.227 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS Tests, Drills, and Inspections § 109.227 Verification...

  8. 46 CFR 109.227 - Verification of vessel compliance with applicable stability requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Verification of vessel compliance with applicable stability requirements. 109.227 Section 109.227 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS Tests, Drills, and Inspections § 109.227 Verification...

  9. 46 CFR 109.227 - Verification of vessel compliance with applicable stability requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Verification of vessel compliance with applicable stability requirements. 109.227 Section 109.227 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS Tests, Drills, and Inspections § 109.227 Verification...

  10. 46 CFR 109.227 - Verification of vessel compliance with applicable stability requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Verification of vessel compliance with applicable stability requirements. 109.227 Section 109.227 Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) A-MOBILE OFFSHORE DRILLING UNITS OPERATIONS Tests, Drills, and Inspections § 109.227 Verification...

  11. 46 CFR 131.513 - Verification of compliance with applicable stability requirements.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 46 Shipping 4 2013-10-01 2013-10-01 false Verification of compliance with applicable stability...) OFFSHORE SUPPLY VESSELS OPERATIONS Tests, Drills, and Inspections § 131.513 Verification of compliance with applicable stability requirements. (a) After loading but before departure, and at other times necessary to...

  12. 46 CFR 131.513 - Verification of compliance with applicable stability requirements.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 46 Shipping 4 2014-10-01 2014-10-01 false Verification of compliance with applicable stability...) OFFSHORE SUPPLY VESSELS OPERATIONS Tests, Drills, and Inspections § 131.513 Verification of compliance with applicable stability requirements. (a) After loading but before departure, and at other times necessary to...

  13. 46 CFR 131.513 - Verification of compliance with applicable stability requirements.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 46 Shipping 4 2012-10-01 2012-10-01 false Verification of compliance with applicable stability...) OFFSHORE SUPPLY VESSELS OPERATIONS Tests, Drills, and Inspections § 131.513 Verification of compliance with applicable stability requirements. (a) After loading but before departure, and at other times necessary to...

  14. Environmental Technology Verification Program Materials Management and Remediation Center Generic Protocol for Verification of In Situ Chemical Oxidation

    EPA Science Inventory

    The protocol provides generic procedures for implementing a verification test for the performance of in situ chemical oxidation (ISCO), focused specifically to expand the application of ISCO at manufactured gas plants with polyaromatic hydrocarbon (PAH) contamination (MGP/PAH) an...

  15. Optimum-AIV: A planning and scheduling system for spacecraft AIV

    NASA Technical Reports Server (NTRS)

    Arentoft, M. M.; Fuchs, Jens J.; Parrod, Y.; Gasquet, Andre; Stader, J.; Stokes, I.; Vadon, H.

    1991-01-01

    A project undertaken for the European Space Agency (ESA) is presented. The project is developing a knowledge based software system for planning and scheduling of activities for spacecraft assembly, integration, and verification (AIV). The system extends into the monitoring of plan execution and the plan repair phase. The objectives are to develop an operational kernel of a planning, scheduling, and plan repair tool, called OPTIMUM-AIV, and to provide facilities which will allow individual projects to customize the kernel to suit its specific needs. The kernel shall consist of a set of software functionalities for assistance in initial specification of the AIV plan, in verification and generation of valid plans and schedules for the AIV activities, and in interactive monitoring and execution problem recovery for the detailed AIV plans. Embedded in OPTIMUM-AIV are external interfaces which allow integration with alternative scheduling systems and project databases. The current status of the OPTIMUM-AIV project, as of Jan. 1991, is that a further analysis of the AIV domain has taken place through interviews with satellite AIV experts, a software requirement document (SRD) for the full operational tool was approved, and an architectural design document (ADD) for the kernel excluding external interfaces is ready for review.

  16. Application of software technology to automatic test data analysis

    NASA Technical Reports Server (NTRS)

    Stagner, J. R.

    1991-01-01

    The verification process for a major software subsystem was partially automated as part of a feasibility demonstration. The methods employed are generally useful and applicable to other types of subsystems. The effort resulted in substantial savings in test engineer analysis time and offers a method for inclusion of automatic verification as a part of regression testing.

  17. 40 CFR 1065.362 - Non-stoichiometric raw exhaust FID O2 interference verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... air source during testing, use zero air as the FID burner's air source for this verification. (4) Zero the FID analyzer using the zero gas used during emission testing. (5) Span the FID analyzer using a span gas that you use during emission testing. (6) Check the zero response of the FID analyzer using...

  18. 40 CFR 86.1847-01 - Manufacturer in-use verification and in-use confirmatory testing; submittal of information and...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... laboratory equipment calibrations and verifications as prescribed by subpart B of this part or by good... in-use confirmatory testing; submittal of information and maintenance of records. 86.1847-01 Section... confirmatory testing; submittal of information and maintenance of records. (a) The manufacturer who conducts or...

  19. Developing a Test for Assessing Elementary Students' Comprehension of Science Texts

    ERIC Educational Resources Information Center

    Wang, Jing-Ru; Chen, Shin-Feng; Tsay, Reuy-Fen; Chou, Ching-Ting; Lin, Sheau-Wen; Kao, Huey-Lien

    2012-01-01

    This study reports on the process of developing a test to assess students' reading comprehension of scientific materials and on the statistical results of the verification study. A combination of classic test theory and item response theory approaches was used to analyze the assessment data from a verification study. Data analysis indicates the…

  20. Strategic planning for the International Space Station

    NASA Technical Reports Server (NTRS)

    Griner, Carolyn S.

    1990-01-01

    The concept for utilization and operations planning for the International Space Station Freedom was developed in a NASA Space Station Operations Task Force in 1986. Since that time the concept has been further refined to definitize the process and products required to integrate the needs of the international user community with the operational capabilities of the Station in its evolving configuration. The keystone to the process is the development of individual plans by the partners, with the parameters and formats common to the degree that electronic communications techniques can be effectively utilized, while maintaining the proper level and location of configuration control. The integration, evaluation, and verification of the integrated plan, called the Consolidated Operations and Utilization Plan (COUP), is being tested in a multilateral environment to prove out the parameters, interfaces, and process details necessary to produce the first COUP for Space Station in 1991. This paper will describe the concept, process, and the status of the multilateral test case.

  1. ELRIS2D: A MATLAB Package for the 2D Inversion of DC Resistivity/IP Data

    NASA Astrophysics Data System (ADS)

    Akca, Irfan

    2016-04-01

    ELRIS2D is an open source code written in MATLAB for the two-dimensional inversion of direct current resistivity (DCR) and time domain induced polarization (IP) data. The user interface of the program is designed for functionality and ease of use. All available settings of the program can be reached from the main window. The subsurface is discre-tized using a hybrid mesh generated by the combination of structured and unstructured meshes, which reduces the computational cost of the whole inversion procedure. The inversion routine is based on the smoothness constrained least squares method. In order to verify the program, responses of two test models and field data sets were inverted. The models inverted from the synthetic data sets are consistent with the original test models in both DC resistivity and IP cases. A field data set acquired in an archaeological site is also used for the verification of outcomes of the program in comparison with the excavation results.

  2. Online Learning Flight Control for Intelligent Flight Control Systems (IFCS)

    NASA Technical Reports Server (NTRS)

    Niewoehner, Kevin R.; Carter, John (Technical Monitor)

    2001-01-01

    The research accomplishments for the cooperative agreement 'Online Learning Flight Control for Intelligent Flight Control Systems (IFCS)' include the following: (1) previous IFC program data collection and analysis; (2) IFC program support site (configured IFC systems support network, configured Tornado/VxWorks OS development system, made Configuration and Documentation Management Systems Internet accessible); (3) Airborne Research Test Systems (ARTS) II Hardware (developed hardware requirements specification, developing environmental testing requirements, hardware design, and hardware design development); (4) ARTS II software development laboratory unit (procurement of lab style hardware, configured lab style hardware, and designed interface module equivalent to ARTS II faceplate); (5) program support documentation (developed software development plan, configuration management plan, and software verification and validation plan); (6) LWR algorithm analysis (performed timing and profiling on algorithm); (7) pre-trained neural network analysis; (8) Dynamic Cell Structures (DCS) Neural Network Analysis (performing timing and profiling on algorithm); and (9) conducted technical interchange and quarterly meetings to define IFC research goals.

  3. Adaptation of a software development methodology to the implementation of a large-scale data acquisition and control system. [for Deep Space Network

    NASA Technical Reports Server (NTRS)

    Madrid, G. A.; Westmoreland, P. T.

    1983-01-01

    A progress report is presented on a program to upgrade the existing NASA Deep Space Network in terms of a redesigned computer-controlled data acquisition system for channelling tracking, telemetry, and command data between a California-based control center and three signal processing centers in Australia, California, and Spain. The methodology for the improvements is oriented towards single subsystem development with consideration for a multi-system and multi-subsystem network of operational software. Details of the existing hardware configurations and data transmission links are provided. The program methodology includes data flow design, interface design and coordination, incremental capability availability, increased inter-subsystem developmental synthesis and testing, system and network level synthesis and testing, and system verification and validation. The software has been implemented thus far to a 65 percent completion level, and the methodology being used to effect the changes, which will permit enhanced tracking and communication with spacecraft, has been concluded to feature effective techniques.

  4. Using Web 2.0 Techniques in NASA's Ares Engineering Operations Network (AEON) Environment - First Impressions

    NASA Technical Reports Server (NTRS)

    Scott, David W.

    2010-01-01

    The Mission Operations Laboratory (MOL) at Marshall Space Flight Center (MSFC) is responsible for Engineering Support capability for NASA s Ares rocket development and operations. In pursuit of this, MOL is building the Ares Engineering and Operations Network (AEON), a web-based portal to support and simplify two critical activities: Access and analyze Ares manufacturing, test, and flight performance data, with access to Shuttle data for comparison Establish and maintain collaborative communities within the Ares teams/subteams and with other projects, e.g., Space Shuttle, International Space Station (ISS). AEON seeks to provide a seamless interface to a) locally developed engineering applications and b) a Commercial-Off-The-Shelf (COTS) collaborative environment that includes Web 2.0 capabilities, e.g., blogging, wikis, and social networking. This paper discusses how Web 2.0 might be applied to the typically conservative engineering support arena, based on feedback from Integration, Verification, and Validation (IV&V) testing and on searching for their use in similar environments.

  5. Design and Verification of Critical Pressurised Windows for Manned Spaceflight

    NASA Astrophysics Data System (ADS)

    Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.

    2014-06-01

    The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.

  6. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  7. ETV REPORT AND VERIFICATION STATEMENT; EVALUATION OF LOBO LIQUIDS RINSE WATER RECOVERY SYSTEM

    EPA Science Inventory

    The Lobo Liquids Rinse Water Recovery System (Lobo Liquids system) was tested, under actual production conditions, processing metal finishing wastewater, at Gull Industries in Houston, Texas. The verification test evaluated the ability of the ion exchange (IX) treatment system t...

  8. Long-Term Pavement Performance Materials Characterization Program: Verification of Dynamic Test Systems with an Emphasis on Resilient Modulus

    DOT National Transportation Integrated Search

    2005-09-01

    This document describes a procedure for verifying a dynamic testing system (closed-loop servohydraulic). The procedure is divided into three general phases: (1) electronic system performance verification, (2) calibration check and overall system perf...

  9. High-order shock-fitted detonation propagation in high explosives

    NASA Astrophysics Data System (ADS)

    Romick, Christopher M.; Aslam, Tariq D.

    2017-03-01

    A highly accurate numerical shock and material interface fitting scheme composed of fifth-order spatial and third- or fifth-order temporal discretizations is applied to the two-dimensional reactive Euler equations in both slab and axisymmetric geometries. High rates of convergence are not typically possible with shock-capturing methods as the Taylor series analysis breaks down in the vicinity of discontinuities. Furthermore, for typical high explosive (HE) simulations, the effects of material interfaces at the charge boundary can also cause significant computational errors. Fitting a computational boundary to both the shock front and material interface (i.e. streamline) alleviates the computational errors associated with captured shocks and thus opens up the possibility of high rates of convergence for multi-dimensional shock and detonation flows. Several verification tests, including a Sedov blast wave, a Zel'dovich-von Neumann-Döring (ZND) detonation wave, and Taylor-Maccoll supersonic flow over a cone, are utilized to demonstrate high rates of convergence to nontrivial shock and reaction flows. Comparisons to previously published shock-capturing multi-dimensional detonations in a polytropic fluid with a constant adiabatic exponent (PF-CAE) are made, demonstrating significantly lower computational error for the present shock and material interface fitting method. For an error on the order of 10 m /s, which is similar to that observed in experiments, shock-fitting offers a computational savings on the order of 1000. In addition, the behavior of the detonation phase speed is examined for several slab widths to evaluate the detonation performance of PBX 9501 while utilizing the Wescott-Stewart-Davis (WSD) model, which is commonly used in HE modeling. It is found that the thickness effect curve resulting from this equation of state and reaction model using published values is dramatically more steep than observed in recent experiments. Utilizing the present fitting strategy, in conjunction with a nonlinear optimizer, a new set of reaction rate parameters improves the correlation of the model to experimental results. Finally, this new model is tested against two dimensional slabs as a validation test.

  10. A study of compositional verification based IMA integration method

    NASA Astrophysics Data System (ADS)

    Huang, Hui; Zhang, Guoquan; Xu, Wanmeng

    2018-03-01

    The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.

  11. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  12. A Software Defined Radio Based Airplane Communication Navigation Simulation System

    NASA Astrophysics Data System (ADS)

    He, L.; Zhong, H. T.; Song, D.

    2018-01-01

    Radio communication and navigation system plays important role in ensuring the safety of civil airplane in flight. Function and performance should be tested before these systems are installed on-board. Conventionally, a set of transmitter and receiver are needed for each system, thus all the equipment occupy a lot of space and are high cost. In this paper, software defined radio technology is applied to design a common hardware communication and navigation ground simulation system, which can host multiple airplane systems with different operating frequency, such as HF, VHF, VOR, ILS, ADF, etc. We use a broadband analog frontend hardware platform, universal software radio peripheral (USRP), to transmit/receive signal of different frequency band. Software is compiled by LabVIEW on computer, which interfaces with USRP through Ethernet, and is responsible for communication and navigation signal processing and system control. An integrated testing system is established to perform functional test and performance verification of the simulation signal, which demonstrate the feasibility of our design. The system is a low-cost and common hardware platform for multiple airplane systems, which provide helpful reference for integrated avionics design.

  13. Angulated Dental Implants in Posterior Maxilla FEA and Experimental Verification

    PubMed Central

    Hamed, Hamed A.; Marzook, Hamdy A.; Ghoneem, Nahed E.; El–Anwar, Mohamed I.

    2018-01-01

    AIM: This study aimed to evaluate the effect of different implant angulations in posterior maxilla on stress distribution by finite element analysis and verify its results experimentally. METHODS: Two simplified models were prepared for an implant placed vertically and tilted 25° piercing the maxillary sinus. Geometric models’ components were prepared by Autodesk Inventor then assembled in ANSYS for finite element analysis. The results of finite element analysis were verified against experimental trials results which were statistically analysed using student t-test (level of significance p < 0.05). RESULTS: Implant - abutment complex absorbed the load energy in case of vertical implant better than the case of angulated one. That was reflected on cortical bone stress, while both cases showed stress levels within the physiological limits. Comparing results between FEA and experiment trials showed full agreement. CONCLUSION: It was found that the tilted implant by 25° can be utilised in the posterior region maxilla for replacing maxillary first molar avoiding sinus penetration. The implant-bone interface and peri-implant bones received the highest Von Mises stress. Implant - bone interface with angulated implant received about 66% more stresses than the straight one. PMID:29531612

  14. Prototype test article verification of the Space Station Freedom active thermal control system microgravity performance

    NASA Technical Reports Server (NTRS)

    Chen, I. Y.; Ungar, E. K.; Lee, D. Y.; Beckstrom, P. S.

    1993-01-01

    To verify the on-orbit operation of the Space Station Freedom (SSF) two-phase external Active Thermal Control System (ATCS), a test and verification program will be performed prior to flight. The first system level test of the ATCS is the Prototype Test Article (PTA) test that will be performed in early 1994. All ATCS loops will be represented by prototypical components and the line sizes and lengths will be representative of the flight system. In this paper, the SSF ATCS and a portion of its verification process are described. The PTA design and the analytical methods that were used to quantify the gravity effects on PTA operation are detailed. Finally, the gravity effects are listed, and the applicability of the 1-g PTA test results to the validation of on-orbit ATCS operation is discussed.

  15. 40 CFR 1065.545 - Verification of proportional flow control for batch sampling.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... control for batch sampling. 1065.545 Section 1065.545 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Performing an Emission Test Over Specified Duty Cycles § 1065.545 Verification of proportional flow control for batch sampling. For any...

  16. 42 CFR 493.1253 - Standard: Establishment and verification of performance specifications.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 5 2010-10-01 2010-10-01 false Standard: Establishment and verification of..., DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STANDARDS AND CERTIFICATION LABORATORY REQUIREMENTS... of test results for the test system. (vi) Reference intervals (normal values). (vii) Any other...

  17. Second order upwind Lagrangian particle method for Euler equations

    DOE PAGES

    Samulyak, Roman; Chen, Hsin -Chiang; Yu, Kwangmin

    2016-06-01

    A new second order upwind Lagrangian particle method for solving Euler equations for compressible inviscid fluid or gas flows is proposed. Similar to smoothed particle hydrodynamics (SPH), the method represents fluid cells with Lagrangian particles and is suitable for the simulation of complex free surface / multiphase flows. The main contributions of our method, which is different from SPH in all other aspects, are (a) significant improvement of approximation of differential operators based on a polynomial fit via weighted least squares approximation and the convergence of prescribed order, (b) an upwind second-order particle-based algorithm with limiter, providing accuracy and longmore » term stability, and (c) accurate resolution of states at free interfaces. In conclusion, numerical verification tests demonstrating the convergence order for fixed domain and free surface problems are presented.« less

  18. Development of a Software Tool to Automate ADCO Flight Controller Console Planning Tasks

    NASA Technical Reports Server (NTRS)

    Anderson, Mark G.

    2011-01-01

    This independent study project covers the development of the International Space Station (ISS) Attitude Determination and Control Officer (ADCO) Planning Exchange APEX Tool. The primary goal of the tool is to streamline existing manual and time-intensive planning tools into a more automated, user-friendly application that interfaces with existing products and allows the ADCO to produce accurate products and timelines more effectively. This paper will survey the current ISS attitude planning process and its associated requirements, goals, documentation and software tools and how a software tool could simplify and automate many of the planning actions which occur at the ADCO console. The project will be covered from inception through the initial prototype delivery in November 2011 and will include development of design requirements and software as well as design verification and testing.

  19. Second order upwind Lagrangian particle method for Euler equations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Samulyak, Roman; Chen, Hsin -Chiang; Yu, Kwangmin

    A new second order upwind Lagrangian particle method for solving Euler equations for compressible inviscid fluid or gas flows is proposed. Similar to smoothed particle hydrodynamics (SPH), the method represents fluid cells with Lagrangian particles and is suitable for the simulation of complex free surface / multiphase flows. The main contributions of our method, which is different from SPH in all other aspects, are (a) significant improvement of approximation of differential operators based on a polynomial fit via weighted least squares approximation and the convergence of prescribed order, (b) an upwind second-order particle-based algorithm with limiter, providing accuracy and longmore » term stability, and (c) accurate resolution of states at free interfaces. In conclusion, numerical verification tests demonstrating the convergence order for fixed domain and free surface problems are presented.« less

  20. Bias in estimating accuracy of a binary screening test with differential disease verification

    PubMed Central

    Brinton, John T.; Ringham, Brandy M.; Glueck, Deborah H.

    2011-01-01

    SUMMARY Sensitivity, specificity, positive and negative predictive value are typically used to quantify the accuracy of a binary screening test. In some studies it may not be ethical or feasible to obtain definitive disease ascertainment for all subjects using a gold standard test. When a gold standard test cannot be used an imperfect reference test that is less than 100% sensitive and specific may be used instead. In breast cancer screening, for example, follow-up for cancer diagnosis is used as an imperfect reference test for women where it is not possible to obtain gold standard results. This incomplete ascertainment of true disease, or differential disease verification, can result in biased estimates of accuracy. In this paper, we derive the apparent accuracy values for studies subject to differential verification. We determine how the bias is affected by the accuracy of the imperfect reference test, the percent who receive the imperfect reference standard test not receiving the gold standard, the prevalence of the disease, and the correlation between the results for the screening test and the imperfect reference test. It is shown that designs with differential disease verification can yield biased estimates of accuracy. Estimates of sensitivity in cancer screening trials may be substantially biased. However, careful design decisions, including selection of the imperfect reference test, can help to minimize bias. A hypothetical breast cancer screening study is used to illustrate the problem. PMID:21495059

  1. Verification survey report of the south waste tank farm training/test tower and hazardous waste storage lockers at the West Valley demonstration project, West Valley, New York

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, Phyllis C.

    A team from ORAU's Independent Environmental Assessment and Verification Program performed verification survey activities on the South Test Tower and four Hazardous Waste Storage Lockers. Scan data collected by ORAU determined that both the alpha and alpha-plus-beta activity was representative of radiological background conditions. The count rate distribution showed no outliers that would be indicative of alpha or alpha-plus-beta count rates in excess of background. It is the opinion of ORAU that independent verification data collected support the site?s conclusions that the South Tower and Lockers sufficiently meet the site criteria for release to recycle and reuse.

  2. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  3. 76 FR 60829 - Information Collection Being Reviewed by the Federal Communications Commission

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-30

    ... Authorization-Verification (Retention of Records). Form No.: N/A. Type of Review: Extension of a currently... verification, the responsible party, as shown in 47 CFR 2.909 shall maintain the records listed as follows: (1... laboratory, company, or individual performing the verification testing. The Commission may request additional...

  4. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT STORMWATER MANAGEMENT INC., STORMFILTER SYSTEM WITH ZPG MEDIA

    EPA Science Inventory

    Verification testing of the Stormwater Management, Inc. StormFilter Using ZPG Filter Media was conducted on a 0.19 acre portion of the eastbound highway surface of Interstate 794, at an area commonly referred to as the "Riverwalk" site near downtown Milwaukee, Wisconsin...

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, FLINT HILLS RESOURCES, LP, CCD15010 DIESEL FUEL FORMULATION WITH HITEC4121 ADDITIVE

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification (ETV) Program to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. ETV seeks to ach...

  6. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: PROTOCOL FOR THE VERIFICATION OF GROUTING MATERIALS FOR INFRASTRUCTURE REHABILITATION AT THE UNIVERSITY OF HOUSTON - CIGMAT

    EPA Science Inventory

    This protocol was developed under the Environmental Protection Agency's Environmental Technology Verification (ETV) Program, and is intended to be used as a guide in preparing laboratory test plans for the purpose of verifying the performance of grouting materials used for infra...

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, JCH FUEL SOLUTIONS, INC., JCH ENVIRO AUTOMATED FUEL CLEANING AND MAINTENANCE SYSTEM

    EPA Science Inventory

    The verification testing was conducted at the Cl facility in North Las Vegas, NV, on July 17 and 18, 2001. During this period, engine emissions, fuel consumption, and fuel quality were evaluated with contaminated and cleaned fuel.

    To facilitate this verification, JCH repre...

  8. Verification tests of durable TPS concepts

    NASA Technical Reports Server (NTRS)

    Shideler, J. L.; Webb, G. L.; Pittman, C. M.

    1984-01-01

    Titanium multiwall, superalloy honeycomb, and Advanced Carbon-carbon (ACC) multipost Thermal Protection System (TPS) concepts are being developed to provide durable protection for surfaces of future space transportation systems. Verification tests including thermal, vibration, acoustic, water absorption, lightning strike, and aerothermal tests are described. Preliminary results indicate that the three TPS concepts are viable up to a surface temperature in excess of 2300 F.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, REMOVAL OF ARSENIC IN DRINKING WATER: PHASE 1-ADI PILOT TEST UNIT NO. 2002-09 WITH MEDIA G2®

    EPA Science Inventory

    Integrity verification testing of the ADI International Inc. Pilot Test Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8...

  10. Review of waste package verification tests. Semiannual report, October 1982-March 1983

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Soo, P.

    1983-08-01

    The current study is part of an ongoing task to specify tests that may be used to verify that engineered waste package/repository systems comply with NRC radionuclide containment and controlled release performance objectives. Work covered in this report analyzes verification tests for borosilicate glass waste forms and bentonite- and zeolite-based packing mateials (discrete backfills). 76 references.

  11. Total systems design analysis of high performance structures

    NASA Technical Reports Server (NTRS)

    Verderaime, V.

    1993-01-01

    Designer-control parameters were identified at interdiscipline interfaces to optimize structural systems performance and downstream development and operations with reliability and least life-cycle cost. Interface tasks and iterations are tracked through a matrix of performance disciplines integration versus manufacturing, verification, and operations interactions for a total system design analysis. Performance integration tasks include shapes, sizes, environments, and materials. Integrity integrating tasks are reliability and recurring structural costs. Significant interface designer control parameters were noted as shapes, dimensions, probability range factors, and cost. Structural failure concept is presented, and first-order reliability and deterministic methods, benefits, and limitations are discussed. A deterministic reliability technique combining benefits of both is proposed for static structures which is also timely and economically verifiable. Though launch vehicle environments were primarily considered, the system design process is applicable to any surface system using its own unique filed environments.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, HVLP COATING EQUIPMENT, SHARPE MANUFACTURING COMPANY PLATINUM 2012 HVLP SPRAY GUN

    EPA Science Inventory

    This report presents the results of the verification test of the Sharpe Platinum 2013 high-volume, low-pressure gravity-feed spray gun, hereafter referred to as the Sharpe Platinum, which is designed for use in automotive refinishing. The test coating chosen by Sharpe Manufacturi...

  13. EPA/NSF ETV Equipment Verification Testing Plan for the Removal of Volatile Organic Chemical Contaminants by Adsorptive Media Processes

    EPA Science Inventory

    This document is the Environmental Technology Verification (ETV) Technology Specific Test Plan (TSTP) for evaluation of drinking water treatment equipment utilizing adsorptive media for synthetic organic chemical (SOC) removal. This TSTP is to be used within the structure provid...

  14. TEST DESIGN FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF ADD-ON NOX CONTROL UTILIZING OZONE INJECTION

    EPA Science Inventory

    The paper discusses the test design for environmental technology verification (ETV) of add-0n nitrogen oxides (NOx) control utilizing ozone injection. (NOTE: ETV is an EPA-established program to enhance domestic and international market acceptance of new or improved commercially...

  15. STS-2: SAIL non-avionics subsystems math model requirements

    NASA Technical Reports Server (NTRS)

    Bennett, W. P.; Herold, R. W.

    1980-01-01

    Simulation of the STS-2 Shuttle nonavionics subsystems in the shuttle avionics integration laboratory (SAIL) is necessary for verification of the integrated shuttle avionics system. The math model (simulation) requirements for each of the nonavionics subsystems that interfaces with the Shuttle avionics system is documented and a single source document for controlling approved changes (by the SAIL change control panel) to the math models is provided.

  16. MOD-1 Wind Turbine Generator Analysis and Design Report, Volume 2

    NASA Technical Reports Server (NTRS)

    1979-01-01

    The MOD-1 detail design is appended. The supporting analyses presented include a parametric system trade study, a verification of the computer codes used for rotor loads analysis, a metal blade study, and a definition of the design loads at each principal wind turbine generator interface for critical loading conditions. Shipping and assembly requirements, composite blade development, and electrical stability are also discussed.

  17. Virtual Manufacturing (la Fabrication virtuelle)

    DTIC Science & Technology

    1998-05-01

    with moving parts and subassemblies, • verification of product subcomponents and systems operations through kinematics studies, and • realism ...dimensions, parts moved in mechanism based directions, and realism of interaction is increased through use of sound, touch and other parameters. For the...direct converters from CAD systems. A simple cinematic package is also high on the requirement to be able to simulate motions as well as an interface to

  18. Electromagnetic scattering from microwave absorbers - Laboratory verification of the coupled wave theory

    NASA Technical Reports Server (NTRS)

    Gasiewski, A. J.; Jackson, D. M.

    1992-01-01

    W-band measurements of the bistatic scattering function of some common microwave absorbing structures, including periodic wedge-type and pyramid-type iron-epoxy calibration loads and flat carbon-foam 'Echosorb' samples, were made using a network analyzer interface to a focused-lens scattering range. Swept frequency measurements over the 75-100 GHz band revealed specular and Bragg reflection characteristics in the measured data.

  19. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  20. Verification and transfer of thermal pollution model. Volume 4: User's manual for three-dimensional rigid-lid model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Nwadike, E. V.; Sinha, S. E.

    1982-01-01

    The theory of a three dimensional (3-D) mathematical thermal discharge model and a related one dimensional (1-D) model are described. Model verification at two sites, a separate user's manual for each model are included. The 3-D model has two forms: free surface and rigid lid. The former allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth, estuaries and coastal regions. The latter is suited for small surface wave heights compared to depth because surface elevation was removed as a parameter. These models allow computation of time dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions. The free surface model also provides surface height variations with time.

  1. Verification and transfer of thermal pollution model. Volume 2: User's manual for 3-dimensional free-surface model

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Tuann, S. Y.; Lee, C. R.

    1982-01-01

    The six-volume report: describes the theory of a three-dimensional (3-D) mathematical thermal discharge model and a related one-dimensional (1-D) model, includes model verification at two sites, and provides a separate user's manual for each model. The 3-D model has two forms: free surface and rigid lid. The former, verified at Anclote Anchorage (FL), allows a free air/water interface and is suited for significant surface wave heights compared to mean water depth; e.g., estuaries and coastal regions. The latter, verified at Lake Keowee (SC), is suited for small surface wave heights compared to depth. These models allow computation of time-dependent velocity and temperature fields for given initial conditions and time-varying boundary conditions.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PHYSICAL REMOVAL OF PARTICULATE CONTAMINANTS IN DRINKING WATER, AQUASOURCE M1A35 ULTRAFILTRATION MEMBRANE SYSTEM AT AQUA2000 RESEARCH CENTER - NSF 00/03/EPADW395

    EPA Science Inventory

    Verification testing of the Aquasource UF unit ws conducted over two test periods at the Aqua2000 Research Center in San Diego, CA. The first test period, from 3/5 - 4/19/99, represented winter/spring conditons. The second test period, from 8/25 - 9/28/99, represented summer/fall...

  3. Gender verification testing in sport.

    PubMed

    Ferris, E A

    1992-07-01

    Gender verification testing in sport, first introduced in 1966 by the International Amateur Athletic Federation (IAAF) in response to fears that males with a physical advantage in terms of muscle mass and strength were cheating by masquerading as females in women's competition, has led to unfair disqualifications of women athletes and untold psychological harm. The discredited sex chromatin test, which identifies only the sex chromosome component of gender and is therefore misleading, was abandoned in 1991 by the IAAF in favour of medical checks for all athletes, women and men, which preclude the need for gender testing. But, women athletes will still be tested at the Olympic Games at Albertville and Barcelona using polymerase chain reaction (PCR) to amplify DNA sequences on the Y chromosome which identifies genetic sex only. Gender verification testing may in time be abolished when the sporting community are fully cognizant of its scientific and ethical implications.

  4. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  5. Functions of social support and self-verification in association with loneliness, depression, and stress.

    PubMed

    Wright, Kevin B; King, Shawn; Rosenberg, Jenny

    2014-01-01

    This study investigated the influence of social support and self-verification on loneliness, depression, and stress among 477 college students. The authors propose and test a theoretical model using structural equation modeling. The results indicated empirical support for the model, with self-verification mediating the relation between social support and health outcomes. The results have implications for social support and self-verification research, which are discussed along with directions for future research and limitations of the study.

  6. CD volume design and verification

    NASA Technical Reports Server (NTRS)

    Li, Y. P.; Hughes, J. S.

    1993-01-01

    In this paper, we describe a prototype for CD-ROM volume design and verification. This prototype allows users to create their own model of CD volumes by modifying a prototypical model. Rule-based verification of the test volumes can then be performed later on against the volume definition. This working prototype has proven the concept of model-driven rule-based design and verification for large quantity of data. The model defined for the CD-ROM volumes becomes a data model as well as an executable specification.

  7. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT: BROME AGRI SALES, LTD., MAXIMIZER SEPARATOR, MODEL MAX 1016 - 03/01/WQPC-SWP

    EPA Science Inventory

    Verification testing of the Brome Agri Sales Ltd. Maximizer Separator, Model MAX 1016 (Maximizer) was conducted at the Lake Wheeler Road Field Laboratory Swine Educational Unit in Raleigh, North Carolina. The Maximizer is an inclined screen solids separator that can be used to s...

  8. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    EPA Science Inventory

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  9. Direct Verification of School Meal Applications with Medicaid Data: A Pilot Evaluation of Feasibility, Effectiveness and Costs

    ERIC Educational Resources Information Center

    Logan, Christopher W.; Cole, Nancy; Kamara, Sheku G.

    2010-01-01

    Purpose/Objectives: The Direct Verification Pilot tested the feasibility, effectiveness, and costs of using Medicaid and State Children's Health Insurance Program (SCHIP) data to verify applications for free and reduced-price (FRP) school meals instead of obtaining documentation from parents and guardians. Methods: The Direct Verification Pilot…

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION--TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES, CUMMINS EMISSION SOLUTIONS AND CUMMINS FILTRATION DIESEL OXIDATION CATALYST AND CLOSED CRANKCASE VENTILATION SYSTEM

    EPA Science Inventory

    The U.S. EPA has created the Environmental Technology Verification (ETV) Program. ETV seeks to provide high-quality, peer-reviewed data on technology performance. The Air Pollution Control Technology (APCT) Verification Center, a center under the ETV Program, is operated by Res...

  11. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Reports: Final Comprehensive Performance Test Report, P/N: 1356006-1, S.N: 202/A2

    NASA Technical Reports Server (NTRS)

    Platt, R.

    1998-01-01

    This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.

  12. Evaluation of verification and testing tools for FORTRAN programs

    NASA Technical Reports Server (NTRS)

    Smith, K. A.

    1980-01-01

    Two automated software verification and testing systems were developed for use in the analysis of computer programs. An evaluation of the static analyzer DAVE and the dynamic analyzer PET, which are used in the analysis of FORTRAN programs on Control Data (CDC) computers, are described. Both systems were found to be effective and complementary, and are recommended for use in testing FORTRAN programs.

  13. Verification and classification bias interactions in diagnostic test accuracy studies for fine-needle aspiration biopsy.

    PubMed

    Schmidt, Robert L; Walker, Brandon S; Cohen, Michael B

    2015-03-01

    Reliable estimates of accuracy are important for any diagnostic test. Diagnostic accuracy studies are subject to unique sources of bias. Verification bias and classification bias are 2 sources of bias that commonly occur in diagnostic accuracy studies. Statistical methods are available to estimate the impact of these sources of bias when they occur alone. The impact of interactions when these types of bias occur together has not been investigated. We developed mathematical relationships to show the combined effect of verification bias and classification bias. A wide range of case scenarios were generated to assess the impact of bias components and interactions on total bias. Interactions between verification bias and classification bias caused overestimation of sensitivity and underestimation of specificity. Interactions had more effect on sensitivity than specificity. Sensitivity was overestimated by at least 7% in approximately 6% of the tested scenarios. Specificity was underestimated by at least 7% in less than 0.1% of the scenarios. Interactions between verification bias and classification bias create distortions in accuracy estimates that are greater than would be predicted from each source of bias acting independently. © 2014 American Cancer Society.

  14. Control structural interaction testbed: A model for multiple flexible body verification

    NASA Technical Reports Server (NTRS)

    Chory, M. A.; Cohen, A. L.; Manning, R. A.; Narigon, M. L.; Spector, V. A.

    1993-01-01

    Conventional end-to-end ground tests for verification of control system performance become increasingly complicated with the development of large, multiple flexible body spacecraft structures. The expense of accurately reproducing the on-orbit dynamic environment and the attendant difficulties in reducing and accounting for ground test effects limits the value of these tests. TRW has developed a building block approach whereby a combination of analysis, simulation, and test has replaced end-to-end performance verification by ground test. Tests are performed at the component, subsystem, and system level on engineering testbeds. These tests are aimed at authenticating models to be used in end-to-end performance verification simulations: component and subassembly engineering tests and analyses establish models and critical parameters, unit level engineering and acceptance tests refine models, and subsystem level tests confirm the models' overall behavior. The Precision Control of Agile Spacecraft (PCAS) project has developed a control structural interaction testbed with a multibody flexible structure to investigate new methods of precision control. This testbed is a model for TRW's approach to verifying control system performance. This approach has several advantages: (1) no allocation for test measurement errors is required, increasing flight hardware design allocations; (2) the approach permits greater latitude in investigating off-nominal conditions and parametric sensitivities; and (3) the simulation approach is cost effective, because the investment is in understanding the root behavior of the flight hardware and not in the ground test equipment and environment.

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DEVILBISS JGHV-531-46FF HVLP SPRAY GUN

    EPA Science Inventory

    This report presents the results of the verification test of the DeVilbiss JGHV-531-46FF high-volume, low-pressure pressure-feed spray gun, hereafter referred to as the DeVilbiss JGHV, which is designed for use in industrial finishing. The test coating chosen by ITW Industrial Fi...

  16. 40 CFR 1065.308 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers not...

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... which you sample and record gas-analyzer concentrations. (b) Measurement principles. This test verifies... appropriate frequency to prevent loss of information. This test also verifies that the measurement system... instructions. Adjust the measurement system as needed to optimize performance. Run this verification with the...

  17. The Verification-based Analysis of Reliable Multicast Protocol

    NASA Technical Reports Server (NTRS)

    Wu, Yunqing

    1996-01-01

    Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.

  18. Land Ice Verification and Validation Kit

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and testmore » data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  19. ON-LINE MONITORING OF I&C TRANSMITTERS AND SENSORS FOR CALIBRATION VERIFICATION AND RESPONSE TIME TESTING WAS SUCCESSFULLY IMPLEMENTED AT ATR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Phillip A.; O'Hagan, Ryan; Shumaker, Brent

    The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carriedmore » out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.« less

  20. Development of CFC-Free Cleaning Processes at the NASA White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Beeson, Harold; Kirsch, Mike; Hornung, Steven; Biesinger, Paul

    1995-01-01

    The NASA White Sands Test Facility (WSTF) is developing cleaning and verification processes to replace currently used chlorofluorocarbon-113- (CFC-113-) based processes. The processes being evaluated include both aqueous- and solvent-based techniques. The presentation will include the findings of investigations of aqueous cleaning and verification processes that are based on a draft of a proposed NASA Kennedy Space Center (KSC) cleaning procedure. Verification testing with known contaminants, such as hydraulic fluid and commonly used oils, established correlations between nonvolatile residue and CFC-113. Recoveries ranged from 35 to 60 percent of theoretical. WSTF is also investigating enhancements to aqueous sampling for organics and particulates. Although aqueous alternatives have been identified for several processes, a need still exists for nonaqueous solvent cleaning, such as the cleaning and cleanliness verification of gauges used for oxygen service. The cleaning effectiveness of tetrachloroethylene (PCE), trichloroethylene (TCE), ethanol, hydrochlorofluorocarbon-225 (HCFC-225), tert-butylmethylether, and n-Hexane was evaluated using aerospace gauges and precision instruments and then compared to the cleaning effectiveness of CFC-113. Solvents considered for use in oxygen systems were also tested for oxygen compatibility using high-pressure oxygen autoignition and liquid oxygen mechanical impact testing.

Top