Guidelines for mission integration, a summary report
NASA Technical Reports Server (NTRS)
1979-01-01
Guidelines are presented for instrument/experiment developers concerning hardware design, flight verification, and operations and mission implementation requirements. Interface requirements between the STS and instruments/experiments are defined. Interface constraints and design guidelines are presented along with integrated payload requirements for Spacelab Missions 1, 2, and 3. Interim data are suggested for use during hardware development until more detailed information is developed when a complete mission and an integrated payload system are defined. Safety requirements, flight verification requirements, and operations procedures are defined.
NASA Astrophysics Data System (ADS)
Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.
2006-03-01
By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.
As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fitzgerald, T.J.; Carlos, R.C.; Argo, P.E.
As part of the integrated verification experiment (IVE), we deployed a network of hf ionospheric sounders to detect the effects of acoustic waves generated by surface ground motion following underground nuclear tests at the Nevada Test Site. The network sampled up to four geographic locations in the ionosphere from almost directly overhead of the surface ground zero out to a horizontal range of 60 km. We present sample results for four of the IVEs: Misty Echo, Texarkana, Mineral Quarry, and Bexar.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Weaver, T.A.; Baker, D.F.; Edwards, C.L.
1993-10-01
Surface ground motion was recorded for many of the Integrated Verification Experiments using standard 10-, 25- and 100-g accelerometers, force-balanced accelerometers and, for some events, using golf balls and 0.39-cm steel balls as surface inertial gauges (SIGs). This report contains the semi-processed acceleration, velocity, and displacement data for the accelerometers fielded and the individual observations for the SIG experiments. Most acceleration, velocity, and displacement records have had calibrations applied and have been deramped, offset corrected, and deglitched but are otherwise unfiltered or processed from their original records. Digital data for all of these records are stored at Los Alamos Nationalmore » Laboratory.« less
Structural verification for GAS experiments
NASA Technical Reports Server (NTRS)
Peden, Mark Daniel
1992-01-01
The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.
Definition of ground test for Large Space Structure (LSS) control verification
NASA Technical Reports Server (NTRS)
Waites, H. B.; Doane, G. B., III; Tollison, D. K.
1984-01-01
An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.
78 FR 32010 - Pipeline Safety: Public Workshop on Integrity Verification Process
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-28
.... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process AGENCY: Pipeline and... announcing a public workshop to be held on the concept of ``Integrity Verification Process.'' The Integrity Verification Process shares similar characteristics with fitness for service processes. At this workshop, the...
78 FR 56268 - Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-12
.... PHMSA-2013-0119] Pipeline Safety: Public Workshop on Integrity Verification Process, Comment Extension... public workshop on ``Integrity Verification Process'' which took place on August 7, 2013. The notice also sought comments on the proposed ``Integrity Verification Process.'' In response to the comments received...
Integrated Disposal Facility FY 2016: ILAW Verification and Validation of the eSTOMP Simulator
DOE Office of Scientific and Technical Information (OSTI.GOV)
Freedman, Vicky L.; Bacon, Diana H.; Fang, Yilin
2016-05-13
This document describes two sets of simulations carried out to further verify and validate the eSTOMP simulator. In this report, a distinction is made between verification and validation, and the focus is on verifying eSTOMP through a series of published benchmarks on cementitious wastes, and validating eSTOMP based on a lysimeter experiment for the glassified waste. These activities are carried out within the context of a scientific view of validation that asserts that models can only be invalidated, and that model validation (and verification) is a subjective assessment.
Development of a Scalable Testbed for Mobile Olfaction Verification.
Zakaria, Syed Muhammad Mamduh Syed; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Yeon, Ahmad Shakaff Ali; Md Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah
2015-12-09
The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment.
Development of a Scalable Testbed for Mobile Olfaction Verification
Syed Zakaria, Syed Muhammad Mamduh; Visvanathan, Retnam; Kamarudin, Kamarulzaman; Ali Yeon, Ahmad Shakaff; Md. Shakaff, Ali Yeon; Zakaria, Ammar; Kamarudin, Latifah Munirah
2015-01-01
The lack of information on ground truth gas dispersion and experiment verification information has impeded the development of mobile olfaction systems, especially for real-world conditions. In this paper, an integrated testbed for mobile gas sensing experiments is presented. The integrated 3 m × 6 m testbed was built to provide real-time ground truth information for mobile olfaction system development. The testbed consists of a 72-gas-sensor array, namely Large Gas Sensor Array (LGSA), a localization system based on cameras and a wireless communication backbone for robot communication and integration into the testbed system. Furthermore, the data collected from the testbed may be streamed into a simulation environment to expedite development. Calibration results using ethanol have shown that using a large number of gas sensor in the LGSA is feasible and can produce coherent signals when exposed to the same concentrations. The results have shown that the testbed was able to capture the time varying characteristics and the variability of gas plume in a 2 h experiment thus providing time dependent ground truth concentration maps. The authors have demonstrated the ability of the mobile olfaction testbed to monitor, verify and thus, provide insight to gas distribution mapping experiment. PMID:26690175
A Systematic Method for Verification and Validation of Gyrokinetic Microstability Codes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bravenec, Ronald
My original proposal for the period Feb. 15, 2014 through Feb. 14, 2017 called for an integrated validation and verification effort carried out by myself with collaborators. The validation component would require experimental profile and power-balance analysis. In addition, it would require running the gyrokinetic codes varying the input profiles within experimental uncertainties to seek agreement with experiment before discounting a code as invalidated. Therefore, validation would require a major increase of effort over my previous grant periods which covered only code verification (code benchmarking). Consequently, I had requested full-time funding. Instead, I am being funded at somewhat less thanmore » half time (5 calendar months per year). As a consequence, I decided to forego the validation component and to only continue the verification efforts.« less
NASA Astrophysics Data System (ADS)
Akers, James C.; Passe, Paul J.; Cooper, Beth A.
2005-09-01
The Acoustical Testing Laboratory (ATL) at the NASA John H. Glenn Research Center (GRC) in Cleveland, OH, provides acoustic emission testing and noise control engineering services for a variety of specialized customers, particularly developers of equipment and science experiments manifested for NASA's manned space missions. The ATL's primary customer has been the Fluids and Combustion Facility (FCF), a multirack microgravity research facility being developed at GRC for the USA Laboratory Module of the International Space Station (ISS). Since opening in September 2000, ATL has conducted acoustic emission testing of components, subassemblies, and partially populated FCF engineering model racks. The culmination of this effort has been the acoustic emission verification tests on the FCF Combustion Integrated Rack (CIR) and Fluids Integrated Rack (FIR), employing a procedure that incorporates ISO 11201 (``Acoustics-Noise emitted by machinery and equipment-Measurement of emission sound pressure levels at a work station and at other specified positions-Engineering method in an essentially free field over a reflecting plane''). This paper will provide an overview of the test methodology, software, and hardware developed to perform the acoustic emission verification tests on the CIR and FIR flight racks and lessons learned from these tests.
Metzger, Nicole L; Chesson, Melissa M; Momary, Kathryn M
2015-09-25
Objective. To create, implement, and assess a simulated medication reconciliation and an order verification activity using hospital training software. Design. A simulated patient with medication orders and home medications was built into existing hospital training software. Students in an institutional introductory pharmacy practice experience (IPPE) reconciled the patient's medications and determined whether or not to verify the inpatient orders based on his medical history and laboratory data. After reconciliation, students identified medication discrepancies and documented their rationale for rejecting inpatient orders. Assessment. For a 3-year period, the majority of students agreed the simulation enhanced their learning, taught valuable clinical decision-making skills, integrated material from previous courses, and stimulated their interest in institutional pharmacy. Overall feedback from student evaluations about the IPPE also was favorable. Conclusion. Use of existing hospital training software can affordably simulate the pharmacist's role in order verification and medication reconciliation, as well as improve clinical decision-making.
7 Processes that Enable NASA Software Engineering Technologies: Value-Added Process Engineering
NASA Technical Reports Server (NTRS)
Housch, Helen; Godfrey, Sally
2011-01-01
The presentation reviews Agency process requirements and the purpose, benefits, and experiences or seven software engineering processes. The processes include: product integration, configuration management, verification, software assurance, measurement and analysis, requirements management, and planning and monitoring.
Environmental Verification Experiment for the Explorer Platform (EVEEP)
NASA Technical Reports Server (NTRS)
Norris, Bonnie; Lorentson, Chris
1992-01-01
Satellites and long-life spacecraft require effective contamination control measures to ensure data accuracy and maintain overall system performance margins. Satellite and spacecraft contamination can occur from either molecular or particulate matter. Some of the sources of the molecular species are as follows: mass loss from nonmetallic materials; venting of confined spacecraft or experiment volumes; exhaust effluents from attitude control systems; integration and test activities; and improper cleaning of surfaces. Some of the sources of particulates are as follows: leaks or purges which condense upon vacuum exposure; abrasion of movable surfaces; and micrometeoroid impacts. The Environmental Verification Experiment for the Explorer Platform (EVEEP) was designed to investigate the following aspects of spacecraft contamination control: materials selection; contamination modeling of existing designs; and thermal vacuum testing of a spacecraft with contamination monitors.
NASA Technical Reports Server (NTRS)
1978-01-01
The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.
Simulation environment based on the Universal Verification Methodology
NASA Astrophysics Data System (ADS)
Fiergolski, A.
2017-01-01
Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.
Engineering of the LISA Pathfinder mission—making the experiment a practical reality
NASA Astrophysics Data System (ADS)
Warren, Carl; Dunbar, Neil; Backler, Mike
2009-05-01
LISA Pathfinder represents a unique challenge in the development of scientific spacecraft—not only is the LISA Test Package (LTP) payload a complex integrated development, placing stringent requirements on its developers and the spacecraft, but the payload also acts as the core sensor and actuator for the spacecraft, making the tasks of control design, software development and system verification unusually difficult. The micro-propulsion system which provides the remaining actuation also presents substantial development and verification challenges. As the mission approaches the system critical design review, flight hardware is completing verification and the process of verification using software and hardware simulators and test benches is underway. Preparation for operations has started, but critical milestones for LTP and field effect electric propulsion (FEEP) lie ahead. This paper summarizes the status of the present development and outlines the key challenges that must be overcome on the way to launch.
Integration and verification testing of the Large Synoptic Survey Telescope camera
NASA Astrophysics Data System (ADS)
Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.
2016-08-01
We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.
Putting the Laboratory at the Center of Teaching Chemistry
ERIC Educational Resources Information Center
Bopegedera, A. M. R. P.
2011-01-01
This article describes an effective approach to teaching chemistry by bringing the laboratory to the center of teaching, to bring the excitement of discovery to the learning process. The lectures and laboratories are closely integrated to provide a holistic learning experience. The laboratories progress from verification to open-inquiry and…
Control/structure interaction design methodology
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.; Layman, William E.
1989-01-01
The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.
NASA Astrophysics Data System (ADS)
Arndt, J.; Kreimer, J.
2010-09-01
The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough flexibility which is desired by manned space operations with scientific objectives. In the period of COLUMBUS operations since launch already a number of lessons learnt could be implemented especially in the IEHA that allow to improve the flexibility of on-board operations without degradation of Safety.
The concept verification testing of materials science payloads
NASA Technical Reports Server (NTRS)
Griner, C. S.; Johnston, M. H.; Whitaker, A.
1976-01-01
The concept Verification Testing (CVT) project at the Marshall Space Flight Center, Alabama, is a developmental activity that supports Shuttle Payload Projects such as Spacelab. It provides an operational 1-g environment for testing NASA and other agency experiment and support systems concepts that may be used in shuttle. A dedicated Materials Science Payload was tested in the General Purpose Laboratory to assess the requirements of a space processing payload on a Spacelab type facility. Physical and functional integration of the experiments into the facility was studied, and the impact of the experiments on the facility (and vice versa) was evaluated. A follow-up test designated CVT Test IVA was also held. The purpose of this test was to repeat Test IV experiments with a crew composed of selected and trained scientists. These personnel were not required to have prior knowledge of the materials science disciplines, but were required to have a basic knowledge of science and the scientific method.
A software engineering approach to expert system design and verification
NASA Technical Reports Server (NTRS)
Bochsler, Daniel C.; Goodwin, Mary Ann
1988-01-01
Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.
The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.
2004-01-01
The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.
Integrating Conceptual Knowledge Within and Across Representational Modalities
McNorgan, Chris; Reid, Jackie; McRae, Ken
2011-01-01
Research suggests that concepts are distributed across brain regions specialized for processing information from different sensorimotor modalities. Multimodal semantic models fall into one of two broad classes differentiated by the assumed hierarchy of convergence zones over which information is integrated. In shallow models, communication within- and between-modality is accomplished using either direct connectivity, or a central semantic hub. In deep models, modalities are connected via cascading integration sites with successively wider receptive fields. Four experiments provide the first direct behavioral tests of these models using speeded tasks involving feature inference and concept activation. Shallow models predict no within-modal versus cross-modal difference in either task, whereas deep models predict a within-modal advantage for feature inference, but a cross-modal advantage for concept activation. Experiments 1 and 2 used relatedness judgments to tap participants’ knowledge of relations for within- and cross-modal feature pairs. Experiments 3 and 4 used a dual feature verification task. The pattern of decision latencies across Experiments 1 to 4 is consistent with a deep integration hierarchy. PMID:21093853
Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1
NASA Technical Reports Server (NTRS)
1976-01-01
The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.
Space transportation system payload interface verification
NASA Technical Reports Server (NTRS)
Everline, R. T.
1977-01-01
The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).
Software Independent Verification and Validation (SIV&V) Simplified
2006-12-01
Configuration Item I/O Input/Output I2V2 Independent Integrated Verification and Validation IBM International Business Machines ICD Interface...IPT Integrated Product Team IRS Interface Requirements Specification ISD Integrated System Diagram ITD Integrated Test Description ITP ...programming languages such as COBOL (Common Business Oriented Language) (Codasyl committee 1960), and FORTRAN (FORmula TRANslator) ( IBM 1952) (Robat 11
Formal Verification for a Next-Generation Space Shuttle
NASA Technical Reports Server (NTRS)
Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)
2002-01-01
This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.
Baseline Assessment and Prioritization Framework for IVHM Integrity Assurance Enabling Capabilities
NASA Technical Reports Server (NTRS)
Cooper, Eric G.; DiVito, Benedetto L.; Jacklin, Stephen A.; Miner, Paul S.
2009-01-01
Fundamental to vehicle health management is the deployment of systems incorporating advanced technologies for predicting and detecting anomalous conditions in highly complex and integrated environments. Integrated structural integrity health monitoring, statistical algorithms for detection, estimation, prediction, and fusion, and diagnosis supporting adaptive control are examples of advanced technologies that present considerable verification and validation challenges. These systems necessitate interactions between physical and software-based systems that are highly networked with sensing and actuation subsystems, and incorporate technologies that are, in many respects, different from those employed in civil aviation today. A formidable barrier to deploying these advanced technologies in civil aviation is the lack of enabling verification and validation tools, methods, and technologies. The development of new verification and validation capabilities will not only enable the fielding of advanced vehicle health management systems, but will also provide new assurance capabilities for verification and validation of current generation aviation software which has been implicated in anomalous in-flight behavior. This paper describes the research focused on enabling capabilities for verification and validation underway within NASA s Integrated Vehicle Health Management project, discusses the state of the art of these capabilities, and includes a framework for prioritizing activities.
Verification and Validation Studies for the LAVA CFD Solver
NASA Technical Reports Server (NTRS)
Moini-Yekta, Shayan; Barad, Michael F; Sozer, Emre; Brehm, Christoph; Housman, Jeffrey A.; Kiris, Cetin C.
2013-01-01
The verification and validation of the Launch Ascent and Vehicle Aerodynamics (LAVA) computational fluid dynamics (CFD) solver is presented. A modern strategy for verification and validation is described incorporating verification tests, validation benchmarks, continuous integration and version control methods for automated testing in a collaborative development environment. The purpose of the approach is to integrate the verification and validation process into the development of the solver and improve productivity. This paper uses the Method of Manufactured Solutions (MMS) for the verification of 2D Euler equations, 3D Navier-Stokes equations as well as turbulence models. A method for systematic refinement of unstructured grids is also presented. Verification using inviscid vortex propagation and flow over a flat plate is highlighted. Simulation results using laminar and turbulent flow past a NACA 0012 airfoil and ONERA M6 wing are validated against experimental and numerical data.
NASA Technical Reports Server (NTRS)
Powell, John D.
2003-01-01
This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.
Test and Verification Approach for the NASA Constellation Program
NASA Technical Reports Server (NTRS)
Strong, Edward
2008-01-01
This viewgraph presentation is a test and verification approach for the NASA Constellation Program. The contents include: 1) The Vision for Space Exploration: Foundations for Exploration; 2) Constellation Program Fleet of Vehicles; 3) Exploration Roadmap; 4) Constellation Vehicle Approximate Size Comparison; 5) Ares I Elements; 6) Orion Elements; 7) Ares V Elements; 8) Lunar Lander; 9) Map of Constellation content across NASA; 10) CxP T&V Implementation; 11) Challenges in CxP T&V Program; 12) T&V Strategic Emphasis and Key Tenets; 13) CxP T&V Mission & Vision; 14) Constellation Program Organization; 15) Test and Evaluation Organization; 16) CxP Requirements Flowdown; 17) CxP Model Based Systems Engineering Approach; 18) CxP Verification Planning Documents; 19) Environmental Testing; 20) Scope of CxP Verification; 21) CxP Verification - General Process Flow; 22) Avionics and Software Integrated Testing Approach; 23) A-3 Test Stand; 24) Space Power Facility; 25) MEIT and FEIT; 26) Flight Element Integrated Test (FEIT); 27) Multi-Element Integrated Testing (MEIT); 28) Flight Test Driving Principles; and 29) Constellation s Integrated Flight Test Strategy Low Earth Orbit Servicing Capability.
NASA Astrophysics Data System (ADS)
Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.
2013-02-01
In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.
Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos
NASA Astrophysics Data System (ADS)
Long, Min; Li, You; Peng, Fei
Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.
Integrated testing and verification system for research flight software
NASA Technical Reports Server (NTRS)
Taylor, R. N.
1979-01-01
The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.
Integrating conceptual knowledge within and across representational modalities.
McNorgan, Chris; Reid, Jackie; McRae, Ken
2011-02-01
Research suggests that concepts are distributed across brain regions specialized for processing information from different sensorimotor modalities. Multimodal semantic models fall into one of two broad classes differentiated by the assumed hierarchy of convergence zones over which information is integrated. In shallow models, communication within- and between-modality is accomplished using either direct connectivity, or a central semantic hub. In deep models, modalities are connected via cascading integration sites with successively wider receptive fields. Four experiments provide the first direct behavioral tests of these models using speeded tasks involving feature inference and concept activation. Shallow models predict no within-modal versus cross-modal difference in either task, whereas deep models predict a within-modal advantage for feature inference, but a cross-modal advantage for concept activation. Experiments 1 and 2 used relatedness judgments to tap participants' knowledge of relations for within- and cross-modal feature pairs. Experiments 3 and 4 used a dual-feature verification task. The pattern of decision latencies across Experiments 1-4 is consistent with a deep integration hierarchy. Copyright © 2010 Elsevier B.V. All rights reserved.
Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B
2009-12-01
Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.
NASA Technical Reports Server (NTRS)
Hughes, David W.; Hedgeland, Randy J.
1994-01-01
A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.
Status of BOUT fluid turbulence code: improvements and verification
NASA Astrophysics Data System (ADS)
Umansky, M. V.; Lodestro, L. L.; Xu, X. Q.
2006-10-01
BOUT is an electromagnetic fluid turbulence code for tokamak edge plasma [1]. BOUT performs time integration of reduced Braginskii plasma fluid equations, using spatial discretization in realistic geometry and employing a standard ODE integration package PVODE. BOUT has been applied to several tokamak experiments and in some cases calculated spectra of turbulent fluctuations compared favorably to experimental data. On the other hand, the desire to understand better the code results and to gain more confidence in it motivated investing effort in rigorous verification of BOUT. Parallel to the testing the code underwent substantial modification, mainly to improve its readability and tractability of physical terms, with some algorithmic improvements as well. In the verification process, a series of linear and nonlinear test problems was applied to BOUT, targeting different subgroups of physical terms. The tests include reproducing basic electrostatic and electromagnetic plasma modes in simplified geometry, axisymmetric benchmarks against the 2D edge code UEDGE in real divertor geometry, and neutral fluid benchmarks against the hydrodynamic code LCPFCT. After completion of the testing, the new version of the code is being applied to actual tokamak edge turbulence problems, and the results will be presented. [1] X. Q. Xu et al., Contr. Plas. Phys., 36,158 (1998). *Work performed for USDOE by Univ. Calif. LLNL under contract W-7405-ENG-48.
Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) Independent Analysis
NASA Technical Reports Server (NTRS)
Davis, Mitchell L.; Aguilar, Michael L.; Mora, Victor D.; Regenie, Victoria A.; Ritz, William F.
2009-01-01
Two approaches were compared to the Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) approach: the Flat-Sat and Shuttle Avionics Integration Laboratory (SAIL). The Flat-Sat and CAIL/SAIL approaches are two different tools designed to mitigate different risks. Flat-Sat approach is designed to develop a mission concept into a flight avionics system and associated ground controller. The SAIL approach is designed to aid in the flight readiness verification of the flight avionics system. The approaches are complimentary in addressing both the system development risks and mission verification risks. The following NESC team findings were identified: The CAIL assumption is that the flight subsystems will be matured for the system level verification; The Flat-Sat and SAIL approaches are two different tools designed to mitigate different risks. The following NESC team recommendation was provided: Define, document, and manage a detailed interface between the design and development (EDL and other integration labs) to the verification laboratory (CAIL).
NASA Technical Reports Server (NTRS)
Haapala, C.
1999-01-01
This is the Performance Verification Report, Antenna Drive Subassembly, Antenna Drive Subsystem, METSAT AMSU-A2 (P/N 1331200-2, SN: 108), for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A).
NASA Technical Reports Server (NTRS)
Pines, D.
1999-01-01
This is the Performance Verification Report, METSAT (Meteorological Satellites) Phase Locked Oscillator Assembly, P/N 1348360-1, S/N F09 and F10, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A).
Formal verification and testing: An integrated approach to validating Ada programs
NASA Technical Reports Server (NTRS)
Cohen, Norman H.
1986-01-01
An integrated set of tools called a validation environment is proposed to support the validation of Ada programs by a combination of methods. A Modular Ada Validation Environment (MAVEN) is described which proposes a context in which formal verification can fit into the industrial development of Ada software.
Two-Level Verification of Data Integrity for Data Storage in Cloud Computing
NASA Astrophysics Data System (ADS)
Xu, Guangwei; Chen, Chunlin; Wang, Hongya; Zang, Zhuping; Pang, Mugen; Jiang, Ping
Data storage in cloud computing can save capital expenditure and relive burden of storage management for users. As the lose or corruption of files stored may happen, many researchers focus on the verification of data integrity. However, massive users often bring large numbers of verifying tasks for the auditor. Moreover, users also need to pay extra fee for these verifying tasks beyond storage fee. Therefore, we propose a two-level verification of data integrity to alleviate these problems. The key idea is to routinely verify the data integrity by users and arbitrate the challenge between the user and cloud provider by the auditor according to the MACs and ϕ values. The extensive performance simulations show that the proposed scheme obviously decreases auditor's verifying tasks and the ratio of wrong arbitration.
Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G
2014-08-01
In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.
Development of a verification program for deployable truss advanced technology
NASA Technical Reports Server (NTRS)
Dyer, Jack E.
1988-01-01
Use of large deployable space structures to satisfy the growth demands of space systems is contingent upon reducing the associated risks that pervade many related technical disciplines. The overall objectives of this program was to develop a detailed plan to verify deployable truss advanced technology applicable to future large space structures and to develop a preliminary design of a deployable truss reflector/beam structure for use a a technology demonstration test article. The planning is based on a Shuttle flight experiment program using deployable 5 and 15 meter aperture tetrahedral truss reflections and a 20 m long deployable truss beam structure. The plan addresses validation of analytical methods, the degree to which ground testing adequately simulates flight and in-space testing requirements for large precision antenna designs. Based on an assessment of future NASA and DOD space system requirements, the program was developed to verify four critical technology areas: deployment, shape accuracy and control, pointing and alignment, and articulation and maneuvers. The flight experiment technology verification objectives can be met using two shuttle flights with the total experiment integrated on a single Shuttle Test Experiment Platform (STEP) and a Mission Peculiar Experiment Support Structure (MPESS). First flight of the experiment can be achieved 60 months after go-ahead with a total program duration of 90 months.
A study of compositional verification based IMA integration method
NASA Astrophysics Data System (ADS)
Huang, Hui; Zhang, Guoquan; Xu, Wanmeng
2018-03-01
The rapid development of avionics systems is driving the application of integrated modular avionics (IMA) systems. But meanwhile it is improving avionics system integration, complexity of system test. Then we need simplify the method of IMA system test. The IMA system supports a module platform that runs multiple applications, and shares processing resources. Compared with federated avionics system, IMA system is difficult to isolate failure. Therefore, IMA system verification will face the critical problem is how to test shared resources of multiple application. For a simple avionics system, traditional test methods are easily realizing to test a whole system. But for a complex system, it is hard completed to totally test a huge and integrated avionics system. Then this paper provides using compositional-verification theory in IMA system test, so that reducing processes of test and improving efficiency, consequently economizing costs of IMA system integration.
Robertson, Scott
2014-11-01
Analog gravity experiments make feasible the realization of black hole space-times in a laboratory setting and the observational verification of Hawking radiation. Since such analog systems are typically dominated by dispersion, efficient techniques for calculating the predicted Hawking spectrum in the presence of strong dispersion are required. In the preceding paper, an integral method in Fourier space is proposed for stationary 1+1-dimensional backgrounds which are asymptotically symmetric. Here, this method is generalized to backgrounds which are different in the asymptotic regions to the left and right of the scattering region.
24 CFR 3286.207 - Process for obtaining installation license.
Code of Federal Regulations, 2010 CFR
2010-04-01
... installation license must submit verification of the experience required in § 3286.205(a). This verification may be in the form of statements by past or present employers or a self-certification that the applicant meets those experience requirements, but HUD may contact the applicant for additional verification...
24 CFR 3286.307 - Process for obtaining trainer's qualification.
Code of Federal Regulations, 2010 CFR
2010-04-01
... verification of the experience required in § 3286.305. This verification may be in the form of statements by past or present employers or a self-certification that the applicant meets those experience requirements, but HUD may contact the applicant for additional verification at any time. The applicant must...
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
Study of techniques for redundancy verification without disrupting systems, phases 1-3
NASA Technical Reports Server (NTRS)
1970-01-01
The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.
2007-03-01
Characterisation. In Nanotechnology Aerospace Applications – 2006 (pp. 4-1 – 4-8). Educational Notes RTO-EN-AVT-129bis, Paper 4. Neuilly-sur-Seine, France: RTO...the Commercialisation Processes Concept IDEA Proof-of- Principle Trial Samples Engineering Verification Samples Design Verification Samples...SEIC Systems Engineering for commercialisation Design Houses, Engineering & R&D USERS & Integrators SE S U R Integrators Fabs & Wafer Processing Die
2009-01-01
Background The study of biological networks has led to the development of increasingly large and detailed models. Computer tools are essential for the simulation of the dynamical behavior of the networks from the model. However, as the size of the models grows, it becomes infeasible to manually verify the predictions against experimental data or identify interesting features in a large number of simulation traces. Formal verification based on temporal logic and model checking provides promising methods to automate and scale the analysis of the models. However, a framework that tightly integrates modeling and simulation tools with model checkers is currently missing, on both the conceptual and the implementational level. Results We have developed a generic and modular web service, based on a service-oriented architecture, for integrating the modeling and formal verification of genetic regulatory networks. The architecture has been implemented in the context of the qualitative modeling and simulation tool GNA and the model checkers NUSMV and CADP. GNA has been extended with a verification module for the specification and checking of biological properties. The verification module also allows the display and visual inspection of the verification results. Conclusions The practical use of the proposed web service is illustrated by means of a scenario involving the analysis of a qualitative model of the carbon starvation response in E. coli. The service-oriented architecture allows modelers to define the model and proceed with the specification and formal verification of the biological properties by means of a unified graphical user interface. This guarantees a transparent access to formal verification technology for modelers of genetic regulatory networks. PMID:20042075
Analytical Verifications in Cryogenic Testing of NGST Advanced Mirror System Demonstrators
NASA Technical Reports Server (NTRS)
Cummings, Ramona; Levine, Marie; VanBuren, Dave; Kegley, Jeff; Green, Joseph; Hadaway, James; Presson, Joan; Cline, Todd; Stahl, H. Philip (Technical Monitor)
2002-01-01
Ground based testing is a critical and costly part of component, assembly, and system verifications of large space telescopes. At such tests, however, with integral teamwork by planners, analysts, and test personnel, segments can be included to validate specific analytical parameters and algorithms at relatively low additional cost. This paper opens with strategy of analytical verification segments added to vacuum cryogenic testing of Advanced Mirror System Demonstrator (AMSD) assemblies. These AMSD assemblies incorporate material and architecture concepts being considered in the Next Generation Space Telescope (NGST) design. The test segments for workmanship testing, cold survivability, and cold operation optical throughput are supplemented by segments for analytical verifications of specific structural, thermal, and optical parameters. Utilizing integrated modeling and separate materials testing, the paper continues with support plan for analyses, data, and observation requirements during the AMSD testing, currently slated for late calendar year 2002 to mid calendar year 2003. The paper includes anomaly resolution as gleaned by authors from similar analytical verification support of a previous large space telescope, then closes with draft of plans for parameter extrapolations, to form a well-verified portion of the integrated modeling being done for NGST performance predictions.
Quantum integrability and functional equations
NASA Astrophysics Data System (ADS)
Volin, Dmytro
2010-03-01
In this thesis a general procedure to represent the integral Bethe Ansatz equations in the form of the Reimann-Hilbert problem is given. This allows us to study in simple way integrable spin chains in the thermodynamic limit. Based on the functional equations we give the procedure that allows finding the subleading orders in the solution of various integral equations solved to the leading order by the Wiener-Hopf technics. The integral equations are studied in the context of the AdS/CFT correspondence, where their solution allows verification of the integrability conjecture up to two loops of the strong coupling expansion. In the context of the two-dimensional sigma models we analyze the large-order behavior of the asymptotic perturbative expansion. Obtained experience with the functional representation of the integral equations allowed us also to solve explicitly the crossing equations that appear in the AdS/CFT spectral problem.
ATM test and integration. [Skylab Apollo Telescope Mount
NASA Technical Reports Server (NTRS)
Moore, J. W.; Mitchell, J. R.
1974-01-01
The test and checkout philosophy of the test program for the Skylab ATM module and the overall test flow including in-process, post-manufacturing, vibration, thermal vacuum, and prelaunch checkout activities are described. Capabilities and limitations of the test complex and its use of automation are discussed. Experiences with the organizational principle of using a dedicated test team for all checkout activities are reported. Material on the development of the ATM subsystems, the experimental program and the requirements of the scientific community, and the integration and verification of the complex systems/subsystems of the ATM are presented. The performance of the ATM test program in such areas as alignment, systems and subsystems, contamination control, and experiment operation is evaluated. The conclusions and recommendations resulting from the ATM test program are enumerated.
2015-03-05
launched on its rocket- estimated completion date of May 2015. Air Force will require verification that SpaceX can meet payload integration...design and accelerate integration capability at Space Exploration Technologies Corporation ( SpaceX )1 launch sites. o The Air Force does not intend to...accelerate integration capabilities at SpaceX launch sites because of the studies it directed, but will require verification that SpaceX can meet
Mathematical Modeling of Ni/H2 and Li-Ion Batteries
NASA Technical Reports Server (NTRS)
Weidner, John W.; White, Ralph E.; Dougal, Roger A.
2001-01-01
The modelling effort outlined in this viewgraph presentation encompasses the following topics: 1) Electrochemical Deposition of Nickel Hydroxide; 2) Deposition rates of thin films; 3) Impregnation of porous electrodes; 4) Experimental Characterization of Nickel Hydroxide; 5) Diffusion coefficients of protons; 6) Self-discharge rates (i.e., oxygen-evolution kinetics); 7) Hysteresis between charge and discharge; 8) Capacity loss on cycling; 9) Experimental Verification of the Ni/H2 Battery Model; 10) Mathematical Modeling Li-Ion Batteries; 11) Experimental Verification of the Li-Ion Battery Model; 11) Integrated Power System Models for Satellites; and 12) Experimental Verification of Integrated-Systems Model.
NASA Technical Reports Server (NTRS)
Koga, Dennis (Technical Monitor); Penix, John; Markosian, Lawrence Z.; OMalley, Owen; Brew, William A.
2003-01-01
Attempts to achieve widespread use of software verification tools have been notably unsuccessful. Even 'straightforward', classic, and potentially effective verification tools such as lint-like tools face limits on their acceptance. These limits are imposed by the expertise required applying the tools and interpreting the results, the high false positive rate of many verification tools, and the need to integrate the tools into development environments. The barriers are even greater for more complex advanced technologies such as model checking. Web-hosted services for advanced verification technologies may mitigate these problems by centralizing tool expertise. The possible benefits of this approach include eliminating the need for software developer expertise in tool application and results filtering, and improving integration with other development tools.
Analytical and Experimental Verification of a Flight Article for a Mach-8 Boundary-Layer Experiment
NASA Technical Reports Server (NTRS)
Richards, W. Lance; Monaghan, Richard C.
1996-01-01
Preparations for a boundary-layer transition experiment to be conducted on a future flight mission of the air-launched Pegasus(TM) rocket are underway. The experiment requires a flight-test article called a glove to be attached to the wing of the Mach-8 first-stage booster. A three-dimensional, nonlinear finite-element analysis has been performed and significant small-scale laboratory testing has been accomplished to ensure the glove design integrity and quality of the experiment. Reliance on both the analysis and experiment activities has been instrumental in the success of the flight-article design. Results obtained from the structural analysis and laboratory testing show that all glove components are well within the allowable thermal stress and deformation requirements to satisfy the experiment objectives.
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
Formal Verification of the AAMP-FV Microcode
NASA Technical Reports Server (NTRS)
Miller, Steven P.; Greve, David A.; Wilding, Matthew M.; Srivas, Mandayam
1999-01-01
This report describes the experiences of Collins Avionics & Communications and SRI International in formally specifying and verifying the microcode in a Rockwell proprietary microprocessor, the AAMP-FV, using the PVS verification system. This project built extensively on earlier experiences using PVS to verify the microcode in the AAMP5, a complex, pipelined microprocessor designed for use in avionics displays and global positioning systems. While the AAMP5 experiment demonstrated the technical feasibility of formal verification of microcode, the steep learning curve encountered left unanswered the question of whether it could be performed at reasonable cost. The AAMP-FV project was conducted to determine whether the experience gained on the AAMP5 project could be used to make formal verification of microcode cost effective for safety-critical and high volume devices.
2001-02-03
An overhead crane lowers the Multi-Purpose Logistics Module Donatello onto a workstand. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004
Cassini's Test Methodology for Flight Software Verification and Operations
NASA Technical Reports Server (NTRS)
Wang, Eric; Brown, Jay
2007-01-01
The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).
Hypersonic CFD applications for the National Aero-Space Plane
NASA Technical Reports Server (NTRS)
Richardson, Pamela F.; Mcclinton, Charles R.; Bittner, Robert D.; Dilley, A. Douglas; Edwards, Kelvin W.
1989-01-01
Design and analysis of the NASP depends heavily upon developing the critical technology areas that cover the entire engineering design of the vehicle. These areas include materials, structures, propulsion systems, propellants, integration of airframe and propulsion systems, controls, subsystems, and aerodynamics areas. Currently, verification of many of the classical engineering tools relies heavily on computational fluid dynamics. Advances are being made in the development of CFD codes to accomplish nose-to-tail analyses for hypersonic aircraft. Additional details involving the partial development, analysis, verification, and application of the CFL3D code and the SPARK combustor code are discussed. A nonequilibrium version of CFL3D that is presently being developed and tested is also described. Examples are given of portion calculations for research hypersonic aircraft geometries and comparisons with experiment data show good agreement.
A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events
NASA Astrophysics Data System (ADS)
Kholodovsky, V.
2017-12-01
Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Aronson, A.L.; Gordon, D.M.
IN APRIL 1996, THE UNITED STATES (US) ADDED THE PORTSMOUTH GASEOUS DIFFUSION PLANT TO THE LIST OF FACILITIES ELIGIBLE FOR THE APPLICATION OF INTERNATIONAL ATOMIC ENERGY AGENCY (IAEA) SAFEGUARDS. AT THAT TIME, THE US PROPOSED THAT THE IAEA CARRY OUT A ''VERIFICATION EXPERIMENT'' AT THE PLANT WITH RESPECT TO DOOWNBLENDING OF ABOUT 13 METRIC TONS OF HIGHLY ENRICHED URANIUM (HEU) IN THE FORM OF URANIUM HEXAFLUROIDE (UF6). DURING THE PERIOD DECEMBER 1997 THROUGH JULY 1998, THE IAEA CARRIED OUT THE REQUESTED VERIFICATION EXPERIMENT. THE VERIFICATION APPROACH USED FOR THIS EXPERIMENT INCLUDED, AMONG OTHER MEASURES, THE ENTRY OF PROCESS-OPERATIONAL DATA BYmore » THE FACILITY OPERATOR ON A NEAR-REAL-TIME BASIS INTO A ''MAILBOX'' COMPUTER LOCATED WITHIN A TAMPER-INDICATING ENCLOSURE SEALED BY THE IAEA.« less
NASA Technical Reports Server (NTRS)
Huff, H.; You, Z.; Williams, T.; Nichols, T.; Attia, J.; Fogarty, T. N.; Kirby, K.; Wilkins, R.; Lawton, R.
1998-01-01
As integrated circuits become more sensitive to charged particles and neutrons, anomalous performance due to single event effects (SEE) is a concern and requires experimental verification and quantification. The Center for Applied Radiation Research (CARR) at Prairie View A&M University has developed experiments as a participant in the NASA ER-2 Flight Program, the APEX balloon flight program and the Student Launch Program. Other high altitude and ground level experiments of interest to DoD and commercial applications are being developed. The experiment characterizes the SEE behavior of high speed and high density SRAM's. The system includes a PC-104 computer unit, an optical drive for storage, a test board with the components under test, and a latchup detection and reset unit. The test program will continuously monitor the stored checkerboard data pattern in the SW and record errors. Since both the computer and the optical drive contain integrated circuits, they are also vulnerable to radiation effects. A latchup detection unit with discrete components will monitor the test program and reset the system when necessary. The first results will be obtained from the NASA ER-2 flights, which are now planned to take place in early 1998 from Dryden Research Center in California. The series of flights, at altitudes up to 70,000 feet, and a variety of flight profiles should yield a distribution of conditions for correlating SEES. SEE measurements will be performed from the time of aircraft power-up on the ground throughout the flight regime until systems power-off after landing.
NASA Technical Reports Server (NTRS)
1999-01-01
This is the Performance Verification Report, METSAT (S/N 109) AMSU-A1 Receiver Assemblies, P/N 1356429-1 S/N F06 and P/N 1356409 S/N F06, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A).
Application of a Near-Field Water Quality Model.
1979-07-01
VERIFICATION 45 CENTERLINE TEMPERATURE DECRF~A7F 46 LATERAL VARIATION OF CONSTITUENTS 46 VARIATIOtN OF PLUME WIDTH 49 GENERAL ON VERIFICATION 49...40 4 SOME RESULTS OF VARYING THE ENTRAINMENT COEFFICIENT 4’ 5 RESULTS OF VARYING OTHER COEFFICEINT 42 6 GENERAL PLUME CHARACTERISITICS FOR VARIATION... plume ) axis. These profile forms are then integrated within the basic conservation equations. This integration reduces the problem to a one
Advanced turboprop testbed systems study
NASA Technical Reports Server (NTRS)
Goldsmith, I. M.
1982-01-01
The proof of concept, feasibility, and verification of the advanced prop fan and of the integrated advanced prop fan aircraft are established. The use of existing hardware is compatible with having a successfully expedited testbed ready for flight. A prop fan testbed aircraft is definitely feasible and necessary for verification of prop fan/prop fan aircraft integrity. The Allison T701 is most suitable as a propulsor and modification of existing engine and propeller controls are adequate for the testbed. The airframer is considered the logical overall systems integrator of the testbed program.
PARALLEL MEASUREMENT AND MODELING OF TRANSPORT IN THE DARHT II BEAMLINE ON ETA II
DOE Office of Scientific and Technical Information (OSTI.GOV)
Chambers, F W; Raymond, B A; Falabella, S
To successfully tune the DARHT II transport beamline requires the close coupling of a model of the beam transport and the measurement of the beam observables as the beam conditions and magnet settings are varied. For the ETA II experiment using the DARHT II beamline components this was achieved using the SUICIDE (Simple User Interface Connecting to an Integrated Data Environment) data analysis environment and the FITS (Fully Integrated Transport Simulation) model. The SUICIDE environment has direct access to the experimental beam transport data at acquisition and the FITS predictions of the transport for immediate comparison. The FITS model ismore » coupled into the control system where it can read magnet current settings for real time modeling. We find this integrated coupling is essential for model verification and the successful development of a tuning aid for the efficient convergence on a useable tune. We show the real time comparisons of simulation and experiment and explore the successes and limitations of this close coupled approach.« less
NASA Technical Reports Server (NTRS)
Defeo, P.; Doane, D.; Saito, J.
1982-01-01
A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.
PHM for Ground Support Systems Case Study: From Requirements to Integration
NASA Technical Reports Server (NTRS)
Teubert, Chris
2015-01-01
This session will detail the experience of members of the NASA Ames Prognostic Center of Excellence (PCoE) producing PHM tools for NASA Advanced Ground Support Systems, including the challenges in applying their research in a production environment. Specifically, we will 1) go over the systems engineering and review process used; 2) Discuss the challenges and pitfalls in this process; 3) discuss software architecting, documentation, verification and validation activities and 4) discuss challenges in communicating the benefits and limitations of PHM Technologies.
Continuing Issues (FY 1979) Regarding DoD Use of the Space Transportation System.
1979-12-01
estimates) to the cost of launching experimental payloads In the sortie mode is the analytical verification of compatibility ("Integrsaton") of the experiment...with the Shuttle; the Integration cost mya be reduced by the Air Force by treir proposed "class cargo" generalized integra- tion analysis that, once...camactness and light weight (for a given experimental weight) rather than on intrinsic cost , to minmize lazc costa as couted by the NASA volume and weight
Engineering within the assembly, verification, and integration (AIV) process in ALMA
NASA Astrophysics Data System (ADS)
Lopez, Bernhard; McMullin, Joseph P.; Whyborn, Nicholas D.; Duvall, Eugene
2010-07-01
The Atacama Large Millimeter/submillimeter Array (ALMA) is a joint project between astronomical organizations in Europe, North America, and East Asia, in collaboration with the Republic of Chile. ALMA will consist of at least 54 twelve-meter antennas and 12 seven-meter antennas operating as an interferometer in the millimeter and sub-millimeter wavelength range. It will be located at an altitude above 5000m in the Chilean Atacama desert. As part of the ALMA construction phase the Assembly, Verification and Integration (AIV) team receives antennas and instrumentation from Integrated Product Teams (IPTs), verifies that the sub-systems perform as expected, performs the assembly and integration of the scientific instrumentation and verifies that functional and performance requirements are met. This paper aims to describe those aspects related to the AIV Engineering team, its role within the 4-station AIV process, the different phases the group underwent, lessons learned and potential space for improvement. AIV Engineering initially focused on the preparation of the necessary site infrastructure for AIV activities, on the purchase of tools and equipment and on the first ALMA system installations. With the first antennas arriving on site the team started to gather experience with AIV Station 1 beacon holography measurements for the assessment of the overall antenna surface quality, and with optical pointing to confirm the antenna pointing and tracking capabilities. With the arrival of the first receiver AIV Station 2 was developed which focuses on the installation of electrical and cryogenic systems and incrementally establishes the full connectivity of the antenna as an observing platform. Further antenna deliveries then allowed to refine the related procedures, develop staff expertise and to transition towards a more routine production process. Stations 3 and 4 deal with verification of the antenna with integrated electronics by the AIV Science Team and is not covered directly in this paper. It is believed that both continuous improvement and the clear definition of the AIV 4-station model were key factors in achieving the goal of bringing the antennas into a state that is well enough characterized in order to smoothly start commissioning activities.
An Integrated Environment for Efficient Formal Design and Verification
NASA Technical Reports Server (NTRS)
1998-01-01
The general goal of this project was to improve the practicality of formal methods by combining techniques from model checking and theorem proving. At the time the project was proposed, the model checking and theorem proving communities were applying different tools to similar problems, but there was not much cross-fertilization. This project involved a group from SRI that had substantial experience in the development and application of theorem-proving technology, and a group at Stanford that specialized in model checking techniques. Now, over five years after the proposal was submitted, there are many research groups working on combining theorem-proving and model checking techniques, and much more communication between the model checking and theorem proving research communities. This project contributed significantly to this research trend. The research work under this project covered a variety of topics: new theory and algorithms; prototype tools; verification methodology; and applications to problems in particular domains.
A New Integrated Weighted Model in SNOW-V10: Verification of Categorical Variables
NASA Astrophysics Data System (ADS)
Huang, Laura X.; Isaac, George A.; Sheng, Grant
2014-01-01
This paper presents the verification results for nowcasts of seven categorical variables from an integrated weighted model (INTW) and the underlying numerical weather prediction (NWP) models. Nowcasting, or short range forecasting (0-6 h), over complex terrain with sufficient accuracy is highly desirable but a very challenging task. A weighting, evaluation, bias correction and integration system (WEBIS) for generating nowcasts by integrating NWP forecasts and high frequency observations was used during the Vancouver 2010 Olympic and Paralympic Winter Games as part of the Science of Nowcasting Olympic Weather for Vancouver 2010 (SNOW-V10) project. Forecast data from Canadian high-resolution deterministic NWP system with three nested grids (at 15-, 2.5- and 1-km horizontal grid-spacing) were selected as background gridded data for generating the integrated nowcasts. Seven forecast variables of temperature, relative humidity, wind speed, wind gust, visibility, ceiling and precipitation rate are treated as categorical variables for verifying the integrated weighted forecasts. By analyzing the verification of forecasts from INTW and the NWP models among 15 sites, the integrated weighted model was found to produce more accurate forecasts for the 7 selected forecast variables, regardless of location. This is based on the multi-categorical Heidke skill scores for the test period 12 February to 21 March 2010.
NASA Astrophysics Data System (ADS)
Valenziano, L.; Gregorio, A.; Butler, R. C.; Amiaux, J.; Bonoli, C.; Bortoletto, F.; Burigana, C.; Corcione, L.; Ealet, A.; Frailis, M.; Jahnke, K.; Ligori, S.; Maiorano, E.; Morgante, G.; Nicastro, L.; Pasian, F.; Riva, M.; Scaramella, R.; Schiavone, F.; Tavagnacco, D.; Toledo-Moreo, R.; Trifoglio, M.; Zacchei, A.; Zerbi, F. M.; Maciaszek, T.
2012-09-01
Euclid is the future ESA mission, mainly devoted to Cosmology. Like WMAP and Planck, it is a survey mission, to be launched in 2019 and injected in orbit far away from the Earth, for a nominal lifetime of 7 years. Euclid has two instruments on-board, the Visible Imager (VIS) and the Near- Infrared Spectro-Photometer (NISP). The NISP instrument includes cryogenic mechanisms, active thermal control, high-performance Data Processing Unit and requires periodic in-flight calibrations and instrument parameters monitoring. To fully exploit the capability of the NISP, a careful control of systematic effects is required. From previous experiments, we have built the concept of an integrated instrument development and verification approach, where the scientific, instrument and ground-segment expertise have strong interactions from the early phases of the project. In particular, we discuss the strong integration of test and calibration activities with the Ground Segment, starting from early pre-launch verification activities. We want to report here the expertise acquired by the Euclid team in previous missions, only citing the literature for detailed reference, and indicate how it is applied in the Euclid mission framework.
Experimental Verification of Boyle's Law and the Ideal Gas Law
ERIC Educational Resources Information Center
Ivanov, Dragia Trifonov
2007-01-01
Two new experiments are offered concerning the experimental verification of Boyle's law and the ideal gas law. To carry out the experiments, glass tubes, water, a syringe and a metal manometer are used. The pressure of the saturated water vapour is taken into consideration. For educational purposes, the experiments are characterized by their…
Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart
2010-07-01
High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users.
Massouras, Andreas; Decouttere, Frederik; Hens, Korneel; Deplancke, Bart
2010-01-01
High-throughput sequencing (HTS) is revolutionizing our ability to obtain cheap, fast and reliable sequence information. Many experimental approaches are expected to benefit from the incorporation of such sequencing features in their pipeline. Consequently, software tools that facilitate such an incorporation should be of great interest. In this context, we developed WebPrInSeS, a web server tool allowing automated full-length clone sequence identification and verification using HTS data. WebPrInSeS encompasses two separate software applications. The first is WebPrInSeS-C which performs automated sequence verification of user-defined open-reading frame (ORF) clone libraries. The second is WebPrInSeS-E, which identifies positive hits in cDNA or ORF-based library screening experiments such as yeast one- or two-hybrid assays. Both tools perform de novo assembly using HTS data from any of the three major sequencing platforms. Thus, WebPrInSeS provides a highly integrated, cost-effective and efficient way to sequence-verify or identify clones of interest. WebPrInSeS is available at http://webprinses.epfl.ch/ and is open to all users. PMID:20501601
Overhearers Use Addressee Backchannels in Dialog Comprehension.
Tolins, Jackson; Fox Tree, Jean E
2016-08-01
Observing others in conversation is a common format for comprehending language, yet little work has been done to understand dialog comprehension. We tested whether overhearers use addressee backchannels as predictive cues for how to integrate information across speaker turns during comprehension of spontaneously produced collaborative narration. In Experiment 1, words that followed specific backchannels (e.g., really, oh) were recognized more slowly than words that followed either generic backchannels (e.g., uh huh, mhm) or pauses. In Experiment 2, we found that when the turn after the backchannel was a continuation of the narrative, specific backchannels prompted the fastest verification of prior information. When the turn after was an elaboration, they prompted the slowest, indicating that overhearers took specific backchannels as cues to integrate preceding talk with subsequent talk. These findings demonstrate that overhearers capitalize on the predictive relationship between backchannels and the development of speakers' talk, coordinating information across conversational roles. Copyright © 2015 Cognitive Science Society, Inc.
NASA Technical Reports Server (NTRS)
Naumann, Robert J.
1995-01-01
This report covers the development of and results from three experiments that were flown in the Materials Science Glovebox on USML-1: Marangoni convection in Closed Containers (MCCC), Double Float Zone (DFZ), and Fiber Pulling in Microgravity (FPM). The Glovebox provided a convenient, low cost method for doing simple 'try and see' experiments that could test new concepts or elucidate microgravity phenomena. Since the Glovebox provided essentially one (or possibly two levels of confinement, many of the stringent verification and test requirements on the experiment apparatus could be relaxed and a streamlined test and verification plan for flight qualification could be implemented. Furthermore, the experiments were contained in their own carrying cases whose external configurations could be identified early in the integration sequence for stowage considerations while delivery of the actual experiment apparatus could be postponed until only a few months before flight. This minimized the time fluids must be contained and reduced the possibility of corrosive reactions that could ruin the experiment. In many respects, this exercise was as much about developing a simpler, cheaper way of doing crew-assisted science as it was about the actual scientific accomplishments of the individual experiments. The Marangoni Convection in Closed Containers experiment was designed to study the effects of a void space in a simulated Bridgman crystal growth configuration and to determine if surface tension driven convective flows that may result from thermal gradients along any free surfaces could affect the solidification process. The Fiber Pulling in Microgravity experiment sought to separate the role of gravity drainage from capillarity effects in the break-up of slender cylindrical liquid columns. The Stability of a Double Float Zone experiment explored the feasibility of a quasi-containerless process in which a solidifying material is suspended by two liquid bridges of its own melt.
NASA Astrophysics Data System (ADS)
Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard
2006-05-01
A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.
42 CFR 455.20 - Recipient verification procedure.
Code of Federal Regulations, 2010 CFR
2010-10-01
... 42 Public Health 4 2010-10-01 2010-10-01 false Recipient verification procedure. 455.20 Section 455.20 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) MEDICAL ASSISTANCE PROGRAMS PROGRAM INTEGRITY: MEDICAID Medicaid Agency Fraud Detection and...
33 CFR 385.20 - Restoration Coordination and Verification (RECOVER).
Code of Federal Regulations, 2013 CFR
2013-07-01
... Verification (RECOVER). 385.20 Section 385.20 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION... technical team described in the “Final Integrated Feasibility Report and Programmatic Environmental Impact...
33 CFR 385.20 - Restoration Coordination and Verification (RECOVER).
Code of Federal Regulations, 2014 CFR
2014-07-01
... Verification (RECOVER). 385.20 Section 385.20 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION... technical team described in the “Final Integrated Feasibility Report and Programmatic Environmental Impact...
33 CFR 385.20 - Restoration Coordination and Verification (RECOVER).
Code of Federal Regulations, 2012 CFR
2012-07-01
... Verification (RECOVER). 385.20 Section 385.20 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION... technical team described in the “Final Integrated Feasibility Report and Programmatic Environmental Impact...
33 CFR 385.20 - Restoration Coordination and Verification (RECOVER).
Code of Federal Regulations, 2011 CFR
2011-07-01
... Verification (RECOVER). 385.20 Section 385.20 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION... technical team described in the “Final Integrated Feasibility Report and Programmatic Environmental Impact...
Ada(R) Test and Verification System (ATVS)
NASA Technical Reports Server (NTRS)
Strelich, Tom
1986-01-01
The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.
HDL to verification logic translator
NASA Technical Reports Server (NTRS)
Gambles, J. W.; Windley, P. J.
1992-01-01
The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.
NASA Technical Reports Server (NTRS)
Platt, R.
1999-01-01
This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oler, Kiri J.; Miller, Carl H.
In this paper, we present a methodology for reverse engineering integrated circuits, including a mathematical verification of a scalable algorithm used to generate minimal finite state machine representations of integrated circuits.
NASA Technical Reports Server (NTRS)
Sweeney, Christopher; Bunnell, John; Chung, William; Giovannetti, Dean; Mikula, Julie; Nicholson, Bob; Roscoe, Mike
2001-01-01
Joint Shipboard Helicopter Integration Process (JSHIP) is a Joint Test and Evaluation (JT&E) program sponsored by the Office of the Secretary of Defense (OSD). Under the JSHDP program is a simulation effort referred to as the Dynamic Interface Modeling and Simulation System (DIMSS). The purpose of DIMSS is to develop and test the processes and mechanisms that facilitate ship-helicopter interface testing via man-in-the-loop ground-based flight simulators. Specifically, the DIMSS charter is to develop an accredited process for using a flight simulator to determine the wind-over-the-deck (WOD) launch and recovery flight envelope for the UH-60A ship/helicopter combination. DIMSS is a collaborative effort between the NASA Ames Research Center and OSD. OSD determines the T&E and warfighter training requirements, provides the programmatics and dynamic interface T&E experience, and conducts ship/aircraft interface tests for validating the simulation. NASA provides the research and development element, simulation facility, and simulation technical experience. This paper will highlight the benefits of the NASA/JSHIP collaboration and detail achievements of the project in terms of modeling and simulation. The Vertical Motion Simulator (VMS) at NASA Ames Research Center offers the capability to simulate a wide range of simulation cueing configurations, which include visual, aural, and body-force cueing devices. The system flexibility enables switching configurations io allow back-to-back evaluation and comparison of different levels of cueing fidelity in determining minimum training requirements. The investigation required development and integration of several major simulation system at the VMS. A new UH-60A BlackHawk interchangeable cab that provides an out-the-window (OTW) field-of-view (FOV) of 220 degrees in azimuth and 70 degrees in elevation was built. Modeling efforts involved integrating Computational Fluid Dynamics (CFD) generated data of an LHA ship airwake and integrating a real-time ship motion model developed based on a batch model from Naval Surface Warfare Center. Engineering development and integration of a three degrees-of-freedom (DOF) dynamic seat to simulate high frequency rotor-dynamics dependent motion cues for use in conjunction with the large motion system was accomplished. The development of an LHA visual model in several different levels of resolution and an aural cueing system in which three separate fidelity levels could be selected were developed. VMS also integrated a PC-based E&S simFUSION system to investigate cost effective IG alternatives. The DIMSS project consists of three phases that follow an approved Validation, Verification and accreditation (VV&A) process. The first phase will support the accreditation of the individual subsystems and models. The second will follow the verification and validation of the integrated subsystems and models, and will address fidelity requirements of the integrated models and subsystems. The third and final phase will allow the verification and validation of the full system integration. This VV&A process will address the utility of the simulated WOD launch and recovery envelope. Simulations supporting the first two stages have been completed and the data is currently being reviewed and analyzed.
Beam Extinction Monitoring in the Mu2e Experiment
DOE Office of Scientific and Technical Information (OSTI.GOV)
Prebys, Eric; Bartoszek, Larry; Gaponenko, Andrei
The Mu2e Experiment at Fermilab will search for the conversion of a muon to an electron in the field of an atomic nucleus with unprecedented sensitivity. The experiment requires a beam consisting of proton bunches approximately 200ns FW long, separated by 1.7 microseconds, with no out-of-time protons at the 10⁻¹⁰ fractional level. The verification of this level of extinction is very challenging. The proposed technique uses a special purpose spectrometer which will observe particles scattered from the production target of the experiment. The acceptance will be limited such that there will be no saturation effects from the in-time beam. Themore » precise level and profile of the out-of-time beam can then be built up statistically, by integrating over many bunches.« less
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
Integrated testing and verification system for research flight software design document
NASA Technical Reports Server (NTRS)
Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.
1979-01-01
The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.
Two high accuracy digital integrators for Rogowski current transducers.
Luo, Pan-dian; Li, Hong-bin; Li, Zhen-hua
2014-01-01
The Rogowski current transducers have been widely used in AC current measurement, but their accuracy is mainly subject to the analog integrators, which have typical problems such as poor long-term stability and being susceptible to environmental conditions. The digital integrators can be another choice, but they cannot obtain a stable and accurate output for the reason that the DC component in original signal can be accumulated, which will lead to output DC drift. Unknown initial conditions can also result in integral output DC offset. This paper proposes two improved digital integrators used in Rogowski current transducers instead of traditional analog integrators for high measuring accuracy. A proportional-integral-derivative (PID) feedback controller and an attenuation coefficient have been applied in improving the Al-Alaoui integrator to change its DC response and get an ideal frequency response. For the special design in the field of digital signal processing, the improved digital integrators have better performance than analog integrators. Simulation models are built for the purpose of verification and comparison. The experiments prove that the designed integrators can achieve higher accuracy than analog integrators in steady-state response, transient-state response, and temperature changing condition.
Two high accuracy digital integrators for Rogowski current transducers
NASA Astrophysics Data System (ADS)
Luo, Pan-dian; Li, Hong-bin; Li, Zhen-hua
2014-01-01
The Rogowski current transducers have been widely used in AC current measurement, but their accuracy is mainly subject to the analog integrators, which have typical problems such as poor long-term stability and being susceptible to environmental conditions. The digital integrators can be another choice, but they cannot obtain a stable and accurate output for the reason that the DC component in original signal can be accumulated, which will lead to output DC drift. Unknown initial conditions can also result in integral output DC offset. This paper proposes two improved digital integrators used in Rogowski current transducers instead of traditional analog integrators for high measuring accuracy. A proportional-integral-derivative (PID) feedback controller and an attenuation coefficient have been applied in improving the Al-Alaoui integrator to change its DC response and get an ideal frequency response. For the special design in the field of digital signal processing, the improved digital integrators have better performance than analog integrators. Simulation models are built for the purpose of verification and comparison. The experiments prove that the designed integrators can achieve higher accuracy than analog integrators in steady-state response, transient-state response, and temperature changing condition.
2001-02-03
The lid is off the shipping container with the Multi-Purpose Logistics Module Donatello inside. It sits on a transporter inside the Space Station Processing Facility. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004
2001-02-03
Workers in the Space Station Processing Facility attach an overhead crane to the Multi-Purpose Logistics Module Donatello to lift it out of the shipping container. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004
2001-02-03
In the Space Station Processing Facility, workers help guide the overhead crane as it lifts the Multi-Purpose Logistics Module Donatello out of the shipping container. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004
2001-02-03
In the Space Station Processing Facility, workers help guide the Multi-Purpose Logistics Module Donatello as it moves the length of the SSPF toward a workstand. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004
2001-02-03
In the Space Station Processing Facility, workers wait for the Multi-Purpose Logistics Module Donatello, suspended by an overhead crane, to move onto a workstand. In the SSPF, Donatello will undergo processing by the payload test team, including integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo. Donatello will be launched on mission STS-130, currently planned for September 2004
NASA Astrophysics Data System (ADS)
Kunii, Masaru; Saito, Kazuo; Seko, Hiromu; Hara, Masahiro; Hara, Tabito; Yamaguchi, Munehiko; Gong, Jiandong; Charron, Martin; Du, Jun; Wang, Yong; Chen, Dehui
2011-05-01
During the period around the Beijing 2008 Olympic Games, the Beijing 2008 Olympics Research and Development Project (B08RDP) was conducted as part of the World Weather Research Program short-range weather forecasting research project. Mesoscale ensemble prediction (MEP) experiments were carried out by six organizations in near-real time, in order to share their experiences in the development of MEP systems. The purpose of this study is to objectively verify these experiments and to clarify the problems associated with the current MEP systems through the same experiences. Verification was performed using the MEP outputs interpolated into a common verification domain with a horizontal resolution of 15 km. For all systems, the ensemble spreads grew as the forecast time increased, and the ensemble mean improved the forecast errors compared with individual control forecasts in the verification against the analysis fields. However, each system exhibited individual characteristics according to the MEP method. Some participants used physical perturbation methods. The significance of these methods was confirmed by the verification. However, the mean error (ME) of the ensemble forecast in some systems was worse than that of the individual control forecast. This result suggests that it is necessary to pay careful attention to physical perturbations.
Study on verifying the angle measurement performance of the rotary-laser system
NASA Astrophysics Data System (ADS)
Zhao, Jin; Ren, Yongjie; Lin, Jiarui; Yin, Shibin; Zhu, Jigui
2018-04-01
An angle verification method to verify the angle measurement performance of the rotary-laser system was developed. Angle measurement performance has a great impact on measuring accuracy. Although there is some previous research on the verification of angle measuring uncertainty for the rotary-laser system, there are still some limitations. High-precision reference angles are used in the study of the method, and an integrated verification platform is set up to evaluate the performance of the system. This paper also probes the error that has biggest influence on the verification system. Some errors of the verification system are avoided via the experimental method, and some are compensated through the computational formula and curve fitting. Experimental results show that the angle measurement performance meets the requirement for coordinate measurement. The verification platform can evaluate the uncertainty of angle measurement for the rotary-laser system efficiently.
2005-05-01
TANK WALL.........................74 6 VERIFICATION - BONDED JOINT HOMOGENOUS ISOTROPIC AND ORTHOTROPIC DELALE & ERDOGAN PUBLICATION (SIX EXAMPLES...developed for verification of BondJo 87 6.3.2 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 88...comparisons for condition 1 91 6.3.6 Adhesive stress comparisons between BondJo, Ansys solid model FEA and Delale and Erdogan plate theory 92 x FIGURE
Apollo experience report: Guidance and control systems. Engineering simulation program
NASA Technical Reports Server (NTRS)
Gilbert, D. W.
1973-01-01
The Apollo Program experience from early 1962 to July 1969 with respect to the engineering-simulation support and the problems encountered is summarized in this report. Engineering simulation in support of the Apollo guidance and control system is discussed in terms of design analysis and verification, certification of hardware in closed-loop operation, verification of hardware/software compatibility, and verification of both software and procedures for each mission. The magnitude, time, and cost of the engineering simulations are described with respect to hardware availability, NASA and contractor facilities (for verification of the command module, the lunar module, and the primary guidance, navigation, and control system), and scheduling and planning considerations. Recommendations are made regarding implementation of similar, large-scale simulations for future programs.
Software verification plan for GCS. [guidance and control software
NASA Technical Reports Server (NTRS)
Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.
1990-01-01
This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.
Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk
NASA Technical Reports Server (NTRS)
Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.
2014-01-01
The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.
Formalization of the Integral Calculus in the PVS Theorem Prover
NASA Technical Reports Server (NTRS)
Butler, Ricky W.
2004-01-01
The PVS Theorem prover is a widely used formal verification tool used for the analysis of safety-critical systems. The PVS prover, though fully equipped to support deduction in a very general logic framework, namely higher-order logic, it must nevertheless, be augmented with the definitions and associated theorems for every branch of mathematics and Computer Science that is used in a verification. This is a formidable task, ultimately requiring the contributions of researchers and developers all over the world. This paper reports on the formalization of the integral calculus in the PVS theorem prover. All of the basic definitions and theorems covered in a first course on integral calculus have been completed.The theory and proofs were based on Rosenlicht's classic text on real analysis and follow the traditional epsilon-delta method. The goal of this work was to provide a practical set of PVS theories that could be used for verification of hybrid systems that arise in air traffic management systems and other aerospace applications. All of the basic linearity, integrability, boundedness, and continuity properties of the integral calculus were proved. The work culminated in the proof of the Fundamental Theorem Of Calculus. There is a brief discussion about why mechanically checked proofs are so much longer than standard mathematics textbook proofs.
NASA Technical Reports Server (NTRS)
1986-01-01
Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.
International Space Station Requirement Verification for Commercial Visiting Vehicles
NASA Technical Reports Server (NTRS)
Garguilo, Dan
2017-01-01
The COTS program demonstrated NASA could rely on commercial providers for safe, reliable, and cost-effective cargo delivery to ISS. The ISS Program has developed a streamlined process to safely integrate commercial visiting vehicles and ensure requirements are met Levy a minimum requirement set (down from 1000s to 100s) focusing on the ISS interface and safety, reducing the level of NASA oversight/insight and burden on the commercial Partner. Partners provide a detailed verification and validation plan documenting how they will show they've met NASA requirements. NASA conducts process sampling to ensure that the established verification processes is being followed. NASA participates in joint verification events and analysis for requirements that require both parties verify. Verification compliance is approved by NASA and launch readiness certified at mission readiness reviews.
Magnetic cleanliness verification approach on tethered satellite
NASA Technical Reports Server (NTRS)
Messidoro, Piero; Braghin, Massimo; Grande, Maurizio
1990-01-01
Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.
An evaluation of SEASAT-A candidate ocean industry economic verification experiments
NASA Technical Reports Server (NTRS)
1977-01-01
A description of the candidate economic verification experiments which could be performed with SEASAT is provided. Experiments have been identified in each of the areas of ocean-based activity that are expected to show an economic impact from the use of operational SEASAT data. Experiments have been identified in the areas of Arctic operations, the ocean fishing industry, the offshore oil and natural gas industry, as well as ice monitoring and coastal zone applications.
Accelerating functional verification of an integrated circuit
Deindl, Michael; Ruedinger, Jeffrey Joseph; Zoellin, Christian G.
2015-10-27
Illustrative embodiments include a method, system, and computer program product for accelerating functional verification in simulation testing of an integrated circuit (IC). Using a processor and a memory, a serial operation is replaced with a direct register access operation, wherein the serial operation is configured to perform bit shifting operation using a register in a simulation of the IC. The serial operation is blocked from manipulating the register in the simulation of the IC. Using the register in the simulation of the IC, the direct register access operation is performed in place of the serial operation.
2007-02-01
and Astronautics 11 PS3C W3 P3 T3 FAR3 Ps3 W41 P41 T41 FAR41 Ps41 W4 P4 T4 FAR4 Ps4 7 NozFlow 6 Flow45 5 Flow44 4 Flow41 3 Flow4 2 Flow3 1 N2Bal... Motivation for Modeling and Simulation Work The Augmented Generic Engine Model (AGEM) Model Verification and Validation (V&V) Assessment of AGEM V&V
Off-line robot programming and graphical verification of path planning
NASA Technical Reports Server (NTRS)
Tonkay, Gregory L.
1989-01-01
The objective of this project was to develop or specify an integrated environment for off-line programming, graphical path verification, and debugging for robotic systems. Two alternatives were compared. The first was the integration of the ASEA Off-line Programming package with ROBSIM, a robotic simulation program. The second alternative was the purchase of the commercial product IGRIP. The needs of the RADL (Robotics Applications Development Laboratory) were explored and the alternatives were evaluated based on these needs. As a result, IGRIP was proposed as the best solution to the problem.
NASA Astrophysics Data System (ADS)
Barr, D.; Gilpatrick, J. D.; Martinez, D.; Shurter, R. B.
2004-11-01
The Los Alamos Neutron Science Center (LANSCE) facility at Los Alamos National Laboratory has constructed both an Isotope Production Facility (IPF) and a Switchyard Kicker (XDK) as additions to the H+ and H- accelerator. These additions contain eleven Beam Position Monitors (BPMs) that measure the beam's position throughout the transport. The analog electronics within each processing module determines the beam position using the log-ratio technique. For system reliability, calibrations compensate for various temperature drifts and other imperfections in the processing electronics components. Additionally, verifications are periodically implemented by a PC running a National Instruments LabVIEW virtual instrument (VI) to verify continued system and cable integrity. The VI communicates with the processor cards via a PCI/MXI-3 VXI-crate communication module. Previously, accelerator operators performed BPM system calibrations typically once per day while beam was explicitly turned off. One of this new measurement system's unique achievements is its automated calibration and verification capability. Taking advantage of the pulsed nature of the LANSCE-facility beams, the integrated electronics hardware and VI perform calibration and verification operations between beam pulses without interrupting production beam delivery. The design, construction, and performance results of the automated calibration and verification portion of this position measurement system will be the topic of this paper.
Development of advanced seal verification
NASA Technical Reports Server (NTRS)
Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.
1992-01-01
The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.
Kraus, Michael W; Chen, Serena
2009-07-01
Extending research on the automatic activation of goals associated with significant others, the authors hypothesized that self-verification goals typically pursued with significant others are automatically elicited when a significant-other representation is activated. Supporting this hypothesis, the activation of a significant-other representation through priming (Experiments 1 and 3) or through a transference encounter (Experiment 2) led participants to seek feedback that verifies their preexisting self-views. Specifically, significant-other primed participants desired self-verifying feedback, in general (Experiment 1), from an upcoming interaction partner (Experiment 2), and relative to acquaintance-primed participants and favorable feedback (Experiment 3). Finally, self-verification goals were activated, especially for relational self-views deemed high in importance to participants' self-concepts (Experiment 2) and held with high certainty (Experiment 3). Implications for research on self-evaluative goals, the relational self, and the automatic goal activation literature are discussed, as are consequences for close relationships. (PsycINFO Database Record (c) 2009 APA, all rights reserved).
JWST Telescope Integration and Test Progress
NASA Technical Reports Server (NTRS)
Matthews, Gary W.; Whitman, Tony L.; Feinberg, Lee D.; Voyton, Mark F.; Lander, Juli A.; Keski-Kuha, Ritva
2016-01-01
The James Webb Space Telescope (JWST) is a 6.5m, segmented, IR telescope that will explore the first light of the universe after the big bang. The JWST Optical Telescope Element (Telescope) integration and test program is well underway. The telescope was completed in the spring of 2016 and the cryogenic test equipment has been through two optical test programs leading up to the final flight verification program. The details of the telescope mirror integration will be provided along with the current status of the flight observatory. In addition, the results of the two optical ground support equipment cryo tests will be shown and how these plans fold into the flight verification program.
Verification of a Remaining Flying Time Prediction System for Small Electric Aircraft
NASA Technical Reports Server (NTRS)
Hogge, Edward F.; Bole, Brian M.; Vazquez, Sixto L.; Celaya, Jose R.; Strom, Thomas H.; Hill, Boyd L.; Smalling, Kyle M.; Quach, Cuong C.
2015-01-01
This paper addresses the problem of building trust in online predictions of a battery powered aircraft's remaining available flying time. A set of ground tests is described that make use of a small unmanned aerial vehicle to verify the performance of remaining flying time predictions. The algorithm verification procedure described here uses a fully functional vehicle that is restrained to a platform for repeated run-to-functional-failure experiments. The vehicle under test is commanded to follow a predefined propeller RPM profile in order to create battery demand profiles similar to those expected in flight. The fully integrated aircraft is repeatedly operated until the charge stored in powertrain batteries falls below a specified lower-limit. The time at which the lower-limit on battery charge is crossed is then used to measure the accuracy of remaining flying time predictions. Accuracy requirements are considered in this paper for an alarm that warns operators when remaining flying time is estimated to fall below a specified threshold.
A space station Structures and Assembly Verification Experiment, SAVE
NASA Technical Reports Server (NTRS)
Russell, R. A.; Raney, J. P.; Deryder, L. J.
1986-01-01
The Space Station structure has been baselined to be a 5 M (16.4 ft) erectable truss. This structure will provide the overall framework to attach laboratory modules and other systems, subsystems and utilities. The assembly of this structure represents a formidable EVA challenge. To validate this capability the Space Station Structures/Dynamics Technical Integration Panel (TIP) met to develop the necessary data for an integrated STS structures flight experiment. As a result of this meeting, the Langley Research Center initiated a joint Langley/Boeing Aerospace Company study which supported the structures/dynamics TIP in developing the preliminary definition and design of a 5 M erectable space station truss and the resources required for a proposed flight experiment. The purpose of the study was to: (1) devise methods of truss assembly by astronauts; (2) define a specific test matrix for dynamic characterization; (3) identify instrumentation and data system requirements; (4) determine the power, propulsion and control requirements for the truss on-orbit for 3 years; (5) study the packaging of the experiment in the orbiter cargo bay; (6) prepare a preliminary cost estimate and schedule for the experiment; and (7) provide a list of potential follow-on experiments using the structure as a free flyer. The results of this three month study are presented.
Integrating Model-Based Verification into Software Design Education
ERIC Educational Resources Information Center
Yilmaz, Levent; Wang, Shuo
2005-01-01
Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…
Feasibility analysis of marine ecological on-line integrated monitoring system
NASA Astrophysics Data System (ADS)
Chu, D. Z.; Cao, X.; Zhang, S. W.; Wu, N.; Ma, R.; Zhang, L.; Cao, L.
2017-08-01
The in-situ water quality sensors were susceptible to biological attachment. Moreover, sea water corrosion and wave impact damage, and many sensors scattered distribution would cause maintenance inconvenience. The paper proposed a highly integrated marine ecological on-line integrated monitoring system, which can be used inside monitoring station. All sensors were reasonably classified, the similar in series, the overall in parallel. The system composition and workflow were described. In addition, the paper proposed attention issues of the system design and corresponding solutions. Water quality multi-parameters and 5 nutrient salts as the verification index, in-situ and systematic data comparison experiment were carried out. The results showed that the data consistency of nutrient salt, PH and salinity was better. Temperature and dissolved oxygen data trend was consistent, but the data had deviation. Turbidity fluctuated greatly; the chlorophyll trend was similar with it. Aiming at the above phenomena, three points system optimization direction were proposed.
Synesthesia affects verification of simple arithmetic equations.
Ghirardelli, Thomas G; Mills, Carol Bergfeld; Zilioli, Monica K C; Bailey, Leah P; Kretschmar, Paige K
2010-01-01
To investigate the effects of color-digit synesthesia on numerical representation, we presented a synesthete, called SE, in the present study, and controls with mathematical equations for verification. In Experiment 1, SE verified addition equations made up of digits that either matched or mismatched her color-digit photisms or were in black. In Experiment 2A, the addends were presented in the different color conditions and the solution was presented in black, whereas in Experiment 2B the addends were presented in black and the solutions were presented in the different color conditions. In Experiment 3, multiplication and division equations were presented in the same color conditions as in Experiment 1. SE responded significantly faster to equations that matched her photisms than to those that did not; controls did not show this effect. These results suggest that photisms influence the processing of digits in arithmetic verification, replicating and extending previous findings.
On-chip continuous-variable quantum entanglement
NASA Astrophysics Data System (ADS)
Masada, Genta; Furusawa, Akira
2016-09-01
Entanglement is an essential feature of quantum theory and the core of the majority of quantum information science and technologies. Quantum computing is one of the most important fruits of quantum entanglement and requires not only a bipartite entangled state but also more complicated multipartite entanglement. In previous experimental works to demonstrate various entanglement-based quantum information processing, light has been extensively used. Experiments utilizing such a complicated state need highly complex optical circuits to propagate optical beams and a high level of spatial interference between different light beams to generate quantum entanglement or to efficiently perform balanced homodyne measurement. Current experiments have been performed in conventional free-space optics with large numbers of optical components and a relatively large-sized optical setup. Therefore, they are limited in stability and scalability. Integrated photonics offer new tools and additional capabilities for manipulating light in quantum information technology. Owing to integrated waveguide circuits, it is possible to stabilize and miniaturize complex optical circuits and achieve high interference of light beams. The integrated circuits have been firstly developed for discrete-variable systems and then applied to continuous-variable systems. In this article, we review the currently developed scheme for generation and verification of continuous-variable quantum entanglement such as Einstein-Podolsky-Rosen beams using a photonic chip where waveguide circuits are integrated. This includes balanced homodyne measurement of a squeezed state of light. As a simple example, we also review an experiment for generating discrete-variable quantum entanglement using integrated waveguide circuits.
A digital flight control system verification laboratory
NASA Technical Reports Server (NTRS)
De Feo, P.; Saib, S.
1982-01-01
A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.
Security Verification Techniques Applied to PatchLink COTS Software
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer
2006-01-01
Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.
Ver-i-Fus: an integrated access control and information monitoring and management system
NASA Astrophysics Data System (ADS)
Thomopoulos, Stelios C.; Reisman, James G.; Papelis, Yiannis E.
1997-01-01
This paper describes the Ver-i-Fus Integrated Access Control and Information Monitoring and Management (IAC-I2M) system that INTELNET Inc. has developed. The Ver-i-Fus IAC-I2M system has been designed to meet the most stringent security and information monitoring requirements while allowing two- way communication between the user and the system. The systems offers a flexible interface that permits to integrate practically any sensing device, or combination of sensing devices, including a live-scan fingerprint reader, thus providing biometrics verification for enhanced security. Different configurations of the system provide solutions to different sets of access control problems. The re-configurable hardware interface, tied together with biometrics verification and a flexible interface that allows to integrate Ver-i-Fus with an MIS, provide an integrated solution to security, time and attendance, labor monitoring, production monitoring, and payroll applications.
MPLM Donatello is offloaded at the SLF
NASA Technical Reports Server (NTRS)
2001-01-01
At the Shuttle Landing Facility, cranes help offload the Italian Space Agency's Multi-Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane. The third of three for the International Space Station, the module will be moved on a transporter to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.
Integrated Formal Analysis of Timed-Triggered Ethernet
NASA Technical Reports Server (NTRS)
Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam
2012-01-01
We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.
GEOS-C noncoherent C-band transponder test procedure for spacecraft level tests
NASA Technical Reports Server (NTRS)
Selser, A. R.
1973-01-01
Test procedures necessary for the calibration and performance verification of the noncoherent C-band transponders after spacecraft hardware integration, but prior to spacecraft/launch vehicle integration are presented.
Research on key technology of the verification system of steel rule based on vision measurement
NASA Astrophysics Data System (ADS)
Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun
2018-01-01
The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.
Li, Y; Nielsen, P V
2011-12-01
There has been a rapid growth of scientific literature on the application of computational fluid dynamics (CFD) in the research of ventilation and indoor air science. With a 1000-10,000 times increase in computer hardware capability in the past 20 years, CFD has become an integral part of scientific research and engineering development of complex air distribution and ventilation systems in buildings. This review discusses the major and specific challenges of CFD in terms of turbulence modelling, numerical approximation, and boundary conditions relevant to building ventilation. We emphasize the growing need for CFD verification and validation, suggest ongoing needs for analytical and experimental methods to support the numerical solutions, and discuss the growing capacity of CFD in opening up new research areas. We suggest that CFD has not become a replacement for experiment and theoretical analysis in ventilation research, rather it has become an increasingly important partner. We believe that an effective scientific approach for ventilation studies is still to combine experiments, theory, and CFD. We argue that CFD verification and validation are becoming more crucial than ever as more complex ventilation problems are solved. It is anticipated that ventilation problems at the city scale will be tackled by CFD in the next 10 years. © 2011 John Wiley & Sons A/S.
NASA Astrophysics Data System (ADS)
Wan, Junwei; Chen, Hongyan; Zhao, Jing
2017-08-01
According to the requirements of real-time, reliability and safety for aerospace experiment, the single center cloud computing technology application verification platform is constructed. At the IAAS level, the feasibility of the cloud computing technology be applied to the field of aerospace experiment is tested and verified. Based on the analysis of the test results, a preliminary conclusion is obtained: Cloud computing platform can be applied to the aerospace experiment computing intensive business. For I/O intensive business, it is recommended to use the traditional physical machine.
JPL control/structure interaction test bed real-time control computer architecture
NASA Technical Reports Server (NTRS)
Briggs, Hugh C.
1989-01-01
The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.
Integrity verification testing of the ADI International Inc. Pilot Test Unit No. 2002-09 with MEDIA G2® arsenic adsorption media filter system was conducted at the Hilltown Township Water and Sewer Authority (HTWSA) Well Station No. 1 in Sellersville, Pennsylvania from October 8...
The 1991 3rd NASA Symposium on VLSI Design
NASA Technical Reports Server (NTRS)
Maki, Gary K.
1991-01-01
Papers from the symposium are presented from the following sessions: (1) featured presentations 1; (2) very large scale integration (VLSI) circuit design; (3) VLSI architecture 1; (4) featured presentations 2; (5) neural networks; (6) VLSI architectures 2; (7) featured presentations 3; (8) verification 1; (9) analog design; (10) verification 2; (11) design innovations 1; (12) asynchronous design; and (13) design innovations 2.
Hou, Guixue; Lou, Xiaomin; Sun, Yulin; Xu, Shaohang; Zi, Jin; Wang, Quanhui; Zhou, Baojin; Han, Bo; Wu, Lin; Zhao, Xiaohang; Lin, Liang; Liu, Siqi
2015-09-04
We propose an efficient integration of SWATH with MRM for biomarker discovery and verification when the corresponding ion library is well established. We strictly controlled the false positive rate associated with SWATH MS signals and carefully selected the target peptides coupled with SWATH and MRM. We collected 10 samples of esophageal squamous cell carcinoma (ESCC) tissues paired with tumors and adjacent regions and quantified 1758 unique proteins with FDR 1% at protein level using SWATH, in which 467 proteins were abundance-dependent with ESCC. After carefully evaluating the SWATH MS signals of the up-regulated proteins, we selected 120 proteins for MRM verification. MRM analysis of the pooled and individual esophageal tissues resulted in 116 proteins that exhibited similar abundance response modes to ESCC that were acquired with SWATH. Because the ESCC-related proteins consisted of a high percentile of secreted proteins, we conducted the MRM assay on patient sera that were collected from pre- and postoperation. Of the 116 target proteins, 42 were identified in the ESCC sera, including 11 with lowered abundances postoperation. Coupling SWATH and MRM is thus feasible and efficient for the discovery and verification of cancer-related protein biomarkers.
Enrichment Assay Methods Development for the Integrated Cylinder Verification System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Misner, Alex C.; Hatchell, Brian K.
2009-10-22
International Atomic Energy Agency (IAEA) inspectors currently perform periodic inspections at uranium enrichment plants to verify UF6 cylinder enrichment declarations. Measurements are typically performed with handheld high-resolution sensors on a sampling of cylinders taken to be representative of the facility's entire product-cylinder inventory. Pacific Northwest National Laboratory (PNNL) is developing a concept to automate the verification of enrichment plant cylinders to enable 100 percent product-cylinder verification and potentially, mass-balance calculations on the facility as a whole (by also measuring feed and tails cylinders). The Integrated Cylinder Verification System (ICVS) could be located at key measurement points to positively identify eachmore » cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until IAEA inspector arrival. The three main objectives of this FY09 project are summarized here and described in more detail in the report: (1) Develop a preliminary design for a prototype NDA system, (2) Refine PNNL's MCNP models of the NDA system, and (3) Procure and test key pulse-processing components. Progress against these tasks to date, and next steps, are discussed.« less
Space shuttle engineering and operations support. Avionics system engineering
NASA Technical Reports Server (NTRS)
Broome, P. A.; Neubaur, R. J.; Welsh, R. T.
1976-01-01
The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.
Verification and Validation for Flight-Critical Systems (VVFCS)
NASA Technical Reports Server (NTRS)
Graves, Sharon S.; Jacobsen, Robert A.
2010-01-01
On March 31, 2009 a Request for Information (RFI) was issued by NASA s Aviation Safety Program to gather input on the subject of Verification and Validation (V & V) of Flight-Critical Systems. The responses were provided to NASA on or before April 24, 2009. The RFI asked for comments in three topic areas: Modeling and Validation of New Concepts for Vehicles and Operations; Verification of Complex Integrated and Distributed Systems; and Software Safety Assurance. There were a total of 34 responses to the RFI, representing a cross-section of academic (26%), small & large industry (47%) and government agency (27%).
Electric power system test and verification program
NASA Technical Reports Server (NTRS)
Rylicki, Daniel S.; Robinson, Frank, Jr.
1994-01-01
Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.
NASA Technical Reports Server (NTRS)
Proud, Ryan; Adam, Jason
2011-01-01
As of Draft 4.0 of the CCT-REQ-1130 requirements document for CCP, ISS Crew Transportation and Services Requirements Document, specific language for the verification of the abort capability requirement, 3.3.1.4, was added. The abort capability requirement ensures that the CTS under dispersed conditions is always capable of aborting from a failed LV. The Integrated Aborts IPT was asked to author a memo for how this verification might be completed. The following memo dictates one way that this requirement and its verification could be met, but this is the not the only method.
Exomars Mission Verification Approach
NASA Astrophysics Data System (ADS)
Cassi, Carlo; Gilardi, Franco; Bethge, Boris
According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.
ERIC Educational Resources Information Center
Rao, Nyapati R.; Kodali, Rahul; Mian, Ayesha; Ramtekkar, Ujjwal; Kamarajan, Chella; Jibson, Michael D.
2012-01-01
Objective: The authors report on a pilot study of the experiences and perceptions of foreign international medical graduate (F-IMG), United States international medical graduate (US-IMG), and United States medical graduate (USMG) psychiatric residents with the newly mandated Clinical Skills Verification (CSV) process. The goal was to identify and…
NASA Astrophysics Data System (ADS)
Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB
2017-11-01
Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.
Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System
NASA Astrophysics Data System (ADS)
Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won
2009-12-01
As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.
A novel environmental chamber for neuronal network multisite recordings.
Biffi, E; Regalia, G; Ghezzi, D; De Ceglia, R; Menegon, A; Ferrigno, G; Fiore, G B; Pedrocchi, A
2012-10-01
Environmental stability is a critical issue for neuronal networks in vitro. Hence, the ability to control the physical and chemical environment of cell cultures during electrophysiological measurements is an important requirement in the experimental design. In this work, we describe the development and the experimental verification of a closed chamber for multisite electrophysiology and optical monitoring. The chamber provides stable temperature, pH and humidity and guarantees cell viability comparable to standard incubators. Besides, it integrates the electronics for long-term neuronal activity recording. The system is portable and adaptable for multiple network housings, which allows performing parallel experiments in the same environment. Our results show that this device can be a solution for long-term electrophysiology, for dual network experiments and for coupled optical and electrical measurements. Copyright © 2012 Wiley Periodicals, Inc.
Ground Testing of a 10 K Sorption Cryocooler Flight Experiment (BETSCE)
NASA Technical Reports Server (NTRS)
Bard, S.; Wu, J.; Karlmann, P.; Cowgill, P.; Mirate, C.; Rodriguez, J.
1994-01-01
The Brilliant Eyes Ten-Kelvin Sorption Cryocooler Experiment (BETSCE) is a Space Shuttle side-wall-mounted flight experiment designed to demonstrate 10 K sorption cryocooler technology in a space environment. The BETSCE objectives are to: (1) provide a thorough end-to-end characterization and space performance validation of a complete, multistage, automated, closed-cycle hydride sorption cryocooler in the 10 to 30 K temperature range, (2) acquire the quantitative microgravity database required to provide confident engineering design, scaling, and optimization, (3) advance the enabling technologies and resolve integration issues, and (4) provide hardware qualification and safety verification heritage. BETSCE ground tests were the first-ever demonstration of a complete closed-cycle 10 K sorption cryocooler. Test results exceeded functional requirements. This paper summarizes functional and environmental ground test results, planned characterization tests, important development challenges that were overcome, and valuable lessons-learned.
Model-Based Verification and Validation of Spacecraft Avionics
NASA Technical Reports Server (NTRS)
Khan, M. Omair; Sievers, Michael; Standley, Shaun
2012-01-01
Verification and Validation (V&V) at JPL is traditionally performed on flight or flight-like hardware running flight software. For some time, the complexity of avionics has increased exponentially while the time allocated for system integration and associated V&V testing has remained fixed. There is an increasing need to perform comprehensive system level V&V using modeling and simulation, and to use scarce hardware testing time to validate models; the norm for thermal and structural V&V for some time. Our approach extends model-based V&V to electronics and software through functional and structural models implemented in SysML. We develop component models of electronics and software that are validated by comparison with test results from actual equipment. The models are then simulated enabling a more complete set of test cases than possible on flight hardware. SysML simulations provide access and control of internal nodes that may not be available in physical systems. This is particularly helpful in testing fault protection behaviors when injecting faults is either not possible or potentially damaging to the hardware. We can also model both hardware and software behaviors in SysML, which allows us to simulate hardware and software interactions. With an integrated model and simulation capability we can evaluate the hardware and software interactions and identify problems sooner. The primary missing piece is validating SysML model correctness against hardware; this experiment demonstrated such an approach is possible.
Design Authority in the Test Programme Definition: The Alenia Spazio Experience
NASA Astrophysics Data System (ADS)
Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.
2004-08-01
In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).
First Cryo-Vacuum Test of the JWST Integrated Science Instrument Module
NASA Astrophysics Data System (ADS)
Kimble, Randy A.; Antonille, S. R.; Balzano, V.; Comber, B. J.; Davila, P. S.; Drury, M. D.; Glasse, A.; Glazer, S. D.; Lundquist, R.; Mann, S. D.; McGuffey, D. B.; Novo-Gradac, K. J.; Penanen, K.; Ramey, D. D.; Sullivan, J.; Van Campen, J.; Vila, M. B.
2014-01-01
The integration and test program for the Integrated Science Instrument Module (ISIM) of the James Webb Space Telescope (JWST) calls for three cryo-vacuum tests of the ISIM hardware. The first is a risk-reduction test aimed at checking out the test hardware and procedures; this will be followed by two formal verification tests that will bracket other key aspects of the environmental test program (e.g. vibration and acoustics, EMI/EMC). The first of these cryo-vacuum tests, the risk-reduction test, was executed at NASA’s Goddard Space Flight Center starting in late August, 2013. Flight hardware under test included two (of the eventual four) flight instruments, the Mid-Infrared Instrument (MIRI) and the Fine Guidance Sensor/Near-Infrared Imager and Slitless Spectrograph (FGS/NIRISS), mounted to the ISIM structure, as well as the ISIM Electronics Compartment (IEC). The instruments were cooled to their flight operating temperatures 40K for FGS/NIRISS, ~6K for MIRI) and optically tested against a cryo-certified telescope simulator. Key goals for the risk reduction test included: 1) demonstration of controlled cooldown and warmup, stable control at operating temperature, and measurement of heat loads, 2) operation of the science instruments with ISIM electronics systems at temperature, 3) health trending of the science instruments against instrument-level test results, 4) measurement of the pupil positions and six degree of freedom alignment of the science instruments against the simulated telescope focal surface, 5) detailed optical characterization of the NIRISS instrument, 6) verification of the signal-to-noise performance of the MIRI, and 7) exercise of the Onboard Script System that will be used to operate the instruments in flight. In addition, the execution of the test is expected to yield invaluable logistical experience - development and execution of procedures, communications, analysis of results - that will greatly benefit the subsequent verification tests. At the time of this submission, the hardware had reached operating temperature and was partway through the cryo test program. We report here on the test configuration, the overall process, and the results that were ultimately obtained.
Development and Verification of the Charring Ablating Thermal Protection Implicit System Solver
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Calvert, Nathan D.; Kirk, Benjamin S.
2010-01-01
The development and verification of the Charring Ablating Thermal Protection Implicit System Solver is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method with first and second order implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton's method, while the fully implicit linear system is solved with the Generalized Minimal Residual method. Verification results from exact solutions and the Method of Manufactured Solutions are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.
Development and Verification of the Charring, Ablating Thermal Protection Implicit System Simulator
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Calvert, Nathan; Kirk, Benjamin S.
2011-01-01
The development and verification of the Charring Ablating Thermal Protection Implicit System Solver (CATPISS) is presented. This work concentrates on the derivation and verification of the stationary grid terms in the equations that govern three-dimensional heat and mass transfer for charring thermal protection systems including pyrolysis gas flow through the porous char layer. The governing equations are discretized according to the Galerkin finite element method (FEM) with first and second order fully implicit time integrators. The governing equations are fully coupled and are solved in parallel via Newton s method, while the linear system is solved via the Generalized Minimum Residual method (GMRES). Verification results from exact solutions and Method of Manufactured Solutions (MMS) are presented to show spatial and temporal orders of accuracy as well as nonlinear convergence rates.
An Airbus arrives at KSC with third MPLM
NASA Technical Reports Server (NTRS)
2001-01-01
An Airbus '''Beluga''' air cargo plane, The Super Transporter, lands at KSC's Shuttle Landing Facility. Its cargo, from the factory of Alenia Aerospazio in Turin, Italy, is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.
An Airbus arrives at KSC with third MPLM
NASA Technical Reports Server (NTRS)
2001-01-01
An Airbus '''Beluga''' air cargo plane, The Super Transporter, arrives at KSC's Shuttle Landing Facility from the factory of Alenia Aerospazio in Turin, Italy. Its cargo is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.
Probabilistic Structural Analysis Methods (PSAM) for select space propulsion system components
NASA Technical Reports Server (NTRS)
1991-01-01
This annual report summarizes the work completed during the third year of technical effort on the referenced contract. Principal developments continue to focus on the Probabilistic Finite Element Method (PFEM) which has been under development for three years. Essentially all of the linear capabilities within the PFEM code are in place. Major progress in the application or verifications phase was achieved. An EXPERT module architecture was designed and partially implemented. EXPERT is a user interface module which incorporates an expert system shell for the implementation of a rule-based interface utilizing the experience and expertise of the user community. The Fast Probability Integration (FPI) Algorithm continues to demonstrate outstanding performance characteristics for the integration of probability density functions for multiple variables. Additionally, an enhanced Monte Carlo simulation algorithm was developed and demonstrated for a variety of numerical strategies.
NASA Astrophysics Data System (ADS)
McConkey, M. L.
1984-12-01
A complete CMOS/BULK design cycle has been implemented and fully tested to evaluate its effectiveness and a viable set of computer-aided design tools for the layout, verification, and simulation of CMOS/BULK integrated circuits. This design cycle is good for p-well, n-well, or twin-well structures, although current fabrication technique available limit this to p-well only. BANE, an integrated layout program from Stanford, is at the center of this design cycle and was shown to be simple to use in the layout of CMOS integrated circuits (it can be also used to layout NMOS integrated circuits). A flowchart was developed showing the design cycle from initial layout, through design verification, and to circuit simulation using NETLIST, PRESIM, and RNL from the University of Washington. A CMOS/BULK library was designed and includes logic gates that were designed and completely tested by following this flowchart. Also designed was an arithmetic logic unit as a more complex test of the CMOS/BULK design cycle.
Spectral Analysis of Forecast Error Investigated with an Observing System Simulation Experiment
NASA Technical Reports Server (NTRS)
Prive, N. C.; Errico, Ronald M.
2015-01-01
The spectra of analysis and forecast error are examined using the observing system simulation experiment (OSSE) framework developed at the National Aeronautics and Space Administration Global Modeling and Assimilation Office (NASAGMAO). A global numerical weather prediction model, the Global Earth Observing System version 5 (GEOS-5) with Gridpoint Statistical Interpolation (GSI) data assimilation, is cycled for two months with once-daily forecasts to 336 hours to generate a control case. Verification of forecast errors using the Nature Run as truth is compared with verification of forecast errors using self-analysis; significant underestimation of forecast errors is seen using self-analysis verification for up to 48 hours. Likewise, self analysis verification significantly overestimates the error growth rates of the early forecast, as well as mischaracterizing the spatial scales at which the strongest growth occurs. The Nature Run-verified error variances exhibit a complicated progression of growth, particularly for low wave number errors. In a second experiment, cycling of the model and data assimilation over the same period is repeated, but using synthetic observations with different explicitly added observation errors having the same error variances as the control experiment, thus creating a different realization of the control. The forecast errors of the two experiments become more correlated during the early forecast period, with correlations increasing for up to 72 hours before beginning to decrease.
NASA Technical Reports Server (NTRS)
Platt, R.
1998-01-01
This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.
An ORCID based synchronization framework for a national CRIS ecosystem.
Mendes Moreira, João; Cunha, Alcino; Macedo, Nuno
2015-01-01
PTCRIS (Portuguese Current Research Information System) is a program aiming at the creation and sustained development of a national integrated information ecosystem, to support research management according to the best international standards and practices. This paper reports on the experience of designing and prototyping a synchronization framework for PTCRIS based on ORCID (Open Researcher and Contributor ID). This framework embraces the "input once, re-use often" principle, and will enable a substantial reduction of the research output management burden by allowing automatic information exchange between the various national systems. The design of the framework followed best practices in rigorous software engineering, namely well-established principles in the research field of consistency management, and relied on formal analysis techniques and tools for its validation and verification. The notion of consistency between the services was formally specified and discussed with the stakeholders before the technical aspects on how to preserve said consistency were explored. Formal specification languages and automated verification tools were used to analyze the specifications and generate usage scenarios, useful for validation with the stakeholder and essential to certificate compliant services.
The CHANDRA X-Ray Observatory: Thermal Design, Verification, and Early Orbit Experience
NASA Technical Reports Server (NTRS)
Boyd, David A.; Freeman, Mark D.; Lynch, Nicolie; Lavois, Anthony R. (Technical Monitor)
2000-01-01
The CHANDRA X-ray Observatory (formerly AXAF), one of NASA's "Great Observatories" was launched aboard the Shuttle in July 1999. CHANDRA comprises a grazing-incidence X-ray telescope of unprecedented focal-length, collecting area and angular resolution -- better than two orders of magnitude improvement in imaging performance over any previous soft X-ray (0.1-10 keV) mission. Two focal-plane instruments, one with a 150 K passively-cooled detector, provide celestial X-ray images and spectra. Thermal control of CHANDRA includes active systems for the telescope mirror and environment and the optical bench, and largely passive systems for the focal plans instruments. Performance testing of these thermal control systems required 1-1/2 years at increasing levels of integration, culminating in thermal-balance testing of the fully-configured observatory during the summer of 1998. This paper outlines details of thermal design tradeoffs and methods for both the Observatory and the two focal-plane instruments, the thermal verification philosophy of the Chandra program (what to test and at what level), and summarizes the results of the instrument, optical system and observatory testing.
NASA Technical Reports Server (NTRS)
Ryan, R. S.; Salter, L. D.; Young, G. M., III; Munafo, P. M.
1985-01-01
The planned missions for the space shuttle dictated a unique and technology-extending rocket engine. The high specific impulse requirements in conjunction with a 55-mission lifetime, plus volume and weight constraints, produced unique structural design, manufacturing, and verification requirements. Operations from Earth to orbit produce severe dynamic environments, which couple with the extreme pressure and thermal environments associated with the high performance, creating large low cycle loads and high alternating stresses above endurance limit which result in high sensitivity to alternating stresses. Combining all of these effects resulted in the requirements for exotic materials, which are more susceptible to manufacturing problems, and the use of an all-welded structure. The challenge of integrating environments, dynamics, structures, and materials into a verified SSME structure is discussed. The verification program and developmental flight results are included. The first six shuttle flights had engine performance as predicted with no failures. The engine system has met the basic design challenges.
NASA Technical Reports Server (NTRS)
Drury, Michael; Becker, Neil; Bos, Brent; Davila, Pamela; Frey, Bradley; Hylan, Jason; Marsh, James; McGuffey, Douglas; Novak, Maria; Ohl, Raymond;
2007-01-01
The James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (approx.40K). The JWST Observatory architecture includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) element that contains four science instruments (SI) including a Guider. The SIs and Guider are mounted to a composite metering structure with outer dimensions of 2.1x2.2x1.9m. The SI and Guider units are integrated to the ISIM structure and optically tested at NASA/Goddard Space Flight Center as an instrument suite using a high-fidelity, cryogenic JWST telescope simulator that features a 1.5m diameter powered mirror. The SIs are integrated and aligned to the structure under ambient, clean room conditions. SI performance, including focus, pupil shear and wavefront error, is evaluated at the operating temperature. We present an overview of the ISIM integration within the context of Observatory-level construction. We describe the integration and verification plan for the ISIM element, including an overview of our incremental verification approach, ambient mechanical integration and test plans and optical alignment and cryogenic test plans. We describe key ground support equipment and facilities.
Integrated fringe projection 3D scanning system for large-scale metrology based on laser tracker
NASA Astrophysics Data System (ADS)
Du, Hui; Chen, Xiaobo; Zhou, Dan; Guo, Gen; Xi, Juntong
2017-10-01
Large scale components exist widely in advance manufacturing industry,3D profilometry plays a pivotal role for the quality control. This paper proposes a flexible, robust large-scale 3D scanning system by integrating a robot with a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. And a mathematical model is established for the global data fusion. Subsequently, a flexible and robust method and mechanism is introduced for the establishment of the end coordination system. Based on this method, a virtual robot noumenon is constructed for hand-eye calibration. And then the transformation matrix between end coordination system and world coordination system is solved. Validation experiment is implemented for verifying the proposed algorithms. Firstly, hand-eye transformation matrix is solved. Then a car body rear is measured for 16 times for the global data fusion algorithm verification. And the 3D shape of the rear is reconstructed successfully.
UAS Integration in the NAS Project Overview: RTCA SC-228 Plenary DAA Working Group 5
NASA Technical Reports Server (NTRS)
Randall, Debra K.
2014-01-01
The presentation is intended to allow the public to know and understand NASA's plans for integrated test to allow them the opportunity to provide feedback and suggestions. The integrated testing will support verification and validation of the RTCA SC-228 UAS minimum operation performance standard requirements.
Study of Measurement Strategies of Geometric Deviation of the Position of the Threaded Holes
NASA Astrophysics Data System (ADS)
Drbul, Mário; Martikan, Pavol; Sajgalik, Michal; Czan, Andrej; Broncek, Jozef; Babik, Ondrej
2017-12-01
Verification of product and quality control is an integral part of current production process. In terms of functional requirements and product interoperability, it is necessary to analyze their dimensional and also geometric specifications. Threaded holes are verified elements too, which are a substantial part of detachable screw connections and have a broad presence in engineering products. This paper deals with on the analysing of measurement strategies of verification geometric deviation of the position of the threaded holes, which are the indirect method of measuring threaded pins when applying different measurement strategies which can affect the result of the verification of the product..
NASA Technical Reports Server (NTRS)
Zoutendyk, J. A.; Smith, L. S.; Soli, G. A.; Thieberger, P.; Wegner, H. E.
1985-01-01
Single-Event Upset (SEU) response of a bipolar low-power Schottky-diode-clamped TTL static RAM has been observed using Br ions in the 100-240 MeV energy range and O ions in the 20-100 MeV range. These data complete the experimental verification of circuit-simulation SEU modeling for this device. The threshold for onset of SEU has been observed by the variation of energy, ion species and angle of incidence. The results obtained from the computer circuit-simulation modeling and experimental model verification demonstrate a viable methodology for modeling SEU in bipolar integrated circuits.
NASA Astrophysics Data System (ADS)
Rieben, James C., Jr.
This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.
Fall 2012 Graduate Engineering Internship Summary
NASA Technical Reports Server (NTRS)
Ehrlich, Joshua
2013-01-01
In the fall of 2012, I participated in the National Aeronautics and Space Administration (NASA) Pathways Intern Employment Program at the Kennedy Space Center (KSC) in Florida. This was my second internship opportunity with NASA, a consecutive extension from a summer 2012 internship. During my four-month tenure, I gained valuable knowledge and extensive hands-on experience with payload design and testing as well as composite fabrication for repair design on future space vehicle structures. As a systems engineer, I supported the systems engineering and integration team with the testing of scientific payloads such as the Vegetable Production System (Veggie). Verification and validation (V&V) of the Veggie was carried out prior to qualification testing of the payload, which incorporated a lengthy process of confirming design requirements that were integrated through one or more validatjon methods: inspection, analysis, demonstration, and testing. Additionally, I provided assistance in verifying design requirements outlined in the V&V plan with the requirements outlined by the scientists in the Science Requirements Envelope Document (SRED). The purpose of the SRED was to define experiment requirements intended for the payload to meet and carry out.
Integrated Tokamak modeling: When physics informs engineering and research planning
NASA Astrophysics Data System (ADS)
Poli, Francesca Maria
2018-05-01
Modeling tokamaks enables a deeper understanding of how to run and control our experiments and how to design stable and reliable reactors. We model tokamaks to understand the nonlinear dynamics of plasmas embedded in magnetic fields and contained by finite size, conducting structures, and the interplay between turbulence, magneto-hydrodynamic instabilities, and wave propagation. This tutorial guides through the components of a tokamak simulator, highlighting how high-fidelity simulations can guide the development of reduced models that can be used to understand how the dynamics at a small scale and short time scales affects macroscopic transport and global stability of plasmas. It discusses the important role that reduced models have in the modeling of an entire plasma discharge from startup to termination, the limits of these models, and how they can be improved. It discusses the important role that efficient workflows have in the coupling between codes, in the validation of models against experiments and in the verification of theoretical models. Finally, it reviews the status of integrated modeling and addresses the gaps and needs towards predictions of future devices and fusion reactors.
Integrated Tokamak modeling: When physics informs engineering and research planning
DOE Office of Scientific and Technical Information (OSTI.GOV)
Poli, Francesca Maria
Modeling tokamaks enables a deeper understanding of how to run and control our experiments and how to design stable and reliable reactors. We model tokamaks to understand the nonlinear dynamics of plasmas embedded in magnetic fields and contained by finite size, conducting structures, and the interplay between turbulence, magneto-hydrodynamic instabilities, and wave propagation. This tutorial guides through the components of a tokamak simulator, highlighting how high-fidelity simulations can guide the development of reduced models that can be used to understand how the dynamics at a small scale and short time scales affects macroscopic transport and global stability of plasmas. Itmore » discusses the important role that reduced models have in the modeling of an entire plasma discharge from startup to termination, the limits of these models, and how they can be improved. It discusses the important role that efficient workflows have in the coupling between codes, in the validation of models against experiments and in the verification of theoretical models. Finally, it reviews the status of integrated modeling and addresses the gaps and needs towards predictions of future devices and fusion reactors.« less
Integrated Tokamak modeling: When physics informs engineering and research planning
Poli, Francesca Maria
2018-05-01
Modeling tokamaks enables a deeper understanding of how to run and control our experiments and how to design stable and reliable reactors. We model tokamaks to understand the nonlinear dynamics of plasmas embedded in magnetic fields and contained by finite size, conducting structures, and the interplay between turbulence, magneto-hydrodynamic instabilities, and wave propagation. This tutorial guides through the components of a tokamak simulator, highlighting how high-fidelity simulations can guide the development of reduced models that can be used to understand how the dynamics at a small scale and short time scales affects macroscopic transport and global stability of plasmas. Itmore » discusses the important role that reduced models have in the modeling of an entire plasma discharge from startup to termination, the limits of these models, and how they can be improved. It discusses the important role that efficient workflows have in the coupling between codes, in the validation of models against experiments and in the verification of theoretical models. Finally, it reviews the status of integrated modeling and addresses the gaps and needs towards predictions of future devices and fusion reactors.« less
The IXV experience, from the mission conception to the flight results
NASA Astrophysics Data System (ADS)
Tumino, G.; Mancuso, S.; Gallego, J.-M.; Dussy, S.; Preaud, J.-P.; Di Vita, G.; Brunner, P.
2016-07-01
The atmospheric re-entry domain is a cornerstone of a wide range of space applications, ranging from reusable launcher stages developments, robotic planetary exploration, human space flight, to innovative applications such as reusable research platforms for in orbit validation of multiple space applications technologies. The Intermediate experimental Vehicle (IXV) is an advanced demonstrator which has performed in-flight experimentation of atmospheric re-entry enabling systems and technologies aspects, with significant advancements on Europe's previous flight experiences, consolidating Europe's autonomous position in the strategic field of atmospheric re-entry. The IXV mission objectives were the design, development, manufacturing, assembling and on-ground to in-flight verification of an autonomous European lifting and aerodynamically controlled reentry system, integrating critical re-entry technologies at system level. Among such critical technologies of interest, special attention was paid to aerodynamic and aerothermodynamics experimentation, including advanced instrumentation for aerothermodynamics phenomena investigations, thermal protections and hot-structures, guidance, navigation and flight control through combined jets and aerodynamic surfaces (i.e. flaps), in particular focusing on the technologies integration at system level for flight, successfully performed on February 11th, 2015.
Crowd-Sourced Help with Emergent Knowledge for Optimized Formal Verification (CHEKOFV)
2016-03-01
up game Binary Fission, which was deployed during Phase Two of CHEKOFV. Xylem: The Code of Plants is a casual game for players using mobile ...there are the design and engineering challenges of building a game infrastructure that integrates verification technology with crowd participation...the backend processes that annotate the originating software. Allowing players to construct their own equations opened up the flexibility to receive
Verification of the databases EXFOR and ENDF
NASA Astrophysics Data System (ADS)
Berton, Gottfried; Damart, Guillaume; Cabellos, Oscar; Beauzamy, Bernard; Soppera, Nicolas; Bossant, Manuel
2017-09-01
The objective of this work is for the verification of large experimental (EXFOR) and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…). The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.
Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model
NASA Astrophysics Data System (ADS)
Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal
How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sethuraman, TKR; Sherif, M; Subramanian, N
Purpose: The complexity of IMRT delivery requires pre-treatment quality assurance and plan verification. KCCC has implemented IMRT clinically in few sites and will extend to all sites. Recently, our Varian linear accelerator and Eclipse planning system were upgraded from Millennium 80 to 120 Multileaf Collimator (MLC) and from v8.6 to 11.0 respectively. Our preliminary experience on the pre-treatment quality assurance verification is discussed. Methods: Eight Breast, Three Prostate and One Hypopharynx cancer patients were planned with step and shoot IMRT. All breast cases were planned before the upgrade with 60% cases treated. The ICRU 83 recommendations were followed for themore » dose prescription and constraints to OAR for all cases. Point dose measurement was done with CIRS cylindrical phantom and PTW 0.125 cc ionization chamber. Measured dose was compared with calculated dose at the point of measurement. Map CHECK diode array phantom was used for the plan verification. Planned and measured doses were compared by applying gamma index of 3% (dose difference) / 3 mm DTA (average distance to agreement). For all cases, a plan is considered to be successful if more than 95% of the tested diodes pass the gamma test. A prostate case was chosen to compare the plan verification before and after the upgrade. Results: Point dose measurement results were in agreement with the calculated doses. The maximum deviation observed was 2.3%. The passing rate of average gamma index was measured higher than 97% for the plan verification of all cases. Similar result was observed for plan verification of the chosen prostate case before and after the upgrade. Conclusion: Our preliminary experience from the obtained results validates the accuracy of our QA process and provides confidence to extend IMRT to all sites in Kuwait.« less
The Yes-No Question Answering System and Statement Verification.
ERIC Educational Resources Information Center
Akiyama, M. Michael; And Others
1979-01-01
Two experiments investigated the relationship of verification to the answering of yes-no questions. Subjects verified simple statements or answered simple questions. Various proposals concerning the relative difficulty of answering questions and verifying statements were considered, and a model was proposed. (SW)
A probabilistic verification score for contours demonstrated with idealized ice-edge forecasts
NASA Astrophysics Data System (ADS)
Goessling, Helge; Jung, Thomas
2017-04-01
We introduce a probabilistic verification score for ensemble-based forecasts of contours: the Spatial Probability Score (SPS). Defined as the spatial integral of local (Half) Brier Scores, the SPS can be considered the spatial analog of the Continuous Ranked Probability Score (CRPS). Applying the SPS to idealized seasonal ensemble forecasts of the Arctic sea-ice edge in a global coupled climate model, we demonstrate that the SPS responds properly to ensemble size, bias, and spread. When applied to individual forecasts or ensemble means (or quantiles), the SPS is reduced to the 'volume' of mismatch, in case of the ice edge corresponding to the Integrated Ice Edge Error (IIEE).
NASA Lighting Research, Test, & Analysis
NASA Technical Reports Server (NTRS)
Clark, Toni
2015-01-01
The Habitability and Human Factors Branch, at Johnson Space Center, in Houston, TX, provides technical guidance for the development of spaceflight lighting requirements, verification of light system performance, analysis of integrated environmental lighting systems, and research of lighting-related human performance issues. The Habitability & Human Factors Lighting Team maintains two physical facilities that are integrated to provide support. The Lighting Environment Test Facility (LETF) provides a controlled darkroom environment for physical verification of lighting systems with photometric and spetrographic measurement systems. The Graphics Research & Analysis Facility (GRAF) maintains the capability for computer-based analysis of operational lighting environments. The combined capabilities of the Lighting Team at Johnson Space Center have been used for a wide range of lighting-related issues.
2003-03-01
Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig
P91-1 ARGOS spacecraft thermal control
NASA Astrophysics Data System (ADS)
Sadunas, Jonas; Baginski, Ben; McCarthy, Daniel
1993-07-01
The P91-1, or ARGOS, is a Department of Defense funded (DOD) Space Test Program (STP) satellite managed by the Space and Missile Systems Center Space and Small Launch Vehicle Programs Office (SMC/CUL). Rockwell International Space Systems Division is the space vehicle prime contractor. The P91-1 mission is to fly a suite of eight experiments in a 450 nautical mile sun-synchronous orbit dedicated to three dimensional UV imaging of the ionosphere, X-ray source mapping, navigation, space debris characterization, performance characterization of high temperature super conductivity RF devices, and on orbit demonstration of an electrical propulsion system. The primary purpose of this paper is to acquaint the thermal control community, and potential future follow on mission users, with the thermal control characteristics of the spacecraft, experiment/SV thermal integration aspects, and test verification plans.
NASA Technical Reports Server (NTRS)
Martinez, Pedro A.; Dunn, Kevin W.
1987-01-01
This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.
Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia
2014-11-01
Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.
Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia
2014-01-01
Background Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. Objective The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. Methods The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Results Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Conclusions Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. PMID:24906806
NASA Astrophysics Data System (ADS)
Huang, Lei; Zhou, Chenlu; Zhao, Wenchuan; Choi, Heejoo; Graves, Logan; Kim, Daewook
2017-06-01
We present a high precision deflectometry system (DS) controlled deformable mirror (DM) solution for optical system. Different from wavefront and non-wavefront system, the DS and the DM are set to be an individual integrated DCDM unit and can be installed in one base plate. In the DCDM unit, the DS can directly provide the influence functions and surface shape of the DM to the industrial computer in any adaptive optics system. As an integrated adaptive unit, the DCDM unit could be put into various optical systems to realize aberration compensation. In this paper, the configuration and principle of the DCDM unit is introduced first. Theoretical simulation on the close-loop performance of the DCDM unit is carried out. Finally, a verification experiment is proposed to verify the compensation capability of the DCDM unit.
An Airbus arrives at KSC with third MPLM
NASA Technical Reports Server (NTRS)
2001-01-01
An Airbus '''Beluga''' air cargo plane, The Super Transporter, taxis onto the parking apron at KSC's Shuttle Landing Facility. Its cargo, from the factory of Alenia Aerospazio in Turin, Italy, is the Italian Space Agency's Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.
MPLM Donatello is offloaded at the SLF
NASA Technical Reports Server (NTRS)
2001-01-01
At the Shuttle Landing Facility, workers in cherry pickers (right) help guide offloading of the Italian Space Agency's Multi-Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.
Knowledge-based assistance in costing the space station DMS
NASA Technical Reports Server (NTRS)
Henson, Troy; Rone, Kyle
1988-01-01
The Software Cost Engineering (SCE) methodology developed over the last two decades at IBM Systems Integration Division (SID) in Houston is utilized to cost the NASA Space Station Data Management System (DMS). An ongoing project to capture this methodology, which is built on a foundation of experiences and lessons learned, has resulted in the development of an internal-use-only, PC-based prototype that integrates algorithmic tools with knowledge-based decision support assistants. This prototype Software Cost Engineering Automation Tool (SCEAT) is being employed to assist in the DMS costing exercises. At the same time, DMS costing serves as a forcing function and provides a platform for the continuing, iterative development, calibration, and validation and verification of SCEAT. The data that forms the cost engineering database is derived from more than 15 years of development of NASA Space Shuttle software, ranging from low criticality, low complexity support tools to highly complex and highly critical onboard software.
2001-02-01
An Airbus “Beluga” air cargo plane, The Super Transporter, taxis onto the parking apron at KSC’s Shuttle Landing Facility. Its cargo, from the factory of Alenia Aerospazio in Turin, Italy, is the Italian Space Agency’s Multi-Purpose Logistics Module Donatello, the third of three for the International Space Station. The module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle’s payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo
Toward an integrated software platform for systems pharmacology
Ghosh, Samik; Matsuoka, Yukiko; Asai, Yoshiyuki; Hsin, Kun-Yi; Kitano, Hiroaki
2013-01-01
Understanding complex biological systems requires the extensive support of computational tools. This is particularly true for systems pharmacology, which aims to understand the action of drugs and their interactions in a systems context. Computational models play an important role as they can be viewed as an explicit representation of biological hypotheses to be tested. A series of software and data resources are used for model development, verification and exploration of the possible behaviors of biological systems using the model that may not be possible or not cost effective by experiments. Software platforms play a dominant role in creativity and productivity support and have transformed many industries, techniques that can be applied to biology as well. Establishing an integrated software platform will be the next important step in the field. © 2013 The Authors. Biopharmaceutics & Drug Disposition published by John Wiley & Sons, Ltd. PMID:24150748
The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope
NASA Technical Reports Server (NTRS)
Mosler, Gary E.; Howard, Joseph M.; Johnston, John D.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, A. Marcel; Kim, Kevin; Ha, Kong Q.
2004-01-01
This viewgraph presentation gives an overview of the architecture of the James Webb Space Telescope, and explains how integrated modeling is useful for analyzing wavefront, thermal distortion, subsystems, and image motion/jitter for the telescope design.
Integrated vehicle-based safety systems heavy-truck on-road test report
DOT National Transportation Integrated Search
2008-08-01
This report presents results from a series of on-road verification tests performed to determine the readiness of a prototype : integrated warning system to advance to field testing, as well as to identify areas of system performance that should be im...
Integrated vehicle-based safety systems light-vehicle on-road test report
DOT National Transportation Integrated Search
2008-08-01
This report presents results from a series of on-road verification tests performed to determine the readiness of a prototype : integrated warning system to advance to field testing, as well as to identify areas of system performance that should be im...
Ng, Candy K S; Osuna-Sanchez, Hector; Valéry, Eric; Sørensen, Eva; Bracewell, Daniel G
2012-06-15
An integrated experimental and modeling approach for the design of high productivity protein A chromatography is presented to maximize productivity in bioproduct manufacture. The approach consists of four steps: (1) small-scale experimentation, (2) model parameter estimation, (3) productivity optimization and (4) model validation with process verification. The integrated use of process experimentation and modeling enables fewer experiments to be performed, and thus minimizes the time and materials required in order to gain process understanding, which is of key importance during process development. The application of the approach is demonstrated for the capture of antibody by a novel silica-based high performance protein A adsorbent named AbSolute. In the example, a series of pulse injections and breakthrough experiments were performed to develop a lumped parameter model, which was then used to find the best design that optimizes the productivity of a batch protein A chromatographic process for human IgG capture. An optimum productivity of 2.9 kg L⁻¹ day⁻¹ for a column of 5mm diameter and 8.5 cm length was predicted, and subsequently verified experimentally, completing the whole process design approach in only 75 person-hours (or approximately 2 weeks). Copyright © 2012 Elsevier B.V. All rights reserved.
Formal Verification of Complex Systems based on SysML Functional Requirements
2014-12-23
Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools
Clinical Skills Verification, Formative Feedback, and Psychiatry Residency Trainees
ERIC Educational Resources Information Center
Dalack, Gregory W.; Jibson, Michael D.
2012-01-01
Objective: The authors describe the implementation of Clinical Skills Verification (CSV) in their program as an in-training assessment intended primarily to provide formative feedback to trainees, strengthen the supervisory experience, identify the need for remediation of interviewing skills, and secondarily to demonstrating resident competence…
Statement Verification: A Stochastic Model of Judgment and Response.
ERIC Educational Resources Information Center
Wallsten, Thomas S.; Gonzalez-Vallejo, Claudia
1994-01-01
A stochastic judgment model (SJM) is presented as a framework for addressing issues in statement verification and probability judgment. Results of 5 experiments with 264 undergraduates support the validity of the model and provide new information that is interpreted in terms of the SJM. (SLD)
NASA Technical Reports Server (NTRS)
Belcastro, Christine M.
2010-01-01
Loss of control remains one of the largest contributors to aircraft fatal accidents worldwide. Aircraft loss-of-control accidents are highly complex in that they can result from numerous causal and contributing factors acting alone or (more often) in combination. Hence, there is no single intervention strategy to prevent these accidents and reducing them will require a holistic integrated intervention capability. Future onboard integrated system technologies developed for preventing loss of vehicle control accidents must be able to assure safe operation under the associated off-nominal conditions. The transition of these technologies into the commercial fleet will require their extensive validation and verification (V and V) and ultimate certification. The V and V of complex integrated systems poses major nontrivial technical challenges particularly for safety-critical operation under highly off-nominal conditions associated with aircraft loss-of-control events. This paper summarizes the V and V problem and presents a proposed process that could be applied to complex integrated safety-critical systems developed for preventing aircraft loss-of-control accidents. A summary of recent research accomplishments in this effort is also provided.
Cluster man/system design requirements and verification. [for Skylab program
NASA Technical Reports Server (NTRS)
Watters, H. H.
1974-01-01
Discussion of the procedures employed for determining the man/system requirements that guided Skylab design, and review of the techniques used for implementing the man/system design verification. The foremost lesson learned from the design need anticipation and design verification experience is the necessity to allow for human capabilities of in-flight maintenance and repair. It is now known that the entire program was salvaged by a series of unplanned maintenance and repair events which were implemented in spite of poor design provisions for maintenance.
Results from an Independent View on The Validation of Safety-Critical Space Systems
NASA Astrophysics Data System (ADS)
Silva, N.; Lopes, R.; Esper, A.; Barbosa, R.
2013-08-01
The Independent verification and validation (IV&V) has been a key process for decades, and is considered in several international standards. One of the activities described in the “ESA ISVV Guide” is the independent test verification (stated as Integration/Unit Test Procedures and Test Data Verification). This activity is commonly overlooked since customers do not really see the added value of checking thoroughly the validation team work (could be seen as testing the tester's work). This article presents the consolidated results of a large set of independent test verification activities, including the main difficulties, results obtained and advantages/disadvantages for the industry of these activities. This study will support customers in opting-in or opting-out for this task in future IV&V contracts since we provide concrete results from real case studies in the space embedded systems domain.
CASL Verification and Validation Plan
DOE Office of Scientific and Technical Information (OSTI.GOV)
Mousseau, Vincent Andrew; Dinh, Nam
2016-06-30
This report documents the Consortium for Advanced Simulation of LWRs (CASL) verification and validation plan. The document builds upon input from CASL subject matter experts, most notably the CASL Challenge Problem Product Integrators, CASL Focus Area leaders, and CASL code development and assessment teams. This document will be a living document that will track progress on CASL to do verification and validation for both the CASL codes (including MPACT, CTF, BISON, MAMBA) and for the CASL challenge problems (CIPS, PCI, DNB). The CASL codes and the CASL challenge problems are at differing levels of maturity with respect to validation andmore » verification. The gap analysis will summarize additional work that needs to be done. Additional VVUQ work will be done as resources permit. This report is prepared for the Department of Energy’s (DOE’s) CASL program in support of milestone CASL.P13.02.« less
Location-assured, multifactor authentication on smartphones via LTE communication
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan A.; Al-Assam, Hisham
2013-05-01
With the added security provided by LTE, geographical location has become an important factor for authentication to enhance the security of remote client authentication during mCommerce applications using Smartphones. Tight combination of geographical location with classic authentication factors like PINs/Biometrics in a real-time, remote verification scheme over the LTE layer connection assures the authenticator about the client itself (via PIN/biometric) as well as the client's current location, thus defines the important aspects of "who", "when", and "where" of the authentication attempt without eaves dropping or man on the middle attacks. To securely integrate location as an authentication factor into the remote authentication scheme, client's location must be verified independently, i.e. the authenticator should not solely rely on the location determined on and reported by the client's Smartphone. The latest wireless data communication technology for mobile phones (4G LTE, Long-Term Evolution), recently being rolled out in various networks, can be employed to enhance this location-factor requirement of independent location verification. LTE's Control Plane LBS provisions, when integrated with user-based authentication and independent source of localisation factors ensures secure efficient, continuous location tracking of the Smartphone. This feature can be performed during normal operation of the LTE-based communication between client and network operator resulting in the authenticator being able to verify the client's claimed location more securely and accurately. Trials and experiments show that such algorithm implementation is viable for nowadays Smartphone-based banking via LTE communication.
Integrated heat pipe-thermal storage system performance evaluation
NASA Technical Reports Server (NTRS)
Keddy, E.; Sena, J. T.; Merrigan, M.; Heidenreich, Gary
1987-01-01
An integrated thermal energy storage (TES) system, developed as a part of an organic Rankine cycle solar dynamic power system is described, and the results of the performance verification tests of this TES system are presented. The integrated system consists of potassium heat-pipe elements that incorporate TES canisters within the vapor space, along with an organic fluid heater tube used as the condenser region of the heat pipe. The heat pipe assembly was operated through the range of design conditions from the nominal design input of 4.8 kW to a maximum of 5.7 kW. The performance verification tests show that the system meets the functional requirements of absorbing the solar energy reflected by the concentrator, transporting the energy to the organic Rankine heater, providing thermal storage for the eclipse phase, and allowing uniform discharge from the thermal storage to the heater.
Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, Leon E.; Miller, Karen A.; Garner, James R.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less
Speed and Accuracy in the Processing of False Statements About Semantic Information.
ERIC Educational Resources Information Center
Ratcliff, Roger
1982-01-01
A standard reaction time procedure and a response signal procedure were used on data from eight experiments on semantic verifications. Results suggest that simple models of the semantic verification task that assume a single yes/no dimension on which discrimination is made are not correct. (Author/PN)
Formulating face verification with semidefinite programming.
Yan, Shuicheng; Liu, Jianzhuang; Tang, Xiaoou; Huang, Thomas S
2007-11-01
This paper presents a unified solution to three unsolved problems existing in face verification with subspace learning techniques: selection of verification threshold, automatic determination of subspace dimension, and deducing feature fusing weights. In contrast to previous algorithms which search for the projection matrix directly, our new algorithm investigates a similarity metric matrix (SMM). With a certain verification threshold, this matrix is learned by a semidefinite programming approach, along with the constraints of the kindred pairs with similarity larger than the threshold, and inhomogeneous pairs with similarity smaller than the threshold. Then, the subspace dimension and the feature fusing weights are simultaneously inferred from the singular value decomposition of the derived SMM. In addition, the weighted and tensor extensions are proposed to further improve the algorithmic effectiveness and efficiency, respectively. Essentially, the verification is conducted within an affine subspace in this new algorithm and is, hence, called the affine subspace for verification (ASV). Extensive experiments show that the ASV can achieve encouraging face verification accuracy in comparison to other subspace algorithms, even without the need to explore any parameters.
Alternative sample sizes for verification dose experiments and dose audits
NASA Astrophysics Data System (ADS)
Taylor, W. A.; Hansen, J. M.
1999-01-01
ISO 11137 (1995), "Sterilization of Health Care Products—Requirements for Validation and Routine Control—Radiation Sterilization", provides sampling plans for performing initial verification dose experiments and quarterly dose audits. Alternative sampling plans are presented which provide equivalent protection. These sampling plans can significantly reduce the cost of testing. These alternative sampling plans have been included in a draft ISO Technical Report (type 2). This paper examines the rational behind the proposed alternative sampling plans. The protection provided by the current verification and audit sampling plans is first examined. Then methods for identifying equivalent plans are highlighted. Finally, methods for comparing the cost associated with the different plans are provided. This paper includes additional guidance for selecting between the original and alternative sampling plans not included in the technical report.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meyer, Ryan M.; Suter, Jonathan D.; Jones, Anthony M.
2014-09-12
This report documents FY14 efforts for two instrumentation subtasks under storage and transportation. These instrumentation tasks relate to developing effective nondestructive evaluation (NDE) methods and techniques to (1) verify the integrity of metal canisters for the storage of used nuclear fuel (UNF) and to (2) verify the integrity of dry storage cask internals.
Overview of the TOPEX/Poseidon Platform Harvest Verification Experiment
NASA Technical Reports Server (NTRS)
Morris, Charles S.; DiNardo, Steven J.; Christensen, Edward J.
1995-01-01
An overview is given of the in situ measurement system installed on Texaco's Platform Harvest for verification of the sea level measurement from the TOPEX/Poseidon satellite. The prelaunch error budget suggested that the total root mean square (RMS) error due to measurements made at this verification site would be less than 4 cm. The actual error budget for the verification site is within these original specifications. However, evaluation of the sea level data from three measurement systems at the platform has resulted in unexpectedly large differences between the systems. Comparison of the sea level measurements from the different tide gauge systems has led to a better understanding of the problems of measuring sea level in relatively deep ocean. As of May 1994, the Platform Harvest verification site has successfully supported 60 TOPEX/Poseidon overflights.
CFD Code Validation of Wall Heat Fluxes for a G02/GH2 Single Element Combustor
NASA Technical Reports Server (NTRS)
Lin, Jeff; West, Jeff S.; Williams, Robert W.; Tucker, P. Kevin
2005-01-01
This paper puts forth the case for the need for improved injector design tools to meet NASA s Vision for Space Exploration goals. Requirements for this improved tool are outlined and discussed. The potential for Computational Fluid Dynamics (CFD) to meet these requirements is noted along with its current shortcomings, especially relative to demonstrated solution accuracy. The concept of verification and validation is introduced as the primary process for building and quantifying the confidence necessary for CFD to be useful as an injector design tool. The verification and validation process is considered in the context of the Marshall Space Flight Center (MSFC) Combustion Devices CFD Simulation Capability Roadmap via the Simulation Readiness Level (SRL) concept. The portion of the validation process which demonstrates the ability of a CFD code to simulate heat fluxes to a rocket engine combustor wall is the focus of the current effort. The FDNS and Loci-CHEM codes are used to simulate a shear coaxial single element G02/GH2 injector experiment. The experiment was conducted a t a chamber pressure of 750 psia using hot propellants from preburners. A measured wall temperature profile is used as a boundary condition to facilitate the calculations. Converged solutions, obtained from both codes by using wall functions with the K-E turbulence model and integrating to the wall using Mentor s baseline turbulence model, are compared to the experimental data. The initial solutions from both codes revealed significant issues with the wall function implementation associated with the recirculation zone between the shear coaxial jet and the chamber wall. The FDNS solution with a corrected implementation shows marked improvement in overall character and level of comparison to the data. With the FDNS code, integrating to the wall with Mentor s baseline turbulence model actually produce a degraded solution when compared to the wall function solution with the K--E model. The Loci-CHEM solution, produced by integrating to the wall with Mentor s baseline turbulence model, matches both the heat flux rise rate in the near injector region and the peak heat flux level very well. However, it moderately over predicts the heat fluxes downstream of the reattachment point. The Loci-CHEM solution achieved by integrating to the wall with Mentor s baseline turbulence model was clearly superior to the other solutions produced in this effort.
Distilling the Verification Process for Prognostics Algorithms
NASA Technical Reports Server (NTRS)
Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai
2013-01-01
The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.
Optical/digital identification/verification system based on digital watermarking technology
NASA Astrophysics Data System (ADS)
Herrigel, Alexander; Voloshynovskiy, Sviatoslav V.; Hrytskiv, Zenon D.
2000-06-01
This paper presents a new approach for the secure integrity verification of driver licenses, passports or other analogue identification documents. The system embeds (detects) the reference number of the identification document with the DCT watermark technology in (from) the owner photo of the identification document holder. During verification the reference number is extracted and compared with the reference number printed in the identification document. The approach combines optical and digital image processing techniques. The detection system must be able to scan an analogue driver license or passport, convert the image of this document into a digital representation and then apply the watermark verification algorithm to check the payload of the embedded watermark. If the payload of the watermark is identical with the printed visual reference number of the issuer, the verification was successful and the passport or driver license has not been modified. This approach constitutes a new class of application for the watermark technology, which was originally targeted for the copyright protection of digital multimedia data. The presented approach substantially increases the security of the analogue identification documents applied in many European countries.
Design and Verification of Critical Pressurised Windows for Manned Spaceflight
NASA Astrophysics Data System (ADS)
Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.
2014-06-01
The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.
Hand Grasping Synergies As Biometrics.
Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana
2017-01-01
Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies-postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security.
Calvo, Roque; D’Amato, Roberto; Gómez, Emilio; Domingo, Rosario
2016-01-01
Coordinate measuring machines (CMM) are main instruments of measurement in laboratories and in industrial quality control. A compensation error model has been formulated (Part I). It integrates error and uncertainty in the feature measurement model. Experimental implementation for the verification of this model is carried out based on the direct testing on a moving bridge CMM. The regression results by axis are quantified and compared to CMM indication with respect to the assigned values of the measurand. Next, testing of selected measurements of length, flatness, dihedral angle, and roundness features are accomplished. The measurement of calibrated gauge blocks for length or angle, flatness verification of the CMM granite table and roundness of a precision glass hemisphere are presented under a setup of repeatability conditions. The results are analysed and compared with alternative methods of estimation. The overall performance of the model is endorsed through experimental verification, as well as the practical use and the model capability to contribute in the improvement of current standard CMM measuring capabilities. PMID:27754441
SPR Hydrostatic Column Model Verification and Validation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bettin, Giorgia; Lord, David; Rudeen, David Keith
2015-10-01
A Hydrostatic Column Model (HCM) was developed to help differentiate between normal "tight" well behavior and small-leak behavior under nitrogen for testing the pressure integrity of crude oil storage wells at the U.S. Strategic Petroleum Reserve. This effort was motivated by steady, yet distinct, pressure behavior of a series of Big Hill caverns that have been placed under nitrogen for extended period of time. This report describes the HCM model, its functional requirements, the model structure and the verification and validation process. Different modes of operation are also described, which illustrate how the software can be used to model extendedmore » nitrogen monitoring and Mechanical Integrity Tests by predicting wellhead pressures along with nitrogen interface movements. Model verification has shown that the program runs correctly and it is implemented as intended. The cavern BH101 long term nitrogen test was used to validate the model which showed very good agreement with measured data. This supports the claim that the model is, in fact, capturing the relevant physical phenomena and can be used to make accurate predictions of both wellhead pressure and interface movements.« less
Social cognitive theory of posttraumatic recovery: the role of perceived self-efficacy.
Benight, Charles C; Bandura, Albert
2004-10-01
The present article integrates findings from diverse studies on the generalized role of perceived coping self-efficacy in recovery from different types of traumatic experiences. They include natural disasters, technological catastrophes, terrorist attacks, military combat, and sexual and criminal assaults. The various studies apply multiple controls for diverse sets of potential contributors to posttraumatic recovery. In these different multivariate analyses, perceived coping self-efficacy emerges as a focal mediator of posttraumatic recovery. Verification of its independent contribution to posttraumatic recovery across a wide range of traumas lends support to the centrality of the enabling and protective function of belief in one's capability to exercise some measure of control over traumatic adversity.
NASA Technical Reports Server (NTRS)
Luck, Rogelio; Ray, Asok
1990-01-01
The implementation and verification of the delay-compensation algorithm are addressed. The delay compensator has been experimentally verified at an IEEE 802.4 network testbed for velocity control of a DC servomotor. The performance of the delay-compensation algorithm was also examined by combined discrete-event and continuous-time simulation of the flight control system of an advanced aircraft that uses the SAE (Society of Automotive Engineers) linear token passing bus for data communications.
Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications
NASA Technical Reports Server (NTRS)
Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.
1990-01-01
The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning valve control system (TVCS) are discussed, and their solutions are documented. The emphasis of this paper is on the validation of integrated systems.
Validation of the F-18 high alpha research vehicle flight control and avionics systems modifications
NASA Technical Reports Server (NTRS)
Chacon, Vince; Pahle, Joseph W.; Regenie, Victoria A.
1990-01-01
The verification and validation process is a critical portion of the development of a flight system. Verification, the steps taken to assure the system meets the design specification, has become a reasonably understood and straightforward process. Validation is the method used to ensure that the system design meets the needs of the project. As systems become more integrated and more critical in their functions, the validation process becomes more complex and important. The tests, tools, and techniques which are being used for the validation of the high alpha research vehicle (HARV) turning vane control system (TVCS) are discussed and the problems and their solutions are documented. The emphasis of this paper is on the validation of integrated system.
NASA Technical Reports Server (NTRS)
Platt, R.
1999-01-01
This is the Performance Verification Report, Initial Comprehensive Performance Test Report, P/N 1331200-2-IT, S/N 105/A2, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The specification establishes the requirements for the Comprehensive Performance Test (CPT) and Limited Performance Test (LPT) of the Advanced Microwave Sounding, Unit-A2 (AMSU-A2), referred to herein as the unit. The unit is defined on Drawing 1331200. 1.2 Test procedure sequence. The sequence in which the several phases of this test procedure shall take place is shown in Figure 1, but the sequence can be in any order.
An Integrated Approach to Exploration Launch Office Requirements Development
NASA Technical Reports Server (NTRS)
Holladay, Jon B.; Langford, Gary
2006-01-01
The proposed paper will focus on the Project Management and Systems Engineering approach utilized to develop a set of both integrated and cohesive requirements for the Exploration Launch Office, within the Constellation Program. A summary of the programmatic drivers which influenced the approach along with details of the resulting implementation will be discussed as well as metrics evaluating the efficiency and accuracy of the various requirements development activities. Requirements development activities will focus on the procedures utilized to ensure that technical content was valid and mature in preparation for the Crew Launch Vehicle and Constellation System s Requirements Reviews. This discussion will begin at initial requirements development during the Exploration Systems Architecture Study and progress through formal development of the program structure. Specific emphasis will be given to development and validation of the requirements. This discussion will focus on approaches to garner the appropriate requirement owners (or customers), project infrastructure utilized to emphasize proper integration, and finally the procedure to technically mature, verify and validate the requirements. Examples of requirements being implemented on the Launch Vehicle (systems, interfaces, test & verification) will be utilized to demonstrate the various processes and also provide a top level understanding of the launch vehicle(s) performance goals. Details may also be provided on the approaches for verification, which range from typical aerospace hardware development (qualification/acceptance) through flight certification (flight test, etc.). The primary intent of this paper is to provide a demonstrated procedure for the development of a mature, effective, integrated set of requirements on a complex system, which also has the added intricacies of both heritage and new hardware development integration. Ancillary focus of the paper will include discussion of Test and Verification approaches along with top level systems/elements performance capabilities.
IMRT verification using a radiochromic/optical-CT dosimetry system
NASA Astrophysics Data System (ADS)
Oldham, Mark; Guo, Pengyi; Gluckman, Gary; Adamovics, John
2006-12-01
This work represents our first experiences relating to IMRT verification using a relatively new 3D dosimetry system consisting of a PRESAGETM dosimeter (Heuris Inc, Pharma LLC) and an optical-CT scanning system (OCTOPUSTM TM MGS Inc). This work builds in a step-wise manner on prior work in our lab.
A proposed USAF fatigue evaluation program based upon recent systems experience
NASA Technical Reports Server (NTRS)
Haviland, G. P.; Purkey, G. F.
1972-01-01
The United States Air Force has published a document entitled Aircraft Structural Integrity Program. One phase of the program is concerned with the fatigue life certification of all types of military aircraft. The document describes the criteria, analyses, and tests that are necessary in order to satisfy the USAF fatigue life requirement. Some recent and valid criticism has been directed toward the document, particularly the fatigue-life requirements contained in it. Some changes are proposed based on surveys conducted in the United States and abroad as well as some recent systems' experience. The surveys covered both military and civilian organizations. The fatigue certification case histories of selected military and commercial aircraft are presented. The design development element tests, preproduction design verification tests, and full-scale fatigue tests of each are described. A brief status report on the revisions to the MIL-A-008860 series specifications is included.
Beam Loss Monitoring for LHC Machine Protection
NASA Astrophysics Data System (ADS)
Holzer, Eva Barbara; Dehning, Bernd; Effnger, Ewald; Emery, Jonathan; Grishin, Viatcheslav; Hajdu, Csaba; Jackson, Stephen; Kurfuerst, Christoph; Marsili, Aurelien; Misiowiec, Marek; Nagel, Markus; Busto, Eduardo Nebot Del; Nordt, Annika; Roderick, Chris; Sapinski, Mariusz; Zamantzas, Christos
The energy stored in the nominal LHC beams is two times 362 MJ, 100 times the energy of the Tevatron. As little as 1 mJ/cm3 deposited energy quenches a magnet at 7 TeV and 1 J/cm3 causes magnet damage. The beam dumps are the only places to safely dispose of this beam. One of the key systems for machine protection is the beam loss monitoring (BLM) system. About 3600 ionization chambers are installed at likely or critical loss locations around the LHC ring. The losses are integrated in 12 time intervals ranging from 40 μs to 84 s and compared to threshold values defined in 32 energy ranges. A beam abort is requested when potentially dangerous losses are detected or when any of the numerous internal system validation tests fails. In addition, loss data are used for machine set-up and operational verifications. The collimation system for example uses the loss data for set-up and regular performance verification. Commissioning and operational experience of the BLM are presented: The machine protection functionality of the BLM system has been fully reliable; the LHC availability has not been compromised by false beam aborts.
Comments for A Conference on Verification in the 21st Century
DOE Office of Scientific and Technical Information (OSTI.GOV)
Doyle, James E.
2012-06-12
The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less
Discriminative Features Mining for Offline Handwritten Signature Verification
NASA Astrophysics Data System (ADS)
Neamah, Karrar; Mohamad, Dzulkifli; Saba, Tanzila; Rehman, Amjad
2014-03-01
Signature verification is an active research area in the field of pattern recognition. It is employed to identify the particular person with the help of his/her signature's characteristics such as pen pressure, loops shape, speed of writing and up down motion of pen, writing speed, pen pressure, shape of loops, etc. in order to identify that person. However, in the entire process, features extraction and selection stage is of prime importance. Since several signatures have similar strokes, characteristics and sizes. Accordingly, this paper presents combination of orientation of the skeleton and gravity centre point to extract accurate pattern features of signature data in offline signature verification system. Promising results have proved the success of the integration of the two methods.
Requirements, Verification, and Compliance (RVC) Database Tool
NASA Technical Reports Server (NTRS)
Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale
2001-01-01
This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".
Model based verification of the Secure Socket Layer (SSL) Protocol for NASA systems
NASA Technical Reports Server (NTRS)
Powell, John D.; Gilliam, David
2004-01-01
The National Aeronautics and Space Administration (NASA) has tens of thousands of networked computer systems and applications. Software Security vulnerabilities present risks such as lost or corrupted data, information theft, and unavailability of critical systems. These risks represent potentially enormous costs to NASA. The NASA Code Q research initiative 'Reducing Software Security Risk (RSSR) Trough an Integrated Approach' offers formal verification of information technology (IT), through the creation of a Software Security Assessment Instrument (SSAI), to address software security risks.
TORMES-BEXUS 17 and 19: Precursor of the 6U CubeSat 3CAT-2
NASA Astrophysics Data System (ADS)
Carreno-Luengo, H.; Amezaga, A.; Bolet, A.; Vidal, D.; Jane, J.; Munoz, J. F.; Olive, R.; Camps, A.; Carola, J.; Catarino, N.; Hagenfeldt, M.; Palomo, P.; Cornara, S.
2015-09-01
3Cat-2 Assembly, Integration and Verification (AIV) activities of the Engineering Model (EM) and the Flight Model (FM) are being carried out at present. The Attitude Determination and Control System (ADCS) and Flight Software (FSW) validation campaigns will be performed at Universitat Politècnica de Catalunya (UPC) during the incomings months. An analysis and verification of the 3Cat-2 key mission requirements has been performed. The main results are summarized in this work.
2012-04-01
Systems Concepts and Integration SET Sensors and Electronics Technology SISO Simulation Interoperability Standards Organization SIW Simulation...conjunction with 2006 Fall SIW 2006 September SISO Standards Activity Committee approved beginning IEEE balloting 2006 October IEEE Project...019 published 2008 June Edinborough, UK Held in conjunction with 2008 Euro- SIW 2008 September Laurel, MD, US Work on Composite Model 2008 December
NASA Astrophysics Data System (ADS)
Nomaguch, Yutaka; Fujita, Kikuo
This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.
First results of the wind evaluation breadboard for ELT primary mirror design
NASA Astrophysics Data System (ADS)
Reyes García-Talavera, Marcos; Viera, Teodora; Núñez, Miguel
2010-07-01
The Wind Evaluation Breadboard (WEB) is a primary mirror and telescope simulator formed by seven aluminium segments, including position sensors, electromechanical support systems and support structures. WEB has been developed to evaluate technologies for primary mirror wavefront control and to evaluate the performance of the control of wind buffeting disturbance on ELT segmented mirrors. For this purpose WEB electro-mechanical set-up simulates the real operational constrains applied to large segmented mirrors. This paper describes the WEB assembly, integration and verification, the instrument characterisation and close loop control design, including the dynamical characterization of the instrument and the control architecture. The performance of the new technologies developed for position sensing, acting and controlling is evaluated. The integration of the instrument in the observatory and the results of the first experiments are summarised, with different wind conditions, elevation and azimuth angles of incidence. Conclusions are extracted with respect the wind rejection performance and the control strategy for an ELT. WEB has been designed and developed by IAC, ESO, ALTRAN and JUPASA, with the integration of subsystems of FOGALE and TNO.
NASA Astrophysics Data System (ADS)
Andrianova, Olga; Lomakov, Gleb; Manturov, Gennady
2017-09-01
The report presents the results of an analysis of benchmark experiments form the international ICSBEP Handbook (HEU-MET-INTER-005) carried out at the the SSC RF - IPPE in cooperation with the Idaho National Laboratory (INL, USA) applicable to the verification of calculations of a wide range of tasks related to safe storage of vitrified radioactive waste. Experiments on the BFS assemblies make it possible to perform a large series of studies needed for neutron data refinement, including measurements of reactivity effects which allow testing the neutron cross section resonance structure. This series of studies is considered as a sample joint analysis framework for differential and integral experiments required to correct nuclea data files of the ROSFOND evaluated neutron data library. Thus, it is shown that despite the wide range of available experimental data, in so far as it relates to the resonance region refinement, the experiments on reactivity measurement make it possible to more subtly reflect the resonance structure peculiarities in addition to the time-of-flight measurement method.
Experimental evaluation of fingerprint verification system based on double random phase encoding
NASA Astrophysics Data System (ADS)
Suzuki, Hiroyuki; Yamaguchi, Masahiro; Yachida, Masuyoshi; Ohyama, Nagaaki; Tashima, Hideaki; Obi, Takashi
2006-03-01
We proposed a smart card holder authentication system that combines fingerprint verification with PIN verification by applying a double random phase encoding scheme. In this system, the probability of accurate verification of an authorized individual reduces when the fingerprint is shifted significantly. In this paper, a review of the proposed system is presented and preprocessing for improving the false rejection rate is proposed. In the proposed method, the position difference between two fingerprint images is estimated by using an optimized template for core detection. When the estimated difference exceeds the permissible level, the user inputs the fingerprint again. The effectiveness of the proposed method is confirmed by a computational experiment; its results show that the false rejection rate is improved.
MC 2 -3: Multigroup Cross Section Generation Code for Fast Reactor Analysis
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lee, Changho; Yang, Won Sik
This paper presents the methods and performance of the MC2 -3 code, which is a multigroup cross-section generation code for fast reactor analysis, developed to improve the resonance self-shielding and spectrum calculation methods of MC2 -2 and to simplify the current multistep schemes generating region-dependent broad-group cross sections. Using the basic neutron data from ENDF/B data files, MC2 -3 solves the consistent P1 multigroup transport equation to determine the fundamental mode spectra for use in generating multigroup neutron cross sections. A homogeneous medium or a heterogeneous slab or cylindrical unit cell problem is solved in ultrafine (2082) or hyperfine (~400more » 000) group levels. In the resolved resonance range, pointwise cross sections are reconstructed with Doppler broadening at specified temperatures. The pointwise cross sections are directly used in the hyperfine group calculation, whereas for the ultrafine group calculation, self-shielded cross sections are prepared by numerical integration of the pointwise cross sections based upon the narrow resonance approximation. For both the hyperfine and ultrafine group calculations, unresolved resonances are self-shielded using the analytic resonance integral method. The ultrafine group calculation can also be performed for a two-dimensional whole-core problem to generate region-dependent broad-group cross sections. Verification tests have been performed using the benchmark problems for various fast critical experiments including Los Alamos National Laboratory critical assemblies; Zero-Power Reactor, Zero-Power Physics Reactor, and Bundesamt für Strahlenschutz experiments; Monju start-up core; and Advanced Burner Test Reactor. Verification and validation results with ENDF/B-VII.0 data indicated that eigenvalues from MC2 -3/DIF3D agreed well with Monte Carlo N-Particle5 MCNP5 or VIM Monte Carlo solutions within 200 pcm and regionwise one-group fluxes were in good agreement with Monte Carlo solutions.« less
Interfacing and Verifying ALHAT Safe Precision Landing Systems with the Morpheus Vehicle
NASA Technical Reports Server (NTRS)
Carson, John M., III; Hirsh, Robert L.; Roback, Vincent E.; Villalpando, Carlos; Busa, Joseph L.; Pierrottet, Diego F.; Trawny, Nikolas; Martin, Keith E.; Hines, Glenn D.
2015-01-01
The NASA Autonomous precision Landing and Hazard Avoidance Technology (ALHAT) project developed a suite of prototype sensors to enable autonomous and safe precision landing of robotic or crewed vehicles under any terrain lighting conditions. Development of the ALHAT sensor suite was a cross-NASA effort, culminating in integration and testing on-board a variety of terrestrial vehicles toward infusion into future spaceflight applications. Terrestrial tests were conducted on specialized test gantries, moving trucks, helicopter flights, and a flight test onboard the NASA Morpheus free-flying, rocket-propulsive flight-test vehicle. To accomplish these tests, a tedious integration process was developed and followed, which included both command and telemetry interfacing, as well as sensor alignment and calibration verification to ensure valid test data to analyze ALHAT and Guidance, Navigation and Control (GNC) performance. This was especially true for the flight test campaign of ALHAT onboard Morpheus. For interfacing of ALHAT sensors to the Morpheus flight system, an adaptable command and telemetry architecture was developed to allow for the evolution of per-sensor Interface Control Design/Documents (ICDs). Additionally, individual-sensor and on-vehicle verification testing was developed to ensure functional operation of the ALHAT sensors onboard the vehicle, as well as precision-measurement validity for each ALHAT sensor when integrated within the Morpheus GNC system. This paper provides some insight into the interface development and the integrated-systems verification that were a part of the build-up toward success of the ALHAT and Morpheus flight test campaigns in 2014. These campaigns provided valuable performance data that is refining the path toward spaceflight infusion of the ALHAT sensor suite.
Definition of ground test for verification of large space structure control
NASA Technical Reports Server (NTRS)
Doane, G. B., III; Glaese, J. R.; Tollison, D. K.; Howsman, T. G.; Curtis, S. (Editor); Banks, B.
1984-01-01
Control theory and design, dynamic system modelling, and simulation of test scenarios are the main ideas discussed. The overall effort is the achievement at Marshall Space Flight Center of a successful ground test experiment of a large space structure. A simplified planar model of ground test experiment of a large space structure. A simplified planar model of ground test verification was developed. The elimination from that model of the uncontrollable rigid body modes was also examined. Also studied was the hardware/software of computation speed.
A Management Model for International Participation in Space Exploration Missions
NASA Technical Reports Server (NTRS)
George, Patrick J.; Pease, Gary M.; Tyburski, Timothy E.
2005-01-01
This paper proposes an engineering management model for NASA's future space exploration missions based on past experiences working with the International Partners of the International Space Station. The authors have over 25 years of combined experience working with the European Space Agency, Japan Aerospace Exploration Agency, Canadian Space Agency, Italian Space Agency, Russian Space Agency, and their respective contractors in the design, manufacturing, verification, and integration of their elements electric power system into the United States on-orbit segment. The perspective presented is one from a specific sub-system integration role and is offered so that the lessons learned from solving issues of technical and cultural nature may be taken into account during the formulation of international partnerships. Descriptions of the types of unique problems encountered relative to interactions between international partnerships are reviewed. Solutions to the problems are offered, taking into consideration the technical implications. Through the process of investigating each solution, the important and significant issues associated with working with international engineers and managers are outlined. Potential solutions are then characterized by proposing a set of specific methodologies to jointly develop spacecraft configurations that benefits all international participants, maximizes mission success and vehicle interoperability while minimizing cost.
The Babushka Concept--An Instructional Sequence to Enhance Laboratory Learning in Science Education
ERIC Educational Resources Information Center
Gårdebjer, Sofie; Larsson, Anette; Adawi, Tom
2017-01-01
This paper deals with a novel method for improving the traditional "verification" laboratory in science education. Drawing on the idea of integrated instructional units, we describe an instructional sequence which we call the Babushka concept. This concept consists of three integrated instructional units: a start-up lecture, a laboratory…
Design of a front-end integrated circuit for 3D acoustic imaging using 2D CMUT arrays.
Ciçek, Ihsan; Bozkurt, Ayhan; Karaman, Mustafa
2005-12-01
Integration of front-end electronics with 2D capacitive micromachined ultrasonic transducer (CMUT) arrays has been a challenging issue due to the small element size and large channel count. We present design and verification of a front-end drive-readout integrated circuit for 3D ultrasonic imaging using 2D CMUT arrays. The circuit cell dedicated to a single CMUT array element consists of a high-voltage pulser and a low-noise readout amplifier. To analyze the circuit cell together with the CMUT element, we developed an electrical CMUT model with parameters derived through finite element analysis, and performed both the pre- and postlayout verification. An experimental chip consisting of 4 X 4 array of the designed circuit cells, each cell occupying a 200 X 200 microm2 area, was formed for the initial test studies and scheduled for fabrication in 0.8 microm, 50 V CMOS technology. The designed circuit is suitable for integration with CMUT arrays through flip-chip bonding and the CMUT-on-CMOS process.
VEG-01: Veggie Hardware Verification Testing
NASA Technical Reports Server (NTRS)
Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond
2013-01-01
The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.
Ham, Y.; Kerr, P.; Sitaraman, S.; ...
2016-05-05
Here, the need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called "difficult-to-access" areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into "difficult-to-access" areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reportedmore » the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17×17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly bunrup levels.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ham, Y.S.; Kerr, P.; Sitaraman, S.
The need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called 'difficult-to-access' areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into 'difficult-to-access' areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reported themore » successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17x17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly burnup levels. (authors)« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ham, Y.; Kerr, P.; Sitaraman, S.
Here, the need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called "difficult-to-access" areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into "difficult-to-access" areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reportedmore » the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17×17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly bunrup levels.« less
Integrating image quality in 2nu-SVM biometric match score fusion.
Vatsa, Mayank; Singh, Richa; Noore, Afzel
2007-10-01
This paper proposes an intelligent 2nu-support vector machine based match score fusion algorithm to improve the performance of face and iris recognition by integrating the quality of images. The proposed algorithm applies redundant discrete wavelet transform to evaluate the underlying linear and non-linear features present in the image. A composite quality score is computed to determine the extent of smoothness, sharpness, noise, and other pertinent features present in each subband of the image. The match score and the corresponding quality score of an image are fused using 2nu-support vector machine to improve the verification performance. The proposed algorithm is experimentally validated using the FERET face database and the CASIA iris database. The verification performance and statistical evaluation show that the proposed algorithm outperforms existing fusion algorithms.
Viswanathan, P; Krishna, P Venkata
2014-05-01
Teleradiology allows transmission of medical images for clinical data interpretation to provide improved e-health care access, delivery, and standards. The remote transmission raises various ethical and legal issues like image retention, fraud, privacy, malpractice liability, etc. A joint FED watermarking system means a joint fingerprint/encryption/dual watermarking system is proposed for addressing these issues. The system combines a region based substitution dual watermarking algorithm using spatial fusion, stream cipher algorithm using symmetric key, and fingerprint verification algorithm using invariants. This paper aims to give access to the outcomes of medical images with confidentiality, availability, integrity, and its origin. The watermarking, encryption, and fingerprint enrollment are conducted jointly in protection stage such that the extraction, decryption, and verification can be applied independently. The dual watermarking system, introducing two different embedding schemes, one used for patient data and other for fingerprint features, reduces the difficulty in maintenance of multiple documents like authentication data, personnel and diagnosis data, and medical images. The spatial fusion algorithm, which determines the region of embedding using threshold from the image to embed the encrypted patient data, follows the exact rules of fusion resulting in better quality than other fusion techniques. The four step stream cipher algorithm using symmetric key for encrypting the patient data with fingerprint verification system using algebraic invariants improves the robustness of the medical information. The experiment result of proposed scheme is evaluated for security and quality analysis in DICOM medical images resulted well in terms of attacks, quality index, and imperceptibility.
ERIC Educational Resources Information Center
Acharya, Sushil; Manohar, Priyadarshan; Wu, Peter; Schilling, Walter
2017-01-01
Imparting real world experiences in a software verification and validation (SV&V) course is often a challenge due to the lack of effective active learning tools. This pedagogical requirement is important because graduates are expected to develop software that meets rigorous quality standards in functional and application domains. Realizing the…
Experimental verification of vapor deposition rate theory in high velocity burner rigs
NASA Technical Reports Server (NTRS)
Gokoglu, Suleyman A.; Santoro, Gilbert J.
1985-01-01
The main objective has been the experimental verification of the corrosive vapor deposition theory in high-temperature, high-velocity environments. Towards this end a Mach 0.3 burner-rig appartus was built to measure deposition rates from salt-seeded (mostly Na salts) combustion gases on the internally cooled cylindrical collector. Deposition experiments are underway.
Formal verification of an avionics microprocessor
NASA Technical Reports Server (NTRS)
Srivas, Mandayam, K.; Miller, Steven P.
1995-01-01
Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ricci, P., E-mail: paolo.ricci@epfl.ch; Riva, F.; Theiler, C.
In the present work, a Verification and Validation procedure is presented and applied showing, through a practical example, how it can contribute to advancing our physics understanding of plasma turbulence. Bridging the gap between plasma physics and other scientific domains, in particular, the computational fluid dynamics community, a rigorous methodology for the verification of a plasma simulation code is presented, based on the method of manufactured solutions. This methodology assesses that the model equations are correctly solved, within the order of accuracy of the numerical scheme. The technique to carry out a solution verification is described to provide a rigorousmore » estimate of the uncertainty affecting the numerical results. A methodology for plasma turbulence code validation is also discussed, focusing on quantitative assessment of the agreement between experiments and simulations. The Verification and Validation methodology is then applied to the study of plasma turbulence in the basic plasma physics experiment TORPEX [Fasoli et al., Phys. Plasmas 13, 055902 (2006)], considering both two-dimensional and three-dimensional simulations carried out with the GBS code [Ricci et al., Plasma Phys. Controlled Fusion 54, 124047 (2012)]. The validation procedure allows progress in the understanding of the turbulent dynamics in TORPEX, by pinpointing the presence of a turbulent regime transition, due to the competition between the resistive and ideal interchange instabilities.« less
Face verification with balanced thresholds.
Yan, Shuicheng; Xu, Dong; Tang, Xiaoou
2007-01-01
The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.
A study of applications scribe frame data verifications using design rule check
NASA Astrophysics Data System (ADS)
Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki
2013-06-01
In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.
Hankemeier, Dorice A.; Van Lunen, Bonnie L.
2013-01-01
Context: As evidence-based practice (EBP) becomes prevalent in athletic training education, the barriers that Approved Clinical Instructors (ACIs) experience in implementing it with students need to be understood. Objective: To investigate barriers ACIs face when implementing EBP concepts in clinical practice and in teaching EBP to professional athletic training students and to investigate the educational emphases to improve the barriers. Design: Qualitative study. Setting: Telephone interviews. Patients or Other Participants: Sixteen ACIs (11 men, 5 women; experience as an athletic trainer = 10 ± 4.7 years, experience as an ACI = 6.81 ± 3.9 years) were interviewed. Data Collection and Analysis: We interviewed each participant by telephone. Interview data were analyzed and coded for common themes and subthemes regarding barriers and educational emphases. Themes were triangulated through multiple-analyst triangulation and interpretive verification. Results: Barriers to EBP incorporation and educational emphasis placed on EBP were the main themes reported. Resources, personnel, and student characteristics were subthemes identified as barriers. Resource barriers included time, equipment, access to current literature, and knowledge. Coworkers, clinicians, and coaches who were unwilling to accept evidence regarding advancements in treatment were identified as personnel barriers. Programmatic improvement and communication improvement were subthemes of the educational emphasis placed on EBP theme. The ACIs reported the need for better integration between the clinical setting and the classroom and expressed the need for EBP to be integrated throughout the athletic training education program. Conclusions: Integration of the classroom and clinical experience is important in advancing ACIs' use of EBP with their students. Collaborative efforts within the clinical and academic program could help address the barriers ACIs face when implementing EBP. This collaboration could positively affect the ability of ACIs to implement EBP within their clinical practices. PMID:23675798
Exploration of Uncertainty in Glacier Modelling
NASA Technical Reports Server (NTRS)
Thompson, David E.
1999-01-01
There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.
A methodology for the rigorous verification of plasma simulation codes
NASA Astrophysics Data System (ADS)
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.
Strategies for Validation Testing of Ground Systems
NASA Technical Reports Server (NTRS)
Annis, Tammy; Sowards, Stephanie
2009-01-01
In order to accomplish the full Vision for Space Exploration announced by former President George W. Bush in 2004, NASA will have to develop a new space transportation system and supporting infrastructure. The main portion of this supporting infrastructure will reside at the Kennedy Space Center (KSC) in Florida and will either be newly developed or a modification of existing vehicle processing and launch facilities, including Ground Support Equipment (GSE). This type of large-scale launch site development is unprecedented since the time of the Apollo Program. In order to accomplish this successfully within the limited budget and schedule constraints a combination of traditional and innovative strategies for Verification and Validation (V&V) have been developed. The core of these strategies consists of a building-block approach to V&V, starting with component V&V and ending with a comprehensive end-to-end validation test of the complete launch site, called a Ground Element Integration Test (GEIT). This paper will outline these strategies and provide the high level planning for meeting the challenges of implementing V&V on a large-scale development program. KEY WORDS: Systems, Elements, Subsystem, Integration Test, Ground Systems, Ground Support Equipment, Component, End Item, Test and Verification Requirements (TVR), Verification Requirements (VR)
Enhancing the Human Factors Engineering Role in an Austere Fiscal Environment
NASA Technical Reports Server (NTRS)
Stokes, Jack W.
2003-01-01
An austere fiscal environment in the aerospace community creates pressures to reduce program costs, often minimizing or sometimes even deleting the human interface requirements from the design process. With an assumption that the flight crew can recover real time from a poorly human factored space vehicle design, the classical crew interface requirements have been either not included in the design or not properly funded, though carried as requirements. Cost cuts have also affected quality of retained human factors engineering personnel. In response to this concern, planning is ongoing to correct the acting issues. Herein are techniques for ensuring that human interface requirements are integrated into a flight design, from proposal through verification and launch activation. This includes human factors requirements refinement and consolidation across flight programs; keyword phrases in the proposals; closer ties with systems engineering and other classical disciplines; early planning for crew-interface verification; and an Agency integrated human factors verification program, under the One NASA theme. Importance is given to communication within the aerospace human factors discipline, and utilizing the strengths of all government, industry, and academic human factors organizations in an unified research and engineering approach. A list of recommendations and concerns are provided in closing.
Hand Grasping Synergies As Biometrics
Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K.; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana
2017-01-01
Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies—postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security. PMID:28512630
Signature Verification Based on Handwritten Text Recognition
NASA Astrophysics Data System (ADS)
Viriri, Serestina; Tapamo, Jules-R.
Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.
A progress report on a NASA research program for embedded computer systems software
NASA Technical Reports Server (NTRS)
Foudriat, E. C.; Senn, E. H.; Will, R. W.; Straeter, T. A.
1979-01-01
The paper presents the results of the second stage of the Multipurpose User-oriented Software Technology (MUST) program. Four primary areas of activities are discussed: programming environment, HAL/S higher-order programming language support, the Integrated Verification and Testing System (IVTS), and distributed system language research. The software development environment is provided by the interactive software invocation system. The higher-order programming language (HOL) support chosen for consideration is HAL/S mainly because at the time it was one of the few HOLs with flight computer experience and it is the language used on the Shuttle program. The overall purpose of IVTS is to provide a 'user-friendly' software testing system which is highly modular, user controlled, and cooperative in nature.
Space shuttle electrical power generation and reactant supply system
NASA Technical Reports Server (NTRS)
Simon, W. E.
1985-01-01
The design philosophy and development experience of fuel cell power generation and cryogenic reactant supply systems are reviewed, beginning with the state of technology at the conclusion of the Apollo Program. Technology advancements span a period of 10 years from initial definition phase to the most recent space transportation system (STS) flights. The development program encompassed prototype, verification, and qualification hardware, as well as post-STS-1 design improvements. Focus is on the problems encountered, the scientific and engineering approaches employed to meet the technological challenges, and the results obtained. Major technology barriers are discussed, and the evolving technology development paths are traced from their conceptual beginnings to the fully man-rated systems which are now an integral part of the shuttle vehicle.
A 3-D turbulent flow analysis using finite elements with k-ɛ model
NASA Astrophysics Data System (ADS)
Okuda, H.; Yagawa, G.; Eguchi, Y.
1989-03-01
This paper describes the finite element turbulent flow analysis, which is suitable for three-dimensional large scale problems. The k-ɛ turbulence model as well as the conservation equations of mass and momentum are discretized in space using rather low order elements. Resulting coefficient matrices are evaluated by one-point quadrature in order to reduce the computational storage and the CPU cost. The time integration scheme based on the velocity correction method is employed to obtain steady state solutions. For the verification of this FEM program, two-dimensional plenum flow is simulated and compared with experiment. As the application to three-dimensional practical problems, the turbulent flows in the upper plenum of the fast breeder reactor are calculated for various boundary conditions.
FY17 ISCR Scholar End-of-Assignment Report - Robbie Sadre
DOE Office of Scientific and Technical Information (OSTI.GOV)
Sadre, R.
2017-10-20
Throughout this internship assignment, I did various tasks that contributed towards the starting of the SASEDS (Safe Active Scanning for Energy Delivery Systems) and CES-21 (California Energy Systems for the 21st Century) projects in the SKYFALL laboratory. The goal of the SKYFALL laboratory is to perform modeling and simulation verification of transmission power system devices, while integrating with high-performance computing. The first thing I needed to do was acquire official Online LabVIEW training from National Instruments. Through these online tutorial modules, I learned the basics of LabVIEW, gaining experience in connecting to NI devices through the DAQmx API as wellmore » as LabVIEW basic programming techniques (structures, loops, state machines, front panel GUI design etc).« less
A Secure Framework for Location Verification in Pervasive Computing
NASA Astrophysics Data System (ADS)
Liu, Dawei; Lee, Moon-Chuen; Wu, Dan
The way people use computing devices has been changed in some way by the relatively new pervasive computing paradigm. For example, a person can use a mobile device to obtain its location information at anytime and anywhere. There are several security issues concerning whether this information is reliable in a pervasive environment. For example, a malicious user may disable the localization system by broadcasting a forged location, and it may impersonate other users by eavesdropping their locations. In this paper, we address the verification of location information in a secure manner. We first present the design challenges for location verification, and then propose a two-layer framework VerPer for secure location verification in a pervasive computing environment. Real world GPS-based wireless sensor network experiments confirm the effectiveness of the proposed framework.
DOE Office of Scientific and Technical Information (OSTI.GOV)
NONE
1995-11-01
This report presents the results of instrumentation measurements and observations made during construction of the North Ramp Starter Tunnel (NRST) of the Exploratory Studies Facility (ESF). The information in this report was developed as part of the Design Verification Study, Section 8.3.1.15.1.8 of the Yucca Mountain Site Characterization Plan (DOE 1988). The ESF is being constructed by the US Department of Energy (DOE) to evaluate the feasibility of locating a potential high-level nuclear waste repository on lands within and adjacent to the Nevada Test Site (NTS), Nye County, Nevada. The Design Verification Studies are performed to collect information during constructionmore » of the ESF that will be useful for design and construction of the potential repository. Four experiments make up the Design Verification Study: Evaluation of Mining Methods, Monitoring Drift Stability, Monitoring of Ground Support Systems, and The Air Quality and Ventilation Experiment. This report describes Sandia National Laboratories` (SNL) efforts in the first three of these experiments in the NRST.« less
The Hawaiian Electric Companies | Energy Systems Integration Facility |
farm in Maui, Hawaii Verification of Voltage Regulation Operating Strategies NREL has studied how Hawaiian Electric Companies can best manage voltage regulation functions from distributed technologies. Two
Stathakarou, Natalia; Zary, Nabil; Kononowicz, Andrzej A
2014-01-01
Background. Massive Open Online Courses (MOOCs) are an emerging trend in online learning. However, their technology is not yet completely adjusted to the needs of healthcare education. Integration of Virtual Patients within MOOCs to increase interactivity and foster clinical reasoning skills training, has been discussed in the past, but not verified by a practical implementation. Objective. To investigate the technical feasibility of integrating MOOCs with Virtual Patients for the purpose of enabling further research into the potential pedagogical benefits of this approach. Methods. We selected OpenEdx and Open Labyrinth as representative constituents of a MOOC platform and Virtual Patient system integration. Based upon our prior experience we selected the most fundamental technical requirement to address. Grounded in the available literature we identified an e-learning standard to guide the integration. We attempted to demonstrate the feasibility of the integration by designing a "proof-of-concept" prototype. The resulting pilot implementation was subject of verification by two test cases. Results. A Single Sign-On mechanism connecting Open Labyrinth with OpenEdx and based on the IMS LTI standard was successfully implemented and verified. Conclusion. We investigated the technical perspective of integrating Virtual Patients with MOOCs. By addressing this crucial technical requirement we set a base for future research on the educational benefits of using virtual patients in MOOCs. This provides new opportunities for integrating specialized software in healthcare education at massive scale.
Zary, Nabil; Kononowicz, Andrzej A.
2014-01-01
Background. Massive Open Online Courses (MOOCs) are an emerging trend in online learning. However, their technology is not yet completely adjusted to the needs of healthcare education. Integration of Virtual Patients within MOOCs to increase interactivity and foster clinical reasoning skills training, has been discussed in the past, but not verified by a practical implementation. Objective. To investigate the technical feasibility of integrating MOOCs with Virtual Patients for the purpose of enabling further research into the potential pedagogical benefits of this approach. Methods. We selected OpenEdx and Open Labyrinth as representative constituents of a MOOC platform and Virtual Patient system integration. Based upon our prior experience we selected the most fundamental technical requirement to address. Grounded in the available literature we identified an e-learning standard to guide the integration. We attempted to demonstrate the feasibility of the integration by designing a “proof-of-concept” prototype. The resulting pilot implementation was subject of verification by two test cases. Results. A Single Sign-On mechanism connecting Open Labyrinth with OpenEdx and based on the IMS LTI standard was successfully implemented and verified. Conclusion. We investigated the technical perspective of integrating Virtual Patients with MOOCs. By addressing this crucial technical requirement we set a base for future research on the educational benefits of using virtual patients in MOOCs. This provides new opportunities for integrating specialized software in healthcare education at massive scale. PMID:25405078
Breast Mass Detection in Digital Mammogram Based on Gestalt Psychology
Bu, Qirong; Liu, Feihong; Zhang, Min; Ren, Yu; Lv, Yi
2018-01-01
Inspired by gestalt psychology, we combine human cognitive characteristics with knowledge of radiologists in medical image analysis. In this paper, a novel framework is proposed to detect breast masses in digitized mammograms. It can be divided into three modules: sensation integration, semantic integration, and verification. After analyzing the progress of radiologist's mammography screening, a series of visual rules based on the morphological characteristics of breast masses are presented and quantified by mathematical methods. The framework can be seen as an effective trade-off between bottom-up sensation and top-down recognition methods. This is a new exploratory method for the automatic detection of lesions. The experiments are performed on Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM) data sets. The sensitivity reached to 92% at 1.94 false positive per image (FPI) on MIAS and 93.84% at 2.21 FPI on DDSM. Our framework has achieved a better performance compared with other algorithms. PMID:29854359
Macro-fingerprint analysis-through-separation of licorice based on FT-IR and 2DCOS-IR
NASA Astrophysics Data System (ADS)
Wang, Yang; Wang, Ping; Xu, Changhua; Yang, Yan; Li, Jin; Chen, Tao; Li, Zheng; Cui, Weili; Zhou, Qun; Sun, Suqin; Li, Huifen
2014-07-01
In this paper, a step-by-step analysis-through-separation method under the navigation of multi-step IR macro-fingerprint (FT-IR integrated with second derivative IR (SD-IR) and 2DCOS-IR) was developed for comprehensively characterizing the hierarchical chemical fingerprints of licorice from entirety to single active components. Subsequently, the chemical profile variation rules of three parts (flavonoids, saponins and saccharides) in the separation process were holistically revealed and the number of matching peaks and correlation coefficients with standards of pure compounds was increasing along the extracting directions. The findings were supported by UPLC results and a verification experiment of aqueous separation process. It has been demonstrated that the developed multi-step IR macro-fingerprint analysis-through-separation approach could be a rapid, effective and integrated method not only for objectively providing comprehensive chemical characterization of licorice and all its separated parts, but also for rapidly revealing the global enrichment trend of the active components in licorice separation process.
NASA Astrophysics Data System (ADS)
Kowalski, John B.; Herring, Craig; Baryschpolec, Lisa; Reger, John; Patel, Jay; Feeney, Mary; Tallentire, Alan
2002-08-01
The International and European standards for radiation sterilization require evidence of the effectiveness of a minimum sterilization dose of 25 kGy but do not provide detailed guidance on how this evidence can be generated. An approach, designated VD max, has recently been described and computer evaluated to provide safe and unambiguous substantiation of a 25 kGy sterilization dose. The approach has been further developed into a practical method, which has been subjected to field evaluations at three manufacturing facilities which produce different types of medical devices. The three facilities each used a different overall evaluation strategy: Facility A used VD max for quarterly dose audits; Facility B compared VD max and Method 1 in side-by-side parallel experiments; and Facility C, a new facility at start-up, used VD max for initial substantiation of 25 kGy and subsequent quarterly dose audits. A common element at all three facilities was the use of 10 product units for irradiation in the verification dose experiment. The field evaluations of the VD max method were successful at all three facilities; they included many different types of medical devices/product families with a wide range of average bioburden and sample item portion values used in the verification dose experiments. Overall, around 500 verification dose experiments were performed and no failures were observed. In the side-by-side parallel experiments, the outcomes of the VD max experiments were consistent with the outcomes observed with Method 1. The VD max approach has been extended to sterilization doses >25 and <25 kGy; verification doses have been derived for sterilization doses of 15, 20, 30, and 35 kGy. Widespread application of the VD max method for doses other than 25 kGy must await controlled field evaluations and the development of appropriate specifications/standards.
Tethered satellite system dynamics and control review panel and related activities, phase 3
NASA Technical Reports Server (NTRS)
1991-01-01
Two major tests of the Tethered Satellite System (TSS) engineering and flight units were conducted to demonstrate the functionality of the hardware and software. Deficiencies in the hardware/software integration tests (HSIT) led to a recommendation for more testing to be performed. Selected problem areas of tether dynamics were analyzed, including verification of the severity of skip rope oscillations, verification or comparison runs to explore dynamic phenomena observed in other simulations, and data generation runs to explore the performance of the time domain and frequency domain skip rope observers.
Minimum accommodation for aerobrake assembly, phase 2
NASA Technical Reports Server (NTRS)
Katzberg, Stephen J.; Haynes, Davy A.; Tutterow, Robin D.; Watson, Judith J.; Russell, James W.
1994-01-01
A multi-element study was done to assess the practicality of a Space Station Freedom-based aerobrake system for the Space Exploration Initiative. The study was organized into six parts related to structure, aerodynamics, robotics and assembly, thermal protection system, inspection, and verification, all tied together by an integration study. The integration activity managed the broad issues related to meeting mission requirements. This report is a summary of the issues addressed by the integration team.
NASA Astrophysics Data System (ADS)
Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan
2018-02-01
The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.
Yamaoka, S
1995-06-01
Uniqueness theory explains that extremely high perceived similarity between self and others evokes negative emotional reactions and causes uniqueness seeking behavior. However, the theory conceptualizes similarity so ambiguously that it appears to suffer from low predictive validity. The purpose of the current article is to propose an alternative explanation of uniqueness seeking behavior. It posits that perceived uniqueness deprivation is a threat to self-concepts, and therefore causes self-verification behavior. Two levels of self verification are conceived: one based on personal categorization and the other on social categorization. The present approach regards uniqueness seeking behavior as the personal-level self verification. To test these propositions, a 2 (very high or moderate similarity information) x 2 (with or without outgroup information) x 2 (high or low need for uniqueness) between-subject factorial-design experiment was conducted with 95 university students. Results supported the self-verification approach, and were discussed in terms of effects of uniqueness deprivation, levels of self-categorization, and individual differences in need for uniqueness.
You Can't See the Real Me: Attachment Avoidance, Self-Verification, and Self-Concept Clarity.
Emery, Lydia F; Gardner, Wendi L; Carswell, Kathleen L; Finkel, Eli J
2018-03-01
Attachment shapes people's experiences in their close relationships and their self-views. Although attachment avoidance and anxiety both undermine relationships, past research has primarily emphasized detrimental effects of anxiety on the self-concept. However, as partners can help people maintain stable self-views, avoidant individuals' negative views of others might place them at risk for self-concept confusion. We hypothesized that avoidance would predict lower self-concept clarity and that less self-verification from partners would mediate this association. Attachment avoidance was associated with lower self-concept clarity (Studies 1-5), an effect that was mediated by low self-verification (Studies 2-3). The association between avoidance and self-verification was mediated by less self-disclosure and less trust in partner feedback (Study 4). Longitudinally, avoidance predicted changes in self-verification, which in turn predicted changes in self-concept clarity (Study 5). Thus, avoidant individuals' reluctance to trust or become too close to others may result in hidden costs to the self-concept.
Formal verification of medical monitoring software using Z language: a representative sample.
Babamir, Seyed Morteza; Borhani, Mehdi
2012-08-01
Medical monitoring systems are useful aids assisting physicians in keeping patients under constant surveillance; however, taking sound decision by the systems is a physician concern. As a result, verification of the systems behavior in monitoring patients is a matter of significant. The patient monitoring is undertaken by software in modern medical systems; so, software verification of modern medial systems have been noticed. Such verification can be achieved by the Formal Languages having mathematical foundations. Among others, the Z language is a suitable formal language has been used to formal verification of systems. This study aims to present a constructive method to verify a representative sample of a medical system by which the system is visually specified and formally verified against patient constraints stated in Z Language. Exploiting our past experience in formal modeling Continuous Infusion Insulin Pump (CIIP), we think of the CIIP system as a representative sample of medical systems in proposing our present study. The system is responsible for monitoring diabetic's blood sugar.
The MCNP6 Analytic Criticality Benchmark Suite
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Forrest B.
2016-06-16
Analytical benchmarks provide an invaluable tool for verifying computer codes used to simulate neutron transport. Several collections of analytical benchmark problems [1-4] are used routinely in the verification of production Monte Carlo codes such as MCNP® [5,6]. Verification of a computer code is a necessary prerequisite to the more complex validation process. The verification process confirms that a code performs its intended functions correctly. The validation process involves determining the absolute accuracy of code results vs. nature. In typical validations, results are computed for a set of benchmark experiments using a particular methodology (code, cross-section data with uncertainties, and modeling)more » and compared to the measured results from the set of benchmark experiments. The validation process determines bias, bias uncertainty, and possibly additional margins. Verification is generally performed by the code developers, while validation is generally performed by code users for a particular application space. The VERIFICATION_KEFF suite of criticality problems [1,2] was originally a set of 75 criticality problems found in the literature for which exact analytical solutions are available. Even though the spatial and energy detail is necessarily limited in analytical benchmarks, typically to a few regions or energy groups, the exact solutions obtained can be used to verify that the basic algorithms, mathematics, and methods used in complex production codes perform correctly. The present work has focused on revisiting this benchmark suite. A thorough review of the problems resulted in discarding some of them as not suitable for MCNP benchmarking. For the remaining problems, many of them were reformulated to permit execution in either multigroup mode or in the normal continuous-energy mode for MCNP. Execution of the benchmarks in continuous-energy mode provides a significant advance to MCNP verification methods.« less
Development of Onboard Computer Complex for Russian Segment of ISS
NASA Technical Reports Server (NTRS)
Branets, V.; Brand, G.; Vlasov, R.; Graf, I.; Clubb, J.; Mikrin, E.; Samitov, R.
1998-01-01
Report present a description of the Onboard Computer Complex (CC) that was developed during the period of 1994-1998 for the Russian Segment of ISS. The system was developed in co-operation with NASA and ESA. ESA developed a new computation system under the RSC Energia Technical Assignment, called DMS-R. The CC also includes elements developed by Russian experts and organizations. A general architecture of the computer system and the characteristics of primary elements of this system are described. The system was integrated at RSC Energia with the participation of American and European specialists. The report contains information on software simulators, verification and de-bugging facilities witch were been developed for both stand-alone and integrated tests and verification. This CC serves as the basis for the Russian Segment Onboard Control Complex on ISS.
NASA Technical Reports Server (NTRS)
Ghatas, Rania W.; Jack, Devin P.; Tsakpinis, Dimitrios; Sturdy, James L.; Vincent, Michael J.; Hoffler, Keith D.; Myer, Robert R.; DeHaven, Anna M.
2017-01-01
As Unmanned Aircraft Systems (UAS) make their way to mainstream aviation operations within the National Airspace System (NAS), research efforts are underway to develop a safe and effective environment for their integration into the NAS. Detect and Avoid (DAA) systems are required to account for the lack of "eyes in the sky" due to having no human on-board the aircraft. The technique, results, and lessons learned from a detailed End-to-End Verification and Validation (E2-V2) simulation study of a DAA system representative of RTCA SC-228's proposed Phase I DAA Minimum Operational Performance Standards (MOPS), based on specific test vectors and encounter cases, will be presented in this paper.
Time dependent pre-treatment EPID dosimetry for standard and FFF VMAT.
Podesta, Mark; Nijsten, Sebastiaan M J J G; Persoon, Lucas C G G; Scheib, Stefan G; Baltes, Christof; Verhaegen, Frank
2014-08-21
Methods to calibrate Megavoltage electronic portal imaging devices (EPIDs) for dosimetry have been previously documented for dynamic treatments such as intensity modulated radiotherapy (IMRT) using flattened beams and typically using integrated fields. While these methods verify the accumulated field shape and dose, the dose rate and differential fields remain unverified. The aim of this work is to provide an accurate calibration model for time dependent pre-treatment dose verification using amorphous silicon (a-Si) EPIDs in volumetric modulated arc therapy (VMAT) for both flattened and flattening filter free (FFF) beams. A general calibration model was created using a Varian TrueBeam accelerator, equipped with an aS1000 EPID, for each photon spectrum 6 MV, 10 MV, 6 MV-FFF, 10 MV-FFF. As planned VMAT treatments use control points (CPs) for optimization, measured images are separated into corresponding time intervals for direct comparison with predictions. The accuracy of the calibration model was determined for a range of treatment conditions. Measured and predicted CP dose images were compared using a time dependent gamma evaluation using criteria (3%, 3 mm, 0.5 sec). Time dependent pre-treatment dose verification is possible without an additional measurement device or phantom, using the on-board EPID. Sufficient data is present in trajectory log files and EPID frame headers to reliably synchronize and resample portal images. For the VMAT plans tested, significantly more deviation is observed when analysed in a time dependent manner for FFF and non-FFF plans than when analysed using only the integrated field. We show EPID-based pre-treatment dose verification can be performed on a CP basis for VMAT plans. This model can measure pre-treatment doses for both flattened and unflattened beams in a time dependent manner which highlights deviations that are missed in integrated field verifications.
NASA Technical Reports Server (NTRS)
1978-01-01
The performance, design, and verification requirements for the space construction automated fabrication experiment (SCAFE) are defined and the source of each imposed or derived requirement is identified.
SEASAT - A candidate ocean industry economic verification experiments
NASA Technical Reports Server (NTRS)
Miller, B. P.
1976-01-01
The economic benefits of an operational SEASAT system are discussed in the areas of marine transportation, offshore oil and natural gas exploration and development, ocean fishing, and Arctic operations. A description of the candidate economic verification experiments which could be performed with SEASAT-A is given. With the exception of the area of Arctic operations, experiments have been identified in each of the areas of ocean based activity that are expected to show an economic impact from the use of operational SEASAT data. Experiments have been identified in the areas of the offshore oil and natural gas industry, as well as ice monitoring and coastal zone applications. Emphasis has been placed on the identification and the development of those experiments which meet criteria for: (1) end user participation; (2) SEASAT-A data utility; (3) measurability of operational parameters to demonstrate economic effect; and (4) non-proprietary nature of results.
NASA Astrophysics Data System (ADS)
Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.
2018-05-01
The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.
Design of verification platform for wireless vision sensor networks
NASA Astrophysics Data System (ADS)
Ye, Juanjuan; Shang, Fei; Yu, Chuang
2017-08-01
At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.
NASA Astrophysics Data System (ADS)
Kim, A. A.; Klochkov, D. V.; Konyaev, M. A.; Mihaylenko, A. S.
2017-11-01
The article considers the problem of control and verification of the laser ceilometers basic performance parameters and describes an alternative method based on the use of multi-length fiber optic delay line, simulating atmospheric track. The results of the described experiment demonstrate the great potential of this method for inspection and verification procedures of laser ceilometers.
TeleOperator/telePresence System (TOPS) Concept Verification Model (CVM) development
NASA Technical Reports Server (NTRS)
Shimamoto, Mike S.
1993-01-01
The development of an anthropomorphic, undersea manipulator system, the TeleOperator/telePresence System (TOPS) Concept Verification Model (CVM) is described. The TOPS system's design philosophy, which results from NRaD's experience in undersea vehicles and manipulator systems development and operations, is presented. The TOPS design approach, task teams, manipulator, and vision system development and results, conclusions, and recommendations are presented.
NASA Technical Reports Server (NTRS)
Yew, Calinda; Whitehouse, Paul; Lui, Yan; Banks, Kimberly
2016-01-01
JWST Integrated Science Instruments Module (ISIM) has completed its system-level testing program at the NASA Goddard Space Flight Center (GSFC). In March 2016, ISIM was successfully delivered for integration with the Optical Telescope Element (OTE) after the successful verification of the system through a series of three cryo-vacuum (CV) tests. The first test served as a risk reduction test; the second test provided the initial verification of the fully-integrated flight instruments; and the third test verified the system in its final flight configuration. The complexity of the mission has generated challenging requirements that demand highly reliable system performance and capabilities from the Space Environment Simulator (SES) vacuum chamber. As JWST progressed through its CV testing campaign, deficiencies in the test configuration and support equipment were uncovered from one test to the next. Subsequent upgrades and modifications were implemented to improve the facility support capabilities required to achieve test requirements. This paper: (1) provides an overview of the integrated mechanical and thermal facility systems required to achieve the objectives of JWST ISIM testing, (2) compares the overall facility performance and instrumentation results from the three ISIM CV tests, and (3) summarizes lessons learned from the ISIM testing campaign.
Chen, Runlin; Wei, Yangyang; Shi, Zhaoyang; Yuan, Xiaoyang
2016-01-01
The identification accuracy of dynamic characteristics coefficients is difficult to guarantee because of the errors of the measurement system itself. A novel dynamic calibration method of measurement system for dynamic characteristics coefficients is proposed in this paper to eliminate the errors of the measurement system itself. Compared with the calibration method of suspension quality, this novel calibration method is different because the verification device is a spring-mass system, which can simulate the dynamic characteristics of sliding bearing. The verification device is built, and the calibration experiment is implemented in a wide frequency range, in which the bearing stiffness is simulated by the disc springs. The experimental results show that the amplitude errors of this measurement system are small in the frequency range of 10 Hz–100 Hz, and the phase errors increase along with the increasing of frequency. It is preliminarily verified by the simulated experiment of dynamic characteristics coefficients identification in the frequency range of 10 Hz–30 Hz that the calibration data in this frequency range can support the dynamic characteristics test of sliding bearing in this frequency range well. The bearing experiments in greater frequency ranges need higher manufacturing and installation precision of calibration device. Besides, the processes of calibration experiments should be improved. PMID:27483283
Katz, Jennifer; Joiner, Thomas E
2002-02-01
We contend that close relationships provide adults with optimal opportunities for personal growth when relationship partners provide accurate, honest feedback. Accordingly, it was predicted that young adults would experience the relationship quality with relationship partners who evaluated them in a manner consistent their own self-evaluations. Three empirical tests of this self-verification hypothesis as applied to close dyads were conducted. In Study 1, young adults in dating relationships were most intimate with and somewhat more committed to partners when they perceived that partners evaluated them as they evaluated themselves. Self-verification effects were pronounced for those involved in more serious dating relationships. In Study 2, men reported the greatest esteem for same-sex roommates who evaluated them in a self-verifying manner. Results from Study 2 were replicated and extended to both male and female roommate dyads in Study 3. Further, self-verification effects were most pronounced for young adults with high emotional empathy. Results suggest that self-verification theory is useful for understanding dyadic adjustment across a variety of relational contexts in young adulthood. Implications of self-verification processes for adult personal development are outlined within an identity negotiation framework.
Validation and verification of expert systems
NASA Technical Reports Server (NTRS)
Gilstrap, Lewey
1991-01-01
Validation and verification (V&V) are procedures used to evaluate system structure or behavior with respect to a set of requirements. Although expert systems are often developed as a series of prototypes without requirements, it is not possible to perform V&V on any system for which requirements have not been prepared. In addition, there are special problems associated with the evaluation of expert systems that do not arise in the evaluation of conventional systems, such as verification of the completeness and accuracy of the knowledge base. The criticality of most NASA missions make it important to be able to certify the performance of the expert systems used to support these mission. Recommendations for the most appropriate method for integrating V&V into the Expert System Development Methodology (ESDM) and suggestions for the most suitable approaches for each stage of ESDM development are presented.
Annual verifications--a tick-box exercise?
Walker, Gwen; Williams, David
2014-09-01
With the onus on healthcare providers and their staff to protect patients against all elements of 'avoidable harm' perhaps never greater, Gwen Walker, a highly experienced infection prevention control nurse specialist, and David Williams, MD of Approved Air, who has 30 years' experience in validation and verification of ventilation and ultraclean ventilation systems, examine changing requirements for, and trends in, operating theatre ventilation. Validation and verification reporting on such vital HVAC equipment should not, they argue, merely be viewed as a 'tick-box exercise'; it should instead 'comprehensively inform key stakeholders, and ultimately form part of clinical governance, thus protecting those ultimately named responsible for organisation-wide safety at Trust board level'.
Experimental verification of layout physical verification of silicon photonics
NASA Astrophysics Data System (ADS)
El Shamy, Raghi S.; Swillam, Mohamed A.
2018-02-01
Silicon photonics have been approved as one of the best platforms for dense integration of photonic integrated circuits (PICs) due to the high refractive index contrast among its materials. Silicon on insulator (SOI) is a widespread photonics technology, which support a variety of devices for lots of applications. As the photonics market is growing, the number of components in the PICs increases which increase the need for an automated physical verification (PV) process. This PV process will assure reliable fabrication of the PICs as it will check both the manufacturability and the reliability of the circuit. However, PV process is challenging in the case of PICs as it requires running an exhaustive electromagnetic (EM) simulations. Our group have recently proposed an empirical closed form models for the directional coupler and the waveguide bends based on the SOI technology. The models have shown a very good agreement with both finite element method (FEM) and finite difference time domain (FDTD) solvers. These models save the huge time of the 3D EM simulations and can be easily included in any electronic design automation (EDA) flow as the equations parameters can be easily extracted from the layout. In this paper we present experimental verification for our previously proposed models. SOI directional couplers with different dimensions have been fabricated using electron beam lithography and measured. The results from the measurements of the fabricate devices have been compared to the derived models and show a very good agreement. Also the matching can reach 100% by calibrating certain parameter in the model.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
Using Automation to Improve the Flight Software Testing Process
NASA Technical Reports Server (NTRS)
ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.
2001-01-01
One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.
US corn and soybeans exploratory experiment
NASA Technical Reports Server (NTRS)
Carnes, J. G. (Principal Investigator)
1981-01-01
The results from the U.S. corn/soybeans exploratory experiment which was completed during FY 1980 are summarized. The experiment consisted of two parts: the classification procedures verification test and the simulated aggregation test. Evaluations of labeling, proportion estimation, and aggregation procedures are presented.
Development of a Response Surface Thermal Model for Orion Mated to the International Space Station
NASA Technical Reports Server (NTRS)
Miller, Stephen W.; Meier, Eric J.
2010-01-01
A study was performed to determine if a Design of Experiments (DOE)/Response Surface Methodology could be applied to on-orbit thermal analysis and produce a set of Response Surface Equations (RSE) that accurately predict vehicle temperatures. The study used an integrated thermal model of the International Space Station and the Orion Outer mold line model. Five separate factors were identified for study: yaw, pitch, roll, beta angle, and the environmental parameters. Twenty external Orion temperatures were selected as the responses. A DOE case matrix of 110 runs was developed. The data from these cases were analyzed to produce an RSE for each of the temperature responses. The initial agreement between the engineering data and the RSE predictions was encouraging, although many RSEs had large uncertainties on their predictions. Fourteen verification cases were developed to test the predictive powers of the RSEs. The verification showed mixed results with some RSE predicting temperatures matching the engineering data within the uncertainty bands, while others had very large errors. While this study to not irrefutably prove that the DOE/RSM approach can be applied to on-orbit thermal analysis, it does demonstrate that technique has the potential to predict temperatures. Additional work is needed to better identify the cases needed to produce the RSEs
Analog Video Authentication and Seal Verification Equipment Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gregory Lancaster
Under contract to the US Department of Energy in support of arms control treaty verification activities, the Savannah River National Laboratory in conjunction with the Pacific Northwest National Laboratory, the Idaho National Laboratory and Milagro Consulting, LLC developed equipment for use within a chain of custody regime. This paper discussed two specific devices, the Authentication Through the Lens (ATL) analog video authentication system and a photographic multi-seal reader. Both of these devices have been demonstrated in a field trial, and the experience gained throughout will also be discussed. Typically, cryptographic methods are used to prove the authenticity of digital imagesmore » and video used in arms control chain of custody applications. However, in some applications analog cameras are used. Since cryptographic authentication methods will not work on analog video streams, a simple method of authenticating analog video was developed and tested. A photographic multi-seal reader was developed to image different types of visual unique identifiers for use in chain of custody and authentication activities. This seal reader is unique in its ability to image various types of seals including the Cobra Seal, Reflective Particle Tags, and adhesive seals. Flicker comparison is used to compare before and after images collected with the seal reader in order to detect tampering and verify the integrity of the seal.« less
Timing analysis by model checking
NASA Technical Reports Server (NTRS)
Naydich, Dimitri; Guaspari, David
2000-01-01
The safety of modern avionics relies on high integrity software that can be verified to meet hard real-time requirements. The limits of verification technology therefore determine acceptable engineering practice. To simplify verification problems, safety-critical systems are commonly implemented under the severe constraints of a cyclic executive, which make design an expensive trial-and-error process highly intolerant of change. Important advances in analysis techniques, such as rate monotonic analysis (RMA), have provided a theoretical and practical basis for easing these onerous restrictions. But RMA and its kindred have two limitations: they apply only to verifying the requirement of schedulability (that tasks meet their deadlines) and they cannot be applied to many common programming paradigms. We address both these limitations by applying model checking, a technique with successful industrial applications in hardware design. Model checking algorithms analyze finite state machines, either by explicit state enumeration or by symbolic manipulation. Since quantitative timing properties involve a potentially unbounded state variable (a clock), our first problem is to construct a finite approximation that is conservative for the properties being analyzed-if the approximation satisfies the properties of interest, so does the infinite model. To reduce the potential for state space explosion we must further optimize this finite model. Experiments with some simple optimizations have yielded a hundred-fold efficiency improvement over published techniques.
Payload crew training complex simulation engineer's handbook
NASA Technical Reports Server (NTRS)
Shipman, D. L.
1984-01-01
The Simulation Engineer's Handbook is a guide for new engineers assigned to Experiment Simulation and a reference for engineers previously assigned. The experiment simulation process, development of experiment simulator requirements, development of experiment simulator hardware and software, and the verification of experiment simulators are discussed. The training required for experiment simulation is extensive and is only referenced in the handbook.
NASA Technical Reports Server (NTRS)
1995-01-01
The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.
Validation and verification of a virtual environment for training naval submarine officers
NASA Astrophysics Data System (ADS)
Zeltzer, David L.; Pioch, Nicholas J.
1996-04-01
A prototype virtual environment (VE) has been developed for training a submarine officer of the desk (OOD) to perform in-harbor navigation on a surfaced submarine. The OOD, stationed on the conning tower of the vessel, is responsible for monitoring the progress of the boat as it negotiates a marked channel, as well as verifying the navigational suggestions of the below- deck piloting team. The VE system allows an OOD trainee to view a particular harbor and associated waterway through a head-mounted display, receive spoken reports from a simulated piloting team, give spoken commands to the helmsman, and receive verbal confirmation of command execution from the helm. The task analysis of in-harbor navigation, and the derivation of application requirements are briefly described. This is followed by a discussion of the implementation of the prototype. This implementation underwent a series of validation and verification assessment activities, including operational validation, data validation, and software verification of individual software modules as well as the integrated system. Validation and verification procedures are discussed with respect to the OOD application in particular, and with respect to VE applications in general.
NASA Astrophysics Data System (ADS)
Heyer, H.-V.; Föckersperger, S.; Lattner, K.; Moldenhauer, W.; Schmolke, J.; Turk, M.; Willemsen, P.; Schlicker, M.; Westerdorff, K.
2008-08-01
The technology verification satellite TET (Technologie ErprobungsTräger) is the core element of the German On-Orbit-Verification (OOV) program of new technologies and techniques. The goal of this program is the support of the German space industry and research facilities for on-orbit verification of satellite technologies. The TET satellite is a small satellite developed and built in Germany under leadership of Kayser-Threde. The satellite bus is based on the successfully operated satellite BIRD and the newly developed payload platform with the new payload handling system called NVS (Nutzlastversorgungs-system). The NVS can be detailed in three major parts: the power supply the processor boards and the I/O-interfaces. The NVS is realized via several PCBs in Europe format which are connected to each other via an integrated backplane. The payloads are connected by front connectors to the NVS. This paper describes the concept, architecture, and the hard-/software of the NVS. Phase B of this project was successfully finished last year.
Assume-Guarantee Verification of Source Code with Design-Level Assumptions
NASA Technical Reports Server (NTRS)
Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.
2004-01-01
Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.
Development and Assessment of CTF for Pin-resolved BWR Modeling
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salko, Robert K; Wysocki, Aaron J; Collins, Benjamin S
2017-01-01
CTF is the modernized and improved version of the subchannel code, COBRA-TF. It has been adopted by the Consortium for Advanced Simulation for Light Water Reactors (CASL) for subchannel analysis applications and thermal hydraulic feedback calculations in the Virtual Environment for Reactor Applications Core Simulator (VERA-CS). CTF is now jointly developed by Oak Ridge National Laboratory and North Carolina State University. Until now, CTF has been used for pressurized water reactor modeling and simulation in CASL, but in the future it will be extended to boiling water reactor designs. This required development activities to integrate the code into the VERA-CSmore » workflow and to make it more ecient for full-core, pin resolved simulations. Additionally, there is a significant emphasis on producing high quality tools that follow a regimented software quality assurance plan in CASL. Part of this plan involves performing validation and verification assessments on the code that are easily repeatable and tied to specific code versions. This work has resulted in the CTF validation and verification matrix being expanded to include several two-phase flow experiments, including the General Electric 3 3 facility and the BWR Full-Size Fine Mesh Bundle Tests (BFBT). Comparisons with both experimental databases is reasonable, but the BFBT analysis reveals a tendency of CTF to overpredict void, especially in the slug flow regime. The execution of these tests is fully automated, analysis is documented in the CTF Validation and Verification manual, and the tests have become part of CASL continuous regression testing system. This paper will summarize these recent developments and some of the two-phase assessments that have been performed on CTF.« less
Towards SMOS: The 2006 National Airborne Field Experiment Plan
NASA Astrophysics Data System (ADS)
Walker, J. P.; Merlin, O.; Panciera, R.; Kalma, J. D.
2006-05-01
The 2006 National Airborne Field Experiment (NAFE) is the second in a series of two intensive experiments to be conducted in different parts of Australia. The NAFE'05 experiment was undertaken in the Goulburn River catchment during November 2005, with the objective to provide high resolution data for process level understanding of soil moisture retrieval, scaling and data assimilation. The NAFE'06 experiment will be undertaken in the Murrumbidgee catchment during November 2006, with the objective to provide data for SMOS (Soil Moisture and Ocean Salinity) level soil moisture retrieval, downscaling and data assimilation. To meet this objective, PLMR (Polarimetric L-band Multibeam Radiometer) and supporting instruments (TIR and NDVI) will be flown at an altitude of 10,000 ft AGL to provide 1km resolution passive microwave data (and 20m TIR) across a 50km x 50km area every 2-3 days. This will both simulate a SMOS pixel and provide the 1km soil moisture data required for downscale verification, allowing downscaling and near-surface soil moisture assimilation techniques to be tested with remote sensing data which is consistent with that from current (MODIS) and planned (SMOS) satellite sensors.. Additionally, two transects will be flown across the area to provide both 1km multi-angular passive microwave data for SMOS algorithm development, and on the same day, 50m resolution passive microwave data for algorithm verification. The study area contains a total of 13 soil moisture profile and rainfall monitoring sites for assimilation verification, and the transect fight lines are planned to go through 5 of these. Ground monitoring of surface soil moisture and vegetation for algorithm verification will be targeted at these 5 focus farms, with soil moisture measurements made at 250m spacing for 1km resolution flights and 50m spacing for 50m resolution flights. While this experiment has a particular emphasis on the remote sensing of soil moisture, it is open for collaboration from interested scientists from all disciplines of environmental remote sensing and its application. See www.nafe.unimelb.edu.au for more detailed information on these experiments.
Integration Testing of Space Flight Systems
NASA Technical Reports Server (NTRS)
Honeycutt, Timothy; Sowards, Stephanie
2008-01-01
Based on the previous success' of Multi-Element Integration Testing (MEITs) for the International Space Station Program, these type of integrated tests have also been planned for the Constellation Program: MEIT (1) CEV to ISS (emulated) (2) CEV to Lunar Lander/EDS (emulated) (3) Future: Lunar Surface Systems and Mars Missions Finite Element Integration Test (FEIT) (1) CEV/CLV (2) Lunar Lander/EDS/CaL V Integrated Verification Tests (IVT) (1) Performed as a subset of the FEITs during the flight tests and then performed for every flight after Full Operational Capability (FOC) has been obtained with the flight and ground Systems.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hayes, Birchard P; Michel, Kelly D; Few, Douglas A
From stereophonic, positional sound to high-definition imagery that is crisp and clean, high fidelity computer graphics enhance our view, insight, and intuition regarding our environments and conditions. Contemporary 3-D modeling tools offer an open architecture framework that enables integration with other technologically innovative arenas. One innovation of great interest is Augmented Reality, the merging of virtual, digital environments with physical, real-world environments creating a mixed reality where relevant data and information augments the real or actual experience in real-time by spatial or semantic context. Pairing 3-D virtual immersive models with a dynamic platform such as semi-autonomous robotics or personnel odometrymore » systems to create a mixed reality offers a new and innovative design information verification inspection capability, evaluation accuracy, and information gathering capability for nuclear facilities. Our paper discusses the integration of two innovative technologies, 3-D visualizations with inertial positioning systems, and the resulting augmented reality offered to the human inspector. The discussion in the paper includes an exploration of human and non-human (surrogate) inspections of a nuclear facility, integrated safeguards knowledge within a synchronized virtual model operated, or worn, by a human inspector, and the anticipated benefits to safeguards evaluations of facility operations.« less
Age verification cards fail to fully prevent minors from accessing tobacco products.
Kanda, Hideyuki; Osaki, Yoneatsu; Ohida, Takashi; Kaneita, Yoshitaka; Munezawa, Takeshi
2011-03-01
Proper age verification can prevent minors from accessing tobacco products. For this reason, electronic locking devices based on a proof-of age system utilising cards were installed in almost every tobacco vending machine across Japan and Germany to restrict sales to minors. We aimed to clarify the associations between amount smoked by high school students and the usage of age verification cards by conducting a nationwide cross-sectional survey of students in Japan. This survey was conducted in 2008. We asked high school students, aged 13-18 years, in Japan about their smoking behaviour, where they purchase cigarettes, if or if not they have used age verification cards, and if yes, how they obtained this card. As the amount smoked increased, the prevalence of purchasing cigarettes from vending machines also rose for both males and females. The percentage of those with experience of using an age verification card was also higher among those who smoked more. Somebody outside of family was the top source of obtaining cards. Surprisingly, around 5% of males and females belonging to the group with highest smoking levels applied for cards themselves. Age verification cards cannot fully prevent minors from accessing tobacco products. These findings suggest that a total ban of tobacco vending machines, not an age verification system, is needed to prevent sales to minors.
MVP-CA Methodology for the Expert System Advocate's Advisor (ESAA)
DOT National Transportation Integrated Search
1997-11-01
The Multi-Viewpoint Clustering Analysis (MVP-CA) tool is a semi-automated tool to provide a valuable aid for comprehension, verification, validation, maintenance, integration, and evolution of complex knowledge-based software systems. In this report,...
Evaluation and Research for Technology: Not Just Playing Around.
ERIC Educational Resources Information Center
Baker, Eva L.; O'Neil, Harold F., Jr.
2003-01-01
Discusses some of the challenges of technology-based training and education, the role of quality verification and evaluation, and strategies to integrate evaluation into the everyday design of technology-based systems for education and training. (SLD)
DOT National Transportation Integrated Search
2018-01-01
Connected vehicles (CVs) and their integration with transportation infrastructure provide new approaches to wrong-way driving (WWD) detection, warning, verification, and intervention that will help practitioners further reduce the occurrence and seve...
Comparison of OpenFOAM and EllipSys3D actuator line methods with (NEW) MEXICO results
NASA Astrophysics Data System (ADS)
Nathan, J.; Meyer Forsting, A. R.; Troldborg, N.; Masson, C.
2017-05-01
The Actuator Line Method exists for more than a decade and has become a well established choice for simulating wind rotors in computational fluid dynamics. Numerous implementations exist and are used in the wind energy research community. These codes were verified by experimental data such as the MEXICO experiment. Often the verification against other codes were made on a very broad scale. Therefore this study attempts first a validation by comparing two different implementations, namely an adapted version of SOWFA/OpenFOAM and EllipSys3D and also a verification by comparing against experimental results from the MEXICO and NEW MEXICO experiments.
Data requirements for verification of ram glow chemistry
NASA Technical Reports Server (NTRS)
Swenson, G. R.; Mende, S. B.
1985-01-01
A set of questions is posed regarding the surface chemistry producing the ram glow on the space shuttle. The questions surround verification of the chemical cycle involved in the physical processes leading to the glow. The questions, and a matrix of measurements required for most answers, are presented. The measurements include knowledge of the flux composition to and from a ram surface as well as spectroscopic signatures from the U to visible to IR. A pallet set of experiments proposed to accomplish the measurements is discussed. An interim experiment involving an available infrared instrument to be operated from the shuttle Orbiter cabin is also be discussed.
Figures of Merit for Control Verification
NASA Technical Reports Server (NTRS)
Crespo, Luis G.; Kenny, Sean P.; Goesu. Daniel P.
2008-01-01
This paper proposes a methodology for evaluating a controller's ability to satisfy a set of closed-loop specifications when the plant has an arbitrary functional dependency on uncertain parameters. Control verification metrics applicable to deterministic and probabilistic uncertainty models are proposed. These metrics, which result from sizing the largest uncertainty set of a given class for which the specifications are satisfied, enable systematic assessment of competing control alternatives regardless of the methods used to derive them. A particularly attractive feature of the tools derived is that their efficiency and accuracy do not depend on the robustness of the controller. This is in sharp contrast to Monte Carlo based methods where the number of simulations required to accurately approximate the failure probability grows exponentially with its closeness to zero. This framework allows for the integration of complex, high-fidelity simulations of the integrated system and only requires standard optimization algorithms for its implementation.
TRAC-PF1 code verification with data from the OTIS test facility. [Once-Through Intergral System
DOE Office of Scientific and Technical Information (OSTI.GOV)
Childerson, M.T.; Fujita, R.K.
1985-01-01
A computer code (TRAC-PF1/MOD1) developed for predicting transient thermal and hydraulic integral nuclear steam supply system (NSSS) response was benchmarked. Post-small break loss-of-coolant accident (LOCA) data from a scaled, experimental facility, designated the One-Through Integral System (OTIS), were obtained for the Babcock and Wilcox NSSS and compared to TRAC predictions. The OTIS tests provided a challenging small break LOCA data set for TRAC verification. The major phases of a small break LOCA observed in the OTIS tests included pressurizer draining and loop saturation, intermittent reactor coolant system circulation, boiler-condenser mode, and the initial stages of refill. The TRAC code wasmore » successful in predicting OTIS loop conditions (system pressures and temperatures) after modification of the steam generator model. In particular, the code predicted both pool and auxiliary-feedwater initiated boiler-condenser mode heat transfer.« less
NASA Technical Reports Server (NTRS)
Haigh, R.; Krimchansky, S. (Technical Monitor)
2000-01-01
This is the Performance Verification Report, METSAT (S/N 108) AMSU-A1 Receiver Assemblies P/N 1356429-1 S/N F05 and P/N 1356409-1 S/N F05, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The ATP for the AMSU-A Receiver Subsystem, AE-26002/6A, is prepared to describe in detail the configuration of the test setups and the procedures of the tests to verify that the receiver subsystem meets the specifications as required either in the AMSU-A Instrument Performance and Operation Specifications, S-480-80, or in AMSU-A Receiver Subsystem Specifications, AE-26608, derived by the Aerojet System Engineering. Test results that verify the conformance to the specifications demonstrate the acceptability of that particular receiver subsystem.
NASA Technical Reports Server (NTRS)
Tamayo, Tak Chai
1987-01-01
Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.
LIHE Spectral Dynamics and Jaguar Data Acquisition System Measurement Assurance Results 2014.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Covert, Timothy T.; Willis, Michael David; Radtke, Gregg Arthur
2015-06-01
The Light Initiated High Explosive (LIHE) facility performs high rigor, high consequence impulse testing for the nuclear weapons (NW) community. To support the facility mission, LIHE's extensive data acquisition system (DAS) is comprised of several discrete components as well as a fully integrated system. Due to the high consequence and high rigor of the testing performed at LIHE, a measurement assurance plan (MAP) was developed in collaboration with NW system customers to meet their data quality needs and to provide assurance of the robustness of the LIHE DAS. While individual components of the DAS have been calibrated by the SNLmore » Primary Standards Laboratory (PSL), the integrated nature of this complex system requires verification of the complete system, from end-to-end. This measurement assurance plan (MAP) report documents the results of verification and validation procedures used to ensure that the data quality meets customer requirements.« less
Direct and full-scale experimental verifications towards ground-satellite quantum key distribution
NASA Astrophysics Data System (ADS)
Wang, Jian-Yu; Yang, Bin; Liao, Sheng-Kai; Zhang, Liang; Shen, Qi; Hu, Xiao-Fang; Wu, Jin-Cai; Yang, Shi-Ji; Jiang, Hao; Tang, Yan-Lin; Zhong, Bo; Liang, Hao; Liu, Wei-Yue; Hu, Yi-Hua; Huang, Yong-Mei; Qi, Bo; Ren, Ji-Gang; Pan, Ge-Sheng; Yin, Juan; Jia, Jian-Jun; Chen, Yu-Ao; Chen, Kai; Peng, Cheng-Zhi; Pan, Jian-Wei
2013-05-01
Quantum key distribution (QKD) provides the only intrinsically unconditional secure method for communication based on the principle of quantum mechanics. Compared with fibre-based demonstrations, free-space links could provide the most appealing solution for communication over much larger distances. Despite significant efforts, all realizations to date rely on stationary sites. Experimental verifications are therefore extremely crucial for applications to a typical low Earth orbit satellite. To achieve direct and full-scale verifications of our set-up, we have carried out three independent experiments with a decoy-state QKD system, and overcome all conditions. The system is operated on a moving platform (using a turntable), on a floating platform (using a hot-air balloon), and with a high-loss channel to demonstrate performances under conditions of rapid motion, attitude change, vibration, random movement of satellites, and a high-loss regime. The experiments address wide ranges of all leading parameters relevant to low Earth orbit satellites. Our results pave the way towards ground-satellite QKD and a global quantum communication network.
Du, Hui; Chen, Xiaobo; Xi, Juntong; Yu, Chengyi; Zhao, Bao
2017-12-12
Large-scale surfaces are prevalent in advanced manufacturing industries, and 3D profilometry of these surfaces plays a pivotal role for quality control. This paper proposes a novel and flexible large-scale 3D scanning system assembled by combining a robot, a binocular structured light scanner and a laser tracker. The measurement principle and system construction of the integrated system are introduced. A mathematical model is established for the global data fusion. Subsequently, a robust method is introduced for the establishment of the end coordinate system. As for hand-eye calibration, the calibration ball is observed by the scanner and the laser tracker simultaneously. With this data, the hand-eye relationship is solved, and then an algorithm is built to get the transformation matrix between the end coordinate system and the world coordinate system. A validation experiment is designed to verify the proposed algorithms. Firstly, a hand-eye calibration experiment is implemented and the computation of the transformation matrix is done. Then a car body rear is measured 22 times in order to verify the global data fusion algorithm. The 3D shape of the rear is reconstructed successfully. To evaluate the precision of the proposed method, a metric tool is built and the results are presented.
NASA Astrophysics Data System (ADS)
Herbuś, K.; Ociepka, P.
2017-08-01
In the work is analysed a sequential control system of a machine for separating and grouping work pieces for processing. Whereas, the area of the considered problem is related with verification of operation of an actuator system of an electro-pneumatic control system equipped with a PLC controller. Wherein to verification is subjected the way of operation of actuators in view of logic relationships assumed in the control system. The actuators of the considered control system were three drives of linear motion (pneumatic cylinders). And the logical structure of the system of operation of the control system is based on the signals flow graph. The tested logical structure of operation of the electro-pneumatic control system was implemented in the Automation Studio software of B&R company. This software is used to create programs for the PLC controllers. Next, in the FluidSIM software was created the model of the actuator system of the control system of a machine. To verify the created program for the PLC controller, simulating the operation of the created model, it was utilized the approach of integration these two programs using the tool for data exchange in the form of the OPC server.
The greenhouse theory of climate change - A test by an inadvertent global experiment
NASA Technical Reports Server (NTRS)
Ramanathan, V.
1988-01-01
The greenhouse theory of climate change has reached the crucial stage of verification. Surface warming as large as that predicted by models would be unprecedented during an interglacial period such as the present. The theory, its scope for verification, and the emerging complexities of the climate feedback mechanisms are discussed in this paper. The evidence for change is described and competing nonclimatic forcings are discussed.
An IBM PC-based math model for space station solar array simulation
NASA Technical Reports Server (NTRS)
Emanuel, E. M.
1986-01-01
This report discusses and documents the design, development, and verification of a microcomputer-based solar cell math model for simulating the Space Station's solar array Initial Operational Capability (IOC) reference configuration. The array model is developed utilizing a linear solar cell dc math model requiring only five input parameters: short circuit current, open circuit voltage, maximum power voltage, maximum power current, and orbit inclination. The accuracy of this model is investigated using actual solar array on orbit electrical data derived from the Solar Array Flight Experiment/Dynamic Augmentation Experiment (SAFE/DAE), conducted during the STS-41D mission. This simulator provides real-time simulated performance data during the steady state portion of the Space Station orbit (i.e., array fully exposed to sunlight). Eclipse to sunlight transients and shadowing effects are not included in the analysis, but are discussed briefly. Integrating the Solar Array Simulator (SAS) into the Power Management and Distribution (PMAD) subsystem is also discussed.
WTS-4 system verification unit for wind/hydroelectric integration study
NASA Technical Reports Server (NTRS)
Watts, A. W.
1982-01-01
The Bureau of Reclamation (Reclamation) initiated a study to investigate the concept of integrating 100 MW of wind energy from megawatt-size wind turbines with the Federal hydroelectric system. As a part of the study, one large wind turbine was purchased through the competitive bid process and is now being installed to serve as a system verification unit (SVU). Reclamation negotiated an agreement with NASA to provide technical management of the project for the design, fabrication, installation, testing, and initial operation. Hamilton Standard was awarded a contract to furnish and install its WTS-4 wind turbine rated at 4 MW at a site near Medicine Bow, Wyoming. The purposes for installing the SVU are to fully evaluate the wind/hydro integration concept, make technical evaluation of the hardware design, train personnel in the technology, evaluate operation and maintenance aspects, and evaluate associated environmental impacts. The SVU will be operational in June 1982. Data from the WTS-4 and from a second SVU, Boeing's MOD-2, will be used to prepare a final design for a 100-MW farm if Congress authorizes the project.
Collusion-aware privacy-preserving range query in tiered wireless sensor networks.
Zhang, Xiaoying; Dong, Lei; Peng, Hui; Chen, Hong; Zhao, Suyun; Li, Cuiping
2014-12-11
Wireless sensor networks (WSNs) are indispensable building blocks for the Internet of Things (IoT). With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromisedmaster nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals.
Collusion-Aware Privacy-Preserving Range Query in Tiered Wireless Sensor Networks†
Zhang, Xiaoying; Dong, Lei; Peng, Hui; Chen, Hong; Zhao, Suyun; Li, Cuiping
2014-01-01
Wireless sensor networks (WSNs) are indispensable building blocks for the Internet of Things (IoT). With the development of WSNs, privacy issues have drawn more attention. Existing work on the privacy-preserving range query mainly focuses on privacy preservation and integrity verification in two-tiered WSNs in the case of compromised master nodes, but neglects the damage of node collusion. In this paper, we propose a series of collusion-aware privacy-preserving range query protocols in two-tiered WSNs. To the best of our knowledge, this paper is the first to consider collusion attacks for a range query in tiered WSNs while fulfilling the preservation of privacy and integrity. To preserve the privacy of data and queries, we propose a novel encoding scheme to conceal sensitive information. To preserve the integrity of the results, we present a verification scheme using the correlation among data. In addition, two schemes are further presented to improve result accuracy and reduce communication cost. Finally, theoretical analysis and experimental results confirm the efficiency, accuracy and privacy of our proposals. PMID:25615731
NASA Technical Reports Server (NTRS)
Davies, Misty D.; Gundy-Burlet, Karen
2010-01-01
A useful technique for the validation and verification of complex flight systems is Monte Carlo Filtering -- a global sensitivity analysis that tries to find the inputs and ranges that are most likely to lead to a subset of the outputs. A thorough exploration of the parameter space for complex integrated systems may require thousands of experiments and hundreds of controlled and measured variables. Tools for analyzing this space often have limitations caused by the numerical problems associated with high dimensionality and caused by the assumption of independence of all of the dimensions. To combat both of these limitations, we propose a technique that uses a combination of the original variables with the derived variables obtained during a principal component analysis.
Firing Room Remote Application Software Development
NASA Technical Reports Server (NTRS)
Liu, Kan
2015-01-01
The Engineering and Technology Directorate (NE) at National Aeronautics and Space Administration (NASA) Kennedy Space Center (KSC) is designing a new command and control system for the checkout and launch of Space Launch System (SLS) and future rockets. The purposes of the semester long internship as a remote application software developer include the design, development, integration, and verification of the software and hardware in the firing rooms, in particular with the Mobile Launcher (ML) Launch Accessories (LACC) subsystem. In addition, a software test verification procedure document was created to verify and checkout LACC software for Launch Equipment Test Facility (LETF) testing.
User-friendly design approach for analog layout design
NASA Astrophysics Data System (ADS)
Li, Yongfu; Lee, Zhao Chuan; Tripathi, Vikas; Perez, Valerio; Ong, Yoong Seang; Hui, Chiu Wing
2017-03-01
Analog circuits are sensitives to the changes in the layout environment conditions, manufacturing processes, and variations. This paper presents analog verification flow with five types of analogfocused layout constraint checks to assist engineers in identifying any potential device mismatch and layout drawing mistakes. Compared to several solutions, our approach only requires layout design, which is sufficient to recognize all the matched devices. Our approach simplifies the data preparation and allows seamless integration into the layout environment with minimum disruption to the custom layout flow. Our user-friendly analog verification flow provides the engineer with more confident with their layouts quality.
STS-2: SAIL non-avionics subsystems math model requirements
NASA Technical Reports Server (NTRS)
Bennett, W. P.; Herold, R. W.
1980-01-01
Simulation of the STS-2 Shuttle nonavionics subsystems in the shuttle avionics integration laboratory (SAIL) is necessary for verification of the integrated shuttle avionics system. The math model (simulation) requirements for each of the nonavionics subsystems that interfaces with the Shuttle avionics system is documented and a single source document for controlling approved changes (by the SAIL change control panel) to the math models is provided.
1. Exterior view of Signal Transfer Building (T28A), looking southwest. ...
1. Exterior view of Signal Transfer Building (T-28A), looking southwest. This structure houses controls for propellant transfer, instrumentation for testing, test data transmission receivers, data verification equipment, and centralized utilities for the Systems Integration Laboratory complex. The gantries of the Systems Integration Laboratory Building (T-28) are visible to the rear of this structure. - Air Force Plant PJKS, Systems Integration Laboratory, Signal Transfer Building, Waterton Canyon Road & Colorado Highway 121, Lakewood, Jefferson County, CO
Protection of the electronic components of measuring equipment from the X-ray radiation
NASA Astrophysics Data System (ADS)
Perez Vasquez, N. O.; Kostrin, D. K.; Uhov, A. A.
2018-02-01
In this work the effect of X-ray radiation on the operation of integrated circuits of the measurement equipment is discussed. The results of the calculations of a shielding system, allowing using integrated circuits with a high degree of integration in the vicinity of the X-ray source, are shown. The results of the verification of two measurement devices that was used for more than five years in the facility for training and testing of X-ray tubes are presented.
NASA Technical Reports Server (NTRS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.;
2016-01-01
NASA's James Webb Space Telescope (JWST) is a 6.6m diameter, segmented, deployable telescope for cryogenic IR space astronomy (40K). The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM) that contains four science instruments (SI) and the fine guider. The SIs are mounted to a composite metering structure. The SI and guider units were integrated to the ISIM structure and optically tested at the NASA Goddard Space Flight Center as a suite using the Optical Telescope Element SIMulator (OSIM). OSIM is a full field, cryogenic JWST telescope simulator. SI performance, including alignment and wave front error, were evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
NASA Astrophysics Data System (ADS)
Antonille, Scott R.; Miskey, Cherie L.; Ohl, Raymond G.; Rohrbach, Scott O.; Aronstein, David L.; Bartoszyk, Andrew E.; Bowers, Charles W.; Cofie, Emmanuel; Collins, Nicholas R.; Comber, Brian J.; Eichhorn, William L.; Glasse, Alistair C.; Gracey, Renee; Hartig, George F.; Howard, Joseph M.; Kelly, Douglas M.; Kimble, Randy A.; Kirk, Jeffrey R.; Kubalak, David A.; Landsman, Wayne B.; Lindler, Don J.; Malumuth, Eliot M.; Maszkiewicz, Michael; Rieke, Marcia J.; Rowlands, Neil; Sabatke, Derek S.; Smith, Corbett T.; Smith, J. Scott; Sullivan, Joseph F.; Telfer, Randal C.; Te Plate, Maurice; Vila, M. Begoña.; Warner, Gerry D.; Wright, David; Wright, Raymond H.; Zhou, Julia; Zielinski, Thomas P.
2016-09-01
NASA's James Webb Space Telescope (JWST) is a 6.5m diameter, segmented, deployable telescope for cryogenic IR space astronomy. The JWST Observatory includes the Optical Telescope Element (OTE) and the Integrated Science Instrument Module (ISIM), that contains four science instruments (SI) and the Fine Guidance Sensor (FGS). The SIs are mounted to a composite metering structure. The SIs and FGS were integrated to the ISIM structure and optically tested at NASA's Goddard Space Flight Center using the Optical Telescope Element SIMulator (OSIM). OSIM is a full-field, cryogenic JWST telescope simulator. SI performance, including alignment and wavefront error, was evaluated using OSIM. We describe test and analysis methods for optical performance verification of the ISIM Element, with an emphasis on the processes used to plan and execute the test. The complexity of ISIM and OSIM drove us to develop a software tool for test planning that allows for configuration control of observations, implementation of associated scripts, and management of hardware and software limits and constraints, as well as tools for rapid data evaluation, and flexible re-planning in response to the unexpected. As examples of our test and analysis approach, we discuss how factors such as the ground test thermal environment are compensated in alignment. We describe how these innovative methods for test planning and execution and post-test analysis were instrumental in the verification program for the ISIM element, with enough information to allow the reader to consider these innovations and lessons learned in this successful effort in their future testing for other programs.
Study for verification testing of the helmet-mounted display in the Japanese Experimental Module.
Nakajima, I; Yamamoto, I; Kato, H; Inokuchi, S; Nemoto, M
2000-02-01
Our purpose is to propose a research and development project in the field of telemedicine. The proposed Multimedia Telemedicine Experiment for Extra-Vehicular Activity will entail experiments designed to support astronaut health management during Extra-Vehicular Activity (EVA). Experiments will have relevant applications to the Japanese Experimental Module (JEM) operated by National Space Development Agency of Japan (NASDA) for the International Space Station (ISS). In essence, this is a proposal for verification testing of the Helmet-Mounted Display (HMD), which enables astronauts to verify their own blood pressures and electrocardiograms, and to view a display of instructions from the ground station and listings of work procedures. Specifically, HMD is a device designed to project images and data inside the astronaut's helmet. We consider this R&D proposal to be one of the most suitable projects under consideration in response to NASDA's open invitation calling for medical experiments to be conducted on JEM.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Samuel, D; Testa, M; Park, Y
Purpose: In-vivo dose and beam range verification in proton therapy could play significant roles in proton treatment validation and improvements. Invivo beam range verification, in particular, could enable new treatment techniques one of which, for example, could be the use of anterior fields for prostate treatment instead of opposed lateral fields as in current practice. We have developed and commissioned an integrated system with hardware, software and workflow protocols, to provide a complete solution, simultaneously for both in-vivo dosimetry and range verification for proton therapy. Methods: The system uses a matrix of diodes, up to 12 in total, but separablemore » into three groups for flexibility in application. A special amplifier was developed to capture extremely small signals from very low proton beam current. The software was developed within iMagX, a general platform for image processing in radiation therapy applications. The range determination exploits the inherent relationship between the internal range modulation clock of the proton therapy system and the radiological depth at the point of measurement. The commissioning of the system, for in-vivo dosimetry and for range verification was separately conducted using anthropomorphic phantom. EBT films and TLDs were used for dose comparisons and range scan of the beam distal fall-off was used as ground truth for range verification. Results: For in-vivo dose measurement, the results were in agreement with TLD and EBT films and were within 3% from treatment planning calculations. For range verification, a precision of 0.5mm is achieved in homogeneous phantoms, and a precision of 2mm for anthropomorphic pelvic phantom, except at points with significant range mixing. Conclusion: We completed the commissioning of our system for in-vivo dosimetry and range verification in proton therapy. The results suggest that the system is ready for clinical trials on patient.« less
Verification and Implementation of Operations Safety Controls for Flight Missions
NASA Technical Reports Server (NTRS)
Jones, Cheryl L.; Smalls, James R.; Carrier, Alicia S.
2010-01-01
Approximately eleven years ago, the International Space Station launched the first module from Russia, the Functional Cargo Block (FGB). Safety and Mission Assurance (S&MA) Operations (Ops) Engineers played an integral part in that endeavor by executing strict flight product verification as well as continued staffing of S&MA's console in the Mission Evaluation Room (MER) for that flight mission. How were these engineers able to conduct such a complicated task? They conducted it based on product verification that consisted of ensuring that safety requirements were adequately contained in all flight products that affected crew safety. S&MA Ops engineers apply both systems engineering and project management principles in order to gain a appropriate level of technical knowledge necessary to perform thorough reviews which cover the subsystem(s) affected. They also ensured that mission priorities were carried out with a great detail and success.
NASA Astrophysics Data System (ADS)
Guo, Bing; Documet, Jorge; Liu, Brent; King, Nelson; Shrestha, Rasu; Wang, Kevin; Huang, H. K.; Grant, Edward G.
2006-03-01
The paper describes the methodology for the clinical design and implementation of a Location Tracking and Verification System (LTVS) that has distinct benefits for the Imaging Department at the Healthcare Consultation Center II (HCCII), an outpatient imaging facility located on the USC Health Science Campus. A novel system for tracking and verification of patients and staff in a clinical environment using wireless and facial biometric technology to monitor and automatically identify patients and staff was developed in order to streamline patient workflow, protect against erroneous examinations and create a security zone to prevent and audit unauthorized access to patient healthcare data under the HIPAA mandate. This paper describes the system design and integration methodology based on initial clinical workflow studies within a clinical environment. An outpatient center was chosen as an initial first step for the development and implementation of this system.
Automated Analysis of Stateflow Models
NASA Technical Reports Server (NTRS)
Bourbouh, Hamza; Garoche, Pierre-Loic; Garion, Christophe; Gurfinkel, Arie; Kahsaia, Temesghen; Thirioux, Xavier
2017-01-01
Stateflow is a widely used modeling framework for embedded and cyber physical systems where control software interacts with physical processes. In this work, we present a framework a fully automated safety verification technique for Stateflow models. Our approach is two-folded: (i) we faithfully compile Stateflow models into hierarchical state machines, and (ii) we use automated logic-based verification engine to decide the validity of safety properties. The starting point of our approach is a denotational semantics of State flow. We propose a compilation process using continuation-passing style (CPS) denotational semantics. Our compilation technique preserves the structural and modal behavior of the system. The overall approach is implemented as an open source toolbox that can be integrated into the existing Mathworks Simulink Stateflow modeling framework. We present preliminary experimental evaluations that illustrate the effectiveness of our approach in code generation and safety verification of industrial scale Stateflow models.
LISA verification binaries with updated distances from Gaia Data Release 2
NASA Astrophysics Data System (ADS)
Kupfer, T.; Korol, V.; Shah, S.; Nelemans, G.; Marsh, T. R.; Ramsay, G.; Groot, P. J.; Steeghs, D. T. H.; Rossi, E. M.
2018-06-01
Ultracompact binaries with orbital periods less than a few hours will dominate the gravitational wave signal in the mHz regime. Until recently, 10 systems were expected have a predicted gravitational wave signal strong enough to be detectable by the Laser Interferometer Space Antenna (LISA), the so-called `verification binaries'. System parameters, including distances, are needed to provide an accurate prediction of the expected gravitational wave strength to be measured by LISA. Using parallaxes from Gaia Data Release 2 we calculate signal-to-noise ratios (SNR) for ≈50 verification binary candidates. We find that 11 binaries reach a SNR≥20, two further binaries reaching a SNR≥5 and three more systems are expected to have a SNR≈5 after four years integration with LISA. For these 16 systems we present predictions of the gravitational wave amplitude (A) and parameter uncertainties from Fisher information matrix on the amplitude (A) and inclination (ι).
Guidance and Control Software Project Data - Volume 3: Verification Documents
NASA Technical Reports Server (NTRS)
Hayhurst, Kelly J. (Editor)
2008-01-01
The Guidance and Control Software (GCS) project was the last in a series of software reliability studies conducted at Langley Research Center between 1977 and 1994. The technical results of the GCS project were recorded after the experiment was completed. Some of the support documentation produced as part of the experiment, however, is serving an unexpected role far beyond its original project context. Some of the software used as part of the GCS project was developed to conform to the RTCA/DO-178B software standard, "Software Considerations in Airborne Systems and Equipment Certification," used in the civil aviation industry. That standard requires extensive documentation throughout the software development life cycle, including plans, software requirements, design and source code, verification cases and results, and configuration management and quality control data. The project documentation that includes this information is open for public scrutiny without the legal or safety implications associated with comparable data from an avionics manufacturer. This public availability has afforded an opportunity to use the GCS project documents for DO-178B training. This report provides a brief overview of the GCS project, describes the 4-volume set of documents and the role they are playing in training, and includes the verification documents from the GCS project. Volume 3 contains four appendices: A. Software Verification Cases and Procedures for the Guidance and Control Software Project; B. Software Verification Results for the Pluto Implementation of the Guidance and Control Software; C. Review Records for the Pluto Implementation of the Guidance and Control Software; and D. Test Results Logs for the Pluto Implementation of the Guidance and Control Software.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Criscenti, Louise Jacqueline; Sassani, David Carl; Arguello, Jose Guadalupe, Jr.
2011-02-01
This report describes the progress in fiscal year 2010 in developing the Waste Integrated Performance and Safety Codes (IPSC) in support of the U.S. Department of Energy (DOE) Office of Nuclear Energy Advanced Modeling and Simulation (NEAMS) Campaign. The goal of the Waste IPSC is to develop an integrated suite of computational modeling and simulation capabilities to quantitatively assess the long-term performance of waste forms in the engineered and geologic environments of a radioactive waste storage or disposal system. The Waste IPSC will provide this simulation capability (1) for a range of disposal concepts, waste form types, engineered repository designs,more » and geologic settings, (2) for a range of time scales and distances, (3) with appropriate consideration of the inherent uncertainties, and (4) in accordance with robust verification, validation, and software quality requirements. Waste IPSC activities in fiscal year 2010 focused on specifying a challenge problem to demonstrate proof of concept, developing a verification and validation plan, and performing an initial gap analyses to identify candidate codes and tools to support the development and integration of the Waste IPSC. The current Waste IPSC strategy is to acquire and integrate the necessary Waste IPSC capabilities wherever feasible, and develop only those capabilities that cannot be acquired or suitably integrated, verified, or validated. This year-end progress report documents the FY10 status of acquisition, development, and integration of thermal-hydrologic-chemical-mechanical (THCM) code capabilities, frameworks, and enabling tools and infrastructure.« less
Practical Application of Model Checking in Software Verification
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Skakkebaek, Jens Ulrik
1999-01-01
This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.
Design and Verification Guidelines for Vibroacoustic and Transient Environments
NASA Technical Reports Server (NTRS)
1986-01-01
Design and verification guidelines for vibroacoustic and transient environments contain many basic methods that are common throughout the aerospace industry. However, there are some significant differences in methodology between NASA/MSFC and others - both government agencies and contractors. The purpose of this document is to provide the general guidelines used by the Component Analysis Branch, ED23, at MSFC, for the application of the vibroacoustic and transient technology to all launch vehicle and payload components and payload components and experiments managed by NASA/MSFC. This document is intended as a tool to be utilized by the MSFC program management and their contractors as a guide for the design and verification of flight hardware.
6th Annual CMMI Technology Conference and User Group
2006-11-17
Operationally Oriented; Customer Focused Proven Approach – Level of Detail Beginner Decision Table (DT) is a tabular representation with tailoring options to...written to reflect the experience of the author Software Engineering led the process charge in the ’80s – Used Flowcharts – CASE tools – “data...Postpo ned PCR. Verification Steps • EPG configuration audits • EPG configuration status reports Flowcharts and Entry, Task, Verification and eXit
Which Accelerates Faster--A Falling Ball or a Porsche?
ERIC Educational Resources Information Center
Rall, James D.; Abdul-Razzaq, Wathiq
2012-01-01
An introductory physics experiment has been developed to address the issues seen in conventional physics lab classes including assumption verification, technological dependencies, and real world motivation for the experiment. The experiment has little technology dependence and compares the acceleration due to gravity by using position versus time…
NASA Astrophysics Data System (ADS)
Amyay, Omar
A method defined in terms of synthesis and verification steps is presented. The specification of the services and protocols of communication within a multilayered architecture of the Open Systems Interconnection (OSI) type is an essential issue for the design of computer networks. The aim is to obtain an operational specification of the protocol service couple of a given layer. Planning synthesis and verification steps constitute a specification trajectory. The latter is based on the progressive integration of the 'initial data' constraints and verification of the specification originating from each synthesis step, through validity constraints that characterize an admissible solution. Two types of trajectories are proposed according to the style of the initial specification of the service protocol couple: operational type and service supplier viewpoint; knowledge property oriented type and service viewpoint. Synthesis and verification activities were developed and formalized in terms of labeled transition systems, temporal logic and epistemic logic. The originality of the second specification trajectory and the use of the epistemic logic are shown. An 'artificial intelligence' approach enables a conceptual model to be defined for a knowledge base system for implementing the method proposed. It is structured in three levels of representation of the knowledge relating to the domain, the reasoning characterizing synthesis and verification activities and the planning of the steps of a specification trajectory.
NASA Astrophysics Data System (ADS)
Da Silva, A.; Sánchez Prieto, S.; Polo, O.; Parra Espada, P.
2013-05-01
Because of the tough robustness requirements in space software development, it is imperative to carry out verification tasks at a very early development stage to ensure that the implemented exception mechanisms work properly. All this should be done long time before the real hardware is available. But even if real hardware is available the verification of software fault tolerance mechanisms can be difficult since real faulty situations must be systematically and artificially brought about which can be imposible on real hardware. To solve this problem the Alcala Space Research Group (SRG) has developed a LEON2 virtual platform (Leon2ViP) with fault injection capabilities. This way it is posible to run the exact same target binary software as runs on the physical system in a more controlled and deterministic environment, allowing a more strict requirements verification. Leon2ViP enables unmanned and tightly focused fault injection campaigns, not possible otherwise, in order to expose and diagnose flaws in the software implementation early. Furthermore, the use of a virtual hardware-in-the-loop approach makes it possible to carry out preliminary integration tests with the spacecraft emulator or the sensors. The use of Leon2ViP has meant a signicant improvement, in both time and cost, in the development and verification processes of the Instrument Control Unit boot software on board Solar Orbiter's Energetic Particle Detector.
DOT National Transportation Integrated Search
2012-03-01
The Federal Motor Vehicle Safety Standards (FMVSS) establish minimum levels for vehicle safety, and manufacturers of motor vehicle and equipment items must comply with these standards. The National Highway Traffic Safety Administration (NHTSA) contra...
In-use testing of diesel emissoin control technologies is an integral component of EPA's verification program. EPA identified and recovered a variety of retrofit devices, installed on heavy-duty vehicles for test.
Human Support Technology Research to Enable Exploration
NASA Technical Reports Server (NTRS)
Joshi, Jitendra
2003-01-01
Contents include the following: Advanced life support. System integration, modeling, and analysis. Progressive capabilities. Water processing. Air revitalization systems. Why advanced CO2 removal technology? Solid waste resource recovery systems: lyophilization. ISRU technologies for Mars life support. Atmospheric resources of Mars. N2 consumable/make-up for Mars life. Integrated test beds. Monitoring and controlling the environment. Ground-based commercial technology. Optimizing size vs capability. Water recovery systems. Flight verification topics.
Supporting the Use of CERT (registered trademark) Secure Coding Standards in DoD Acquisitions
2012-07-01
Capability Maturity Model IntegrationSM (CMMI®) [Davis 2009]. SM Team Software Process, TSP, and Capability Maturity Model Integration are service...STP Software Test Plan TEP Test and Evaluation Plan TSP Team Software Process V & V verification and validation CMU/SEI-2012-TN-016 | 47...Supporting the Use of CERT® Secure Coding Standards in DoD Acquisitions Tim Morrow ( Software Engineering Institute) Robert Seacord ( Software
Review of Data Integrity Models in Multi-Level Security Environments
2011-02-01
2: (E-1 extension) Only executions described in a (User, TP, (CDIs)) relation are allowed • E-3: Users must be authenticated before allowing TP... authentication and verification procedures for upgrading the integrity of certain objects. The mechanism used to manage access to objects is primarily...that is, the self-consistency of interdependent data and the consistency of real-world environment data. The prevention of authorised users from making
Residual Negative Pressure in Vacuum Tubes Might Increase the Risk of Spurious Hemolysis.
Xiao, Tong-Tong; Zhang, Qiao-Xin; Hu, Jing; Ouyang, Hui-Zhen; Cai, Ying-Mu
2017-05-01
We planned a study to establish whether spurious hemolysis may occur when negative pressure remains in vacuum tubes. Four tubes with different vacuum levels (-54, -65, -74, and -86 kPa) were used to examine blood drawn from one healthy volunteer; the tubes were allowed to stand for different times (1, 2, 3, and 4 hours). The plasma was separated and immediately tested for free hemoglobin (FHb). Thirty patients were enrolled in a verification experiment. The degree of hemolysis observed was greater when the remaining negative pressure was higher. Significant differences were recorded in the verification experiment. The results suggest that residual negative pressure might increase the risk of spurious hemolysis.
NASA Technical Reports Server (NTRS)
Mccllough, J. R.; Sharpe, A.; Doetsch, K. H.
1980-01-01
The SIMFAC has played a vital role in the design, development, and performance verification of the shuttle remote manipulator system (SRMS) to be installed in the space shuttle orbiter. The facility provides for realistic man-in-the-loop operation of the SRMS by an operator in the operator complex, a flightlike crew station patterned after the orbiter aft flight deck with all necessary man machine interface elements, including SRMS displays and controls and simulated out-of-the-window and CCTV scenes. The characteristics of the manipulator system, including arm and joint servo dynamics and control algorithms, are simulated by a comprehensive mathematical model within the simulation subsystem of the facility. Major studies carried out using SIMFAC include: SRMS parameter sensitivity evaluations; the development, evaluation, and verification of operating procedures; and malfunction simulation and analysis of malfunction performance. Among the most important and comprehensive man-in-the-loop simulations carried out to date on SIMFAC are those which support SRMS performance verification and certification when the SRMS is part of the integrated orbiter-manipulator system.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Erickson, Phillip A.; O'Hagan, Ryan; Shumaker, Brent
The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carriedmore » out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.« less
NASA Astrophysics Data System (ADS)
Fox, Neil I.; Micheas, Athanasios C.; Peng, Yuqiang
2016-07-01
This paper introduces the use of Bayesian full Procrustes shape analysis in object-oriented meteorological applications. In particular, the Procrustes methodology is used to generate mean forecast precipitation fields from a set of ensemble forecasts. This approach has advantages over other ensemble averaging techniques in that it can produce a forecast that retains the morphological features of the precipitation structures and present the range of forecast outcomes represented by the ensemble. The production of the ensemble mean avoids the problems of smoothing that result from simple pixel or cell averaging, while producing credible sets that retain information on ensemble spread. Also in this paper, the full Bayesian Procrustes scheme is used as an object verification tool for precipitation forecasts. This is an extension of a previously presented Procrustes shape analysis based verification approach into a full Bayesian format designed to handle the verification of precipitation forecasts that match objects from an ensemble of forecast fields to a single truth image. The methodology is tested on radar reflectivity nowcasts produced in the Warning Decision Support System - Integrated Information (WDSS-II) by varying parameters in the K-means cluster tracking scheme.
Implementing Kanban for agile process management within the ALMA Software Operations Group
NASA Astrophysics Data System (ADS)
Reveco, Johnny; Mora, Matias; Shen, Tzu-Chiang; Soto, Ruben; Sepulveda, Jorge; Ibsen, Jorge
2014-07-01
After the inauguration of the Atacama Large Millimeter/submillimeter Array (ALMA), the Software Operations Group in Chile has refocused its objectives to: (1) providing software support to tasks related to System Integration, Scientific Commissioning and Verification, as well as Early Science observations; (2) testing the remaining software features, still under development by the Integrated Computing Team across the world; and (3) designing and developing processes to optimize and increase the level of automation of operational tasks. Due to their different stakeholders, each of these tasks presents a wide diversity of importances, lifespans and complexities. Aiming to provide the proper priority and traceability for every task without stressing our engineers, we introduced the Kanban methodology in our processes in order to balance the demand on the team against the throughput of the delivered work. The aim of this paper is to share experiences gained during the implementation of Kanban in our processes, describing the difficulties we have found, solutions and adaptations that led us to our current but still evolving implementation, which has greatly improved our throughput, prioritization and problem traceability.
MPLM Donatello is offloaded at the SLF
NASA Technical Reports Server (NTRS)
2001-01-01
At the KSC Shuttle Landing Facility, the Italian Space Agency's Multi- Purpose Logistics Module Donatello begins rolling out of the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.
MPLM Donatello is offloaded at the SLF
NASA Technical Reports Server (NTRS)
2001-01-01
At the KSC Shuttle Landing Facility, an Airbus '''Beluga''' air cargo plane opens to reveal its cargo, the Italian Space Agency's Multi- Purpose Logistics Module Donatello, from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.
MPLM Donatello is offloaded at the SLF
NASA Technical Reports Server (NTRS)
2001-01-01
At the KSC Shuttle Landing Facility, the Italian Space Agency's Multi- Purpose Logistics Module Donatello rolls out of the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.
Using computer graphics to enhance astronaut and systems safety
NASA Technical Reports Server (NTRS)
Brown, J. W.
1985-01-01
Computer graphics is being employed at the NASA Johnson Space Center as a tool to perform rapid, efficient and economical analyses for man-machine integration, flight operations development and systems engineering. The Operator Station Design System (OSDS), a computer-based facility featuring a highly flexible and versatile interactive software package, PLAID, is described. This unique evaluation tool, with its expanding data base of Space Shuttle elements, various payloads, experiments, crew equipment and man models, supports a multitude of technical evaluations, including spacecraft and workstation layout, definition of astronaut visual access, flight techniques development, cargo integration and crew training. As OSDS is being applied to the Space Shuttle, Orbiter payloads (including the European Space Agency's Spacelab) and future space vehicles and stations, astronaut and systems safety are being enhanced. Typical OSDS examples are presented. By performing physical and operational evaluations during early conceptual phases. supporting systems verification for flight readiness, and applying its capabilities to real-time mission support, the OSDS provides the wherewithal to satisfy a growing need of the current and future space programs for efficient, economical analyses.
Modeling interfacial fracture in Sierra.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Brown, Arthur A.; Ohashi, Yuki; Lu, Wei-Yang
2013-09-01
This report summarizes computational efforts to model interfacial fracture using cohesive zone models in the SIERRA/SolidMechanics (SIERRA/SM) finite element code. Cohesive surface elements were used to model crack initiation and propagation along predefined paths. Mesh convergence was observed with SIERRA/SM for numerous geometries. As the funding for this project came from the Advanced Simulation and Computing Verification and Validation (ASC V&V) focus area, considerable effort was spent performing verification and validation. Code verification was performed to compare code predictions to analytical solutions for simple three-element simulations as well as a higher-fidelity simulation of a double-cantilever beam. Parameter identification was conductedmore » with Dakota using experimental results on asymmetric double-cantilever beam (ADCB) and end-notched-flexure (ENF) experiments conducted under Campaign-6 funding. Discretization convergence studies were also performed with respect to mesh size and time step and an optimization study was completed for mode II delamination using the ENF geometry. Throughout this verification process, numerous SIERRA/SM bugs were found and reported, all of which have been fixed, leading to over a 10-fold increase in convergence rates. Finally, mixed-mode flexure experiments were performed for validation. One of the unexplained issues encountered was material property variability for ostensibly the same composite material. Since the variability is not fully understood, it is difficult to accurately assess uncertainty when performing predictions.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Damiani, Rick
This manual summarizes the theory and preliminary verifications of the JacketSE module, which is an offshore jacket sizing tool that is part of the Wind-Plant Integrated System Design & Engineering Model toolbox. JacketSE is based on a finite-element formulation and on user-prescribed inputs and design standards' criteria (constraints). The physics are highly simplified, with a primary focus on satisfying ultimate limit states and modal performance requirements. Preliminary validation work included comparing industry data and verification against ANSYS, a commercial finite-element analysis package. The results are encouraging, and future improvements to the code are recommended in this manual.
Technology of welding aluminum alloys-II
NASA Technical Reports Server (NTRS)
1978-01-01
Step-by-step procedures were developed for high integrity manual and machine welding of aluminum alloys. Detailed instructions are given for each step with tables and graphs to specify materials and dimensions. Throughout work sequence, processing procedure designates manufacturing verification points and inspection points.
Integrating Security into the Curriculum
1998-12-01
predicate calculus, discrete math , and finite-state machine the- ory. In addition to applying standard mathematical foundations to constructing hardware and...models, specifi- cations, and the use of formal methods for verification and covert channel analysis. The means for analysis is based on discrete math , information
Measuring Data Quality Through a Source Data Verification Audit in a Clinical Research Setting.
Houston, Lauren; Probst, Yasmine; Humphries, Allison
2015-01-01
Health data has long been scrutinised in relation to data quality and integrity problems. Currently, no internationally accepted or "gold standard" method exists measuring data quality and error rates within datasets. We conducted a source data verification (SDV) audit on a prospective clinical trial dataset. An audit plan was applied to conduct 100% manual verification checks on a 10% random sample of participant files. A quality assurance rule was developed, whereby if >5% of data variables were incorrect a second 10% random sample would be extracted from the trial data set. Error was coded: correct, incorrect (valid or invalid), not recorded or not entered. Audit-1 had a total error of 33% and audit-2 36%. The physiological section was the only audit section to have <5% error. Data not recorded to case report forms had the greatest impact on error calculations. A significant association (p=0.00) was found between audit-1 and audit-2 and whether or not data was deemed correct or incorrect. Our study developed a straightforward method to perform a SDV audit. An audit rule was identified and error coding was implemented. Findings demonstrate that monitoring data quality by a SDV audit can identify data quality and integrity issues within clinical research settings allowing quality improvement to be made. The authors suggest this approach be implemented for future research.
Singh, Anushikha; Dutta, Malay Kishore
2017-12-01
The authentication and integrity verification of medical images is a critical and growing issue for patients in e-health services. Accurate identification of medical images and patient verification is an essential requirement to prevent error in medical diagnosis. The proposed work presents an imperceptible watermarking system to address the security issue of medical fundus images for tele-ophthalmology applications and computer aided automated diagnosis of retinal diseases. In the proposed work, patient identity is embedded in fundus image in singular value decomposition domain with adaptive quantization parameter to maintain perceptual transparency for variety of fundus images like healthy fundus or disease affected image. In the proposed method insertion of watermark in fundus image does not affect the automatic image processing diagnosis of retinal objects & pathologies which ensure uncompromised computer-based diagnosis associated with fundus image. Patient ID is correctly recovered from watermarked fundus image for integrity verification of fundus image at the diagnosis centre. The proposed watermarking system is tested in a comprehensive database of fundus images and results are convincing. results indicate that proposed watermarking method is imperceptible and it does not affect computer vision based automated diagnosis of retinal diseases. Correct recovery of patient ID from watermarked fundus image makes the proposed watermarking system applicable for authentication of fundus images for computer aided diagnosis and Tele-ophthalmology applications. Copyright © 2017 Elsevier B.V. All rights reserved.
Strict integrity control of biomedical images
NASA Astrophysics Data System (ADS)
Coatrieux, Gouenou; Maitre, Henri; Sankur, Bulent
2001-08-01
The control of the integrity and authentication of medical images is becoming ever more important within the Medical Information Systems (MIS). The intra- and interhospital exchange of images, such as in the PACS (Picture Archiving and Communication Systems), and the ease of copying, manipulation and distribution of images have brought forth the security aspects. In this paper we focus on the role of watermarking for MIS security and address the problem of integrity control of medical images. We discuss alternative schemes to extract verification signatures and compare their tamper detection performance.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wendt, Fabian F; Yu, Yi-Hsiang; Nielsen, Kim
This is the first joint reference paper for the Ocean Energy Systems (OES) Task 10 Wave Energy Converter modeling verification and validation group. The group is established under the OES Energy Technology Network program under the International Energy Agency. OES was founded in 2001 and Task 10 was proposed by Bob Thresher (National Renewable Energy Laboratory) in 2015 and approved by the OES Executive Committee EXCO in 2016. The kickoff workshop took place in September 2016, wherein the initial baseline task was defined. Experience from similar offshore wind validation/verification projects (OC3-OC5 conducted within the International Energy Agency Wind Task 30)more » [1], [2] showed that a simple test case would help the initial cooperation to present results in a comparable way. A heaving sphere was chosen as the first test case. The team of project participants simulated different numerical experiments, such as heave decay tests and regular and irregular wave cases. The simulation results are presented and discussed in this paper.« less
High-resolution face verification using pore-scale facial features.
Li, Dong; Zhou, Huiling; Lam, Kin-Man
2015-08-01
Face recognition methods, which usually represent face images using holistic or local facial features, rely heavily on alignment. Their performances also suffer a severe degradation under variations in expressions or poses, especially when there is one gallery per subject only. With the easy access to high-resolution (HR) face images nowadays, some HR face databases have recently been developed. However, few studies have tackled the use of HR information for face recognition or verification. In this paper, we propose a pose-invariant face-verification method, which is robust to alignment errors, using the HR information based on pore-scale facial features. A new keypoint descriptor, namely, pore-Principal Component Analysis (PCA)-Scale Invariant Feature Transform (PPCASIFT)-adapted from PCA-SIFT-is devised for the extraction of a compact set of distinctive pore-scale facial features. Having matched the pore-scale features of two-face regions, an effective robust-fitting scheme is proposed for the face-verification task. Experiments show that, with one frontal-view gallery only per subject, our proposed method outperforms a number of standard verification methods, and can achieve excellent accuracy even the faces are under large variations in expression and pose.
Subvisual-thin cirrus lidar dataset for satellite verification and climatological research
NASA Technical Reports Server (NTRS)
Sassen, Kenneth; Cho, Byung S.
1992-01-01
A polarization (0.694 microns wavelength) lidar dataset for subvisual and thin (bluish-colored) cirrus clouds is drawn from project FIRE (First ISCCP Regional Experiment) extended time observations. The clouds are characterized by their day-night visual appearance; base, top, and optical midcloud heights and temperatures; measured physical and estimated optical cloud thicknesses; integrated linear depolarization ratios; and derived k/2 eta ratios. A subset of the data supporting 30 NOAA polar-orbiting satellite overpasses is given in tabular form to provide investigators with the means to test cloud retrieval algorithms and establish the limits of cirrus detectability from satellite measurements under various conditions. Climatologically, subvisual-thin cirrus appear to be higher, colder, and more strongly depolarizing than previously reported multilatitude cirrus, although similar k/2 eta that decrease with height and temperature are found.
NASA Technical Reports Server (NTRS)
Bensalem, Saddek; Ganesh, Vijay; Lakhnech, Yassine; Munoz, Cesar; Owre, Sam; Ruess, Harald; Rushby, John; Rusu, Vlad; Saiedi, Hassen; Shankar, N.
2000-01-01
To become practical for assurance, automated formal methods must be made more scalable, automatic, and cost-effective. Such an increase in scope, scale, automation, and utility can be derived from an emphasis on a systematic separation of concerns during verification. SAL (Symbolic Analysis Laboratory) attempts to address these issues. It is a framework for combining different tools to calculate properties of concurrent systems. The heart of SAL is a language, developed in collaboration with Stanford, Berkeley, and Verimag for specifying concurrent systems in a compositional way. Our instantiation of the SAL framework augments PVS with tools for abstraction, invariant generation, program analysis (such as slicing), theorem proving, and model checking to separate concerns as well as calculate properties (i.e., perform, symbolic analysis) of concurrent systems. We. describe the motivation, the language, the tools, their integration in SAL/PAS, and some preliminary experience of their use.
Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang
2014-01-01
Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method.
Jinno, Shunta; Tachibana, Hidenobu; Moriya, Shunsuke; Mizuno, Norifumi; Takahashi, Ryo; Kamima, Tatsuya; Ishibashi, Satoru; Sato, Masanori
2018-05-21
In inhomogeneous media, there is often a large systematic difference in the dose between the conventional Clarkson algorithm (C-Clarkson) for independent calculation verification and the superposition-based algorithms of treatment planning systems (TPSs). These treatment site-dependent differences increase the complexity of the radiotherapy planning secondary check. We developed a simple and effective method of heterogeneity correction integrated with the Clarkson algorithm (L-Clarkson) to account for the effects of heterogeneity in the lateral dimension, and performed a multi-institutional study to evaluate the effectiveness of the method. In the method, a 2D image reconstructed from computed tomography (CT) images is divided according to lines extending from the reference point to the edge of the multileaf collimator (MLC) or jaw collimator for each pie sector, and the radiological path length (RPL) of each line is calculated on the 2D image to obtain a tissue maximum ratio and phantom scatter factor, allowing the dose to be calculated. A total of 261 plans (1237 beams) for conventional breast and lung treatments and lung stereotactic body radiotherapy were collected from four institutions. Disagreements in dose between the on-site TPSs and a verification program using the C-Clarkson and L-Clarkson algorithms were compared. Systematic differences with the L-Clarkson method were within 1% for all sites, while the C-Clarkson method resulted in systematic differences of 1-5%. The L-Clarkson method showed smaller variations. This heterogeneity correction integrated with the Clarkson algorithm would provide a simple evaluation within the range of -5% to +5% for a radiotherapy plan secondary check.
Systems Integration Processes for NASA Ares I Crew Launch Vehicle
NASA Technical Reports Server (NTRS)
Taylor, James L.; Reuter, James L.; Sexton, Jeffrey D.
2006-01-01
NASA's Exploration Initiative will require development of many new elements to constitute a robust system of systems. New launch vehicles are needed to place cargo and crew in stable Low Earth Orbit (LEO). This paper examines the systems integration processes NASA is utilizing to ensure integration and control of propulsion and nonpropulsion elements within NASA's Crew Launch Vehicle (CLV), now known as the Ares I. The objective of the Ares I is to provide the transportation capabilities to meet the Constellation Program requirements for delivering a Crew Exploration Vehicle (CEV) or other payload to LEO in support of the lunar and Mars missions. The Ares I must successfully provide this capability within cost and schedule, and with an acceptable risk approach. This paper will describe the systems engineering management processes that will be applied to assure Ares I Project success through complete and efficient technical integration. Discussion of technical review and management processes for requirements development and verification, integrated design and analysis, integrated simulation and testing, and the integration of reliability, maintainability and supportability (RMS) into the design will also be included. The Ares I Project is logically divided into elements by the major hardware groupings, and associated management, system engineering, and integration functions. The processes to be described herein are designed to integrate within these Ares I elements and among the other Constellation projects. Also discussed is launch vehicle stack integration (Ares I to CEV, and Ground and Flight Operations integration) throughout the life cycle, including integrated vehicle performance through orbital insertion, recovery of the first stage, and reentry of the upper stage. The processes for decomposing requirements to the elements and ensuring that requirements have been correctly validated, decomposed, and allocated, and that the verification requirements are properly defined to ensure that the system design meets requirements, will be discussed.
Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo
2012-01-01
Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044
Observations on CFD Verification and Validation from the AIAA Drag Prediction Workshops
NASA Technical Reports Server (NTRS)
Morrison, Joseph H.; Kleb, Bil; Vassberg, John C.
2014-01-01
The authors provide observations from the AIAA Drag Prediction Workshops that have spanned over a decade and from a recent validation experiment at NASA Langley. These workshops provide an assessment of the predictive capability of forces and moments, focused on drag, for transonic transports. It is very difficult to manage the consistency of results in a workshop setting to perform verification and validation at the scientific level, but it may be sufficient to assess it at the level of practice. Observations thus far: 1) due to simplifications in the workshop test cases, wind tunnel data are not necessarily the “correct” results that CFD should match, 2) an average of core CFD data are not necessarily a better estimate of the true solution as it is merely an average of other solutions and has many coupled sources of variation, 3) outlier solutions should be investigated and understood, and 4) the DPW series does not have the systematic build up and definition on both the computational and experimental side that is required for detailed verification and validation. Several observations regarding the importance of the grid, effects of physical modeling, benefits of open forums, and guidance for validation experiments are discussed. The increased variation in results when predicting regions of flow separation and increased variation due to interaction effects, e.g., fuselage and horizontal tail, point out the need for validation data sets for these important flow phenomena. Experiences with a recent validation experiment at NASA Langley are included to provide guidance on validation experiments.
1981-06-01
design of manufacturing systems, "ilidation and verification of ICAM modules, integration of ICAM modules and the orderly transition of ICAM modules into...Function Model of "Manufacture Product" (MFGO) VIII - Composite Function Model of " Design Product" (DESIGNO) IX - Composite Information Model of...User Interface Requirements; and the Architecture of Design . This work was performed during the period of 29 September 1978 through 10
Space station System Engineering and Integration (SE and I). Volume 2: Study results
NASA Technical Reports Server (NTRS)
1987-01-01
A summary of significant study results that are products of the Phase B conceptual design task are contained. Major elements are addressed. Study results applicable to each major element or area of design are summarized and included where appropriate. Areas addressed include: system engineering and integration; customer accommodations; test and program verification; product assurance; conceptual design; operations and planning; technical and management information system (TMIS); and advanced development.
Orion European Service Module (ESM) Development, Integration and Qualification Status
NASA Technical Reports Server (NTRS)
Berthe, Philippe; Over, Ann P.; Picardo, Michelle; Byers, Anthony W.
2017-01-01
ESA and the European Industry are supplying the European Service Module for Orion. An overview of the system and subsystem configuration of the Orion European Service Module (ESM) as designed and built for the EM-1 mission is provided as well as an outline of its development, assembly, integration and verification process performed by ESA and NASA in coordination with their respective Industrial prime contractors, Airbus Defence and Space and Lockheed Martin.
Information Security and Integrity Systems
NASA Technical Reports Server (NTRS)
1990-01-01
Viewgraphs from the Information Security and Integrity Systems seminar held at the University of Houston-Clear Lake on May 15-16, 1990 are presented. A tutorial on computer security is presented. The goals of this tutorial are the following: to review security requirements imposed by government and by common sense; to examine risk analysis methods to help keep sight of forest while in trees; to discuss the current hot topic of viruses (which will stay hot); to examine network security, now and in the next year to 30 years; to give a brief overview of encryption; to review protection methods in operating systems; to review database security problems; to review the Trusted Computer System Evaluation Criteria (Orange Book); to comment on formal verification methods; to consider new approaches (like intrusion detection and biometrics); to review the old, low tech, and still good solutions; and to give pointers to the literature and to where to get help. Other topics covered include security in software applications and development; risk management; trust: formal methods and associated techniques; secure distributed operating system and verification; trusted Ada; a conceptual model for supporting a B3+ dynamic multilevel security and integrity in the Ada runtime environment; and information intelligence sciences.
The Hyper-X Flight Systems Validation Program
NASA Technical Reports Server (NTRS)
Redifer, Matthew; Lin, Yohan; Bessent, Courtney Amos; Barklow, Carole
2007-01-01
For the Hyper-X/X-43A program, the development of a comprehensive validation test plan played an integral part in the success of the mission. The goal was to demonstrate hypersonic propulsion technologies by flight testing an airframe-integrated scramjet engine. Preparation for flight involved both verification and validation testing. By definition, verification is the process of assuring that the product meets design requirements; whereas validation is the process of assuring that the design meets mission requirements for the intended environment. This report presents an overview of the program with emphasis on the validation efforts. It includes topics such as hardware-in-the-loop, failure modes and effects, aircraft-in-the-loop, plugs-out, power characterization, antenna pattern, integration, combined systems, captive carry, and flight testing. Where applicable, test results are also discussed. The report provides a brief description of the flight systems onboard the X-43A research vehicle and an introduction to the ground support equipment required to execute the validation plan. The intent is to provide validation concepts that are applicable to current, follow-on, and next generation vehicles that share the hybrid spacecraft and aircraft characteristics of the Hyper-X vehicle.
Deformation and Flexibility Equations for ARIS Umbilicals Idealized as Planar Elastica
NASA Technical Reports Server (NTRS)
Hampton, R. David; Leamy, Michael J.; Bryant, Paul J.; Quraishi, Naveed
2005-01-01
The International Space Station relies on the active rack isolation system (ARIS) as the central component of an integrated, stationwide strategy to isolate microgravity space-science experiments. ARIS uses electromechanical actuators to isolate an international standard payload rack from disturbances due to the motion of the Space Station. Disturbances to microgravity experiments on ARIS isolated racks are transmitted primarily via the ARIS power and vacuum umbilicals. Experimental tests indicate that these umbilicals resonate at frequencies outside the ARIS controller s bandwidth at levels of potential concern for certain microgravity experiments. Reduction in the umbilical resonant frequencies could help to address this issue. This work documents the development and verification of equations for the in-plane deflections and flexibilities of an idealized umbilical (thin, flexible, inextensible, cantilever beam) under end-point, in-plane loading (inclined-force and moment). The effect of gravity is neglected due to the on-orbit application. The analysis assumes an initially curved (not necessarily circular), cantilevered umbilical with uniform cross-section, which undergoes large deflections with no plastic deformation, such that the umbilical slope changes monotonically. The treatment is applicable to the ARIS power and vacuum umbilicals under the indicated assumptions.
NASA Astrophysics Data System (ADS)
Zamani, K.; Bombardelli, F. A.
2013-12-01
ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.
Overview of the NAVAIR Spinal Injury Mitigation Program
2005-10-01
for the same ejection / impact loads. Furthermore, human tolerance data are obtained from elderly 186 specimens (more than 65 years of age). Clear...differences in anthropometry and gender. As the model is developed, an integrated model verification / validation procedure is employed to test model
ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, MARIAH ENERGY CORPORATION HEAT PLUS POWER SYSTEM
The Greenhouse Gas Technology Center (GHG Center) has recently evaluated the performance of the Heat PlusPower(TM) System (Mariah CDP System), which integrates microturbine technology with a heat recovery system. Electric power is generated with a Capstone MicroTurbine(TM) Model ...
Verification of the enhanced integrated climatic module soil subgrade input parameters in the MEPDG.
DOT National Transportation Integrated Search
2015-10-01
At the beginning of 2009, INDOT adopted the Mechanistic-Empirical Pavement Design Guide (MEPDG) method to study the tong-term : pavement performance. The implementation of this new design approach led to difficulties for the pavement to pass the INDO...
REMOTE MONITORING AND DATA VERIFICATION WHEN USING A PACKAGE PLANT
A remote telemetry system (RTS) has been fabricated, laboratory tested, and integrated into the field operation of 10,000 gal/day ultra filtration package plant (UFPP). The UFPP utilizes bag filtration, disinfection by chlorination, and an ultra filtration membrane to produce fin...
The performance evaluation of innovative and alternative environmental technologies is an integral part of the U.S. Environmental Protection Agency's (EPA) mission. Early efforts focused on evaluation technologies that supported the implementation of the Clean Air and Clean Wate...
A clocking discipline for two-phase digital integrated circuits
NASA Astrophysics Data System (ADS)
Noice, D. C.
1983-09-01
Sooner or later a designer of digital circuits must face the problem of timing verification so he can avoid errors caused by clock skew, critical races, and hazards. Unlike previous verification methods, such as timing simulation and timing analysis, the approach presented here guarantees correct operation despite uncertainty about delays in the circuit. The result is a clocking discipline that deals with timing abstractions only. It is not based on delay calculations; it is only concerned with the correct, synchronous operation at some clock rate. Accordingly, it may be used earlier in the design cycle, which is particularly important to integrated circuit designs. The clocking discipline consists of a notation of clocking types, and composition rules for using the types. Together, the notation and rules define a formal theory of two phase clocking. The notation defines the names and exact characteristics for different signals that are used in a two phase digital system. The notation makes it possible to develop rules for propagating the clocking types through particular circuits.
NASA Technical Reports Server (NTRS)
Luu, D.
1999-01-01
This is the Performance Verification Report, AMSU-A1 Antenna Drive Subsystem, P/N 1331720-2, S/N 106, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). The antenna drive subsystem of the METSAT AMSU-A1, S/N 106, P/N 1331720-2, completed acceptance testing per A-ES Test Procedure AE-26002/lD. The test included: Scan Motion and Jitter, Pulse Load Bus Peak Current and Rise Time, Resolver Reading and Position Error, Gain/ Phase Margin, and Operational Gain Margin. The drive motors and electronic circuitry were also tested at the component level. The drive motor test includes: Starting Torque Test, Motor Commutation Test, Resolver Operation/ No-Load Speed Test, and Random Vibration. The electronic circuitry was tested at the Circuit Card Assembly (CCA) level of production; each test exercised all circuit functions. The transistor assembly was tested during the W3 cable assembly (1356941-1) test.
Lourenço, Anália; Coenye, Tom; Goeres, Darla M; Donelli, Gianfranco; Azevedo, Andreia S; Ceri, Howard; Coelho, Filipa L; Flemming, Hans-Curt; Juhna, Talis; Lopes, Susana P; Oliveira, Rosário; Oliver, Antonio; Shirtliff, Mark E; Sousa, Ana M; Stoodley, Paul; Pereira, Maria Olivia; Azevedo, Nuno F
2014-04-01
The minimum information about a biofilm experiment (MIABiE) initiative has arisen from the need to find an adequate and scientifically sound way to control the quality of the documentation accompanying the public deposition of biofilm-related data, particularly those obtained using high-throughput devices and techniques. Thereby, the MIABiE consortium has initiated the identification and organization of a set of modules containing the minimum information that needs to be reported to guarantee the interpretability and independent verification of experimental results and their integration with knowledge coming from other fields. MIABiE does not intend to propose specific standards on how biofilms experiments should be performed, because it is acknowledged that specific research questions require specific conditions which may deviate from any standardization. Instead, MIABiE presents guidelines about the data to be recorded and published in order for the procedure and results to be easily and unequivocally interpreted and reproduced. Overall, MIABiE opens up the discussion about a number of particular areas of interest and attempts to achieve a broad consensus about which biofilm data and metadata should be reported in scientific journals in a systematic, rigorous and understandable manner. © 2014 Federation of European Microbiological Societies. Published by John Wiley & Sons Ltd. All rights reserved.
An unattended verification station for UF6 cylinders: Field trial findings
NASA Astrophysics Data System (ADS)
Smith, L. E.; Miller, K. A.; McDonald, B. S.; Webster, J. B.; Zalavadia, M. A.; Garner, J. R.; Stewart, S. L.; Branney, S. J.; Todd, L. C.; Deshmukh, N. S.; Nordquist, H. A.; Kulisek, J. A.; Swinhoe, M. T.
2017-12-01
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs to the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. Analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 "typical" Type 30B cylinders, and the viability of an "NDA Fingerprint" concept as a high-fidelity means to periodically verify that material diversion has not occurred.
Engineering support activities for the Apollo 17 Surface Electrical Properties Experiment.
NASA Technical Reports Server (NTRS)
Cubley, H. D.
1972-01-01
Description of the engineering support activities which were required to ensure fulfillment of objectives specified for the Apollo 17 SEP (Surface Electrical Properties) Experiment. Attention is given to procedural steps involving verification of hardware acceptability to the astronauts, computer simulation of the experiment hardware, field trials, receiver antenna pattern measurements, and the qualification test program.
ERIC Educational Resources Information Center
Rieben, James C., Jr.
2010-01-01
This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect…
DOE Office of Scientific and Technical Information (OSTI.GOV)
CARTER, R.P.
1999-11-19
The U.S. Department of Energy (DOE) commits to accomplishing its mission safely. To ensure this objective is met, DOE issued DOE P 450.4, Safety Management System Policy, and incorporated safety management into the DOE Acquisition Regulations ([DEAR] 48 CFR 970.5204-2 and 90.5204-78). Integrated Safety Management (ISM) requires contractors to integrate safety into management and work practices at all levels so that missions are achieved while protecting the public, the worker, and the environment. The contractor is required to describe the Integrated Safety Management System (ISMS) to be used to implement the safety performance objective.
NASA Technical Reports Server (NTRS)
Amar, Adam J.; Blackwell, Ben F.; Edwards, Jack R.
2007-01-01
The development and verification of a one-dimensional material thermal response code with ablation is presented. The implicit time integrator, control volume finite element spatial discretization, and Newton's method for nonlinear iteration on the entire system of residual equations have been implemented and verified for the thermochemical ablation of internally decomposing materials. This study is a continuation of the work presented in "One-Dimensional Ablation with Pyrolysis Gas Flow Using a Full Newton's Method and Finite Control Volume Procedure" (AIAA-2006-2910), which described the derivation, implementation, and verification of the constant density solid energy equation terms and boundary conditions. The present study extends the model to decomposing materials including decomposition kinetics, pyrolysis gas flow through the porous char layer, and a mixture (solid and gas) energy equation. Verification results are presented for the thermochemical ablation of a carbon-phenolic ablator which involves the solution of the entire system of governing equations.
The FoReVer Methodology: A MBSE Framework for Formal Verification
NASA Astrophysics Data System (ADS)
Baracchi, Laura; Mazzini, Silvia; Cimatti, Alessandro; Tonetta, Stefano; Garcia, Gerald
2013-08-01
The need for high level of confidence and operational integrity in critical space (software) systems is well recognized in the Space industry and has been addressed so far through rigorous System and Software Development Processes and stringent Verification and Validation regimes. The Model Based Space System Engineering process (MBSSE) derived in the System and Software Functional Requirement Techniques study (SSFRT) focused on the application of model based engineering technologies to support the space system and software development processes, from mission level requirements to software implementation through model refinements and translations. In this paper we report on our work in the ESA-funded FoReVer project where we aim at developing methodological, theoretical and technological support for a systematic approach to the space avionics system development, in phases 0/A/B/C. FoReVer enriches the MBSSE process with contract-based formal verification of properties, at different stages from system to software, through a step-wise refinement approach, with the support for a Software Reference Architecture.
Collaborative Localization and Location Verification in WSNs
Miao, Chunyu; Dai, Guoyong; Ying, Kezhen; Chen, Qingzhang
2015-01-01
Localization is one of the most important technologies in wireless sensor networks. A lightweight distributed node localization scheme is proposed by considering the limited computational capacity of WSNs. The proposed scheme introduces the virtual force model to determine the location by incremental refinement. Aiming at solving the drifting problem and malicious anchor problem, a location verification algorithm based on the virtual force mode is presented. In addition, an anchor promotion algorithm using the localization reliability model is proposed to re-locate the drifted nodes. Extended simulation experiments indicate that the localization algorithm has relatively high precision and the location verification algorithm has relatively high accuracy. The communication overhead of these algorithms is relative low, and the whole set of reliable localization methods is practical as well as comprehensive. PMID:25954948
2002-06-07
Continue to Develop and Refine Emerging Technology • Some of the emerging biometric devices, such as iris scans and facial recognition systems...such as iris scans and facial recognition systems, facial recognition systems, and speaker verification systems. (976301)
33 CFR 385.20 - Restoration Coordination and Verification (RECOVER).
Code of Federal Regulations, 2010 CFR
2010-07-01
... Water Management District to conduct assessment, evaluation, and planning and integration activities... ensuring that the goals and purposes of the Plan are achieved. RECOVER has been organized into a Leadership Group that provides management and coordination for the activities of RECOVER and teams that accomplish...
Capability Portfolio Analysis Tool (CPAT) Verification and Validation Report
2013-01-01
BFSB Battlefield Surveillance Brigade BFV Bradley Fighting Vehicle BMOD Bradley Modernization C2 (H) Command and Control (HBCT) C2 (S...Fire Infantry Fighting Vehicle (IFV); Fire Integrated Support Team (FIST); Engineer (Eng); Cavalry (CAV) BFV FOV CDD Block II - 16 Apr 2010 GCV FOV
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2012 CFR
2012-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR § 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2013 CFR
2013-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2014 CFR
2014-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2010 CFR
2010-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
36 CFR 1237.28 - What special concerns apply to digital photographs?
Code of Federal Regulations, 2011 CFR
2011-07-01
... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...
Sediment Ecotoxicity Assessment Ring Verification Report and Statement
The SEA Ring (U.S. Patent No. 8,011,239) is an integrated, field tested, toxicity and bioavailability assessment device. This device was developed at SPAWAR in San Diego, California and is commercially available from Zebra-Tech, Ltd. The SEA Ring was designed to evaluate toxicity...
NASA Astrophysics Data System (ADS)
Delle Monache, L.; Rodriguez, L. M.; Meech, S.; Hahn, D.; Betancourt, T.; Steinhoff, D.
2016-12-01
It is necessary to accurately estimate the initial source characteristics in the event of an accidental or intentional release of a Chemical, Biological, Radiological, or Nuclear (CBRN) agent into the atmosphere. The accurate estimation of the source characteristics are important because many times they are unknown and the Atmospheric Transport and Dispersion (AT&D) models rely heavily on these estimates to create hazard assessments. To correctly assess the source characteristics in an operational environment where time is critical, the National Center for Atmospheric Research (NCAR) has developed a Source Term Estimation (STE) method, known as the Variational Iterative Refinement STE algorithm (VIRSA). VIRSA consists of a combination of modeling systems. These systems include an AT&D model, its corresponding STE model, a Hybrid Lagrangian-Eulerian Plume Model (H-LEPM), and its mathematical adjoint model. In an operational scenario where we have information regarding the infrastructure of a city, the AT&D model used is the Urban Dispersion Model (UDM) and when using this model in VIRSA we refer to the system as uVIRSA. In all other scenarios where we do not have the city infrastructure information readily available, the AT&D model used is the Second-order Closure Integrated PUFF model (SCIPUFF) and the system is referred to as sVIRSA. VIRSA was originally developed using SCIPUFF 2.4 for the Defense Threat Reduction Agency and integrated into the Hazard Prediction and Assessment Capability and Joint Program for Information Systems Joint Effects Model. The results discussed here are the verification and validation of the upgraded system with SCIPUFF 3.0 and the newly implemented UDM capability. To verify uVIRSA and sVIRSA, synthetic concentration observation scenarios were created in urban and rural environments and the results of this verification are shown. Finally, we validate the STE performance of uVIRSA using scenarios from the Joint Urban 2003 (JU03) experiment, which was held in Oklahoma City and also validate the performance of sVIRSA using scenarios from the FUsing Sensor Integrated Observing Network (FUSION) Field Trial 2007 (FFT07), held at Dugway Proving Grounds in rural Utah.
Integrated Short Range, Low Bandwidth, Wearable Communications Networking Technologies
2012-04-30
Only (FOUO) Table of Contents Introduction 7 Research Discussions 7 1 Specifications 8 2 SAN Radio 9 2.1 R.F. Design Improvements 9 2.1.1 LNA...Characterization and Verification Testing 26 2.2 Digital Design Improvements 26 2.2.1 Improve Processor Access to Memory Resources 26 2.2.2...integrated and tested . A hybrid architecture of the automatic gain control (AGC) was designed to Page 7 of 116 For Official Use Only (FOUO
Ares I Integrated Test Approach
NASA Technical Reports Server (NTRS)
Taylor, Jim
2008-01-01
This slide presentation reviews the testing approach that NASA is developing for the Ares I launch vehicle. NASA is planning a complete series of development, qualification and verification tests. These include: (1) Upper stage engine sea-level and altitude testing (2) First stage development and qualification motors (3) Upper stage structural and thermal development and qualification test articles (4) Main Propulsion Test Article (MPTA) (5) Upper stage green run testing (6) Integrated Vehicle Ground Vibration Testing (IVGVT) and (7) Aerodynamic characterization testing.
Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier
2017-03-14
Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.
Verification of Space Station Secondary Power System Stability Using Design of Experiment
NASA Technical Reports Server (NTRS)
Karimi, Kamiar J.; Booker, Andrew J.; Mong, Alvin C.; Manners, Bruce
1998-01-01
This paper describes analytical methods used in verification of large DC power systems with applications to the International Space Station (ISS). Large DC power systems contain many switching power converters with negative resistor characteristics. The ISS power system presents numerous challenges with respect to system stability such as complex sources and undefined loads. The Space Station program has developed impedance specifications for sources and loads. The overall approach to system stability consists of specific hardware requirements coupled with extensive system analysis and testing. Testing of large complex distributed power systems is not practical due to size and complexity of the system. Computer modeling has been extensively used to develop hardware specifications as well as to identify system configurations for lab testing. The statistical method of Design of Experiments (DoE) is used as an analysis tool for verification of these large systems. DOE reduces the number of computer runs which are necessary to analyze the performance of a complex power system consisting of hundreds of DC/DC converters. DoE also provides valuable information about the effect of changes in system parameters on the performance of the system. DoE provides information about various operating scenarios and identification of the ones with potential for instability. In this paper we will describe how we have used computer modeling to analyze a large DC power system. A brief description of DoE is given. Examples using applications of DoE to analysis and verification of the ISS power system are provided.
Advanced Plant Habitat Test Harvest
2017-08-24
John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, harvests half the Arabidopsis thaliana plants inside the growth chamber of the Advanced Plant Habitat (APH) Flight Unit No. 1. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.
Verification Failures: What to Do When Things Go Wrong
NASA Astrophysics Data System (ADS)
Bertacco, Valeria
Every integrated circuit is released with latent bugs. The damage and risk implied by an escaped bug ranges from almost imperceptible to potential tragedy; unfortunately it is impossible to discern within this range before a bug has been exposed and analyzed. While the past few decades have witnessed significant efforts to improve verification methodology for hardware systems, these efforts have been far outstripped by the massive complexity of modern digital designs, leading to product releases for which an always smaller fraction of system's states has been verified. The news of escaped bugs in large market designs and/or safety critical domains is alarming because of safety and cost implications (due to replacements, lawsuits, etc.).
NASA Technical Reports Server (NTRS)
Muheim, Danniella; Menzel, Michael; Mosier, Gary; Irish, Sandra; Maghami, Peiman; Mehalick, Kimberly; Parrish, Keith
2010-01-01
The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2014. System-level verification of critical performance requirements will rely on integrated observatory models that predict the wavefront error accurately enough to verify that allocated top-level wavefront error of 150 nm root-mean-squared (rms) through to the wave-front sensor focal plane is met. The assembled models themselves are complex and require the insight of technical experts to assess their ability to meet their objectives. This paper describes the systems engineering and modeling approach used on the JWST through the detailed design phase.
Cryogenic Fluid Management Experiment (CFME) trunnion verification testing
NASA Technical Reports Server (NTRS)
Bailey, W. J.; Fester, D. A.
1983-01-01
The Cryogenic Fluid Management Experiment (CFME) was designed to characterize subcritical liquid hydrogen storage and expulsion in the low-g space environment. The CFME has now become the storage and supply tank for the Cryogenic Fluid Management Facility, which includes transfer line and receiver tanks, as well. The liquid hydrogen storage and supply vessel is supported within a vacuum jacket to two fiberglass/epoxy composite trunnions which were analyzed and designed. Analysis using the limited available data indicated the trunnion was the most fatigue critical component in the storage vessel. Before committing the complete storage tank assembly to environmental testing, an experimental assessment was performed to verify the capability of the trunnion design to withstand expected vibration and loading conditions. Three tasks were conducted to evaluate trunnion integrity. The first determined the fatigue properties of the trunnion composite laminate materials. Tests at both ambient and liquid hydrogen temperatures showed composite material fatigue properties far in excess of those expected. Next, an assessment of the adequacy of the trunnion designs was performed (based on the tested material properties).
Vibration modelling and verifications for whole aero-engine
NASA Astrophysics Data System (ADS)
Chen, G.
2015-08-01
In this study, a new rotor-ball-bearing-casing coupling dynamic model for a practical aero-engine is established. In the coupling system, the rotor and casing systems are modelled using the finite element method, support systems are modelled as lumped parameter models, nonlinear factors of ball bearings and faults are included, and four types of supports and connection models are defined to model the complex rotor-support-casing coupling system of the aero-engine. A new numerical integral method that combines the Newmark-β method and the improved Newmark-β method (Zhai method) is used to obtain the system responses. Finally, the new model is verified in three ways: (1) modal experiment based on rotor-ball bearing rig, (2) modal experiment based on rotor-ball-bearing-casing rig, and (3) fault simulations for a certain type of missile turbofan aero-engine vibration. The results show that the proposed model can not only simulate the natural vibration characteristics of the whole aero-engine but also effectively perform nonlinear dynamic simulations of a whole aero-engine with faults.
Verification of the SENTINEL-4 Focal Plane Subsystem
NASA Astrophysics Data System (ADS)
Williges, C.; Hohn, R.; Rossmann, H.; Hilbert, S.; Uhlig, M.; Buchwinkler, K.; Reulke, R.
2017-05-01
The Sentinel-4 payload is a multi-spectral camera system which is designed to monitor atmospheric conditions over Europe. The German Aerospace Center (DLR) in Berlin, Germany conducted the verification campaign of the Focal Plane Subsystem (FPS) on behalf of Airbus Defense and Space GmbH, Ottobrunn, Germany. The FPS consists, inter alia, of two Focal Plane Assemblies (FPAs), one for the UV-VIS spectral range (305 nm … 500 nm), the second for NIR (750 nm … 775 nm). In this publication, we will present in detail the opto-mechanical laboratory set-up of the verification campaign of the Sentinel-4 Qualification Model (QM) which will also be used for the upcoming Flight Model (FM) verification. The test campaign consists mainly of radiometric tests performed with an integrating sphere as homogenous light source. The FPAs have mainly to be operated at 215 K ± 5 K, making it necessary to exploit a thermal vacuum chamber (TVC) for the test accomplishment. This publication focuses on the challenge to remotely illuminate both Sentinel-4 detectors as well as a reference detector homogeneously over a distance of approximately 1 m from outside the TVC. Furthermore selected test analyses and results will be presented, showing that the Sentinel-4 FPS meets specifications.
Computer Modeling and Simulation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Pronskikh, V. S.
2014-05-09
Verification and validation of computer codes and models used in simulation are two aspects of the scientific practice of high importance and have recently been discussed by philosophers of science. While verification is predominantly associated with the correctness of the way a model is represented by a computer code or algorithm, validation more often refers to model’s relation to the real world and its intended use. It has been argued that because complex simulations are generally not transparent to a practitioner, the Duhem problem can arise for verification and validation due to their entanglement; such an entanglement makes it impossiblemore » to distinguish whether a coding error or model’s general inadequacy to its target should be blamed in the case of the model failure. I argue that in order to disentangle verification and validation, a clear distinction between computer modeling (construction of mathematical computer models of elementary processes) and simulation (construction of models of composite objects and processes by means of numerical experimenting with them) needs to be made. Holding on to that distinction, I propose to relate verification (based on theoretical strategies such as inferences) to modeling and validation, which shares the common epistemology with experimentation, to simulation. To explain reasons of their intermittent entanglement I propose a weberian ideal-typical model of modeling and simulation as roles in practice. I suggest an approach to alleviate the Duhem problem for verification and validation generally applicable in practice and based on differences in epistemic strategies and scopes« less
Verification of road databases using multiple road models
NASA Astrophysics Data System (ADS)
Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian
2017-08-01
In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.
A Protocol-Analytic Study of Metacognition in Mathematical Problem Solving.
ERIC Educational Resources Information Center
Cai, Jinfa
1994-01-01
Metacognitive behaviors of subjects having high (n=2) and low (n=2) levels of mathematical experience were compared across four cognitive processes in mathematical problem solving: orientation, organization, execution, and verification. High-experience subjects engaged in self-regulation and spent more time on orientation and organization. (36…
40 CFR 80.1450 - What are the registration requirements under the RFS program?
Code of Federal Regulations, 2013 CFR
2013-07-01
... professional work experience in the chemical engineering field or related to renewable fuel production. (B) For... third-party engineering review and written report and verification of the information provided pursuant... professional engineer licensed in the United States with professional work experience in the chemical...
Shuttle payload interface verification equipment study. Volume 1: Executive summary
NASA Technical Reports Server (NTRS)
1976-01-01
A preliminary design analysis of a stand alone payload integration device (IVE) is provided that is capable of verifying payload compatibility in form, fit and function with the shuttle orbiter prior to on-line payload/orbiter operations. The IVE is a high fidelity replica of the orbiter payload accommodations capable of supporting payload functional checkout and mission simulation. A top level payload integration analysis developed detailed functional flow block diagrams of the payload integration process for the broad spectrum of P/L's and identified degree of orbiter data required by the payload user and potential applications of the IVE.
Magnetic force microscopy method and apparatus to detect and image currents in integrated circuits
Campbell, Ann. N.; Anderson, Richard E.; Cole, Jr., Edward I.
1995-01-01
A magnetic force microscopy method and improved magnetic tip for detecting and quantifying internal magnetic fields resulting from current of integrated circuits. Detection of the current is used for failure analysis, design verification, and model validation. The interaction of the current on the integrated chip with a magnetic field can be detected using a cantilevered magnetic tip. Enhanced sensitivity for both ac and dc current and voltage detection is achieved with voltage by an ac coupling or a heterodyne technique. The techniques can be used to extract information from analog circuits.
Magnetic force microscopy method and apparatus to detect and image currents in integrated circuits
Campbell, A.N.; Anderson, R.E.; Cole, E.I. Jr.
1995-11-07
A magnetic force microscopy method and improved magnetic tip for detecting and quantifying internal magnetic fields resulting from current of integrated circuits are disclosed. Detection of the current is used for failure analysis, design verification, and model validation. The interaction of the current on the integrated chip with a magnetic field can be detected using a cantilevered magnetic tip. Enhanced sensitivity for both ac and dc current and voltage detection is achieved with voltage by an ac coupling or a heterodyne technique. The techniques can be used to extract information from analog circuits. 17 figs.
Integrated Safety Analysis Tiers
NASA Technical Reports Server (NTRS)
Shackelford, Carla; McNairy, Lisa; Wetherholt, Jon
2009-01-01
Commercial partnerships and organizational constraints, combined with complex systems, may lead to division of hazard analysis across organizations. This division could cause important hazards to be overlooked, causes to be missed, controls for a hazard to be incomplete, or verifications to be inefficient. Each organization s team must understand at least one level beyond the interface sufficiently enough to comprehend integrated hazards. This paper will discuss various ways to properly divide analysis among organizations. The Ares I launch vehicle integrated safety analyses effort will be utilized to illustrate an approach that addresses the key issues and concerns arising from multiple analysis responsibilities.
NASA Technical Reports Server (NTRS)
1976-01-01
The performance, design, verification and operational requirements for Labcraft equipment and Labcraft integrated payloads are described. The requirements are based on the current definition of Spacelab and Space Transportation System equipment and the constraints associated with their use.
UAS Integration in the NAS Project: Part Task 6 V & V Simulation: Primary Results
NASA Technical Reports Server (NTRS)
Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor
2016-01-01
This is a presentation of the preliminary results on final V and V (Verification and Validation) activity of [RTCA (Radio Technical Commission for Aeronautics)] SC (Special Committee)-228 DAA (Detect and Avoid) HMI (Human-Machine Interface) requirements for display alerting and guidance.
IT Project Success w\\7120 and 7123 NPRs to Achieve Project Success
NASA Technical Reports Server (NTRS)
Walley, Tina L.
2009-01-01
This slide presentation reviews management techniques to assure information technology development project success. Details include the work products, the work breakdown structure (WBS), system integration, verification and validation (IV&V), and deployment and operations. An example, the NASA Consolidated Active Directory (NCAD), is reviewed.
The performance evaluation of innovative and alternative environmental technologies is an integral part of the U.S. Environmental Protection Agency's (EPA) mission. In the spirit of the technology policy, the Agency began to direct a portion of its resources toward the promotion...
2011-08-01
design space is large. His research contributions are to the field of Decision-based Design, specifically in linking consumer preferences and...Integrating Consumer Preferences into Engineering Design, to be published in 2012. He received his PhD from Northwestern University in Mechanical
NASA Astrophysics Data System (ADS)
Kagami, Hiroyuki
2007-05-01
We have proposed and modified a model of drying process of polymer solution coated on a flat substrate for flat polymer film fabrication and have presented the fruits through Photomask Japan 2002, 2003, 2004, Smart Materials, Nano-, and Micro-Smart Systems 2006 and so on. And for example numerical simulation of the model qualitatively reappears a typical thickness profile of the polymer film formed after drying, that is, the profile that the edge of the film is thicker and just the region next to the edge's bump is thinner. Then we have clarified dependence of distribution of polymer molecules on a flat substrate on a various parameters based on analysis of many numerical simulations. Then we did a few kinds of experiments so as to verify the modified model and reported the results of them through Photomask Japan 2005 and 2006. We could observe some results supporting the modified model. But we could not observe a characteristic region of a valley next to the edge's bump of a polymer film after drying. After some trial of various improved experiments we reached the conclusion that the characteristic region didn't appear by reason that water which vaporized slower than organic solvent was used as solvent. Then, in this study, we adopted organic solvent instead of water as solvent for experiments. As a result, that the characteristic region as mentioned above could be seen and we could verify the model more accurately. In this paper, we present verification of the model through above improved experiments for verification using organic solvent.
Large Scale Skill in Regional Climate Modeling and the Lateral Boundary Condition Scheme
NASA Astrophysics Data System (ADS)
Veljović, K.; Rajković, B.; Mesinger, F.
2009-04-01
Several points are made concerning the somewhat controversial issue of regional climate modeling: should a regional climate model (RCM) be expected to maintain the large scale skill of the driver global model that is supplying its lateral boundary condition (LBC)? Given that this is normally desired, is it able to do so without help via the fairly popular large scale nudging? Specifically, without such nudging, will the RCM kinetic energy necessarily decrease with time compared to that of the driver model or analysis data as suggested by a study using the Regional Atmospheric Modeling System (RAMS)? Finally, can the lateral boundary condition scheme make a difference: is the almost universally used but somewhat costly relaxation scheme necessary for a desirable RCM performance? Experiments are made to explore these questions running the Eta model in two versions differing in the lateral boundary scheme used. One of these schemes is the traditional relaxation scheme, and the other the Eta model scheme in which information is used at the outermost boundary only, and not all variables are prescribed at the outflow boundary. Forecast lateral boundary conditions are used, and results are verified against the analyses. Thus, skill of the two RCM forecasts can be and is compared not only against each other but also against that of the driver global forecast. A novel verification method is used in the manner of customary precipitation verification in that forecast spatial wind speed distribution is verified against analyses by calculating bias adjusted equitable threat scores and bias scores for wind speeds greater than chosen wind speed thresholds. In this way, focusing on a high wind speed value in the upper troposphere, verification of large scale features we suggest can be done in a manner that may be more physically meaningful than verifications via spectral decomposition that are a standard RCM verification method. The results we have at this point are somewhat limited in view of the integrations having being done only for 10-day forecasts. Even so, one should note that they are among very few done using forecast as opposed to reanalysis or analysis global driving data. Our results suggest that (1) running the Eta as an RCM no significant loss of large-scale kinetic energy with time seems to be taking place; (2) no disadvantage from using the Eta LBC scheme compared to the relaxation scheme is seen, while enjoying the advantage of the scheme being significantly less demanding than the relaxation given that it needs driver model fields at the outermost domain boundary only; and (3) the Eta RCM skill in forecasting large scales, with no large scale nudging, seems to be just about the same as that of the driver model, or, in the terminology of Castro et al., the Eta RCM does not lose "value of the large scale" which exists in the larger global analyses used for the initial condition and for verification.
Mars 2020 Model Based Systems Engineering Pilot
NASA Technical Reports Server (NTRS)
Dukes, Alexandra Marie
2017-01-01
The pilot study is led by the Integration Engineering group in NASA's Launch Services Program (LSP). The Integration Engineering (IE) group is responsible for managing the interfaces between the spacecraft and launch vehicle. This pilot investigates the utility of Model-Based Systems Engineering (MBSE) with respect to managing and verifying interface requirements. The main objectives of the pilot are to model several key aspects of the Mars 2020 integrated operations and interface requirements based on the design and verification artifacts from Mars Science Laboratory (MSL) and to demonstrate how MBSE could be used by LSP to gain further insight on the interface between the spacecraft and launch vehicle as well as to enhance how LSP manages the launch service. The method used to accomplish this pilot started through familiarization of SysML, MagicDraw, and the Mars 2020 and MSL systems through books, tutorials, and NASA documentation. MSL was chosen as the focus of the model since its processes and verifications translate easily to the Mars 2020 mission. The study was further focused by modeling specialized systems and processes within MSL in order to demonstrate the utility of MBSE for the rest of the mission. The systems chosen were the In-Flight Disconnect (IFD) system and the Mass Properties process. The IFD was chosen as a system of focus since it is an interface between the spacecraft and launch vehicle which can demonstrate the usefulness of MBSE from a system perspective. The Mass Properties process was chosen as a process of focus since the verifications for mass properties occur throughout the lifecycle and can demonstrate the usefulness of MBSE from a multi-discipline perspective. Several iterations of both perspectives have been modeled and evaluated. While the pilot study will continue for another 2 weeks, pros and cons of using MBSE for LSP IE have been identified. A pro of using MBSE includes an integrated view of the disciplines, requirements, and verifications leading up to launch. The model allows IE to understand the relationships between disciplines throughout test activities and verifications. Additionally, the relationships between disciplines and integration tasks are generally consistent. The model allows for the generic relationships and tasks to be captured and used throughout multiple mission models should LSP further pursue MBSE. A con of MBSE is the amount of time it takes upfront to understand MBSE and create a useful model. The upfront time it takes to create a useful model is heavily discussed in MBSE literature and is a consistent con throughout the known applications of MBSE. The need to understand SysML and the software chosen also poses the possibility of a "bottleneck" or one person being the sole MBSE user for the working group. The utility of MBSE will continue to be evaluated through the remainder of the study. In conclusion, the original objectives of the pilot study were to use artifacts from MSL to model key aspects of Mars 2020 and demonstrate how MBSE could be used by LSP to gain insight into the spacecraft and launch vehicle interfaces. Progress has been made in modeling and identifying the utility of MBSE to LSP IE and will continue to be made until the pilot study's conclusion in mid-August. The results of this study will produce initial models, modeling instructions and examples, and a summary of MBSE's utility for future use by LSP.
Universal Verification Methodology Based Register Test Automation Flow.
Woo, Jae Hun; Cho, Yong Kwan; Park, Sun Kyu
2016-05-01
In today's SoC design, the number of registers has been increased along with complexity of hardware blocks. Register validation is a time-consuming and error-pron task. Therefore, we need an efficient way to perform verification with less effort in shorter time. In this work, we suggest register test automation flow based UVM (Universal Verification Methodology). UVM provides a standard methodology, called a register model, to facilitate stimulus generation and functional checking of registers. However, it is not easy for designers to create register models for their functional blocks or integrate models in test-bench environment because it requires knowledge of SystemVerilog and UVM libraries. For the creation of register models, many commercial tools support a register model generation from register specification described in IP-XACT, but it is time-consuming to describe register specification in IP-XACT format. For easy creation of register model, we propose spreadsheet-based register template which is translated to IP-XACT description, from which register models can be easily generated using commercial tools. On the other hand, we also automate all the steps involved integrating test-bench and generating test-cases, so that designers may use register model without detailed knowledge of UVM or SystemVerilog. This automation flow involves generating and connecting test-bench components (e.g., driver, checker, bus adaptor, etc.) and writing test sequence for each type of register test-case. With the proposed flow, designers can save considerable amount of time to verify functionality of registers.
A web-based system for supporting global land cover data production
NASA Astrophysics Data System (ADS)
Han, Gang; Chen, Jun; He, Chaoying; Li, Songnian; Wu, Hao; Liao, Anping; Peng, Shu
2015-05-01
Global land cover (GLC) data production and verification process is very complicated, time consuming and labor intensive, requiring huge amount of imagery data and ancillary data and involving many people, often from different geographic locations. The efficient integration of various kinds of ancillary data and effective collaborative classification in large area land cover mapping requires advanced supporting tools. This paper presents the design and development of a web-based system for supporting 30-m resolution GLC data production by combining geo-spatial web-service and Computer Support Collaborative Work (CSCW) technology. Based on the analysis of the functional and non-functional requirements from GLC mapping, a three tiers system model is proposed with four major parts, i.e., multisource data resources, data and function services, interactive mapping and production management. The prototyping and implementation of the system have been realised by a combination of Open Source Software (OSS) and commercially available off-the-shelf system. This web-based system not only facilitates the integration of heterogeneous data and services required by GLC data production, but also provides online access, visualization and analysis of the images, ancillary data and interim 30 m global land-cover maps. The system further supports online collaborative quality check and verification workflows. It has been successfully applied to China's 30-m resolution GLC mapping project, and has improved significantly the efficiency of GLC data production and verification. The concepts developed through this study should also benefit other GLC or regional land-cover data production efforts.
The Seismic Aftershock Monitoring System (SAMS) for OSI - Experiences from IFE14
NASA Astrophysics Data System (ADS)
Gestermann, Nicolai; Sick, Benjamin; Häge, Martin; Blake, Thomas; Labak, Peter; Joswig, Manfred
2016-04-01
An on-site inspection (OSI) is the third of four elements of the verification regime of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The sole purpose of an OSI is to confirm whether a nuclear weapon test explosion or any other nuclear explosion has been carried out in violation of the treaty and to gather any facts which might assist in identifying any possible violator. It thus constitutes the final verification measure under the CTBT if all other available measures are not able to confirm the nature of a suspicious event. The Provisional Technical Secretariat (PTS) carried out the Integrated Field Exercise 2014 (IFE14) in the Dead Sea Area of Jordan from 3 November to 9. December 2014. It was a fictitious OSI whose aim was to test the inspection capabilities in an integrated manner. The technologies allowed during an OSI are listed in the Treaty. The aim of the Seismic Aftershock Monitoring System (SAMS) is to detect and localize aftershocks of low magnitudes of the triggering event or collapses of underground cavities. The locations of these events are expected in the vicinity of a possible previous explosion and help to narrow down the search area within an inspection area (IA) of an OSI. The success of SAMS depends on the main elements, hardware, software, deployment strategy, the search logic and not least the effective use of personnel. All elements of SAMS were tested and improved during the Built-Up Exercises (BUE) which took place in Austria and Hungary. IFE14 provided more realistic climatic and hazardous terrain conditions with limited resources. Significant variations in topography of the IA of IFE14 in the mountainous Dead Sea Area of Jordan led to considerable challenges which were not expected from experiences encountered during BUE. The SAMS uses mini arrays with an aperture of about 100 meters and with a total of 4 elements. The station network deployed during IFE14 and results of the data analysis will be presented. Possible aftershocks of the triggering event are expected in a very low magnitude range. Therefore the detection threshold of the network is one of the key parameters of SAMS and crucial for the success of the monitoring. One of the objectives was to record magnitude values down to -2.0 ML. The threshold values have been compared with historical seismicity in the region and those monitored during IFE14. Results of the threshold detection estimation and experiences of the exercise will be presented.
Experiences applying Formal Approaches in the Development of Swarm-Based Space Exploration Systems
NASA Technical Reports Server (NTRS)
Rouff, Christopher A.; Hinchey, Michael G.; Truszkowski, Walter F.; Rash, James L.
2006-01-01
NASA is researching advanced technologies for future exploration missions using intelligent swarms of robotic vehicles. One of these missions is the Autonomous Nan0 Technology Swarm (ANTS) mission that will explore the asteroid belt using 1,000 cooperative autonomous spacecraft. The emergent properties of intelligent swarms make it a potentially powerful concept, but at the same time more difficult to design and ensure that the proper behaviors will emerge. NASA is investigating formal methods and techniques for verification of such missions. The advantage of using formal methods is the ability to mathematically verify the behavior of a swarm, emergent or otherwise. Using the ANTS mission as a case study, we have evaluated multiple formal methods to determine their effectiveness in modeling and ensuring desired swarm behavior. This paper discusses the results of this evaluation and proposes an integrated formal method for ensuring correct behavior of future NASA intelligent swarms.
Counterintuitive Religious Ideas and Metaphoric Thinking: An Event-Related Brain Potential Study.
Fondevila, Sabela; Aristei, Sabrina; Sommer, Werner; Jiménez-Ortega, Laura; Casado, Pilar; Martín-Loeches, Manuel
2016-05-01
It has been shown that counterintuitive ideas from mythological and religious texts are more acceptable than other (non-religious) world knowledge violations. In the present experiment we explored whether this relates to the way they are interpreted (literal vs. metaphorical). Participants were presented with verification questions that referred to either the literal or a metaphorical meaning of the sentence previously read (counterintuitive religious, counterintuitive non-religious and intuitive), in a block-wise design. Both behavioral and electrophysiological results converged. At variance to the literal interpretation of the sentences, the induced metaphorical interpretation specifically facilitated the integration (N400 amplitude decrease) of religious counterintuitions, whereas the semantic processing of non-religious counterintuitions was not affected by the interpretation mode. We suggest that religious ideas tend to operate like other instances of figurative language, such as metaphors, facilitating their acceptability despite their counterintuitive nature. Copyright © 2015 Cognitive Science Society, Inc.
The visual communication in the optonometric scales.
Dantas, Rosane Arruda; Pagliuca, Lorita Marlena Freitag
2006-01-01
Communication through vision involves visual apprenticeship that demands ocular integrity, which results in the importance of the evaluation of visual acuity. The scale of images, formed by optotypes, is a method for the verification of visual acuity in kindergarten children. To identify the optotype the child needs to know the image in analysis. Given the importance of visual communication during the process of construction of the scale of images, one presents a bibliographic, analytical study aiming at thinking about the principles for the construction of those tables. One considers the draw inserted as an optotype as a non-verbal symbolic expression of the body and/or of the environment constructed based on the caption of experiences by the individual. One contests the indiscriminate use of images, for one understands that there must be previous knowledge. Despite the subjectivity of the optotypes, the scales continue valid if one adapts images to those of the universe of the children to be examined.
Automated Environment Generation for Software Model Checking
NASA Technical Reports Server (NTRS)
Tkachuk, Oksana; Dwyer, Matthew B.; Pasareanu, Corina S.
2003-01-01
A key problem in model checking open systems is environment modeling (i.e., representing the behavior of the execution context of the system under analysis). Software systems are fundamentally open since their behavior is dependent on patterns of invocation of system components and values defined outside the system but referenced within the system. Whether reasoning about the behavior of whole programs or about program components, an abstract model of the environment can be essential in enabling sufficiently precise yet tractable verification. In this paper, we describe an approach to generating environments of Java program fragments. This approach integrates formally specified assumptions about environment behavior with sound abstractions of environment implementations to form a model of the environment. The approach is implemented in the Bandera Environment Generator (BEG) which we describe along with our experience using BEG to reason about properties of several non-trivial concurrent Java programs.
Theory, simulation and experiments for precise deflection control of radiotherapy electron beams.
Figueroa, R; Leiva, J; Moncada, R; Rojas, L; Santibáñez, M; Valente, M; Velásquez, J; Young, H; Zelada, G; Yáñez, R; Guillen, Y
2018-03-08
Conventional radiotherapy is mainly applied by linear accelerators. Although linear accelerators provide dual (electron/photon) radiation beam modalities, both of them are intrinsically produced by a megavoltage electron current. Modern radiotherapy treatment techniques are based on suitable devices inserted or attached to conventional linear accelerators. Thus, precise control of delivered beam becomes a main key issue. This work presents an integral description of electron beam deflection control as required for novel radiotherapy technique based on convergent photon beam production. Theoretical and Monte Carlo approaches were initially used for designing and optimizing device´s components. Then, dedicated instrumentation was developed for experimental verification of electron beam deflection due to the designed magnets. Both Monte Carlo simulations and experimental results support the reliability of electrodynamics models used to predict megavoltage electron beam control. Copyright © 2018 Elsevier Ltd. All rights reserved.
Gu, Herong; Guan, Yajuan; Wang, Huaibao; Wei, Baoze; Guo, Xiaoqiang
2014-01-01
Microgrid is an effective way to integrate the distributed energy resources into the utility networks. One of the most important issues is the power flow control of grid-connected voltage-source inverter in microgrid. In this paper, the small-signal model of the power flow control for the grid-connected inverter is established, from which it can be observed that the conventional power flow control may suffer from the poor damping and slow transient response. While the new power flow control can mitigate these problems without affecting the steady-state power flow regulation. Results of continuous-domain simulations in MATLAB and digital control experiments based on a 32-bit fixed-point TMS320F2812 DSP are in good agreement, which verify the small signal model analysis and effectiveness of the proposed method. PMID:24672304
NASA Astrophysics Data System (ADS)
Dang, Haizheng; Zhao, Yibo
2016-09-01
This paper presents the CFD modeling and experimental verifications of a single-stage inertance tube coaxial Stirling-type pulse tube cryocooler operating at 30-35 K using mixed stainless steel mesh regenerator matrices without either double-inlet or multi-bypass. A two-dimensional axis-symmetric CFD model with the thermal non-equilibrium mode is developed to simulate the internal process, and the underlying mechanism of significantly reducing the regenerator losses with mixed matrices is discussed in detail based on the given six cases. The modeling also indicates that the combination of the given different mesh segments can be optimized to achieve the highest cooling efficiency or the largest exergy ratio, and then the verification experiments are conducted in which the satisfactory agreements between simulated and tested results are observed. The experiments achieve a no-load temperature of 27.2 K and the cooling power of 0.78 W at 35 K, or 0.29 W at 30 K, with an input electric power of 220 W and a reject temperature of 300 K.
Verification of Space Weather Forecasts using Terrestrial Weather Approaches
NASA Astrophysics Data System (ADS)
Henley, E.; Murray, S.; Pope, E.; Stephenson, D.; Sharpe, M.; Bingham, S.; Jackson, D.
2015-12-01
The Met Office Space Weather Operations Centre (MOSWOC) provides a range of 24/7 operational space weather forecasts, alerts, and warnings, which provide valuable information on space weather that can degrade electricity grids, radio communications, and satellite electronics. Forecasts issued include arrival times of coronal mass ejections (CMEs), and probabilistic forecasts for flares, geomagnetic storm indices, and energetic particle fluxes and fluences. These forecasts are produced twice daily using a combination of output from models such as Enlil, near-real-time observations, and forecaster experience. Verification of forecasts is crucial for users, researchers, and forecasters to understand the strengths and limitations of forecasters, and to assess forecaster added value. To this end, the Met Office (in collaboration with Exeter University) has been adapting verification techniques from terrestrial weather, and has been working closely with the International Space Environment Service (ISES) to standardise verification procedures. We will present the results of part of this work, analysing forecast and observed CME arrival times, assessing skill using 2x2 contingency tables. These MOSWOC forecasts can be objectively compared to those produced by the NASA Community Coordinated Modelling Center - a useful benchmark. This approach cannot be taken for the other forecasts, as they are probabilistic and categorical (e.g., geomagnetic storm forecasts give probabilities of exceeding levels from minor to extreme). We will present appropriate verification techniques being developed to address these forecasts, such as rank probability skill score, and comparing forecasts against climatology and persistence benchmarks. As part of this, we will outline the use of discrete time Markov chains to assess and improve the performance of our geomagnetic storm forecasts. We will also discuss work to adapt a terrestrial verification visualisation system to space weather, to help MOSWOC forecasters view verification results in near real-time; plans to objectively assess flare forecasts under the EU Horizon 2020 FLARECAST project; and summarise ISES efforts to achieve consensus on verification.
ESTEST: A Framework for the Verification and Validation of Electronic Structure Codes
NASA Astrophysics Data System (ADS)
Yuan, Gary; Gygi, Francois
2011-03-01
ESTEST is a verification and validation (V& V) framework for electronic structure codes that supports Qbox, Quantum Espresso, ABINIT, the Exciting Code and plans support for many more. We discuss various approaches to the electronic structure V& V problem implemented in ESTEST, that are related to parsing, formats, data management, search, comparison and analyses. Additionally, an early experiment in the distribution of V& V ESTEST servers among the electronic structure community will be presented. Supported by NSF-OCI 0749217 and DOE FC02-06ER25777.
NASA Astrophysics Data System (ADS)
Arakelian, S.; Kucherik, A.; Kutrovskaya, S.; Osipov, A.; Istratov, A.; Skryabin, I.
2018-01-01
A clear physical model for the quantum states verification in nanocluster structures with jump/tunneling electroconductivity are under study in both theory and experiment. The accent is made on consideration of low-dimensional structures when the structural phase transitions occur and the tendency to high enhancement electroconductivity obtained. The results give us an opportunity to establish a basis for new physical principles to create the functional elements for the optoelectronics and photonics in hybrid set-up (optics + electrophysics) by the nanocluster technology approach.
NASA Astrophysics Data System (ADS)
Tyagi, N.; Curran, B. H.; Roberson, P. L.; Moran, J. M.; Acosta, E.; Fraass, B. A.
2008-02-01
IMRT often requires delivering small fields which may suffer from electronic disequilibrium effects. The presence of heterogeneities, particularly low-density tissues in patients, complicates such situations. In this study, we report on verification of the DPM MC code for IMRT treatment planning in heterogeneous media, using a previously developed model of the Varian 120-leaf MLC. The purpose of this study is twofold: (a) design a comprehensive list of experiments in heterogeneous media for verification of any dose calculation algorithm and (b) verify our MLC model in these heterogeneous type geometries that mimic an actual patient geometry for IMRT treatment. The measurements have been done using an IMRT head and neck phantom (CIRS phantom) and slab phantom geometries. Verification of the MLC model has been carried out using point doses measured with an A14 slim line (SL) ion chamber inside a tissue-equivalent and a bone-equivalent material using the CIRS phantom. Planar doses using lung and bone equivalent slabs have been measured and compared using EDR films (Kodak, Rochester, NY).
SEPAC software configuration control plan and procedures, revision 1
NASA Technical Reports Server (NTRS)
1981-01-01
SEPAC Software Configuration Control Plan and Procedures are presented. The objective of the software configuration control is to establish the process for maintaining configuration control of the SEPAC software beginning with the baselining of SEPAC Flight Software Version 1 and encompass the integration and verification tests through Spacelab Level IV Integration. They are designed to provide a simplified but complete configuration control process. The intent is to require a minimum amount of paperwork but provide total traceability of SEPAC software.
NASA Technical Reports Server (NTRS)
Jones, J. E.; Richmond, J. H.
1974-01-01
An integral equation formulation is applied to predict pitch- and roll-plane radiation patterns of a thin VHF/UHF (very high frequency/ultra high frequency) annular slot communications antenna operating at several locations in the nose region of the space shuttle orbiter. Digital computer programs used to compute radiation patterns are given and the use of the programs is illustrated. Experimental verification of computed patterns is given from measurements made on 1/35-scale models of the orbiter.
Seven Processes that Enable NASA Software Engineering Technologies
NASA Technical Reports Server (NTRS)
Housch, Helen; Godfrey, Sally
2011-01-01
This slide presentation reviews seven processes that NASA uses to ensure that software is developed, acquired and maintained as specified in the NPR 7150.2A requirement. The requirement is to ensure that all software be appraised for the Capability Maturity Model Integration (CMMI). The enumerated processes are: (7) Product Integration, (6) Configuration Management, (5) Verification, (4) Software Assurance, (3) Measurement and Analysis, (2) Requirements Management and (1) Planning & Monitoring. Each of these is described and the group(s) that are responsible is described.
NASA Technical Reports Server (NTRS)
Duncan, L. M.; Reddell, J. P.; Schoonmaker, P. B.
1975-01-01
Techniques and support software for the efficient performance of simulation validation are discussed. Overall validation software structure, the performance of validation at various levels of simulation integration, guidelines for check case formulation, methods for real time acquisition and formatting of data from an all up operational simulator, and methods and criteria for comparison and evaluation of simulation data are included. Vehicle subsystems modules, module integration, special test requirements, and reference data formats are also described.
EPA ENVIRONMENTAL TECHNOLOGY EXPERIENCE
THE USEPA's Environmental Technology Verification for Metal Finishing Pollution Prevention Technologies (ETV-MF) Program verifies the performance of innovative, commercial-ready technologies designed to improve industry performance and achieve cost-effective pollution prevention ...
Power System Test and Verification at Satellite Level
NASA Astrophysics Data System (ADS)
Simonelli, Giulio; Mourra, Olivier; Tonicello, Ferdinando
2008-09-01
Most of the articles on Power Systems deal with the architecture and technical solutions related to the functionalities of the power system and their performances. Very few articles, if none, address integration and verification aspects of the Power System at satellite level and the related issues with the Power EGSE (Electrical Ground Support Equipment), which, also, have to support the AIT/AIV (Assembly Integration Test and Verification) program of the satellite and, eventually, the launch campaign. In the last years a more complex development and testing concept based on MDVE (Model Based Development and Verification Environment) has been introduced. In the MDVE approach the simulation software is used to simulate the Satellite environment and, in the early stages, the satellites units. This approach changed significantly the Power EGSE requirements. Power EGSEs or, better, Power SCOEs (Special Check Out Equipment) are now requested to provide the instantaneous power generated by the solar array throughout the orbit. To achieve that, the Power SCOE interfaces to the RTS (Real Time Simulator) of the MDVE. The RTS provides the instantaneous settings, which belong to that point along the orbit, to the Power SCOE so that the Power SCOE generates the instantaneous {I,V} curve of the SA (Solar Array). That means a real time test for the power system, which is even more valuable for EO (Earth Observation) satellites where the Solar Array aspect angle to the sun is rarely fixed, and the power load profile can be particularly complex (for example, in radar applications). In this article the major issues related to integration and testing of Power Systems will be discussed taking into account different power system topologies (i.e. regulated bus, unregulated bus, battery bus, based on MPPT or S3R…). Also aspects about Power System AIT I/Fs (interfaces) and Umbilical I/Fs with the launcher and the Power SCOE I/Fs will be addressed. Last but not least, protection strategy of the Power System during AIT/AIV program will also be discussed. The objective of this discussion is also to provide the Power System Engineer with a checklist of key aspects linked to the satellite AIT/AIV program, that have to be considered in the early phases of a new power system development.
An unattended verification station for UF 6 cylinders: Field trial findings
Smith, L. E.; Miller, K. A.; McDonald, B. S.; ...
2017-08-26
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
An unattended verification station for UF 6 cylinders: Field trial findings
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smith, L. E.; Miller, K. A.; McDonald, B. S.
In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS), which could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass, and identification for all declared uranium hexafluoride cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The first phase of the UCVS viability study was centered on a long-term field trial of a prototype UCVS system at a fuel fabrication facility. A key outcome of the study was a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This paper provides a description of the UCVS prototype design and an overview of the long-term field trial. In conclusion, analysis results and interpretation are presented with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that material diversion has not occurred.« less
NASA Astrophysics Data System (ADS)
Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin
2010-04-01
The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.
Analysis of large system black box verification test data
NASA Technical Reports Server (NTRS)
Clapp, Kenneth C.; Iyer, Ravishankar Krishnan
1993-01-01
Issues regarding black box, large systems verification are explored. It begins by collecting data from several testing teams. An integrated database containing test, fault, repair, and source file information is generated. Intuitive effectiveness measures are generated using conventional black box testing results analysis methods. Conventional analysts methods indicate that the testing was effective in the sense that as more tests were run, more faults were found. Average behavior and individual data points are analyzed. The data is categorized and average behavior shows a very wide variation in number of tests run and in pass rates (pass rates ranged from 71 percent to 98 percent). The 'white box' data contained in the integrated database is studied in detail. Conservative measures of effectiveness are discussed. Testing efficiency (ratio of repairs to number of tests) is measured at 3 percent, fault record effectiveness (ratio of repairs to fault records) is measured at 55 percent, and test script redundancy (ratio of number of failed tests to minimum number of tests needed to find the faults) ranges from 4.2 to 15.8. Error prone source files and subsystems are identified. A correlational mapping of test functional area to product subsystem is completed. A new adaptive testing process based on real-time generation of the integrated database is proposed.
Pinheiro, Alexandre; Dias Canedo, Edna; de Sousa Junior, Rafael Timoteo; de Oliveira Albuquerque, Robson; García Villalba, Luis Javier; Kim, Tai-Hoon
2018-03-02
Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance.
2018-01-01
Cloud computing is considered an interesting paradigm due to its scalability, availability and virtually unlimited storage capacity. However, it is challenging to organize a cloud storage service (CSS) that is safe from the client point-of-view and to implement this CSS in public clouds since it is not advisable to blindly consider this configuration as fully trustworthy. Ideally, owners of large amounts of data should trust their data to be in the cloud for a long period of time, without the burden of keeping copies of the original data, nor of accessing the whole content for verifications regarding data preservation. Due to these requirements, integrity, availability, privacy and trust are still challenging issues for the adoption of cloud storage services, especially when losing or leaking information can bring significant damage, be it legal or business-related. With such concerns in mind, this paper proposes an architecture for periodically monitoring both the information stored in the cloud and the service provider behavior. The architecture operates with a proposed protocol based on trust and encryption concepts to ensure cloud data integrity without compromising confidentiality and without overloading storage services. Extensive tests and simulations of the proposed architecture and protocol validate their functional behavior and performance. PMID:29498641
NASA Technical Reports Server (NTRS)
Amason, David L.
2008-01-01
The goal of the Solar Dynamics Observatory (SDO) is to understand and, ideally, predict the solar variations that influence life and society. It's instruments will measure the properties of the Sun and will take hifh definition images of the Sun every few seconds, all day every day. The FlatSat is a high fidelity electrical and functional representation of the SDO spacecraft bus. It is a high fidelity test bed for Integration & Test (I & T), flight software, and flight operations. For I & T purposes FlatSat will be a driver to development and dry run electrical integration procedures, STOL test procedures, page displays, and the command and telemetry database. FlatSat will also serve as a platform for flight software acceptance and systems testing for the flight software system component including the spacecraft main processors, power supply electronics, attitude control electronic, gimbal control electrons and the S-band communications card. FlatSat will also benefit the flight operations team through post-launch flight software code and table update development and verification and verification of new and updated flight operations products. This document highlights the benefits of FlatSat; describes the building of FlatSat; provides FlatSat facility requirements, access roles and responsibilities; and, and discusses FlatSat mechanical and electrical integration and functional testing.
Authentication of digital video evidence
NASA Astrophysics Data System (ADS)
Beser, Nicholas D.; Duerr, Thomas E.; Staisiunas, Gregory P.
2003-11-01
In response to a requirement from the United States Postal Inspection Service, the Technical Support Working Group tasked The Johns Hopkins University Applied Physics Laboratory (JHU/APL) to develop a technique tha will ensure the authenticity, or integrity, of digital video (DV). Verifiable integrity is needed if DV evidence is to withstand a challenge to its admissibility in court on the grounds that it can be easily edited. Specifically, the verification technique must detect additions, deletions, or modifications to DV and satisfy the two-part criteria pertaining to scientific evidence as articulated in Daubert et al. v. Merrell Dow Pharmaceuticals Inc., 43 F3d (9th Circuit, 1995). JHU/APL has developed a prototype digital video authenticator (DVA) that generates digital signatures based on public key cryptography at the frame level of the DV. Signature generation and recording is accomplished at the same time as DV is recorded by the camcorder. Throughput supports the consumer-grade camcorder data rate of 25 Mbps. The DVA software is implemented on a commercial laptop computer, which is connected to a commercial digital camcorder via the IEEE-1394 serial interface. A security token provides agent identification and the interface to the public key infrastructure (PKI) that is needed for management of the public keys central to DV integrity verification.
Gershon, P D
2010-12-15
Two tools are described for integrating LC elution position with mass-based data in hydrogen-deuterium exchange (HDX) experiments by nano-liquid chromatography/matrix-assisted laser desorption/ionization mass spectrometry (nanoLC/MALDI-MS, a novel approach to HDX-MS). The first of these, 'TOF2H-Z Comparator', highlights peptides in HDX experiments that are potentially misidentified on the basis of mass alone. The program first calculates normalized values for the organic solvent concentration responsible for the elution of ions in nanoLC/MALDI HDX experiments. It then allows the solvent gradients for the multiple experiments contributing to an MS/MS-confirmed peptic peptide library to be brought into mutual alignment by iteratively re-modeling variables among LC parameters such as gradient shape, solvent species, fraction duration and LC dead time. Finally, using the program, high-probability chromatographic outliers can be flagged within HDX experimental data. The role of the second tool, 'TOF2H-XIC Comparator', is to normalize the LC chromatograms corresponding to all deuteration timepoints of all HDX experiments of a project, to a common reference. Accurate normalization facilitates the verification of chromatographic consistency between all ions whose spectral segments contribute to particular deuterium uptake plots. Gradient normalization in this manner revealed chromatographic inconsistencies between ions whose masses were either indistinguishable or separated by precise isotopic increments. Copyright © 2010 John Wiley & Sons, Ltd.
Integration of the instrument control electronics for the ESPRESSO spectrograph at ESO-VLT
NASA Astrophysics Data System (ADS)
Baldini, V.; Calderone, G.; Cirami, R.; Coretti, I.; Cristiani, S.; Di Marcantonio, P.; Mégevand, D.; Riva, M.; Santin, P.
2016-07-01
ESPRESSO, the Echelle SPectrograph for Rocky Exoplanet and Stable Spectroscopic Observations of the ESO - Very Large Telescope site, is now in its integration phase. The large number of functions of this complex instrument are fully controlled by a Beckhoff PLC based control electronics architecture. Four small and one large cabinets host the main electronic parts to control all the sensors, motorized stages and other analogue and digital functions of ESPRESSO. The Instrument Control Electronics (ICE) is built following the latest ESO standards and requirements. Two main PLC CPUs are used and are programmed through the TwinCAT Beckhoff dedicated software. The assembly, integration and verification phase of ESPRESSO, due to its distributed nature and different geographical locations of the consortium partners, is quite challenging. After the preliminary assembling and test of the electronic components at the Astronomical Observatory of Trieste and the test of some electronics and software parts at ESO (Garching), the complete system for the control of the four Front End Unit (FEU) arms of ESPRESSO has been fully assembled and tested in Merate (Italy) at the beginning of 2016. After these first tests, the system will be located at the Geneva Observatory (Switzerland) until the Preliminary Acceptance Europe (PAE) and finally shipped to Chile for the commissioning. This paper describes the integration strategy of the ICE workpackage of ESPRESSO, the hardware and software tests that have been performed, with an overall view of the experience gained during these project's phases.
2008-03-01
80 5. Time Frame of the Experiment...81 6. Eyeglasses ............................................................................................81 D. OBSERVED RESULTS...97 2. Eyeglasses are a Factor
Principles of Sterilization of Mars Descent Vehicle Elements
NASA Astrophysics Data System (ADS)
Trofimov, Vladislav; Deshevaya, Elena; Khamidullina, N.; Kalashnikov, Viktor
Due to COSPAR severe requirements to permissible microbiological contamination of elements of down-to-Mars S/C as well as complexity of their chemical composition and structure the exposure of such S/C elements to antimicrobial treatment (sterilization) at their integration requires application of a wide set of methods: chemical, ultraviolet, radiation. The report describes the analysis of all the aspects of applicable methods of treatment for cleaning of elements’ surfaces and inner contents from microbiota. The analysis showed that the most important, predictable and controllable method is radiation processing (of the elements which don’t change their properties after effective treatment). The experience of ionizing radiation application for sterilization of products for medicine, etc. shows that, depending on initial microbial contamination of lander elements, the required absorbed dose can be within the range 12 ÷ 35 kGr. The analysis of the effect of irregularity of radiation absorption in complex structure elements to the choice of radiation methodology was made and the algorithm of the choice of effective conditions of radiation treatment and control of sterilization efficiency was suggested. The important phase of establishing of the effective condition of each structure element treatment is experimental verification of real microbiological contamination in terms of S/C integration, contamination maximum decrease using another cleaning procedures (mechanical, chemical, ultraviolet) and determination of radiation resistance of spore microorganisms typical for the shops of space technology manufacturing and assembling. Proceeding from three parameters (irregularity of radiation absorption in a concrete element, its initial microbial contamination and resistance of microorganisms to the effect of radiation) the condition of the packed object sterilization is chosen, the condition that prevents secondary contamination, ensures given reliability of the treatment without final experimental microbiological verification only by simple control of the absorbed dose at critical points. All the process phases (from the choice of treatment conditions to provision of the procedure safety) are strictly regulated by Russian legislation in accordance with international standards.
NASA Astrophysics Data System (ADS)
Ehret, G.; Amediek, A.; Wirth, M.; Fix, A.; Kiemle, C.; Quatrevalet, M.
2016-12-01
We report on a new method and on the first demonstration to quantify emission rates from strong greenhouse gas (GHG) point sources using airborne Integrated Path Differential Absorption (IPDA) Lidar measurements. In order to build trust in the self-reported emission rates by countries, verification against independent monitoring systems is a prerequisite to check the reported budget. A significant fraction of the total anthropogenic emission of CO2 and CH4 originates from localized strong point sources of large energy production sites or landfills. Both are not monitored with sufficiently accuracy by the current observation system. There is a debate whether airborne remote sensing could fill in the gap to infer those emission rates from budgeting or from Gaussian plume inversion approaches, whereby measurements of the GHG column abundance beneath the aircraft can be used to constrain inverse models. In contrast to passive sensors, the use of an active instrument like CHARM-F for such emission verification measurements is new. CHARM-F is a new airborne IPDA-Lidar devised for the German research aircraft HALO for the simultaneous measurement of the column-integrated dry-air mixing ratio of CO2 and CH4 commonly denoted as XCO2 und XCH4, respectively. It has successfully been tested in a serious of flights over Central Europe to assess its performance under various reflectivity conditions and in a strongly varying topography like the Alps. The analysis of a methane plume measured in crosswind direction of a coal mine ventilation shaft revealed an instantaneous emission rate of 9.9 ± 1.7 kt CH4 yr-1. We discuss the methodology of our point source estimation approach and give an outlook on the CoMet field experiment scheduled in 2017 for the measurement of anthropogenic and natural GHG emissions by a combination of active and passive remote sensing instruments on research aircraft.
Experiences of giving and receiving care in traumatic brain injury: An integrative review.
Kivunja, Stephen; River, Jo; Gullick, Janice
2018-04-01
To synthesise the literature on the experiences of giving or receiving care for traumatic brain injury for people with traumatic brain injury, their family members and nurses in hospital and rehabilitation settings. Traumatic brain injury represents a major source of physical, social and economic burden. In the hospital setting, people with traumatic brain injury feel excluded from decision-making processes and perceive impatient care. Families describe inadequate information and support for psychological distress. Nurses find the care of people with traumatic brain injury challenging particularly when experiencing heavy workloads. To date, a contemporary synthesis of the literature on people with traumatic brain injury, family and nurse experiences of traumatic brain injury care has not been conducted. Integrative literature review. A systematic search strategy guided by the PRISMA statement was conducted in CINAHL, PubMed, Proquest, EMBASE and Google Scholar. Whittemore and Knafl's (Journal of Advanced Nursing, 52, 2005, 546) integrative review framework guided data reduction, data display, data comparison and conclusion verification. Across the three participant categories (people with traumatic brain injury/family members/nurses) and sixteen subcategories, six cross-cutting themes emerged: seeking personhood, navigating challenging behaviour, valuing skills and competence, struggling with changed family responsibilities, maintaining productive partnerships and reflecting on workplace culture. Traumatic brain injury creates changes in physical, cognitive and emotional function that challenge known ways of being in the world for people. This alters relationship dynamics within families and requires a specific skill set among nurses. Recommendations include the following: (i) formal inclusion of people with traumatic brain injury and families in care planning, (ii) routine risk screening for falls and challenging behaviour to ensure that controls are based on accurate assessment, (iii) formal orientation and training for novice nurses in the management of challenging behaviour, (iv) professional case management to guide access to services and funding and (v) personal skill development to optimise family functioning. © 2018 John Wiley & Sons Ltd.
Modality Switching Cost during Property Verification by 7 Years of Age
ERIC Educational Resources Information Center
Ambrosi, Solene; Kalenine, Solene; Blaye, Agnes; Bonthoux, Francoise
2011-01-01
Recent studies in neuroimagery and cognitive psychology support the view of sensory-motor based knowledge: when processing an object concept, neural systems would re-enact previous experiences with this object. In this experiment, a conceptual switching cost paradigm derived from Pecher, Zeelenberg, and Barsalou (2003, 2004) was used to…
Conservation of Mechanical and Electric Energy: Simple Experimental Verification
ERIC Educational Resources Information Center
Ponikvar, D.; Planinsic, G.
2009-01-01
Two similar experiments on conservation of energy and transformation of mechanical into electrical energy are presented. Both can be used in classes, as they offer numerous possibilities for discussion with students and are simple to perform. Results are presented and are precise within 20% for the version of the experiment where measured values…
NASA Astrophysics Data System (ADS)
Andrina, G.; Basso, V.; Saitta, L.
2004-08-01
The effort in optimising the AIV process has been mainly focused in the recent years on the standardisation of approaches and on the application of new methodologies. But the earlier the intervention, the greater the benefits in terms of cost and schedule. Early phases of AIV process relied up to now on standards that need to be tailored through company and personal expertise. A study has then been conducted in order to exploit the possibility to develop an expert system helping in making choices in the early, conceptual phase of Assembly, Integration and Verification, namely the Model Philosophy and the test definition. The work focused on a hybrid approach, allowing interaction between historical data and human expertise. The expert system that has been prototyped exploits both information elicited from domain experts and results of a Data Mining activity on the existent data bases of completed projects verification data. The Data Mining algorithms allow the extraction of past experience resident on ESA/ MATD data base, which contains information in the form of statistical summaries, costs, frequencies of on-ground and in flight failures. Finding non-trivial associations could then be utilised by the experts to manage new decisions in a controlled way (Standards driven) at the beginning or during the AIV Process Moreover, the Expert AIV could allow compilation of a set of feasible AIV schedules to support further programmatic-driven choices.
Predictive Capability Maturity Model for computational modeling and simulation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.
2007-10-01
The Predictive Capability Maturity Model (PCMM) is a new model that can be used to assess the level of maturity of computational modeling and simulation (M&S) efforts. The development of the model is based on both the authors experience and their analysis of similar investigations in the past. The perspective taken in this report is one of judging the usefulness of a predictive capability that relies on the numerical solution to partial differential equations to better inform and improve decision making. The review of past investigations, such as the Software Engineering Institute's Capability Maturity Model Integration and the National Aeronauticsmore » and Space Administration and Department of Defense Technology Readiness Levels, indicates that a more restricted, more interpretable method is needed to assess the maturity of an M&S effort. The PCMM addresses six contributing elements to M&S: (1) representation and geometric fidelity, (2) physics and material model fidelity, (3) code verification, (4) solution verification, (5) model validation, and (6) uncertainty quantification and sensitivity analysis. For each of these elements, attributes are identified that characterize four increasing levels of maturity. Importantly, the PCMM is a structured method for assessing the maturity of an M&S effort that is directed toward an engineering application of interest. The PCMM does not assess whether the M&S effort, the accuracy of the predictions, or the performance of the engineering system satisfies or does not satisfy specified application requirements.« less
Li, Tingwen; Rogers, William A.; Syamlal, Madhava; ...
2016-07-29
Here, the MFiX suite of multiphase computational fluid dynamics (CFD) codes is being developed at U.S. Department of Energy's National Energy Technology Laboratory (NETL). It includes several different approaches to multiphase simulation: MFiX-TFM, a two-fluid (Eulerian–Eulerian) model; MFiX-DEM, an Eulerian fluid model with a Lagrangian Discrete Element Model for the solids phase; and MFiX-PIC, Eulerian fluid model with Lagrangian particle ‘parcels’ representing particle groups. These models are undergoing continuous development and application, with verification, validation, and uncertainty quantification (VV&UQ) as integrated activities. After a brief summary of recent progress in the verification, validation and uncertainty quantification (VV&UQ), this article highlightsmore » two recent accomplishments in the application of MFiX-TFM to fossil energy technology development. First, recent application of MFiX to the pilot-scale KBR TRIG™ Transport Gasifier located at DOE's National Carbon Capture Center (NCCC) is described. Gasifier performance over a range of operating conditions was modeled and compared to NCCC operational data to validate the ability of the model to predict parametric behavior. Second, comparison of code predictions at a detailed fundamental scale is presented studying solid sorbents for the post-combustion capture of CO 2 from flue gas. Specifically designed NETL experiments are being used to validate hydrodynamics and chemical kinetics for the sorbent-based carbon capture process.« less
MPLM Donatello is offloaded at the SLF
NASA Technical Reports Server (NTRS)
2001-01-01
At the Shuttle Landing Facility, workers in cherry pickers (left and right) help direct the offloading of the Italian Space Agency's Multi- Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.
MPLM Donatello is offloaded at the SLF
NASA Technical Reports Server (NTRS)
2001-01-01
At the Shuttle Landing Facility, cranes are poised to help offload the Italian Space Agency's Multi- Purpose Logistics Module Donatello from the Airbus '''Beluga''' air cargo plane that brought it from the factory of Alenia Aerospazio in Turin, Italy. The third of three for the International Space Station, the module will be transported to the Space Station Processing Facility for processing. Among the activities for the payload test team are integrated electrical tests with other Station elements in the SSPF, leak tests, electrical and software compatibility tests with the Space Shuttle (using the Cargo Integrated Test equipment) and an Interface Verification Test once the module is installed in the Space Shuttle's payload bay at the launch pad. The most significant mechanical task to be performed on Donatello in the SSPF is the installation and outfitting of the racks for carrying the various experiments and cargo.
Advanced Plant Habitat Test Harvest
2017-08-24
John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, opens the door to the growth chamber of the Advanced Plant Habitat (APH) Flight Unit No. 1 for a test harvest of half of the Arabidopsis thaliana plants growing within. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.
Advanced Plant Habitat Test Harvest
2017-08-24
John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, places Arabidopsis thaliana plants harvested from the Advanced Plant Habitat (APH) Flight Unit No. 1 into a Mini ColdBag that quickly freezes the plants. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.
Advanced Plant Habitat Test Harvest
2017-08-24
John "JC" Carver, a payload integration engineer with NASA Kennedy Space Center's Test and Operations Support Contract, places Arabidopsis thaliana plants harvested from the Advanced Plant Habitat (APH) Flight Unit No. 1 into an Ultra-low Freezer chilled to -150 degrees Celsius. The harvest is part of an ongoing verification test of the APH unit, which is located inside the International Space Station Environmental Simulator in Kennedy's Space Station Processing Facility. The APH undergoing testing at Kennedy is identical to one on the station and uses red, green and broad-spectrum white LED lights to grow plants in an environmentally controlled chamber. The seeds grown during the verification test will be grown on the station to help scientists understand how these plants adapt to spaceflight.
NASA Astrophysics Data System (ADS)
Podkościelny, P.; Nieszporek, K.
2007-01-01
Surface heterogeneity of activated carbons is usually characterized by adsorption energy distribution (AED) functions which can be estimated from the experimental adsorption isotherms by inverting integral equation. The experimental data of phenol adsorption from aqueous solution on activated carbons prepared from polyacrylonitrile (PAN) and polyethylene terephthalate (PET) have been taken from literature. AED functions for phenol adsorption, generated by application of regularization method have been verified. The Grand Canonical Monte Carlo (GCMC) simulation technique has been used as verification tool. The definitive stage of verification was comparison of experimental adsorption data and those obtained by utilization GCMC simulations. Necessary information for performing of simulations has been provided by parameters of AED functions calculated by regularization method.
Development tests for the 2.5 megawatt Mod-2 wind turbine generator
NASA Technical Reports Server (NTRS)
Andrews, J. S.; Baskin, J. M.
1982-01-01
The 2.5 megawatt MOD-2 wind turbine generator test program is discussed. The development of the 2.5 megawatt MOD-2 wind turbine generator included an extensive program of testing which encompassed verification of analytical procedures, component development, and integrated system verification. The test program was to assure achievement of the thirty year design operational life of the wind turbine system as well as to minimize costly design modifications which would otherwise have been required during on site system testing. Computer codes were modified, fatigue life of structure and dynamic components were verified, mechanical and electrical component and subsystems were functionally checked and modified where necessary to meet system specifications, and measured dynamic responses of coupled systems confirmed analytical predictions.
NASA Technical Reports Server (NTRS)
Page, J.
1981-01-01
The effects of an independent verification and integration (V and I) methodology on one class of application are described. Resource profiles are discussed. The development environment is reviewed. Seven measures are presented to test the hypothesis that V and I improve the development and product. The V and I methodology provided: (1) a decrease in requirements ambiguities and misinterpretation; (2) no decrease in design errors; (3) no decrease in the cost of correcting errors; (4) a decrease in the cost of system and acceptance testing; (5) an increase in early discovery of errors; (6) no improvement in the quality of software put into operation; and (7) a decrease in productivity and an increase in cost.
Commander Lousma works with EEVT experiment and cryogenic tube on aft middeck
1982-03-31
Commander Jack Lousma works with Electrophoresis Equipment Verification Test (EEVT) electrophoresis unit, cryogenic freezer and tube, and stowage locker equipment located on crew compartment middeck aft bulkhead.
ERIC Educational Resources Information Center
Gollin, George D.
2009-01-01
The global demand for higher education currently exceeds the world's existing university capacity. This shortfall is likely to persist for the foreseeable future, raising concerns that frustrated students might choose to purchase fraudulent credentials from counterfeiters or diploma mills. International efforts to encourage the development of…
ERIC Educational Resources Information Center
London, Manuel; Polzer, Jeffrey T.; Omoregie, Heather
2005-01-01
This article presents a multilevel model of group learning that focuses on antecedents and consequences of interpersonal congruence, transactive memory, and feedback processes. The model holds that members' self-verification motives and situational conditions (e.g., member diversity and task demands) give rise to identity negotiation behaviors…
ERIC Educational Resources Information Center
Ding, Lin
2014-01-01
This study seeks to test the causal influences of reasoning skills and epistemologies on student conceptual learning in physics. A causal model, integrating multiple variables that were investigated separately in the prior literature, is proposed and tested through path analysis. These variables include student preinstructional reasoning skills…