Sample records for final verification success

  1. Design and verification of distributed logic controllers with application of Petri nets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  2. Design and verification of distributed logic controllers with application of Petri nets

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-01

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  3. Options and Risk for Qualification of Electric Propulsion System

    NASA Technical Reports Server (NTRS)

    Bailey, Michelle; Daniel, Charles; Cook, Steve (Technical Monitor)

    2002-01-01

    Electric propulsion vehicle systems envelop a wide range of propulsion alternatives including solar and nuclear, which present unique circumstances for qualification. This paper will address the alternatives for qualification of electric propulsion spacecraft systems. The approach taken will be to address the considerations for qualification at the various levels of systems definition. Additionally, for each level of qualification the system level risk implications will be developed. Also, the paper will explore the implications of analysis verses test for various levels of systems definition, while retaining the objectives of a verification program. The limitations of terrestrial testing will be explored along with the risk and implications of orbital demonstration testing. The paper will seek to develop a template for structuring of a verification program based on cost, risk and value return. A successful verification program should establish controls and define objectives of the verification compliance program. Finally the paper will seek to address the political and programmatic factors, which may impact options for system verification.

  4. Cleared for Launch - Lessons Learned from the OSIRIS-REx System Requirements Verification Program

    NASA Technical Reports Server (NTRS)

    Stevens, Craig; Adams, Angela; Williams, Bradley; Goodloe, Colby

    2017-01-01

    Requirements verification of a large flight system is a challenge. It is especially challenging for engineers taking on their first role in space systems engineering. This paper describes our approach to verification of the Origins, Spectral Interpretation, Resource Identification, Security-Regolith Explorer (OSIRIS-REx) system requirements. It also captures lessons learned along the way from developing systems engineers embroiled in this process. We begin with an overview of the mission and science objectives as well as the project requirements verification program strategy. A description of the requirements flow down is presented including our implementation for managing the thousands of program and element level requirements and associated verification data. We discuss both successes and methods to improve the managing of this data across multiple organizational interfaces. Our approach to verifying system requirements at multiple levels of assembly is presented using examples from our work at instrument, spacecraft, and ground segment levels. We include a discussion of system end-to-end testing limitations and their impacts to the verification program. Finally, we describe lessons learned that are applicable to all emerging space systems engineers using our unique perspectives across multiple organizations of a large NASA program.

  5. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  6. 78 FR 52085 - VA Veteran-Owned Small Business Verification Guidelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-22

    ... DEPARTMENT OF VETERANS AFFAIRS 38 CFR Part 74 RIN 2900-AO49 VA Veteran-Owned Small Business Verification Guidelines AGENCY: Department of Veterans Affairs. ACTION: Final rule. SUMMARY: This document... Domestic Assistance This final rule affects the verification guidelines of veteran- owned small businesses...

  7. Cosmic Ray Muon Imaging of Spent Nuclear Fuel in Dry Storage Casks

    DOE PAGES

    Durham, J. Matthew; Guardincerri, Elena; Morris, Christopher L.; ...

    2016-04-29

    In this paper, cosmic ray muon radiography has been used to identify the absence of spent nuclear fuel bundles inside a sealed dry storage cask. The large amounts of shielding that dry storage casks use to contain radiation from the highly radioactive contents impedes typical imaging methods, but the penetrating nature of cosmic ray muons allows them to be used as an effective radiographic probe. This technique was able to successfully identify missing fuel bundles inside a sealed Westinghouse MC-10 cask. This method of fuel cask verification may prove useful for international nuclear safeguards inspectors. Finally, muon radiography may findmore » other safety and security or safeguards applications, such as arms control verification.« less

  8. Use of maxillofacial laboratory materials to construct a tissue-equivalent head phantom with removable titanium implantable devices for use in verification of the dose of intensity-modulated radiotherapy.

    PubMed

    Morris, K

    2017-06-01

    The dose of radiotherapy is often verified by measuring the dose of radiation at specific points within a phantom. The presence of high-density implant materials such as titanium, however, may cause complications both during calculation and delivery of the dose. Numerous studies have reported photon/electron backscatter and alteration of the dose by high-density implants, but we know of no evidence of a dosimetry phantom that incorporates high density implants or fixtures. The aim of the study was to design and manufacture a tissue-equivalent head phantom for use in verification of the dose in radiotherapy using a combination of traditional laboratory materials and techniques and 3-dimensional technology that can incorporate titanium maxillofacial devices. Digital designs were used together with Mimics® 18.0 (Materialise NV) and FreeForm® software. DICOM data were downloaded and manipulated into the final pieces of the phantom mould. Three-dimensional digital objects were converted into STL files and exported for additional stereolithography. Phantoms were constructed in four stages: material testing and selection, design of a 3-dimensional mould, manufacture of implants, and final fabrication of the phantom using traditional laboratory techniques. Three tissue-equivalent materials were found and used to successfully manufacture a suitable phantom with interchangeable sections that contained three versions of titanium maxillofacial implants. Maxillofacial and other materials can be used to successfully construct a head phantom with interchangeable titanium implant sections for use in verification of doses of radiotherapy. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  9. Verification of operational solar flare forecast: Case of Regional Warning Center Japan

    NASA Astrophysics Data System (ADS)

    Kubo, Yûki; Den, Mitsue; Ishii, Mamoru

    2017-08-01

    In this article, we discuss a verification study of an operational solar flare forecast in the Regional Warning Center (RWC) Japan. The RWC Japan has been issuing four-categorical deterministic solar flare forecasts for a long time. In this forecast verification study, we used solar flare forecast data accumulated over 16 years (from 2000 to 2015). We compiled the forecast data together with solar flare data obtained with the Geostationary Operational Environmental Satellites (GOES). Using the compiled data sets, we estimated some conventional scalar verification measures with 95% confidence intervals. We also estimated a multi-categorical scalar verification measure. These scalar verification measures were compared with those obtained by the persistence method and recurrence method. As solar activity varied during the 16 years, we also applied verification analyses to four subsets of forecast-observation pair data with different solar activity levels. We cannot conclude definitely that there are significant performance differences between the forecasts of RWC Japan and the persistence method, although a slightly significant difference is found for some event definitions. We propose to use a scalar verification measure to assess the judgment skill of the operational solar flare forecast. Finally, we propose a verification strategy for deterministic operational solar flare forecasting. For dichotomous forecast, a set of proposed verification measures is a frequency bias for bias, proportion correct and critical success index for accuracy, probability of detection for discrimination, false alarm ratio for reliability, Peirce skill score for forecast skill, and symmetric extremal dependence index for association. For multi-categorical forecast, we propose a set of verification measures as marginal distributions of forecast and observation for bias, proportion correct for accuracy, correlation coefficient and joint probability distribution for association, the likelihood distribution for discrimination, the calibration distribution for reliability and resolution, and the Gandin-Murphy-Gerrity score and judgment skill score for skill.

  10. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION UNIVERSITY OF WASHINGTON MARCH 2016 FINAL TECHNICAL REPORT...DATES COVERED (From - To) JUN 2012 – SEP 2015 4. TITLE AND SUBTITLE VERIFICATION GAMES : CROWD-SOURCED FORMAL VERIFICATION 5a. CONTRACT NUMBER FA8750...clarification memorandum dated 16 Jan 09. 13. SUPPLEMENTARY NOTES 14. ABSTRACT Over the more than three years of the project Verification Games : Crowd-sourced

  11. Grazing-Angle Fourier Transform Infrared Spectroscopy for Surface Cleanliness Verification

    DTIC Science & Technology

    2003-03-01

    coating. 34 North Island personnel were also interested in using the portable FTIR instrument to detect a trivalent chromium conversion coating on... trivalent chromium coating on aluminum panels. 35 Following the successful field-test at NADEP North Island in December 2000, a second demonstration of...contaminated, the panels were allowed to dry under a fume hood to evaporate the solvent. They were then placed in a desiccator for final drying. This

  12. Design and Mechanical Evaluation of a Capacitive Sensor-Based Indexed Platform for Verification of Portable Coordinate Measuring Instruments

    PubMed Central

    Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar

    2014-01-01

    During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures. PMID:24451458

  13. Design and mechanical evaluation of a capacitive sensor-based indexed platform for verification of portable coordinate measuring instruments.

    PubMed

    Avila, Agustín Brau; Mazo, Jorge Santolaria; Martín, Juan José Aguilar

    2014-01-02

    During the last years, the use of Portable Coordinate Measuring Machines (PCMMs) in industry has increased considerably, mostly due to their flexibility for accomplishing in-line measuring tasks as well as their reduced costs and operational advantages as compared to traditional coordinate measuring machines (CMMs). However, their operation has a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification procedures. In this work the mechanical design of an indexed metrology platform (IMP) is presented. The aim of the IMP is to increase the final accuracy and to radically simplify the calibration, identification and verification of geometrical parameter procedures of PCMMs. The IMP allows us to fix the calibrated gauge object and move the measuring instrument in such a way that it is possible to cover most of the instrument working volume, reducing the time and operator fatigue to carry out these types of procedures.

  14. INF and IAEA: A comparative analysis of verification strategy. [Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.

  15. What is the Final Verification of Engineering Requirements?

    NASA Technical Reports Server (NTRS)

    Poole, Eric

    2010-01-01

    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  16. VINCI: the VLT Interferometer commissioning instrument

    NASA Astrophysics Data System (ADS)

    Kervella, Pierre; Coudé du Foresto, Vincent; Glindemann, Andreas; Hofmann, Reiner

    2000-07-01

    The Very Large Telescope Interferometer (VLTI) is a complex system, made of a large number of separated elements. To prepare an early successful operation, it will require a period of extensive testing and verification to ensure that the many devices involved work properly together, and can produce meaningful data. This paper describes the concept chosen for the VLTI commissioning instrument, LEONARDO da VINCI, and details its functionalities. It is a fiber based two-way beam combiner, associated with an artificial star and an alignment verification unit. The technical commissioning of the VLTI is foreseen as a stepwise process: fringes will first be obtained with the commissioning instrument in an autonomous mode (no other parts of the VLTI involved); then the VLTI telescopes and optical trains will be tested in autocollimation; finally fringes will be observed on the sky.

  17. 76 FR 50164 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-12

    ...-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing... correct certain portions of the Protocol Gas Verification Program and Minimum Competency Requirements for... final rule that amends the Agency's Protocol Gas Verification Program (PGVP) and the minimum competency...

  18. Overview of RICOR's reliability theoretical analysis, accelerated life demonstration test results and verification by field data

    NASA Astrophysics Data System (ADS)

    Vainshtein, Igor; Baruch, Shlomi; Regev, Itai; Segal, Victor; Filis, Avishai; Riabzev, Sergey

    2018-05-01

    The growing demand for EO applications that work around the clock 24hr/7days a week, such as in border surveillance systems, emphasizes the need for a highly reliable cryocooler having increased operational availability and optimized system's Integrated Logistic Support (ILS). In order to meet this need, RICOR developed linear and rotary cryocoolers which achieved successfully this goal. Cryocoolers MTTF was analyzed by theoretical reliability evaluation methods, demonstrated by normal and accelerated life tests at Cryocooler level and finally verified by field data analysis derived from Cryocoolers operating at system level. The following paper reviews theoretical reliability analysis methods together with analyzing reliability test results derived from standard and accelerated life demonstration tests performed at Ricor's advanced reliability laboratory. As a summary for the work process, reliability verification data will be presented as a feedback from fielded systems.

  19. 76 FR 23861 - Documents Acceptable for Employment Eligibility Verification; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-29

    ... Documents Acceptable for Employment Eligibility Verification; Correction AGENCY: U.S. Citizenship and... titled Documents Acceptable for Employment Eligibility Verification published in the Federal Register on... a final rule in the Federal Register at 76 FR 21225 establishing Documents Acceptable for Employment...

  20. SMAP Verification and Validation Project - Final Report

    NASA Technical Reports Server (NTRS)

    Murry, Michael

    2012-01-01

    In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.

  1. A tool for hearing aid and cochlear implant users to judge the usability of cellular telephones in field conditions

    NASA Astrophysics Data System (ADS)

    Deer, Maria Soledad

    The auditory experience of using a hearing aid or a cochlear implant simultaneously with a cell phone is driven by a number of factors. These factors are: radiofrequency and baseband interference, speech intelligibility, sound quality, handset design, volume control and signal strength. The purpose of this study was to develop a tool to be used by hearing aid and cochlear implant users in retail stores as they try cell phones before buying them. This tool is meant to be an efficient, practical and systematic consumer selection tool that will capture and document information on all the domains that play a role in the auditory experience of using a cell phone with a hearing aid or cochlear implant. The development of this consumer tool involved three steps as follows: preparation, verification and measurement of success according to a predefined criterion. First, the consumer tool, consisting of a comparison chart and speech material, was prepared. Second, the consumer tool was evaluated by groups of subjects in a two-step verification process. Phase I was conducted in a controlled setting and it was followed by Phase II which took place in real world (field) conditions. In order to perform a systematic evaluation of the consumer tool two questionnaires were developed: one questionnaire for each phase. Both questionnaires involved five quantitative variables scored with the use of ratings scales. These ratings were averaged yielding an Overall Consumer Performance Score. A qualitative performance category corresponding to the Mean Opinion Score (MOS) was allocated to each final score within a scale ranging from 1 to 5 (where 5 = excellent and 1 = bad). Finally, the consumer tool development was determined to be successful if at least 80% of the participants in verification Phase II rated the comparison chart as excellent or good according to the qualitative MOS score. The results for verification Phase II (field conditions) indicated that the Overall Consumer Performance score for 92% of the subjects (11/12) was 3.7 and above corresponding to Good and Excellent MOS qualitative categories. It was concluded that this is a practical and efficient tool for hearing aid/cochlear implant users as they approach a cell phone selection process.

  2. Experimental validation of thermo-chemical algorithm for a simulation of pultrusion processes

    NASA Astrophysics Data System (ADS)

    Barkanov, E.; Akishin, P.; Miazza, N. L.; Galvez, S.; Pantelelis, N.

    2018-04-01

    To provide better understanding of the pultrusion processes without or with temperature control and to support the pultrusion tooling design, an algorithm based on the mixed time integration scheme and nodal control volumes method has been developed. At present study its experimental validation is carried out by the developed cure sensors measuring the electrical resistivity and temperature on the profile surface. By this verification process the set of initial data used for a simulation of the pultrusion process with rod profile has been successfully corrected and finally defined.

  3. Biological weapons and bioterrorism in the first years of the twenty-first century.

    PubMed

    Leitenberg, Milton

    2002-09-01

    This paper evaluates four recent developments in biological-weapons politics and bioterrorism. First is American opposition to finalization of a verification protocol for the Biological Weapons Convention; second, a successful attempt at mass-casualty terrorism; third, an ongoing investigation into the bioterrorist capabilities of the al Qaeda network; and, fourth, a series of fatal anthrax attacks in the United States. The first of these evaluations is informed by interviews conducted between 2000 and 2002 with policy principals in the United States and elsewhere.

  4. Formal methods technology transfer: Some lessons learned

    NASA Technical Reports Server (NTRS)

    Hamilton, David

    1992-01-01

    IBM has a long history in the application of formal methods to software development and verification. There have been many successes in the development of methods, tools and training to support formal methods. And formal methods have been very successful on several projects. However, the use of formal methods has not been as widespread as hoped. This presentation summarizes several approaches that have been taken to encourage more widespread use of formal methods, and discusses the results so far. The basic problem is one of technology transfer, which is a very difficult problem. It is even more difficult for formal methods. General problems of technology transfer, especially the transfer of formal methods technology, are also discussed. Finally, some prospects for the future are mentioned.

  5. 76 FR 75994 - Homeless Emergency Assistance and Rapid Transition to Housing: Defining “Homeless”

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-05

    .... Verification of homeless status by providers serving individuals and families fleeing, or attempting to flee... final rule imposes additional verification requirements for oral statements by individuals or families... includes: (1) Written verification from a professional who is licensed by the state to diagnose and treat...

  6. Using Small-Step Refinement for Algorithm Verification in Computer Science Education

    ERIC Educational Resources Information Center

    Simic, Danijela

    2015-01-01

    Stepwise program refinement techniques can be used to simplify program verification. Programs are better understood since their main properties are clearly stated, and verification of rather complex algorithms is reduced to proving simple statements connecting successive program specifications. Additionally, it is easy to analyse similar…

  7. Improved Detection Technique for Solvent Rinse Cleanliness Verification

    NASA Technical Reports Server (NTRS)

    Hornung, S. D.; Beeson, H. D.

    2001-01-01

    The NASA White Sands Test Facility (WSTF) has an ongoing effort to reduce or eliminate usage of cleaning solvents such as CFC-113 and its replacements. These solvents are used in the final clean and cleanliness verification processes for flight and ground support hardware, especially for oxygen systems where organic contaminants can pose an ignition hazard. For the final cleanliness verification in the standard process, the equivalent of one square foot of surface area of parts is rinsed with the solvent, and the final 100 mL of the rinse is captured. The amount of nonvolatile residue (NVR) in the solvent is determined by weight after the evaporation of the solvent. An improved process of sampling this rinse, developed at WSTF, requires evaporation of less than 2 mL of the solvent to make the cleanliness verification. Small amounts of the solvent are evaporated in a clean stainless steel cup, and the cleanliness of the stainless steel cup is measured using a commercially available surface quality monitor. The effectiveness of this new cleanliness verification technique was compared to the accepted NVR sampling procedures. Testing with known contaminants in solution, such as hydraulic fluid, fluorinated lubricants, and cutting and lubricating oils, was performed to establish a correlation between amount in solution and the process response. This report presents the approach and results and discusses the issues in establishing the surface quality monitor-based cleanliness verification.

  8. VEG-01: Veggie Hardware Verification Testing

    NASA Technical Reports Server (NTRS)

    Massa, Gioia; Newsham, Gary; Hummerick, Mary; Morrow, Robert; Wheeler, Raymond

    2013-01-01

    The Veggie plant/vegetable production system is scheduled to fly on ISS at the end of2013. Since much of the technology associated with Veggie has not been previously tested in microgravity, a hardware validation flight was initiated. This test will allow data to be collected about Veggie hardware functionality on ISS, allow crew interactions to be vetted for future improvements, validate the ability of the hardware to grow and sustain plants, and collect data that will be helpful to future Veggie investigators as they develop their payloads. Additionally, food safety data on the lettuce plants grown will be collected to help support the development of a pathway for the crew to safely consume produce grown on orbit. Significant background research has been performed on the Veggie plant growth system, with early tests focusing on the development of the rooting pillow concept, and the selection of fertilizer, rooting medium and plant species. More recent testing has been conducted to integrate the pillow concept into the Veggie hardware and to ensure that adequate water is provided throughout the growth cycle. Seed sanitation protocols have been established for flight, and hardware sanitation between experiments has been studied. Methods for shipping and storage of rooting pillows and the development of crew procedures and crew training videos for plant activities on-orbit have been established. Science verification testing was conducted and lettuce plants were successfully grown in prototype Veggie hardware, microbial samples were taken, plant were harvested, frozen, stored and later analyzed for microbial growth, nutrients, and A TP levels. An additional verification test, prior to the final payload verification testing, is desired to demonstrate similar growth in the flight hardware and also to test a second set of pillows containing zinnia seeds. Issues with root mat water supply are being resolved, with final testing and flight scheduled for later in 2013.

  9. Verification of S&D Solutions for Network Communications and Devices

    NASA Astrophysics Data System (ADS)

    Rudolph, Carsten; Compagna, Luca; Carbone, Roberto; Muñoz, Antonio; Repp, Jürgen

    This chapter describes the tool-supported verification of S&D Solutions on the level of network communications and devices. First, the general goals and challenges of verification in the context of AmI systems are highlighted and the role of verification and validation within the SERENITY processes is explained.Then, SERENITY extensions to the SH VErification tool are explained using small examples. Finally, the applicability of existing verification tools is discussed in the context of the AVISPA toolset. The two different tools show that for the security analysis of network and devices S&D Patterns relevant complementary approachesexist and can be used.

  10. Probabilistic Sizing and Verification of Space Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit

    2012-07-01

    Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.

  11. Verification of an on line in vivo semiconductor dosimetry system for TBI with two TLD procedures.

    PubMed

    Sánchez-Doblado, F; Terrón, J A; Sánchez-Nieto, B; Arráns, R; Errazquin, L; Biggs, D; Lee, C; Núñez, L; Delgado, A; Muñiz, J L

    1995-01-01

    This work presents the verification of an on line in vivo dosimetry system based on semiconductors. Software and hardware has been designed to convert the diode signal into absorbed dose. Final verification was made in the form of an intercomparison with two independent thermoluminiscent (TLD) dosimetry systems, under TBI conditions.

  12. Correlation of DCPI with deformation under proof roller loading to assess soft subgrade stabilization criterion : addendum to NCDOT final report 2011-05, entitled : "Field verification of undercut criteria and alternatives for subgrade stabilization in th

    DOT National Transportation Integrated Search

    2016-11-21

    Work presented herein is an addendum to the final report for NCDOT Project 2011-05 entitled : Field Verification of Undercut Criteria and Alternatives for Subgrade Stabilization in the : Piedmont Area. The objective of the addendum work is to p...

  13. Specification and Verification of Web Applications in Rewriting Logic

    NASA Astrophysics Data System (ADS)

    Alpuente, María; Ballis, Demis; Romero, Daniel

    This paper presents a Rewriting Logic framework that formalizes the interactions between Web servers and Web browsers through a communicating protocol abstracting HTTP. The proposed framework includes a scripting language that is powerful enough to model the dynamics of complex Web applications by encompassing the main features of the most popular Web scripting languages (e.g. PHP, ASP, Java Servlets). We also provide a detailed characterization of browser actions (e.g. forward/backward navigation, page refresh, and new window/tab openings) via rewrite rules, and show how our models can be naturally model-checked by using the Linear Temporal Logic of Rewriting (LTLR), which is a Linear Temporal Logic specifically designed for model-checking rewrite theories. Our formalization is particularly suitable for verification purposes, since it allows one to perform in-depth analyses of many subtle aspects related to Web interaction. Finally, the framework has been completely implemented in Maude, and we report on some successful experiments that we conducted by using the Maude LTLR model-checker.

  14. James Webb Space Telescope (JWST) Integrated Science Instruments Module (ISIM) Cryo-Vacuum (CV) Test Campaign Summary

    NASA Technical Reports Server (NTRS)

    Yew, Calinda; Whitehouse, Paul; Lui, Yan; Banks, Kimberly

    2016-01-01

    JWST Integrated Science Instruments Module (ISIM) has completed its system-level testing program at the NASA Goddard Space Flight Center (GSFC). In March 2016, ISIM was successfully delivered for integration with the Optical Telescope Element (OTE) after the successful verification of the system through a series of three cryo-vacuum (CV) tests. The first test served as a risk reduction test; the second test provided the initial verification of the fully-integrated flight instruments; and the third test verified the system in its final flight configuration. The complexity of the mission has generated challenging requirements that demand highly reliable system performance and capabilities from the Space Environment Simulator (SES) vacuum chamber. As JWST progressed through its CV testing campaign, deficiencies in the test configuration and support equipment were uncovered from one test to the next. Subsequent upgrades and modifications were implemented to improve the facility support capabilities required to achieve test requirements. This paper: (1) provides an overview of the integrated mechanical and thermal facility systems required to achieve the objectives of JWST ISIM testing, (2) compares the overall facility performance and instrumentation results from the three ISIM CV tests, and (3) summarizes lessons learned from the ISIM testing campaign.

  15. INF and IAEA: A comparative analysis of verification strategy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Scheinman, L.; Kratzer, M.

    1992-07-01

    This is the final report of a study on the relevance and possible lessons of Intermediate Range Nuclear Force (INF) verification to the International Atomic Energy Agency (IAEA) international safeguards activities.

  16. A field study of the accuracy and reliability of a biometric iris recognition system.

    PubMed

    Latman, Neal S; Herb, Emily

    2013-06-01

    The iris of the eye appears to satisfy the criteria for a good anatomical characteristic for use in a biometric system. The purpose of this study was to evaluate a biometric iris recognition system: Mobile-Eyes™. The enrollment, verification, and identification applications were evaluated in a field study for accuracy and reliability using both irises of 277 subjects. Independent variables included a wide range of subject demographics, ambient light, and ambient temperature. A sub-set of 35 subjects had alcohol-induced nystagmus. There were 2710 identification and verification attempts, which resulted in 1,501,340 and 5540 iris comparisons respectively. In this study, the system successfully enrolled all subjects on the first attempt. All 277 subjects were successfully verified and identified on the first day of enrollment. None of the current or prior eye conditions prevented enrollment, verification, or identification. All 35 subjects with alcohol-induced nystagmus were successfully verified and identified. There were no false verifications or false identifications. Two conditions were identified that potentially could circumvent the use of iris recognitions systems in general. The Mobile-Eyes™ iris recognition system exhibited accurate and reliable enrollment, verification, and identification applications in this study. It may have special applications in subjects with nystagmus. Copyright © 2012 Forensic Science Society. Published by Elsevier Ireland Ltd. All rights reserved.

  17. Integrated guidance, navigation and control verification plan primary flight system. [space shuttle avionics integration

    NASA Technical Reports Server (NTRS)

    1978-01-01

    The verification process and requirements for the ascent guidance interfaces and the ascent integrated guidance, navigation and control system for the space shuttle orbiter are defined as well as portions of supporting systems which directly interface with the system. The ascent phase of verification covers the normal and ATO ascent through the final OMS-2 circularization burn (all of OPS-1), the AOA ascent through the OMS-1 burn, and the RTLS ascent through ET separation (all of MM 601). In addition, OPS translation verification is defined. Verification trees and roadmaps are given.

  18. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  19. Electronic cigarette sales to minors via the internet.

    PubMed

    Williams, Rebecca S; Derrick, Jason; Ribisl, Kurt M

    2015-03-01

    Electronic cigarettes (e-cigarettes) entered the US market in 2007 and, with little regulatory oversight, grew into a $2-billion-a-year industry by 2013. The Centers for Disease Control and Prevention has reported a trend of increasing e-cigarette use among teens, with use rates doubling from 2011 to 2012. While several studies have documented that teens can and do buy cigarettes online, to our knowledge, no studies have yet examined age verification among Internet tobacco vendors selling e-cigarettes. To estimate the extent to which minors can successfully purchase e-cigarettes online and assess compliance with North Carolina's 2013 e-cigarette age-verification law. In this cross-sectional study conducted from February 2014 to June 2014, 11 nonsmoking minors aged 14 to 17 years made supervised e-cigarette purchase attempts from 98 Internet e-cigarette vendors. Purchase attempts were made at the University of North Carolina Internet Tobacco Vendors Study project offices using credit cards. Rate at which minors can successfully purchase e-cigarettes on the Internet. Minors successfully received deliveries of e-cigarettes from 76.5% of purchase attempts, with no attempts by delivery companies to verify their ages at delivery and 95% of delivered orders simply left at the door. All delivered packages came from shipping companies that, according to company policy or federal regulation, do not ship cigarettes to consumers. Of the total orders, 18 failed for reasons unrelated to age verification. Only 5 of the remaining 80 youth purchase attempts were rejected owing to age verification, resulting in a youth buy rate of 93.7%. None of the vendors complied with North Carolina's e-cigarette age-verification law. Minors are easily able to purchase e-cigarettes from the Internet because of an absence of age-verification measures used by Internet e-cigarette vendors. Federal law should require and enforce rigorous age verification for all e-cigarette sales as with the federal PACT (Prevent All Cigarette Trafficking) Act's requirements for age verification in Internet cigarette sales.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR AIR POLLUTION CONTROL TECHNOLOGIES: FINAL REPORT

    EPA Science Inventory

    The technical objective of the Environmental Technology Verification (ETV) Program's Air Pollution Control Technology (APCT) Center is to verify environmental technology performance by obtaining objective quality-assured data, thus providing potential purchasers and permitters wi...

  1. A Practitioners Perspective on Verification

    NASA Astrophysics Data System (ADS)

    Steenburgh, R. A.

    2017-12-01

    NOAAs Space Weather Prediction Center offers a wide range of products and services to meet the needs of an equally wide range of customers. A robust verification program is essential to the informed use of model guidance and other tools by both forecasters and end users alike. In this talk, we present current SWPC practices and results, and examine emerging requirements and potential approaches to satisfy them. We explore the varying verification needs of forecasters and end users, as well as the role of subjective and objective verification. Finally, we describe a vehicle used in the meteorological community to unify approaches to model verification and facilitate intercomparison.

  2. Fundamentals of Successful Monitoring, Reporting, and Verification under a Cap and Trade Program

    EPA Pesticide Factsheets

    Learn about the monitoring, reporting, and verification (MRV) elements as they apply to the Acid Rain Program and the Nox Budget Trading Program, and how they can be potentially used in other programs.

  3. Program to Optimize Simulated Trajectories II (POST2) Surrogate Models for Mars Ascent Vehicle (MAV) Performance Assessment

    NASA Technical Reports Server (NTRS)

    Zwack, M. R.; Dees, P. D.; Thomas, H. D.; Polsgrove, T. P.; Holt, J. B.

    2017-01-01

    The primary purpose of the multiPOST tool is to enable the execution of much larger sets of vehicle cases to allow for broader trade space exploration. However, this exploration is not achieved solely with the increased case throughput. The multiPOST tool is applied to carry out a Design of Experiments (DOE), which is a set of cases that have been structured to capture a maximum amount of information about the design space with minimal computational effort. The results of the DOE are then used to fit a surrogate model, ultimately enabling parametric design space exploration. The approach used for the MAV study includes both DOE and surrogate modeling. First, the primary design considerations for the vehicle were used to develop the variables and ranges for the multiPOST DOE. The final set of DOE variables were carefully selected in order to capture the desired vehicle trades and take into account any special considerations for surrogate modeling. Next, the DOE sets were executed through multiPOST. Following successful completion of the DOE cases, a manual verification trial was performed. The trial involved randomly selecting cases from the DOE set and running them by hand. The results from the human analyst's run and multiPOST were then compared to ensure that the automated runs were being executed properly. Completion of the verification trials was then followed by surrogate model fitting. After fits to the multiPOST data were successfully created, the surrogate models were used as a stand-in for POST2 to carry out the desired MAV trades. Using the surrogate models in lieu of POST2 allowed for visualization of vehicle sensitivities to the input variables as well as rapid evaluation of vehicle performance. Although the models introduce some error into the output of the trade study, they were very effective at identifying areas of interest within the trade space for further refinement by human analysts. The next section will cover all of the ground rules and assumptions associated with DOE setup and multiPOST execution. Section 3.1 gives the final DOE variables and ranges, while section 3.2 addresses the POST2 specific assumptions. The results of the verification trials are given in section 4. Section 5 gives the surrogate model fitting results, including the goodness-of-fit metrics for each fit. Finally, the MAV specific results are discussed in section 6.

  4. End-to-End Verification of Information-Flow Security for C and Assembly Programs

    DTIC Science & Technology

    2016-04-01

    seL4 security verification [18] avoids this issue in the same way. In that work, the authors frame their solution as a restriction that disallows...identical: (σ, σ′1) ∈ TM ∧ (σ, σ′2) ∈ TM =⇒ Ol(σ′1) = Ol(σ′2) The successful security verifications of both seL4 and mCertiKOS provide reasonable...evidence that this restriction on specifications is not a major hindrance for usability. Unlike the seL4 verification, however, our framework runs into a

  5. 33 CFR 385.20 - Restoration Coordination and Verification (RECOVER).

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... Verification (RECOVER). 385.20 Section 385.20 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION... technical team described in the “Final Integrated Feasibility Report and Programmatic Environmental Impact...

  6. 33 CFR 385.20 - Restoration Coordination and Verification (RECOVER).

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Verification (RECOVER). 385.20 Section 385.20 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION... technical team described in the “Final Integrated Feasibility Report and Programmatic Environmental Impact...

  7. 33 CFR 385.20 - Restoration Coordination and Verification (RECOVER).

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... Verification (RECOVER). 385.20 Section 385.20 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION... technical team described in the “Final Integrated Feasibility Report and Programmatic Environmental Impact...

  8. 33 CFR 385.20 - Restoration Coordination and Verification (RECOVER).

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... Verification (RECOVER). 385.20 Section 385.20 Navigation and Navigable Waters CORPS OF ENGINEERS, DEPARTMENT OF THE ARMY, DEPARTMENT OF DEFENSE PROGRAMMATIC REGULATIONS FOR THE COMPREHENSIVE EVERGLADES RESTORATION... technical team described in the “Final Integrated Feasibility Report and Programmatic Environmental Impact...

  9. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Report: Final Comprehensive Performance Test Report, P/N 1331720-2TST, S/N 105/A1

    NASA Technical Reports Server (NTRS)

    Platt, R.

    1999-01-01

    This is the Performance Verification Report, Final Comprehensive Performance Test (CPT) Report, for the Integrated Advanced Microwave Sounding Unit-A (AMSU-A). This specification establishes the requirements for the CPT and Limited Performance Test (LPT) of the AMSU-1A, referred to here in as the unit. The sequence in which the several phases of this test procedure shall take place is shown.

  10. Improvement on post-OPC verification efficiency for contact/via coverage check by final CD biasing of metal lines and considering their location on the metal layout

    NASA Astrophysics Data System (ADS)

    Kim, Youngmi; Choi, Jae-Young; Choi, Kwangseon; Choi, Jung-Hoe; Lee, Sooryong

    2011-04-01

    As IC design complexity keeps increasing, it is more and more difficult to ensure the pattern transfer after optical proximity correction (OPC) due to the continuous reduction of layout dimensions and lithographic limitation by k1 factor. To guarantee the imaging fidelity, resolution enhancement technologies (RET) such as off-axis illumination (OAI), different types of phase shift masks and OPC technique have been developed. In case of model-based OPC, to cross-confirm the contour image versus target layout, post-OPC verification solutions continuously keep developed - contour generation method and matching it to target structure, method for filtering and sorting the patterns to eliminate false errors and duplicate patterns. The way to detect only real errors by excluding false errors is the most important thing for accurate and fast verification process - to save not only reviewing time and engineer resource, but also whole wafer process time and so on. In general case of post-OPC verification for metal-contact/via coverage (CC) check, verification solution outputs huge of errors due to borderless design, so it is too difficult to review and correct all points of them. It should make OPC engineer to miss the real defect, and may it cause the delay time to market, at least. In this paper, we studied method for increasing efficiency of post-OPC verification, especially for the case of CC check. For metal layers, final CD after etch process shows various CD bias, which depends on distance with neighbor patterns, so it is more reasonable that consider final metal shape to confirm the contact/via coverage. Through the optimization of biasing rule for different pitches and shapes of metal lines, we could get more accurate and efficient verification results and decrease the time for review to find real errors. In this paper, the suggestion in order to increase efficiency of OPC verification process by using simple biasing rule to metal layout instead of etch model application is presented.

  11. Electronic Cigarette Sales to Minors via the Internet

    PubMed Central

    Williams, Rebecca S.; Derrick, Jason; Ribisl, Kurt M.

    2015-01-01

    Importance Electronic cigarettes (e-cigarettes) entered the US market in 2007 and, with little regulatory oversight, grew into a $2-billion-a-year industry by 2013. The Centers for Disease Control and Prevention has reported a trend of increasing e-cigarette use among teens, with use rates doubling from 2011 to 2012. While several studies have documented that teens can and do buy cigarettes online, to our knowledge, no studies have yet examined age verification among Internet tobacco vendors selling e-cigarettes. Objective To estimate the extent to which minors can successfully purchase e-cigarettes online and assess compliance with North Carolina's 2013 e-cigarette age-verification law. Design, Setting, and Participants In this cross-sectional study conducted from February 2014 to June 2014, 11 nonsmoking minors aged 14 to 17 years made supervised e-cigarette purchase attempts from 98 Internet e-cigarette vendors. Purchase attempts were made at the University of North Carolina Internet Tobacco Vendors Study project offices using credit cards. Main Outcome and Measure Rate at which minors can successfully purchase e-cigarettes on the Internet. Results Minors successfully received deliveries of e-cigarettes from 76.5% of purchase attempts, with no attempts by delivery companies to verify their ages at delivery and 95% of delivered orders simply left at the door. All delivered packages came from shipping companies that, according to company policy or federal regulation, do not ship cigarettes to consumers. Of the total orders, 18 failed for reasons unrelated to age verification. Only 5 of the remaining 80 youth purchase attempts were rejected owing to age verification, resulting in a youth buy rate of 93.7%. None of the vendors complied with North Carolina's e-cigarette age-verification law. Conclusions and Relevance Minors are easily able to purchase e-cigarettes from the Internet because of an absence of age-verification measures used by Internet e-cigarette vendors. Federal law should require and enforce rigorous age verification for all e-cigarette sales as with the federal PACT (Prevent All Cigarette Trafficking) Act's requirements for age verification in Internet cigarette sales. PMID:25730697

  12. Assessment of Intralaminar Progressive Damage and Failure Analysis Using an Efficient Evaluation Framework

    NASA Technical Reports Server (NTRS)

    Hyder, Imran; Schaefer, Joseph; Justusson, Brian; Wanthal, Steve; Leone, Frank; Rose, Cheryl

    2017-01-01

    Reducing the timeline for development and certification for composite structures has been a long standing objective of the aerospace industry. This timeline can be further exacerbated when attempting to integrate new fiber-reinforced composite materials due to the large number of testing required at every level of design. computational progressive damage and failure analysis (PDFA) attempts to mitigate this effect; however, new PDFA methods have been slow to be adopted in industry since material model evaluation techniques have not been fully defined. This study presents an efficient evaluation framework which uses a piecewise verification and validation (V&V) approach for PDFA methods. Specifically, the framework is applied to evaluate PDFA research codes within the context of intralaminar damage. Methods are incrementally taken through various V&V exercises specifically tailored to study PDFA intralaminar damage modeling capability. Finally, methods are evaluated against a defined set of success criteria to highlight successes and limitations.

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION: A VEHICLE FOR INDEPENDENT, CREDIBLE PERFORMANCE RESULTS ON COMMERCIALLY READY TECHNOLOGIES

    EPA Science Inventory

    The paper discusses the U. S. Environmental Protection Agency's Environmental Technology Verification (ETV) Program: its history, operations, past successes, and future plans. Begun in 1995 in response to President Clinton's "Bridge to a Sustainable Future" as a means to work wit...

  14. 78 FR 5409 - Ongoing Equivalence Verifications of Foreign Food Regulatory Systems

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-25

    ... of data shared. Finally, with respect to POE re-inspections, NACMPI recommended the targeting of high-risk product and high-risk imports for sampling and other verification activities during reinspection... authority; the availability of contingency plans in the country for containing and mitigating the effects of...

  15. Role Delineation Refinement and Verification. The Comprehensive Report. Final Report, October 1, 1978-July 31, 1980.

    ERIC Educational Resources Information Center

    Garrett, Gary L.; Zinsmeister, Joanne T.

    This document reports research focusing on physical therapists and physical therapist assistant role delineation refinement and verification; entry-level role determinations; and translation of these roles into an examination development protocol and examination blueprint specifications. Following an introduction, section 2 describes the survey…

  16. Verifying the INF and START treaties

    NASA Astrophysics Data System (ADS)

    Ifft, Edward

    2014-05-01

    The INF and START Treaties form the basis for constraints on nuclear weapons. Their verification provisions are one of the great success stories of modern arms control and will be an important part of the foundation upon which the verification regime for further constraints on nuclear weapons will be constructed.

  17. Verifying the INF and START treaties

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ifft, Edward

    The INF and START Treaties form the basis for constraints on nuclear weapons. Their verification provisions are one of the great success stories of modern arms control and will be an important part of the foundation upon which the verification regime for further constraints on nuclear weapons will be constructed.

  18. Letter Report - Verification Survey of Final Grids at the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2009-02-17

    Conduct verification surveys of grids at the DWI 1630 Site in Knoxville, Tennessee. The independent verification team (IVT) from ORISE, conducted verification activities in whole and partial grids, as completed by BJC. ORISE site activities included gamma surface scans and soil sampling within 33 grids; G11 through G14; H11 through H15; X14, X15, X19, and X21; J13 through J15 and J17 through J21; K7 through K9 and K13 through K15; L13 through L15; and M14 through M16

  19. Space station data management system - A common GSE test interface for systems testing and verification

    NASA Technical Reports Server (NTRS)

    Martinez, Pedro A.; Dunn, Kevin W.

    1987-01-01

    This paper examines the fundamental problems and goals associated with test, verification, and flight-certification of man-rated distributed data systems. First, a summary of the characteristics of modern computer systems that affect the testing process is provided. Then, verification requirements are expressed in terms of an overall test philosophy for distributed computer systems. This test philosophy stems from previous experience that was gained with centralized systems (Apollo and the Space Shuttle), and deals directly with the new problems that verification of distributed systems may present. Finally, a description of potential hardware and software tools to help solve these problems is provided.

  20. Rocket Propulsion 21 Steering Committee Meeting (RP21) NASA In-Space Propulsion Update

    NASA Technical Reports Server (NTRS)

    Klem, Mark

    2015-01-01

    In-house Support of NEXT-C Contract Status Thruster NEXT Long Duration Test post-test destructive evaluation in progress Findings will be used to verify service life models identify potential design improvements Cathode heater fabrication initiated for cyclic life testing Thruster operating algorithm definition verification initiated to provide operating procedures for mission users High voltage propellant isolator life test voluntarily terminated after successfully operating 51,200 h Power processor unit (PPU) Replaced all problematic stacked multilayer ceramic dual inline pin capacitors within PPU Test bed Rebuilt installed discharge power supply primary power board Completed full functional performance characterization Final test report in progress Transferred PPU Testbed to contractor to support prototype design effort.

  1. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION COATINGS AND COATING EQUIPMENT PROGRAM (ETV CCEP), FINAL TECHNOLOGY APPLICATIONS GROUP TAGNITE--TESTING AND QUALITY ASSURANCE PLAN (T/QAP)

    EPA Science Inventory

    The overall objective of the Environmental Testing and Verification Coatings and Coating Equipment Program is to verify pollution prevention and performance characteristics of coating technologies and make the results of the testing available to prospective coating technology use...

  3. 75 FR 42575 - Electronic Signature and Storage of Form I-9, Employment Eligibility Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ... Electronic Signature and Storage of Form I-9, Employment Eligibility Verification AGENCY: U.S. Immigration... published an interim final rule to permit electronic signature and storage of the Form I-9. 71 FR 34510..., or a combination of paper and electronic systems; Employers may change electronic storage systems as...

  4. 77 FR 64480 - Notice of Final Determination of Sales at Less Than Fair Value: Circular Welded Carbon-Quality...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-22

    ... INFORMATION CONTACT: John Drury or Ericka Ukrow, AD/CVD Operations, Office 7, Import Administration... conducted sales and cost verifications between June 18 and 28, 2012 of the questionnaire responses submitted by Al Jazeera. We used standard verification procedures, including examination of relevant accounting...

  5. Ares I-X Range Safety Simulation Verification and Analysis Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Merry, Carl M.; Tarpley, Ashley F.; Craig, A. Scott; Tartabini, Paul V.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    NASA s Ares I-X vehicle launched on a suborbital test flight from the Eastern Range in Florida on October 28, 2009. To obtain approval for launch, a range safety final flight data package was generated to meet the data requirements defined in the Air Force Space Command Manual 91-710 Volume 2. The delivery included products such as a nominal trajectory, trajectory envelopes, stage disposal data and footprints, and a malfunction turn analysis. The Air Force s 45th Space Wing uses these products to ensure public and launch area safety. Due to the criticality of these data, an independent validation and verification effort was undertaken to ensure data quality and adherence to requirements. As a result, the product package was delivered with the confidence that independent organizations using separate simulation software generated data to meet the range requirements and yielded consistent results. This document captures Ares I-X final flight data package verification and validation analysis, including the methodology used to validate and verify simulation inputs, execution, and results and presents lessons learned during the process

  6. Customized Nudging to Improve FAFSA Completion and Income Verification

    ERIC Educational Resources Information Center

    Page, Lindsay; Castleman, Benjamin L.

    2016-01-01

    For most students from low- or moderate-income families, successfully completing the Free Application for Federal Student Aid (FAFSA) is a crucial gateway on the path to college access. However, FAFSA filing and income verification tasks pose substantial barriers to college access for low-income students. In this paper, the authors report on a…

  7. Formal Methods Specification and Verification Guidebook for Software and Computer Systems. Volume 1; Planning and Technology Insertion

    NASA Technical Reports Server (NTRS)

    1995-01-01

    The Formal Methods Specification and Verification Guidebook for Software and Computer Systems describes a set of techniques called Formal Methods (FM), and outlines their use in the specification and verification of computer systems and software. Development of increasingly complex systems has created a need for improved specification and verification techniques. NASA's Safety and Mission Quality Office has supported the investigation of techniques such as FM, which are now an accepted method for enhancing the quality of aerospace applications. The guidebook provides information for managers and practitioners who are interested in integrating FM into an existing systems development process. Information includes technical and administrative considerations that must be addressed when establishing the use of FM on a specific project. The guidebook is intended to aid decision makers in the successful application of FM to the development of high-quality systems at reasonable cost. This is the first volume of a planned two-volume set. The current volume focuses on administrative and planning considerations for the successful application of FM.

  8. The New Payload Handling System for the G erman On- Orbit Verification Satellite TET with the Sensor Bus as Example for Payloads

    NASA Astrophysics Data System (ADS)

    Heyer, H.-V.; Föckersperger, S.; Lattner, K.; Moldenhauer, W.; Schmolke, J.; Turk, M.; Willemsen, P.; Schlicker, M.; Westerdorff, K.

    2008-08-01

    The technology verification satellite TET (Technologie ErprobungsTräger) is the core element of the German On-Orbit-Verification (OOV) program of new technologies and techniques. The goal of this program is the support of the German space industry and research facilities for on-orbit verification of satellite technologies. The TET satellite is a small satellite developed and built in Germany under leadership of Kayser-Threde. The satellite bus is based on the successfully operated satellite BIRD and the newly developed payload platform with the new payload handling system called NVS (Nutzlastversorgungs-system). The NVS can be detailed in three major parts: the power supply the processor boards and the I/O-interfaces. The NVS is realized via several PCBs in Europe format which are connected to each other via an integrated backplane. The payloads are connected by front connectors to the NVS. This paper describes the concept, architecture, and the hard-/software of the NVS. Phase B of this project was successfully finished last year.

  9. Mathematical calibration procedure of a capacitive sensor-based indexed metrology platform

    NASA Astrophysics Data System (ADS)

    Brau-Avila, A.; Santolaria, J.; Acero, R.; Valenzuela-Galvan, M.; Herrera-Jimenez, V. M.; Aguilar, J. J.

    2017-03-01

    The demand for faster and more reliable measuring tasks for the control and quality assurance of modern production systems has created new challenges for the field of coordinate metrology. Thus, the search for new solutions in coordinate metrology systems and the need for the development of existing ones still persists. One example of such a system is the portable coordinate measuring machine (PCMM), the use of which in industry has considerably increased in recent years, mostly due to its flexibility for accomplishing in-line measuring tasks as well as its reduced cost and operational advantages compared to traditional coordinate measuring machines. Nevertheless, PCMMs have a significant drawback derived from the techniques applied in the verification and optimization procedures of their kinematic parameters. These techniques are based on the capture of data with the measuring instrument from a calibrated gauge object, fixed successively in various positions so that most of the instrument measuring volume is covered, which results in time-consuming, tedious and expensive verification and optimization procedures. In this work the mathematical calibration procedure of a capacitive sensor-based indexed metrology platform (IMP) is presented. This calibration procedure is based on the readings and geometric features of six capacitive sensors and their targets with nanometer resolution. The final goal of the IMP calibration procedure is to optimize the geometric features of the capacitive sensors and their targets in order to use the optimized data in the verification procedures of PCMMs.

  10. Overview of the TOPEX/Poseidon Platform Harvest Verification Experiment

    NASA Technical Reports Server (NTRS)

    Morris, Charles S.; DiNardo, Steven J.; Christensen, Edward J.

    1995-01-01

    An overview is given of the in situ measurement system installed on Texaco's Platform Harvest for verification of the sea level measurement from the TOPEX/Poseidon satellite. The prelaunch error budget suggested that the total root mean square (RMS) error due to measurements made at this verification site would be less than 4 cm. The actual error budget for the verification site is within these original specifications. However, evaluation of the sea level data from three measurement systems at the platform has resulted in unexpectedly large differences between the systems. Comparison of the sea level measurements from the different tide gauge systems has led to a better understanding of the problems of measuring sea level in relatively deep ocean. As of May 1994, the Platform Harvest verification site has successfully supported 60 TOPEX/Poseidon overflights.

  11. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.

  12. Safety Verification of a Fault Tolerant Reconfigurable Autonomous Goal-Based Robotic Control System

    NASA Technical Reports Server (NTRS)

    Braman, Julia M. B.; Murray, Richard M; Wagner, David A.

    2007-01-01

    Fault tolerance and safety verification of control systems are essential for the success of autonomous robotic systems. A control architecture called Mission Data System (MDS), developed at the Jet Propulsion Laboratory, takes a goal-based control approach. In this paper, a method for converting goal network control programs into linear hybrid systems is developed. The linear hybrid system can then be verified for safety in the presence of failures using existing symbolic model checkers. An example task is simulated in MDS and successfully verified using HyTech, a symbolic model checking software for linear hybrid systems.

  13. The capability of lithography simulation based on MVM-SEM® system

    NASA Astrophysics Data System (ADS)

    Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong

    2015-10-01

    The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.

  14. Advanced technology development multi-color holography

    NASA Technical Reports Server (NTRS)

    Vikram, Chandra S.

    1994-01-01

    Several key aspects of multi-color holography and some non-conventional ways to study the holographic reconstructions are considered. The error analysis of three-color holography is considered in detail with particular example of a typical triglycine sulfate crystal growth situation. For the numerical analysis of the fringe patterns, a new algorithm is introduced with experimental verification using sugar-water solution. The role of the phase difference among component holograms is also critically considered with examples of several two- and three-color situations. The status of experimentation on two-color holography and fabrication of a small breadboard system is also reported. Finally, some successful demonstrations of unconventional ways to study holographic reconstructions are described. These methods are deflectometry and confocal optical processing using some Spacelab III holograms.

  15. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2009-04-29

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site wasmore » removed in its entirety.« less

  16. Systematic study of source mask optimization and verification flows

    NASA Astrophysics Data System (ADS)

    Ben, Yu; Latypov, Azat; Chua, Gek Soon; Zou, Yi

    2012-06-01

    Source mask optimization (SMO) emerged as powerful resolution enhancement technique (RET) for advanced technology nodes. However, there is a plethora of flow and verification metrics in the field, confounding the end user of the technique. Systemic study of different flows and the possible unification thereof is missing. This contribution is intended to reveal the pros and cons of different SMO approaches and verification metrics, understand the commonality and difference, and provide a generic guideline for RET selection via SMO. The paper discusses 3 different type of variations commonly arise in SMO, namely pattern preparation & selection, availability of relevant OPC recipe for freeform source and finally the metrics used in source verification. Several pattern selection algorithms are compared and advantages of systematic pattern selection algorithms are discussed. In the absence of a full resist model for SMO, alternative SMO flow without full resist model is reviewed. Preferred verification flow with quality metrics of DOF and MEEF is examined.

  17. 30 CFR 250.911 - If my platform is subject to the Platform Verification Program, what must I do?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... a project management timeline, Gantt Chart, that depicts when interim and final reports required by... 30 Mineral Resources 2 2010-07-01 2010-07-01 false If my platform is subject to the Platform Verification Program, what must I do? 250.911 Section 250.911 Mineral Resources MINERALS MANAGEMENT SERVICE...

  18. Automated Installation Verification of COMSOL via LiveLink for MATLAB

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Crowell, Michael W

    Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oakmore » Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).« less

  19. IT Project Success w\\7120 and 7123 NPRs to Achieve Project Success

    NASA Technical Reports Server (NTRS)

    Walley, Tina L.

    2009-01-01

    This slide presentation reviews management techniques to assure information technology development project success. Details include the work products, the work breakdown structure (WBS), system integration, verification and validation (IV&V), and deployment and operations. An example, the NASA Consolidated Active Directory (NCAD), is reviewed.

  20. Partner verification: restoring shattered images of our intimates.

    PubMed

    De La Ronde, C; Swann, W B

    1998-08-01

    When spouses received feedback that disconfirmed their impressions of their partners, they attempted to undermine that feedback during subsequent interactions with these partners. Such partner verification activities occurred whether partners construed the feedback as overly favorable or overly unfavorable. Furthermore, because spouses tended to see their partners as their partners saw themselves, their efforts to restore their impressions of partners often worked hand-in-hand with partners' efforts to verify their own views. Finally, support for self-verification theory emerged in that participants were more intimate with spouses who verified their self-views, whether their self-views happened to be positive or negative.

  1. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models: Appendices

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Verification and validation (V&V) is a highly challenging undertaking for SLS structural dynamics models due to the magnitude and complexity of SLS subassemblies and subassemblies. Responses to challenges associated with V&V of Space Launch System (SLS) structural dynamics models are presented in Volume I of this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA). (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976). (3) Mode Consolidation (MC). Finally, (4) Experimental Mode Verification (EMV). This document contains the appendices to Volume I.

  2. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  3. Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)

    NASA Astrophysics Data System (ADS)

    Selvy, Brian M.; Claver, Charles; Angeli, George

    2014-08-01

    This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.

  4. Neoclassical toroidal viscosity calculations in tokamaks using a δf Monte Carlo simulation and their verifications.

    PubMed

    Satake, S; Park, J-K; Sugama, H; Kanno, R

    2011-07-29

    Neoclassical toroidal viscosities (NTVs) in tokamaks are investigated using a δf Monte Carlo simulation, and are successfully verified with a combined analytic theory over a wide range of collisionality. A Monte Carlo simulation has been required in the study of NTV since the complexities in guiding-center orbits of particles and their collisions cannot be fully investigated by any means of analytic theories alone. Results yielded the details of the complex NTV dependency on particle precessions and collisions, which were predicted roughly in a combined analytic theory. Both numerical and analytic methods can be utilized and extended based on these successful verifications.

  5. Comments for A Conference on Verification in the 21st Century

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, James E.

    2012-06-12

    The author offers 5 points for the discussion of Verification and Technology: (1) Experience with the implementation of arms limitation and arms reduction agreements confirms that technology alone has never been relied upon to provide effective verification. (2) The historical practice of verification of arms control treaties between Cold War rivals may constrain the cooperative and innovative use of technology for transparency, veification and confidence building in the future. (3) An area that has been identified by many, including the US State Department and NNSA as being rich for exploration for potential uses of technology for transparency and verification ismore » information and communications technology (ICT). This includes social media, crowd-sourcing, the internet of things, and the concept of societal verification, but there are issues. (4) On the issue of the extent to which verification technologies are keeping pace with the demands of future protocols and agrements I think the more direct question is ''are they effective in supporting the objectives of the treaty or agreement?'' In this regard it is important to acknowledge that there is a verification grand challenge at our doorstep. That is ''how does one verify limitations on nuclear warheads in national stockpiles?'' (5) Finally, while recognizing the daunting political and security challenges of such an approach, multilateral engagement and cooperation at the conceptual and technical levels provides benefits for addressing future verification challenges.« less

  6. Landing System Development- Design and Test Prediction of a Lander Leg Using Nonlinear Analysis

    NASA Astrophysics Data System (ADS)

    Destefanis, Stefano; Buchwald, Robert; Pellegrino, Pasquale; Schroder, Silvio

    2014-06-01

    Several mission studies have been performed focusing on a soft and precision landing using landing legs. Examples for such missions are Mars Sample Return scenarios (MSR), Lunar landing scenarios (MoonNEXT, Lunar Lander) and small body sample return studies (Marco Polo, MMSR, Phootprint). Such missions foresee a soft landing on the planet surface for delivering payload in a controlled manner and limiting the landing loads.To ensure a successful final landing phase, a landing system is needed, capable of absorbing the residual velocities (vertical, horizontal and angular) at touch- down, and insuring a controlled attitude after landing. Such requirements can be fulfilled by using landing legs with adequate damping.The Landing System Development (LSD) study, currently in its phase 2, foresees the design, analysis, verification, manufacturing and testing of a representative landing leg breadboard based on the Phase B design of the ESA Lunar Lander. Drop tests of a single leg will be performed both on rigid and soft ground, at several impact angles. The activity is covered under ESA contract with TAS-I as Prime Contractor, responsible for analysis and verification, Astrium GmbH for design and test and QinetiQ Space for manufacturing. Drop tests will be performed at the Institute of Space Systems of the German Aerospace Center (DLR-RY) in Bremen.This paper presents an overview of the analytical simulations (test predictions and design verification) performed, comparing the results produced by Astrium made multi body model (rigid bodies, nonlinearities accounted for in mechanical joints and force definitions, based on development tests) and TAS-I made nonlinear explicit model (fully deformable bodies).

  7. Integrated Advanced Microwave Sounding Unit-A (AMSU-A). Performance Verification Reports: Final Comprehensive Performance Test Report, P/N: 1356006-1, S.N: 202/A2

    NASA Technical Reports Server (NTRS)

    Platt, R.

    1998-01-01

    This is the Performance Verification Report. the process specification establishes the requirements for the comprehensive performance test (CPT) and limited performance test (LPT) of the earth observing system advanced microwave sounding unit-A2 (EOS/AMSU-A2), referred to as the unit. The unit is defined on drawing 1356006.

  8. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  9. Mars Exploration Rover Entry, Descent, and Landing: A Thermal Perspective

    NASA Technical Reports Server (NTRS)

    Tsuyuki, Glenn T.; Sunada, Eric T.; Novak, Keith S.; Kinsella, Gary M.; Phillip, Charles J.

    2005-01-01

    Perhaps the most challenging mission phase for the Mars Exploration Rovers was the Entry, Descent, and Landing (EDL). During this phase, the entry vehicle attached to its cruise stage was transformed into a stowed tetrahedral Lander that was surrounded by inflated airbags through a series of complex events. There was only one opportunity to successfully execute an automated command sequence without any possible ground intervention. The success of EDL was reliant upon the system thermal design: 1) to thermally condition EDL hardware from cruise storage temperatures to operating temperature ranges; 2) to maintain the Rover electronics within operating temperature ranges without the benefit of the cruise single phase cooling loop, which had been evacuated in preparation for EDL; and 3) to maintain the cruise stage propulsion components for the critical turn to entry attitude. Since the EDL architecture was inherited from Mars Pathfinder (MPF), the initial EDL thermal design would be inherited from MPF. However, hardware and implementation differences from MPF ultimately changed the MPF inheritance approach for the EDL thermal design. With the lack of full inheritance, the verification and validation of the EDL thermal design took on increased significance. This paper will summarize the verification and validation approach for the EDL thermal design along with applicable system level thermal testing results as well as appropriate thermal analyses. In addition, the lessons learned during the system-level testing will be discussed. Finally, the in-flight EDL experiences of both MER-A and -B missions (Spirit and Opportunity, respectively) will be presented, demonstrated how lessons learned from Spirit were applied to Opportunity.

  10. Integration and verification testing of the Large Synoptic Survey Telescope camera

    NASA Astrophysics Data System (ADS)

    Lange, Travis; Bond, Tim; Chiang, James; Gilmore, Kirk; Digel, Seth; Dubois, Richard; Glanzman, Tom; Johnson, Tony; Lopez, Margaux; Newbry, Scott P.; Nordby, Martin E.; Rasmussen, Andrew P.; Reil, Kevin A.; Roodman, Aaron J.

    2016-08-01

    We present an overview of the Integration and Verification Testing activities of the Large Synoptic Survey Telescope (LSST) Camera at the SLAC National Accelerator Lab (SLAC). The LSST Camera, the sole instrument for LSST and under construction now, is comprised of a 3.2 Giga-pixel imager and a three element corrector with a 3.5 degree diameter field of view. LSST Camera Integration and Test will be taking place over the next four years, with final delivery to the LSST observatory anticipated in early 2020. We outline the planning for Integration and Test, describe some of the key verification hardware systems being developed, and identify some of the more complicated assembly/integration activities. Specific details of integration and verification hardware systems will be discussed, highlighting some of the technical challenges anticipated.

  11. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  12. A preliminary study on the use of FX-Glycine gel and an in-house optical cone beam CT readout for IMRT and RapidArc verification

    NASA Astrophysics Data System (ADS)

    Ravindran, Paul B.; Ebenezer, Suman Babu S.; Winfred, Michael Raj; Amalan, S.

    2017-05-01

    The radiochromic FX gel with Optical CT readout has been investigated by several authors and has shown promising results for 3D dosimetry. One of the applications of the gel dosimeters is their use in 3D dose verification for IMRT and RapidArc quality assurance. Though polymer gel has been used successfully for clinical dose verification, the use of FX gel for clinical dose verification with optical cone beam CT needs further validation. In this work, we have used FX gel and an in- house optical readout system for gamma analysis between the dose matrices of measured dose distribution and a treatment planning system (TPS) calculated dose distribution for a few test cases.

  13. Application of computer vision to automatic prescription verification in pharmaceutical mail order

    NASA Astrophysics Data System (ADS)

    Alouani, Ali T.

    2005-05-01

    In large volume pharmaceutical mail order, before shipping out prescriptions, licensed pharmacists ensure that the drug in the bottle matches the information provided in the patient prescription. Typically, the pharmacist has about 2 sec to complete the prescription verification process of one prescription. Performing about 1800 prescription verification per hour is tedious and can generate human errors as a result of visual and brain fatigue. Available automatic drug verification systems are limited to a single pill at a time. This is not suitable for large volume pharmaceutical mail order, where a prescription can have as many as 60 pills and where thousands of prescriptions are filled every day. In an attempt to reduce human fatigue, cost, and limit human error, the automatic prescription verification system (APVS) was invented to meet the need of large scale pharmaceutical mail order. This paper deals with the design and implementation of the first prototype online automatic prescription verification machine to perform the same task currently done by a pharmacist. The emphasis here is on the visual aspects of the machine. The system has been successfully tested on 43,000 prescriptions.

  14. TRMM Solar Array Panels

    NASA Technical Reports Server (NTRS)

    1998-01-01

    This final report presents conclusions/recommendations concerning the TRMM Solar Array; deliverable list and schedule summary; waivers and deviations; as-shipped performance data, including flight panel verification matrix, panel output detail, shadow test summary, humidity test summary, reverse bias test panel; and finally, quality assurance summary.

  15. Building the Qualification File of EGNOS with DOORS

    NASA Astrophysics Data System (ADS)

    Fabre, J.

    2008-08-01

    EGNOS, the European Satellite-Based Augmentation System (SBAS) to GPS, is getting to its final deployment and being initially operated towards qualification and certification to reach operational capability by 2008/2009. A very important milestone in the development process is the System Qualification Review (QR). As the verification phase aims at demonstrating that the EGNOS System design meets the applicable requirements, the QR declares the completion of verification activities. The main document to present at QR is a consolidated, consistent and complete Qualification file. The information included shall give confidence to the QR reviewers that the performed qualification activities are completed. Therefore, an important issue for the project team is to focus on synthetic and consistent information, and to make the presentation as clear as possible. Traceability to applicable requirements shall be systematically presented. Moreover, in order to support verification justification, reference to details shall be available, and the reviewer shall have the possibility to link automatically to the documents including this detailed information. In that frame, Thales Alenia Space has implemented a strong support in terms of methodology and tool, to provide to System Engineering and Verification teams a single reference technical database, in which all team members consult the applicable requirements, compliance, justification, design data and record the information necessary to build the final Qualification file. This paper presents the EGNOS context, the Qualification file contents, and the methodology implemented, based on Thales Alenia Space practices and in line with ECSS. Finally, it shows how the Qualification file is built in a DOORS environment.

  16. Verification of the databases EXFOR and ENDF

    NASA Astrophysics Data System (ADS)

    Berton, Gottfried; Damart, Guillaume; Cabellos, Oscar; Beauzamy, Bernard; Soppera, Nicolas; Bossant, Manuel

    2017-09-01

    The objective of this work is for the verification of large experimental (EXFOR) and evaluated nuclear reaction databases (JEFF, ENDF, JENDL, TENDL…). The work is applied to neutron reactions in EXFOR data, including threshold reactions, isomeric transitions, angular distributions and data in the resonance region of both isotopes and natural elements. Finally, a comparison of the resonance integrals compiled in EXFOR database with those derived from the evaluated libraries is also performed.

  17. 40 CFR 1065.308 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers not...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ...) AIR POLLUTION CONTROLS ENGINE-TESTING PROCEDURES Calibrations and Verifications § 1065.308 Continuous... adjusted to account for the dilution from ambient air drawn into the probe. We recommend you use the final... gases diluted in air. You may use a multi-gas span gas, such as NO-CO-CO2-C3H8-CH4, to verify multiple...

  18. An assessment of space shuttle flight software development processes

    NASA Technical Reports Server (NTRS)

    1993-01-01

    In early 1991, the National Aeronautics and Space Administration's (NASA's) Office of Space Flight commissioned the Aeronautics and Space Engineering Board (ASEB) of the National Research Council (NRC) to investigate the adequacy of the current process by which NASA develops and verifies changes and updates to the Space Shuttle flight software. The Committee for Review of Oversight Mechanisms for Space Shuttle Flight Software Processes was convened in Jan. 1992 to accomplish the following tasks: (1) review the entire flight software development process from the initial requirements definition phase to final implementation, including object code build and final machine loading; (2) review and critique NASA's independent verification and validation process and mechanisms, including NASA's established software development and testing standards; (3) determine the acceptability and adequacy of the complete flight software development process, including the embedded validation and verification processes through comparison with (1) generally accepted industry practices, and (2) generally accepted Department of Defense and/or other government practices (comparing NASA's program with organizations and projects having similar volumes of software development, software maturity, complexity, criticality, lines of code, and national standards); (4) consider whether independent verification and validation should continue. An overview of the study, independent verification and validation of critical software, and the Space Shuttle flight software development process are addressed. Findings and recommendations are presented.

  19. Six-man, self-contained carbon dioxide concentrator subsystem for Space Station Prototype (SSP) application

    NASA Technical Reports Server (NTRS)

    Kostell, G. D.; Schubert, F. H.; Shumar, J. W.; Hallick, T. M.; Jensen, F. C.

    1974-01-01

    A six man, self contained, electrochemical carbon dioxide concentrating subsystem for space station prototype use was successfully designed, fabricated, and tested. A test program was successfully completed which covered shakedown testing, design verification testing, and acceptance testing.

  20. Restoration of mangrove forest landscape in Babulu Laut village, sub district of Babulu, Penajam Paser Utara district

    NASA Astrophysics Data System (ADS)

    Febrina, W. K.; Marjenah; Sumaryono

    2018-04-01

    The reforestation activities of mangrove forest carried out in various regions have not been well known as the success and influence of landscape in rehabilitation area. Utilization of existing land along the coastal Babulu Laut Village has reduced the area of mangrove forest from day to day. Due to the use of land by the community without considering the conservation aspect causes the loss of mangrove forest. This study aims to determine the final condition of the success rate of forest and land rehabilitation, land cover and the benefits of mangrove forest restoration for the surrounding people. The research method used is the preparation and orientation of research location, data input, codefication, data processing, the field verification and analysis of data. The results of the execution of the inventory mangrove in 22 research location in the Babulu Laut Village, Babulu Subdistrict, Penajam Paser Utara District of 125 ha of plant a whole is kind of Rhizophora sp, where the intensity of sampling 2% with the growing plants of 65.92 %or 2,175 stem/ha then success rate of Mangrove Forest Rehabilitation at Babulu Laut Village Babulu Subdistrict is medium level (55-75%).

  1. Calibration and verification of models of organic carbon removal kinetics in Aerated Submerged Fixed-Bed Biofilm Reactors (ASFBBR): a case study of wastewater from an oil-refinery.

    PubMed

    Trojanowicz, Karol; Wójcik, Włodzimierz

    2011-01-01

    The article presents a case-study on the calibration and verification of mathematical models of organic carbon removal kinetics in biofilm. The chosen Harremöes and Wanner & Reichert models were calibrated with a set of model parameters obtained both during dedicated studies conducted at pilot- and lab-scales for petrochemical wastewater conditions and from the literature. Next, the models were successfully verified through studies carried out utilizing a pilot ASFBBR type bioreactor installed in an oil-refinery wastewater treatment plant. During verification the pilot biofilm reactor worked under varying surface organic loading rates (SOL), dissolved oxygen concentrations and temperatures. The verification proved that the models can be applied in practice to petrochemical wastewater treatment engineering for e.g. biofilm bioreactor dimensioning.

  2. Design of verification platform for wireless vision sensor networks

    NASA Astrophysics Data System (ADS)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  3. EURATOM safeguards efforts in the development of spent fuel verification methods by non-destructive assay

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matloch, L.; Vaccaro, S.; Couland, M.

    The back end of the nuclear fuel cycle continues to develop. The European Commission, particularly the Nuclear Safeguards Directorate of the Directorate General for Energy, implements Euratom safeguards and needs to adapt to this situation. The verification methods for spent nuclear fuel, which EURATOM inspectors can use, require continuous improvement. Whereas the Euratom on-site laboratories provide accurate verification results for fuel undergoing reprocessing, the situation is different for spent fuel which is destined for final storage. In particular, new needs arise from the increasing number of cask loadings for interim dry storage and the advanced plans for the construction ofmore » encapsulation plants and geological repositories. Various scenarios present verification challenges. In this context, EURATOM Safeguards, often in cooperation with other stakeholders, is committed to further improvement of NDA methods for spent fuel verification. In this effort EURATOM plays various roles, ranging from definition of inspection needs to direct participation in development of measurement systems, including support of research in the framework of international agreements and via the EC Support Program to the IAEA. This paper presents recent progress in selected NDA methods. These methods have been conceived to satisfy different spent fuel verification needs, ranging from attribute testing to pin-level partial defect verification. (authors)« less

  4. Final report from VFL Technologies for the pilot-scale thermal treatment of Lower East Fork Poplar Creek floodplain soils. LEFPC appendices. Volume 5. Appendix V-D

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-09-01

    This final report from VFL Technologies for the pilot-scale thermal treatment of lower East Fork Poplar Creek floodplain soils dated September 1994 contains LEFPC Appendices, Volume 5, Appendix V - D. This appendix includes the final verification run data package (PAH, TCLP herbicides, TCLP pesticides).

  5. Automatic Methods and Tools for the Verification of Real Time Systems

    DTIC Science & Technology

    1997-07-31

    real - time systems . This was accomplished by extending techniques, based on automata theory and temporal logic, that have been successful for the verification of time-independent reactive systems. As system specification lanmaage for embedded real - time systems , we introduced hybrid automata, which equip traditional discrete automata with real-numbered clock variables and continuous environment variables. As requirements specification languages, we introduced temporal logics with clock variables for expressing timing constraints.

  6. Practical Application of Model Checking in Software Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Skakkebaek, Jens Ulrik

    1999-01-01

    This paper presents our experiences in applying the JAVA PATHFINDER (J(sub PF)), a recently developed JAVA to SPIN translator, in the finding of synchronization bugs in a Chinese Chess game server application written in JAVA. We give an overview of J(sub PF) and the subset of JAVA that it supports and describe the abstraction and verification of the game server. Finally, we analyze the results of the effort. We argue that abstraction by under-approximation is necessary for abstracting sufficiently smaller models for verification purposes; that user guidance is crucial for effective abstraction; and that current model checkers do not conveniently support the computational models of software in general and JAVA in particular.

  7. SU-E-J-115: Graticule for Verification of Treatment Position in Neutron Therapy.

    PubMed

    Halford, R; Snyder, M

    2012-06-01

    Until recently the treatment verification for patients undergoing fast neutron therapy at our facility was accomplished through a combination of neutron beam portal films aligned with a graticule mounted on an orthronormal x-ray tube. To eliminate uncertainty with respect to the relative positions of the x-ray graticule and the therapy beam, we have developed a graticule which is placed in the neutron beam itself. For a graticule to be visible on the portal film, the attenuation of the neutron beam by the graticule landmarks must be significantly greater than that of the material in which the landmarks are mounted. Various materials, thicknesses, and mounting points were tried to gain the largest contrast between the graticule landmarks and the mounting material. The final design involved 2 inch steel pins of 0.125 inch diameter captured between two parallel plates of 0.25 inch thick clear acrylic plastic. The distance between the two acrylic plates was 1.625 inches, held together at the perimeter with acrylic sidewall spacers. This allowed the majority of length of the steel pins to be surrounded by air. The pins were set 1 cm apart and mounted at angles parallel to the divergence of the beam dependent on their position within the array. The entire steel pin and acrylic plate assembly was mounted on an acrylic accessory tray to allow for graticule alignment. Despite the inherent difficulties in attenuating fast neutrons, our simple graticule design produces the required difference of attenuation between the arrays of landmarks and the mounting material. The graticule successfully provides an in-beam frame of reference for patient portal verification. © 2012 American Association of Physicists in Medicine.

  8. Final report from VFL Technologies for the pilot-scale thermal treatment of lower East Fork Poplar Creek floodplain soils. LEFPC appendices, Volume 4, Appendix V-C

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1994-09-01

    This is the the final verification run data package for pilot scale thermal treatment of lower East Fork Poplar Creek floodplain soils. Included are data on volatiles, semivolatiles, and TCLP volatiles.

  9. Expert system verification and validation study. Delivery 3A and 3B: Trip summaries

    NASA Technical Reports Server (NTRS)

    French, Scott

    1991-01-01

    Key results are documented from attending the 4th workshop on verification, validation, and testing. The most interesting part of the workshop was when representatives from the U.S., Japan, and Europe presented surveys of VV&T within their respective regions. Another interesting part focused on current efforts to define industry standards for artificial intelligence and how that might affect approaches to VV&T of expert systems. The next part of the workshop focused on VV&T methods of applying mathematical techniques to verification of rule bases and techniques for capturing information relating to the process of developing software. The final part focused on software tools. A summary is also presented of the EPRI conference on 'Methodologies, Tools, and Standards for Cost Effective Reliable Software Verification and Validation. The conference was divided into discussion sessions on the following issues: development process, automated tools, software reliability, methods, standards, and cost/benefit considerations.

  10. Simulation validation and management

    NASA Astrophysics Data System (ADS)

    Illgen, John D.

    1995-06-01

    Illgen Simulation Technologies, Inc., has been working interactive verification and validation programs for the past six years. As a result, they have evolved a methodology that has been adopted and successfully implemented by a number of different verification and validation programs. This methodology employs a unique case of computer-assisted software engineering (CASE) tools to reverse engineer source code and produce analytical outputs (flow charts and tables) that aid the engineer/analyst in the verification and validation process. We have found that the use of CASE tools saves time,which equate to improvements in both schedule and cost. This paper will describe the ISTI-developed methodology and how CASe tools are used in its support. Case studies will be discussed.

  11. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Latty, Drew, E-mail: drew.latty@health.nsw.gov.au; Stuart, Kirsty E; Westmead Breast Cancer Institute, Sydney, New South Wales

    Radiation treatment to the left breast is associated with increased cardiac morbidity and mortality. The deep inspiration breath-hold technique (DIBH) can decrease radiation dose delivered to the heart and this may facilitate the treatment of the internal mammary chain nodes. The aim of this review is to critically analyse the literature available in relation to breath-hold methods, implementation, utilisation, patient compliance, planning methods and treatment verification of the DIBH technique. Despite variation in the literature regarding the DIBH delivery method, patient coaching, visual feedback mechanisms and treatment verification, all methods of DIBH delivery reduce radiation dose to the heart. Furthermore » research is required to determine optimum protocols for patient training and treatment verification to ensure the technique is delivered successfully.« less

  12. Abstract Model of the SATS Concept of Operations: Initial Results and Recommendations

    NASA Technical Reports Server (NTRS)

    Dowek, Gilles; Munoz, Cesar; Carreno, Victor A.

    2004-01-01

    An abstract mathematical model of the concept of operations for the Small Aircraft Transportation System (SATS) is presented. The Concept of Operations consist of several procedures that describe nominal operations for SATS, Several safety properties of the system are proven using formal techniques. The final goal of the verification effort is to show that under nominal operations, aircraft are safely separated. The abstract model was written and formally verified in the Prototype Verification System (PVS).

  13. Using a Modular Open Systems Approach in Defense Acquisitions: Implications for the Contracting Process

    DTIC Science & Technology

    2006-01-30

    He has taught contract management courses for the UCLA Government Contracts Certificate program and is also a senior faculty member for the Keller...standards for its key interfaces, and has been subjected to successful validation and verification tests to ensure the openness of its key interfaces...widely supported and consensus based standards for its key interfaces, and is subject to validation and verification tests to ensure the openness of its

  14. Design and Verification of Critical Pressurised Windows for Manned Spaceflight

    NASA Astrophysics Data System (ADS)

    Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.

    2014-06-01

    The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.

  15. The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.

    2004-01-01

    The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.

  16. Evolution of Quality Assurance for Clinical Immunohistochemistry in the Era of Precision Medicine: Part 4: Tissue Tools for Quality Assurance in Immunohistochemistry.

    PubMed

    Cheung, Carol C; D'Arrigo, Corrado; Dietel, Manfred; Francis, Glenn D; Fulton, Regan; Gilks, C Blake; Hall, Jacqueline A; Hornick, Jason L; Ibrahim, Merdol; Marchetti, Antonio; Miller, Keith; van Krieken, J Han; Nielsen, Soren; Swanson, Paul E; Taylor, Clive R; Vyberg, Mogens; Zhou, Xiaoge; Torlakovic, Emina E

    2017-04-01

    The numbers of diagnostic, prognostic, and predictive immunohistochemistry (IHC) tests are increasing; the implementation and validation of new IHC tests, revalidation of existing tests, as well as the on-going need for daily quality assurance monitoring present significant challenges to clinical laboratories. There is a need for proper quality tools, specifically tissue tools that will enable laboratories to successfully carry out these processes. This paper clarifies, through the lens of laboratory tissue tools, how validation, verification, and revalidation of IHC tests can be performed in order to develop and maintain high quality "fit-for-purpose" IHC testing in the era of precision medicine. This is the final part of the 4-part series "Evolution of Quality Assurance for Clinical Immunohistochemistry in the Era of Precision Medicine."

  17. The influence of verification jig on framework fit for nonsegmented fixed implant-supported complete denture.

    PubMed

    Ercoli, Carlo; Geminiani, Alessandro; Feng, Changyong; Lee, Heeje

    2012-05-01

    The purpose of this retrospective study was to assess if there was a difference in the likelihood of achieving passive fit when an implant-supported full-arch prosthesis framework is fabricated with or without the aid of a verification jig. This investigation was approved by the University of Rochester Research Subject Review Board (protocol #RSRB00038482). Thirty edentulous patients, 49 to 73 years old (mean 61 years old), rehabilitated with a nonsegmented fixed implant-supported complete denture were included in the study. During the restorative process, final impressions were made using the pickup impression technique and elastomeric impression materials. For 16 patients, a verification jig was made (group J), while for the remaining 14 patients, a verification jig was not used (group NJ) and the framework was fabricated directly on the master cast. During the framework try-in appointment, the fit was assessed by clinical (Sheffield test) and radiographic inspection and recorded as passive or nonpassive. When a verification jig was used (group J, n = 16), all frameworks exhibited clinically passive fit, while when a verification jig was not used (group NJ, n = 14), only two frameworks fit. This difference was statistically significant (p < .001). Within the limitations of this retrospective study, the fabrication of a verification jig ensured clinically passive fit of metal frameworks in nonsegmented fixed implant-supported complete denture. © 2011 Wiley Periodicals, Inc.

  18. Optical/digital identification/verification system based on digital watermarking technology

    NASA Astrophysics Data System (ADS)

    Herrigel, Alexander; Voloshynovskiy, Sviatoslav V.; Hrytskiv, Zenon D.

    2000-06-01

    This paper presents a new approach for the secure integrity verification of driver licenses, passports or other analogue identification documents. The system embeds (detects) the reference number of the identification document with the DCT watermark technology in (from) the owner photo of the identification document holder. During verification the reference number is extracted and compared with the reference number printed in the identification document. The approach combines optical and digital image processing techniques. The detection system must be able to scan an analogue driver license or passport, convert the image of this document into a digital representation and then apply the watermark verification algorithm to check the payload of the embedded watermark. If the payload of the watermark is identical with the printed visual reference number of the issuer, the verification was successful and the passport or driver license has not been modified. This approach constitutes a new class of application for the watermark technology, which was originally targeted for the copyright protection of digital multimedia data. The presented approach substantially increases the security of the analogue identification documents applied in many European countries.

  19. PANIC: A General-purpose Panoramic Near-infrared Camera for the Calar Alto Observatory

    NASA Astrophysics Data System (ADS)

    Cárdenas Vázquez, M.-C.; Dorner, B.; Huber, A.; Sánchez-Blanco, E.; Alter, M.; Rodríguez Gómez, J. F.; Bizenberger, P.; Naranjo, V.; Ibáñez Mengual, J.-M.; Panduro, J.; García Segura, A. J.; Mall, U.; Fernández, M.; Laun, W.; Ferro Rodríguez, I. M.; Helmling, J.; Terrón, V.; Meisenheimer, K.; Fried, J. W.; Mathar, R. J.; Baumeister, H.; Rohloff, R.-R.; Storz, C.; Verdes-Montenegro, L.; Bouy, H.; Ubierna, M.; Fopp, P.; Funke, B.

    2018-02-01

    PANIC7 is the new PAnoramic Near-Infrared Camera for Calar Alto and is a project jointly developed by the MPIA in Heidelberg, Germany, and the IAA in Granada, Spain, for the German-Spanish Astronomical Center at Calar Alto Observatory (CAHA; Almería, Spain). This new instrument works with the 2.2 m and 3.5 m CAHA telescopes covering a field of view of 30 × 30 arcmin and 15 × 15 arcmin, respectively, with a sampling of 4096 × 4096 pixels. It is designed for the spectral bands from Z to K S , and can also be equipped with narrowband filters. The instrument was delivered to the observatory in 2014 October and was commissioned at both telescopes between 2014 November and 2015 June. Science verification at the 2.2 m telescope was carried out during the second semester of 2015 and the instrument is now at full operation. We describe the design, assembly, integration, and verification process, the final laboratory tests and the PANIC instrument performance. We also present first-light data obtained during the commissioning and preliminary results of the scientific verification. The final optical model and the theoretical performance of the camera were updated according to the as-built data. The laboratory tests were made with a star simulator. Finally, the commissioning phase was done at both telescopes to validate the camera real performance on sky. The final laboratory test confirmed the expected camera performances, complying with the scientific requirements. The commissioning phase on sky has been accomplished.

  20. Baseline and Verification Tests of the Electric Vehicle Associates’ Current Fare Station Wagon.

    DTIC Science & Technology

    1983-01-01

    ELECTRIC Final Test Report VEICLE ASSOCIATES’CURRENT FARE STATION WAGON 27 March 1980 -6 November 1981 6. PERFORMING ORG. REPORT NUMBER * .7. AUTNOR(s) a...Whe,% Doe. Er(,rrrd) -I PREFACE Z..1~ The electric and hybrid vehicle test was conducted by the U.S. Army Mobility Equipment Research and Development...COAST-DOWN D. ELECTRIC AND HYBRID VEHICLE 92 VERIFICATION PROCEDURES 1".f S. -..°.o. . *-.. .,". .. " . ,. . . . . . . % % %d° ILLUSTRATIONS Figure

  1. Development of a pilot-scale kinetic extruder feeder system and test program. Phase II. Verification testing. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1984-01-12

    This report describes the work done under Phase II, the verification testing of the Kinetic Extruder. The main objective of the test program was to determine failure modes and wear rates. Only minor auxiliary equipment malfunctions were encountered. Wear rates indicate useful life expectancy of from 1 to 5 years for wear-exposed components. Recommendations are made for adapting the equipment for pilot plant and commercial applications. 3 references, 20 figures, 12 tables.

  2. A framework of multitemplate ensemble for fingerprint verification

    NASA Astrophysics Data System (ADS)

    Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li

    2012-12-01

    How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.

  3. INF verification: a guide for the perplexed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mendelsohn, J.

    1987-09-01

    The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficultmore » to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.« less

  4. Discriminative Features Mining for Offline Handwritten Signature Verification

    NASA Astrophysics Data System (ADS)

    Neamah, Karrar; Mohamad, Dzulkifli; Saba, Tanzila; Rehman, Amjad

    2014-03-01

    Signature verification is an active research area in the field of pattern recognition. It is employed to identify the particular person with the help of his/her signature's characteristics such as pen pressure, loops shape, speed of writing and up down motion of pen, writing speed, pen pressure, shape of loops, etc. in order to identify that person. However, in the entire process, features extraction and selection stage is of prime importance. Since several signatures have similar strokes, characteristics and sizes. Accordingly, this paper presents combination of orientation of the skeleton and gravity centre point to extract accurate pattern features of signature data in offline signature verification system. Promising results have proved the success of the integration of the two methods.

  5. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, Michael A.

    1997-01-01

    A system and method for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded "D" character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the "D" interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available.

  6. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, M.A.

    1997-01-07

    A system and method are disclosed for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded ``D`` character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the ``D`` interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available. 5 figs.

  7. Technology verification phase. Dynamic isotope power system. Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Halsey, D.G.

    1982-03-10

    The Phase I requirements of the Kilowatt Isotope Power System (KIPS) program were to make a detailed Flight System Conceptual Design (FSCD) for an isotope fueled organic Rankine cycle power system and to build and test a Ground Demonstration System (GDS) which simulated as closely as possible the operational characteristics of the FSCD. The activities and results of Phase II, the Technology Verification Phase, of the program are reported. The objectives of this phase were to increase system efficiency to 18.1% by component development, to demonstrate system reliability by a 5000 h endurance test and to update the flight systemmore » design. During Phase II, system performance was improved from 15.1% to 16.6%, an endurance test of 2000 h was performed while the flight design analysis was limited to a study of the General Purpose Heat Source, a study of the regenerator manufacturing technique and analysis of the hardness of the system to a laser threat. It was concluded from these tests that the GDS is basically prototypic of a flight design; all components necessary for satisfactory operation were demonstrated successfully at the system level; over 11,000 total h of operation without any component failure attested to the inherent reliability of this type of system; and some further development is required, specifically in the area of performance. (LCL)« less

  8. 77 FR 50855 - Oil and Gas and Sulphur Operations on the Outer Continental Shelf-Increased Safety Measures for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... according to the design. The third- subsea function and pressure tests party verification must include...; Requires new casing and cementing integrity tests; Establishes new requirements for subsea secondary BOP... that, for the final casing string (or liner if it is the final string), an operator must install one...

  9. 75 FR 45467 - Certain Magnesia Carbon Bricks From the People's Republic of China: Final Determination of Sales...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-08-02

    ... the People's Republic of China: Verification of Fengchi Import & Export Co., Ltd. of Haicheng City... Certain Magnesia Carbon Bricks from the People's Republic of China and Mexico: Initiation of Antidumping... from the People's Republic of China; Final Determination of Sales at Less Than Fair Value and Critical...

  10. Design, analysis and test verification of advanced encapsulation systems

    NASA Technical Reports Server (NTRS)

    Garcia, A.; Minning, C.

    1982-01-01

    Analytical models were developed to perform optical, thermal, electrical and structural analyses on candidate encapsulation systems. Qualification testing, specimens of various types, and a finalized optimum design are projected.

  11. Exploring the Possible Use of Information Barriers for future Biological Weapons Verification Regimes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Luke, S J

    2011-12-20

    This report describes a path forward for implementing information barriers in a future generic biological arms-control verification regime. Information barriers have become a staple of discussion in the area of arms control verification approaches for nuclear weapons and components. Information barriers when used with a measurement system allow for the determination that an item has sensitive characteristics without releasing any of the sensitive information. Over the last 15 years the United States (with the Russian Federation) has led on the development of information barriers in the area of the verification of nuclear weapons and nuclear components. The work of themore » US and the Russian Federation has prompted other states (e.g., UK and Norway) to consider the merits of information barriers for possible verification regimes. In the context of a biological weapons control verification regime, the dual-use nature of the biotechnology will require protection of sensitive information while allowing for the verification of treaty commitments. A major question that has arisen is whether - in a biological weapons verification regime - the presence or absence of a weapon pathogen can be determined without revealing any information about possible sensitive or proprietary information contained in the genetic materials being declared under a verification regime. This study indicates that a verification regime could be constructed using a small number of pathogens that spans the range of known biological weapons agents. Since the number of possible pathogens is small it is possible and prudent to treat these pathogens as analogies to attributes in a nuclear verification regime. This study has determined that there may be some information that needs to be protected in a biological weapons control verification regime. To protect this information, the study concludes that the Lawrence Livermore Microbial Detection Array may be a suitable technology for the detection of the genetic information associated with the various pathogens. In addition, it has been determined that a suitable information barrier could be applied to this technology when the verification regime has been defined. Finally, the report posits a path forward for additional development of information barriers in a biological weapons verification regime. This path forward has shown that a new analysis approach coined as Information Loss Analysis might need to be pursued so that a numerical understanding of how information can be lost in specific measurement systems can be achieved.« less

  12. Final Report - Regulatory Considerations for Adaptive Systems

    NASA Technical Reports Server (NTRS)

    Wilkinson, Chris; Lynch, Jonathan; Bharadwaj, Raj

    2013-01-01

    This report documents the findings of a preliminary research study into new approaches to the software design assurance of adaptive systems. We suggest a methodology to overcome the software validation and verification difficulties posed by the underlying assumption of non-adaptive software in the requirementsbased- testing verification methods in RTCA/DO-178B and C. An analysis of the relevant RTCA/DO-178B and C objectives is presented showing the reasons for the difficulties that arise in showing satisfaction of the objectives and suggested additional means by which they could be satisfied. We suggest that the software design assurance problem for adaptive systems is principally one of developing correct and complete high level requirements and system level constraints that define the necessary system functional and safety properties to assure the safe use of adaptive systems. We show how analytical techniques such as model based design, mathematical modeling and formal or formal-like methods can be used to both validate the high level functional and safety requirements, establish necessary constraints and provide the verification evidence for the satisfaction of requirements and constraints that supplements conventional testing. Finally the report identifies the follow-on research topics needed to implement this methodology.

  13. Exploration of Uncertainty in Glacier Modelling

    NASA Technical Reports Server (NTRS)

    Thompson, David E.

    1999-01-01

    There are procedures and methods for verification of coding algebra and for validations of models and calculations that are in use in the aerospace computational fluid dynamics (CFD) community. These methods would be efficacious if used by the glacier dynamics modelling community. This paper is a presentation of some of those methods, and how they might be applied to uncertainty management supporting code verification and model validation for glacier dynamics. The similarities and differences between their use in CFD analysis and the proposed application of these methods to glacier modelling are discussed. After establishing sources of uncertainty and methods for code verification, the paper looks at a representative sampling of verification and validation efforts that are underway in the glacier modelling community, and establishes a context for these within overall solution quality assessment. Finally, an information architecture and interactive interface is introduced and advocated. This Integrated Cryospheric Exploration (ICE) Environment is proposed for exploring and managing sources of uncertainty in glacier modelling codes and methods, and for supporting scientific numerical exploration and verification. The details and functionality of this Environment are described based on modifications of a system already developed for CFD modelling and analysis.

  14. Verification Challenges at Low Numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, themore » issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less

  15. 78 FR 70499 - An Inquiry Into the Commission's Policies and Rules Regarding AM Radio Service Directional...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-26

    ... Antenna Performance Verification AGENCY: Federal Communications Commission. ACTION: Final rule; correction... as follows: Subpart BB--Disturbance of AM Broadcast Station Antenna Patterns * * * * * Federal...

  16. Alternative sample sizes for verification dose experiments and dose audits

    NASA Astrophysics Data System (ADS)

    Taylor, W. A.; Hansen, J. M.

    1999-01-01

    ISO 11137 (1995), "Sterilization of Health Care Products—Requirements for Validation and Routine Control—Radiation Sterilization", provides sampling plans for performing initial verification dose experiments and quarterly dose audits. Alternative sampling plans are presented which provide equivalent protection. These sampling plans can significantly reduce the cost of testing. These alternative sampling plans have been included in a draft ISO Technical Report (type 2). This paper examines the rational behind the proposed alternative sampling plans. The protection provided by the current verification and audit sampling plans is first examined. Then methods for identifying equivalent plans are highlighted. Finally, methods for comparing the cost associated with the different plans are provided. This paper includes additional guidance for selecting between the original and alternative sampling plans not included in the technical report.

  17. Alloy-assisted deposition of three-dimensional arrays of atomic gold catalyst for crystal growth studies

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Yin; Jiang, Yuanwen; Cherukara, Mathew J.

    Large-scale assembly of individual atoms over smooth surfaces is difficult to achieve. A configuration of an atom reservoir, in which individual atoms can be readily extracted, may successfully address this challenge. In this work, we demonstrate that a liquid gold-silicon alloy established in classical vapor-liquid-solid growth can deposit ordered and three-dimensional rings of isolated gold atoms over silicon nanowire sidewalls. Here, we perform ab initio molecular dynamics simulation and unveil a surprising single atomic gold-catalyzed chemical etching of silicon. Experimental verification of this catalytic process in silicon nanowires yields dopant-dependent, massive and ordered 3D grooves with spacing down to similarmore » to 5 nm. Finally, we use these grooves as self-labeled and ex situ markers to resolve several complex silicon growths, including the formation of nodes, kinks, scale-like interfaces, and curved backbones.« less

  18. The Unparalleled Systems Engineering of MSL's Backup Entry, Descent, and Landing System: Second Chance

    NASA Technical Reports Server (NTRS)

    Roumeliotis, Chris; Grinblat, Jonathan; Reeves, Glenn

    2013-01-01

    Second Chance (SECC) was a bare bones version of Mars Science Laboratory's (MSL) Entry Descent & Landing (EDL) flight software that ran on Curiosity's backup computer, which could have taken over swiftly in the event of a reset of Curiosity's prime computer, in order to land her safely on Mars. Without SECC, a reset of Curiosity's prime computer would have lead to catastrophic mission failure. Even though a reset of the prime computer never occurred, SECC had the important responsibility as EDL's guardian angel, and this responsibility would not have seen such success without unparalleled systems engineering. This paper will focus on the systems engineering behind SECC: Covering a brief overview of SECC's design, the intense schedule to use SECC as a backup system, the verification and validation of the system's "Do No Harm" mandate, the system's overall functional performance, and finally, its use on the fateful day of August 5th, 2012.

  19. Alloy-assisted deposition of three-dimensional arrays of atomic gold catalyst for crystal growth studies

    DOE PAGES

    Fang, Yin; Jiang, Yuanwen; Cherukara, Mathew J.; ...

    2017-12-08

    Large-scale assembly of individual atoms over smooth surfaces is difficult to achieve. A configuration of an atom reservoir, in which individual atoms can be readily extracted, may successfully address this challenge. In this work, we demonstrate that a liquid gold-silicon alloy established in classical vapor-liquid-solid growth can deposit ordered and three-dimensional rings of isolated gold atoms over silicon nanowire sidewalls. Here, we perform ab initio molecular dynamics simulation and unveil a surprising single atomic gold-catalyzed chemical etching of silicon. Experimental verification of this catalytic process in silicon nanowires yields dopant-dependent, massive and ordered 3D grooves with spacing down to similarmore » to 5 nm. Finally, we use these grooves as self-labeled and ex situ markers to resolve several complex silicon growths, including the formation of nodes, kinks, scale-like interfaces, and curved backbones.« less

  20. Artificial neural networks for document analysis and recognition.

    PubMed

    Marinai, Simone; Gori, Marco; Soda, Giovanni; Society, Computer

    2005-01-01

    Artificial neural networks have been extensively applied to document analysis and recognition. Most efforts have been devoted to the recognition of isolated handwritten and printed characters with widely recognized successful results. However, many other document processing tasks, like preprocessing, layout analysis, character segmentation, word recognition, and signature verification, have been effectively faced with very promising results. This paper surveys the most significant problems in the area of offline document image processing, where connectionist-based approaches have been applied. Similarities and differences between approaches belonging to different categories are discussed. A particular emphasis is given on the crucial role of prior knowledge for the conception of both appropriate architectures and learning algorithms. Finally, the paper provides a critical analysis on the reviewed approaches and depicts the most promising research guidelines in the field. In particular, a second generation of connectionist-based models are foreseen which are based on appropriate graphical representations of the learning environment.

  1. Understanding of Leaf Development-the Science of Complexity.

    PubMed

    Malinowski, Robert

    2013-06-25

    The leaf is the major organ involved in light perception and conversion of solar energy into organic carbon. In order to adapt to different natural habitats, plants have developed a variety of leaf forms, ranging from simple to compound, with various forms of dissection. Due to the enormous cellular complexity of leaves, understanding the mechanisms regulating development of these organs is difficult. In recent years there has been a dramatic increase in the use of technically advanced imaging techniques and computational modeling in studies of leaf development. Additionally, molecular tools for manipulation of morphogenesis were successfully used for in planta verification of developmental models. Results of these interdisciplinary studies show that global growth patterns influencing final leaf form are generated by cooperative action of genetic, biochemical, and biomechanical inputs. This review summarizes recent progress in integrative studies on leaf development and illustrates how intrinsic features of leaves (including their cellular complexity) influence the choice of experimental approach.

  2. Understanding of Leaf Development—the Science of Complexity

    PubMed Central

    Malinowski, Robert

    2013-01-01

    The leaf is the major organ involved in light perception and conversion of solar energy into organic carbon. In order to adapt to different natural habitats, plants have developed a variety of leaf forms, ranging from simple to compound, with various forms of dissection. Due to the enormous cellular complexity of leaves, understanding the mechanisms regulating development of these organs is difficult. In recent years there has been a dramatic increase in the use of technically advanced imaging techniques and computational modeling in studies of leaf development. Additionally, molecular tools for manipulation of morphogenesis were successfully used for in planta verification of developmental models. Results of these interdisciplinary studies show that global growth patterns influencing final leaf form are generated by cooperative action of genetic, biochemical, and biomechanical inputs. This review summarizes recent progress in integrative studies on leaf development and illustrates how intrinsic features of leaves (including their cellular complexity) influence the choice of experimental approach. PMID:27137383

  3. Consistency, Verification, and Validation of Turbulence Models for Reynolds-Averaged Navier-Stokes Applications

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2009-01-01

    In current practice, it is often difficult to draw firm conclusions about turbulence model accuracy when performing multi-code CFD studies ostensibly using the same model because of inconsistencies in model formulation or implementation in different codes. This paper describes an effort to improve the consistency, verification, and validation of turbulence models within the aerospace community through a website database of verification and validation cases. Some of the variants of two widely-used turbulence models are described, and two independent computer codes (one structured and one unstructured) are used in conjunction with two specific versions of these models to demonstrate consistency with grid refinement for several representative problems. Naming conventions, implementation consistency, and thorough grid resolution studies are key factors necessary for success.

  4. Verification bias: an under-recognized source of error in assessing the efficacy of MRI of the meniscii.

    PubMed

    Richardson, Michael L; Petscavage, Jonelle M

    2011-11-01

    The sensitivity and specificity of magnetic resonance imaging (MRI) for diagnosis of meniscal tears has been studied extensively, with tears usually verified by surgery. However, surgically unverified cases are often not considered in these studies, leading to verification bias, which can falsely increase the sensitivity and decrease the specificity estimates. Our study suggests that such bias may be very common in the meniscal MRI literature, and illustrates techniques to detect and correct for such bias. PubMed was searched for articles estimating sensitivity and specificity of MRI for meniscal tears. These were assessed for verification bias, deemed potentially present if a study included any patients whose MRI findings were not surgically verified. Retrospective global sensitivity analysis (GSA) was performed when possible. Thirty-nine of the 314 studies retrieved from PubMed specifically dealt with meniscal tears. All 39 included unverified patients, and hence, potential verification bias. Only seven articles included sufficient information to perform GSA. Of these, one showed definite verification bias, two showed no bias, and four others showed bias within certain ranges of disease prevalence. Only 9 of 39 acknowledged the possibility of verification bias. Verification bias is underrecognized and potentially common in published estimates of the sensitivity and specificity of MRI for the diagnosis of meniscal tears. When possible, it should be avoided by proper study design. If unavoidable, it should be acknowledged. Investigators should tabulate unverified as well as verified data. Finally, verification bias should be estimated; if present, corrected estimates of sensitivity and specificity should be used. Our online web-based calculator makes this process relatively easy. Copyright © 2011 AUR. Published by Elsevier Inc. All rights reserved.

  5. Industrial methodology for process verification in research (IMPROVER): toward systems biology verification

    PubMed Central

    Meyer, Pablo; Hoeng, Julia; Rice, J. Jeremy; Norel, Raquel; Sprengel, Jörg; Stolle, Katrin; Bonk, Thomas; Corthesy, Stephanie; Royyuru, Ajay; Peitsch, Manuel C.; Stolovitzky, Gustavo

    2012-01-01

    Motivation: Analyses and algorithmic predictions based on high-throughput data are essential for the success of systems biology in academic and industrial settings. Organizations, such as companies and academic consortia, conduct large multi-year scientific studies that entail the collection and analysis of thousands of individual experiments, often over many physical sites and with internal and outsourced components. To extract maximum value, the interested parties need to verify the accuracy and reproducibility of data and methods before the initiation of such large multi-year studies. However, systematic and well-established verification procedures do not exist for automated collection and analysis workflows in systems biology which could lead to inaccurate conclusions. Results: We present here, a review of the current state of systems biology verification and a detailed methodology to address its shortcomings. This methodology named ‘Industrial Methodology for Process Verification in Research’ or IMPROVER, consists on evaluating a research program by dividing a workflow into smaller building blocks that are individually verified. The verification of each building block can be done internally by members of the research program or externally by ‘crowd-sourcing’ to an interested community. www.sbvimprover.com Implementation: This methodology could become the preferred choice to verify systems biology research workflows that are becoming increasingly complex and sophisticated in industrial and academic settings. Contact: gustavo@us.ibm.com PMID:22423044

  6. Definition of ground test for verification of large space structure control

    NASA Technical Reports Server (NTRS)

    Doane, G. B., III; Glaese, J. R.; Tollison, D. K.; Howsman, T. G.; Curtis, S. (Editor); Banks, B.

    1984-01-01

    Control theory and design, dynamic system modelling, and simulation of test scenarios are the main ideas discussed. The overall effort is the achievement at Marshall Space Flight Center of a successful ground test experiment of a large space structure. A simplified planar model of ground test experiment of a large space structure. A simplified planar model of ground test verification was developed. The elimination from that model of the uncontrollable rigid body modes was also examined. Also studied was the hardware/software of computation speed.

  7. Advanced turboprop testbed systems study

    NASA Technical Reports Server (NTRS)

    Goldsmith, I. M.

    1982-01-01

    The proof of concept, feasibility, and verification of the advanced prop fan and of the integrated advanced prop fan aircraft are established. The use of existing hardware is compatible with having a successfully expedited testbed ready for flight. A prop fan testbed aircraft is definitely feasible and necessary for verification of prop fan/prop fan aircraft integrity. The Allison T701 is most suitable as a propulsor and modification of existing engine and propeller controls are adequate for the testbed. The airframer is considered the logical overall systems integrator of the testbed program.

  8. Successful MPPF Pneumatics Verification and Validation Testing

    NASA Image and Video Library

    2017-03-28

    Engineers and technicians completed verification and validation testing of several pneumatic systems inside and outside the Multi-Payload Processing Facility (MPPF) at NASA's Kennedy Space Center in Florida. In view is the service platform for Orion spacecraft processing. The MPPF will be used for offline processing and fueling of the Orion spacecraft and service module stack before launch. Orion also will be de-serviced in the MPPF after a mission. The Ground Systems Development and Operations Program (GSDO) is overseeing upgrades to the facility. The Engineering Directorate led the recent pneumatic tests.

  9. Successful MPPF Pneumatics Verification and Validation Testing

    NASA Image and Video Library

    2017-03-28

    Engineers and technicians completed verification and validation testing of several pneumatic systems inside and outside the Multi-Payload Processing Facility (MPPF) at NASA's Kennedy Space Center in Florida. In view is the top level of the service platform for Orion spacecraft processing. The MPPF will be used for offline processing and fueling of the Orion spacecraft and service module stack before launch. Orion also will be de-serviced in the MPPF after a mission. The Ground Systems Development and Operations Program (GSDO) is overseeing upgrades to the facility. The Engineering Directorate led the recent pneumatic tests.

  10. 78 FR 67335 - Solid Urea From the Russian Federation: Final Results of Antidumping Duty Administrative Review...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-11-12

    ... (Russia). For the final results, we continue to find that MCC EuroChem has not sold subject merchandise at... of the antidumping duty order on solid urea from Russia.\\1\\ We invited interested parties to comment... conducted a verification of the sales information reported by MCC EuroChem in Russia.\\2\\ \\2\\ See Memorandum...

  11. Prototype test article verification of the Space Station Freedom active thermal control system microgravity performance

    NASA Technical Reports Server (NTRS)

    Chen, I. Y.; Ungar, E. K.; Lee, D. Y.; Beckstrom, P. S.

    1993-01-01

    To verify the on-orbit operation of the Space Station Freedom (SSF) two-phase external Active Thermal Control System (ATCS), a test and verification program will be performed prior to flight. The first system level test of the ATCS is the Prototype Test Article (PTA) test that will be performed in early 1994. All ATCS loops will be represented by prototypical components and the line sizes and lengths will be representative of the flight system. In this paper, the SSF ATCS and a portion of its verification process are described. The PTA design and the analytical methods that were used to quantify the gravity effects on PTA operation are detailed. Finally, the gravity effects are listed, and the applicability of the 1-g PTA test results to the validation of on-orbit ATCS operation is discussed.

  12. Verification and Improvement of ERS-1/2 Altimeter Geophysical Data Records for Global Change Studies

    NASA Technical Reports Server (NTRS)

    Shum, C. K.

    2000-01-01

    This Final Technical Report summarizes the research work conducted under NASA's Physical Oceanography Program entitled, Verification And Improvement Of ERS-112 Altimeter Geophysical Data Recorders For Global Change Studies, for the time period from January 1, 2000 through June 30, 2000. This report also provides a summary of the investigation from July 1, 1997 - June 30, 2000. The primary objectives of this investigation include verification and improvement of the ERS-1 and ERS-2 radar altimeter geophysical data records for distribution of the data to the ESA-approved U.S. ERS-1/-2 investigators for global climate change studies. Specifically, the investigation is to verify and improve the ERS geophysical data record products by calibrating the instrument and assessing accuracy for the ERS-1/-2 orbital, geophysical, media, and instrument corrections. The purpose is to ensure that the consistency of constants, standards and algorithms with TOPEX/POSEIDON radar altimeter for global climate change studies such as the monitoring and interpretation of long-term sea level change. This investigation has provided the current best precise orbits, with the radial orbit accuracy for ERS-1 (Phases C-G) and ERS-2 estimated at the 3-5 cm rms level, an 30-fold improvement compared to the 1993 accuracy. We have finalized the production and verification of the value-added ERS-1 mission (Phases A, B, C, D, E, F, and G), in collaboration with JPL PODAAC and the University of Texas. Orbit and data verification and improvement of algorithms led to the best data product available to-date. ERS-2 altimeter data have been improved and we have been active on Envisat (2001 launch) GDR algorithm review and improvement. The data improvement of ERS-1 and ERS-2 led to improvement in the global mean sea surface, marine gravity anomaly and bathymetry models, and a study of Antarctica mass balance, which was published in Science in 1998.

  13. Lightweight Small Arms Technologies

    DTIC Science & Technology

    2006-11-01

    conducted using several methods. Initial measurements were obtained using a strand burner , followed by closed bomb measurements using both pressed... pellets and entire cases. Specialized fixtures were developed to measure primer and booster combustion properties. The final verification of interior

  14. Verification of the ODOT overlay design procedure : final report, June 1996.

    DOT National Transportation Integrated Search

    1996-06-01

    The current ODOT overlay design procedure sometimes indicates additional pavement thickness is needed right after the overlay construction. Evaluation of the current procedure reveals that using spreadabiity to back calculate existing pavement modulu...

  15. Preparation of the House Bill 3624 report : final report.

    DOT National Transportation Integrated Search

    2010-03-01

    Senate Bill 1080 (2008 Special Session) tightened documentation and identity verification requirements for the issuance, replacement and renewal of Oregon driver licenses, driver permits and identification cards. The law was signed by the Governor on...

  16. Clean assembly and integration techniques for the Hubble Space Telescope High Fidelity Mechanical Simulator

    NASA Technical Reports Server (NTRS)

    Hughes, David W.; Hedgeland, Randy J.

    1994-01-01

    A mechanical simulator of the Hubble Space Telescope (HST) Aft Shroud was built to perform verification testing of the Servicing Mission Scientific Instruments (SI's) and to provide a facility for astronaut training. All assembly, integration, and test activities occurred under the guidance of a contamination control plan, and all work was reviewed by a contamination engineer prior to implementation. An integrated approach was followed in which materials selection, manufacturing, assembly, subsystem integration, and end product use were considered and controlled to ensure that the use of the High Fidelity Mechanical Simulator (HFMS) as a verification tool would not contaminate mission critical hardware. Surfaces were cleaned throughout manufacturing, assembly, and integration, and reverification was performed following major activities. Direct surface sampling was the preferred method of verification, but access and material constraints led to the use of indirect methods as well. Although surface geometries and coatings often made contamination verification difficult, final contamination sampling and monitoring demonstrated the ability to maintain a class M5.5 environment with surface levels less than 400B inside the HFMS.

  17. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  18. Property-Based Monitoring of Analog and Mixed-Signal Systems

    NASA Astrophysics Data System (ADS)

    Havlicek, John; Little, Scott; Maler, Oded; Nickovic, Dejan

    In the recent past, there has been a steady growth of the market for consumer embedded devices such as cell phones, GPS and portable multimedia systems. In embedded systems, digital, analog and software components are combined on a single chip, resulting in increasingly complex designs that introduce richer functionality on smaller devices. As a consequence, the potential insertion of errors into a design becomes higher, yielding an increasing need for automated analog and mixed-signal validation tools. In the purely digital setting, formal verification based on properties expressed in industrial specification languages such as PSL and SVA is nowadays successfully integrated in the design flow. On the other hand, the validation of analog and mixed-signal systems still largely depends on simulation-based, ad-hoc methods. In this tutorial, we consider some ingredients of the standard verification methodology that can be successfully exported from digital to analog and mixed-signal setting, in particular property-based monitoring techniques. Property-based monitoring is a lighter approach to the formal verification, where the system is seen as a "black-box" that generates sets of traces, whose correctness is checked against a property, that is its high-level specification. Although incomplete, monitoring is effectively used to catch faults in systems, without guaranteeing their full correctness.

  19. Expose : procedure and results of the joint experiment verification tests

    NASA Astrophysics Data System (ADS)

    Panitz, C.; Rettberg, P.; Horneck, G.; Rabbow, E.; Baglioni, P.

    The International Space Station will carry the EXPOSE facility accommodated at the universal workplace URM-D located outside the Russian Service Module. The launch will be affected in 2005 and it is planned to stay in space for 1.5 years. The tray like structure will accomodate 2 chemical and 6 biological PI-experiments or experiment systems of the ROSE (Response of Organisms to Space Environment) consortium. EXPOSE will support long-term in situ studies of microbes in artificial meteorites, as well as of microbial communities from special ecological niches, such as endolithic and evaporitic ecosystems. The either vented or sealed experiment pockets will be covered by an optical filter system to control intensity and spectral range of solar UV irradiation. Control of sun exposure will be achieved by the use of individual shutters. To test the compatibility of the different biological systems and their adaptation to the opportunities and constraints of space conditions a profound ground support program has been developed. The procedure and first results of this joint Experiment Verification Tests (EVT) will be presented. The results will be essential for the success of the EXPOSE mission and have been done in parallel with the development and construction of the final hardware design of the facility. The results of the mission will contribute to the understanding of the organic chemistry processes in space, the biological adaptation strategies to extreme conditions, e.g. on early Earth and Mars, and the distribution of life beyond its planet of origin.

  20. Quality control of recycled asphaltic concrete : final report.

    DOT National Transportation Integrated Search

    1982-07-01

    This study examined the variations found in recycled asphaltic concrete mix based upon plant quality control data and verification testing. The data was collected from four recycled hot-mix projects constructed in 1981. All plant control and acceptan...

  1. Proving autonomous vehicle and advanced driver assistance systems safety : final research report.

    DOT National Transportation Integrated Search

    2016-02-15

    The main objective of this project was to provide technology for answering : crucial safety and correctness questions about verification of autonomous : vehicle and advanced driver assistance systems based on logic. : In synergistic activities, we ha...

  2. Successful MPPF Pneumatics Verification and Validation Testing

    NASA Image and Video Library

    2017-03-28

    Engineers and technicians completed verification and validation testing of several pneumatic systems inside and outside the Multi-Payload Processing Facility (MPPF) at NASA's Kennedy Space Center in Florida. In view is the service platform for Orion spacecraft processing. To the left are several pneumatic panels. The MPPF will be used for offline processing and fueling of the Orion spacecraft and service module stack before launch. Orion also will be de-serviced in the MPPF after a mission. The Ground Systems Development and Operations Program (GSDO) is overseeing upgrades to the facility. The Engineering Directorate led the recent pneumatic tests.

  3. Authoring and verification of clinical guidelines: a model driven approach.

    PubMed

    Pérez, Beatriz; Porres, Ivan

    2010-08-01

    The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.

  4. Commissioning and quality assurance of an integrated system for patient positioning and setup verification in particle therapy.

    PubMed

    Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G

    2014-08-01

    In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.

  5. Simulation of Laboratory Tests of Steel Arch Support

    NASA Astrophysics Data System (ADS)

    Horyl, Petr; Šňupárek, Richard; Maršálek, Pavel; Pacześniowski, Krzysztof

    2017-03-01

    The total load-bearing capacity of steel arch yielding roadways supports is among their most important characteristics. These values can be obtained in two ways: experimental measurements in a specialized laboratory or computer modelling by FEM. Experimental measurements are significantly more expensive and more time-consuming. However, for proper tuning, a computer model is very valuable and can provide the necessary verification by experiment. In the cooperating workplaces of GIG Katowice, VSB-Technical University of Ostrava and the Institute of Geonics ASCR this verification was successful. The present article discusses the conditions and results of this verification for static problems. The output is a tuned computer model, which may be used for other calculations to obtain the load-bearing capacity of other types of steel arch supports. Changes in other parameters such as the material properties of steel, size torques, friction coefficient values etc. can be determined relatively quickly by changing the properties of the investigated steel arch supports.

  6. Verification and Implementation of Operations Safety Controls for Flight Missions

    NASA Technical Reports Server (NTRS)

    Jones, Cheryl L.; Smalls, James R.; Carrier, Alicia S.

    2010-01-01

    Approximately eleven years ago, the International Space Station launched the first module from Russia, the Functional Cargo Block (FGB). Safety and Mission Assurance (S&MA) Operations (Ops) Engineers played an integral part in that endeavor by executing strict flight product verification as well as continued staffing of S&MA's console in the Mission Evaluation Room (MER) for that flight mission. How were these engineers able to conduct such a complicated task? They conducted it based on product verification that consisted of ensuring that safety requirements were adequately contained in all flight products that affected crew safety. S&MA Ops engineers apply both systems engineering and project management principles in order to gain a appropriate level of technical knowledge necessary to perform thorough reviews which cover the subsystem(s) affected. They also ensured that mission priorities were carried out with a great detail and success.

  7. Verification and Validation Methodology of Real-Time Adaptive Neural Networks for Aerospace Applications

    NASA Technical Reports Server (NTRS)

    Gupta, Pramod; Loparo, Kenneth; Mackall, Dale; Schumann, Johann; Soares, Fola

    2004-01-01

    Recent research has shown that adaptive neural based control systems are very effective in restoring stability and control of an aircraft in the presence of damage or failures. The application of an adaptive neural network with a flight critical control system requires a thorough and proven process to ensure safe and proper flight operation. Unique testing tools have been developed as part of a process to perform verification and validation (V&V) of real time adaptive neural networks used in recent adaptive flight control system, to evaluate the performance of the on line trained neural networks. The tools will help in certification from FAA and will help in the successful deployment of neural network based adaptive controllers in safety-critical applications. The process to perform verification and validation is evaluated against a typical neural adaptive controller and the results are discussed.

  8. Formal verification of human-automation interaction

    NASA Technical Reports Server (NTRS)

    Degani, Asaf; Heymann, Michael

    2002-01-01

    This paper discusses a formal and rigorous approach to the analysis of operator interaction with machines. It addresses the acute problem of detecting design errors in human-machine interaction and focuses on verifying the correctness of the interaction in complex and automated control systems. The paper describes a systematic methodology for evaluating whether the interface provides the necessary information about the machine to enable the operator to perform a specified task successfully and unambiguously. It also addresses the adequacy of information provided to the user via training material (e.g., user manual) about the machine's behavior. The essentials of the methodology, which can be automated and applied to the verification of large systems, are illustrated by several examples and through a case study of pilot interaction with an autopilot aboard a modern commercial aircraft. The expected application of this methodology is an augmentation and enhancement, by formal verification, of human-automation interfaces.

  9. Field verification process for open-graded HMAC mixes : final report.

    DOT National Transportation Integrated Search

    2002-07-01

    The State of Oregon uses significant amounts of open-graded HMAC mixes as primary wearing courses on state highways. The primary materials design system for these mixes relies heavily on laboratory draindown to select the design asphalt content. Subs...

  10. Video Vehicle Detector Verification System (V2DVS) operators manual and project final report.

    DOT National Transportation Integrated Search

    2012-03-01

    The accurate detection of the presence, speed and/or length of vehicles on roadways is recognized as critical for : effective roadway congestion management and safety. Vehicle presence sensors are commonly used for traffic : volume measurement and co...

  11. A Compact, Portable, Reduced-Cost, Gamma Ray Spectroscopic System for Nuclear Verification Final Report CRADA No. TSB-1551-98

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lavietes, A.; Kalkhoran, N.

    The overall goal of this project was to demonstrate a compact gamma-ray spectroscopic system with better energy resolution and lower costs than scintillator-based detector systems for uranium enrichment analysis applications.

  12. Intermediate Experimental Vehicle (IXV): Avionics and Software of the ESA Reentry Demonstrator

    NASA Astrophysics Data System (ADS)

    Malucchi, Giovanni; Dussy, Stephane; Camuffo, Fabrizio

    2012-08-01

    The IXV project is conceived as a technology platform that would perform the step forward with respect to the Atmospheric Reentry Demonstrator (ARD), by increasing the system maneuverability and verifying the critical technology performances against a wider re- entry corridor.The main objective is to design, develop and to perform an in-flight verification of an autonomous lifting and aerodynamically controlled (by a combined use of thrusters and aerodynamic surfaces) reentry system.The project also includes the verification and experimentation of a set of critical reentry technologies and disciplines:Thermal Protection System (TPS), for verification and characterization of thermal protection technologies in representative operational environment;Aerodynamics - Aerthermodynamics (AED-A TD), for understanding and validation of aerodynamics and aerothermodyamics phenomena with improvement of design tools;Guidance, Navigation and Control (GNC), for verification of guidance, navigation and control techniques in representative operational environment (i.e. reentry from Low Earth Orbit);Flight dynamics, to update and validate the vehicle model during actual flight, focused on stability and control derivatives.The above activities are being performed through the implementation of a strict system design-to-cost approach with a proto-flight model development philosophy.In 2008 and 2009, the IXV project activities reached the successful completion of the project Phase-B, including the System PDR, and early project Phase-C.In 2010, following a re-organization of the industrial consortium, the IXV project successfully completed a design consolidation leading to an optimization of the technical baseline including the GNC, avionics (i.e. power, data handling, radio frequency and telemetry), measurement sensors, hot and cold composite structures, thermal protections and control, with significant improvements of the main system budgets.The project has successfully closed the System CDR during 2011 and it is currently running the Phase-D with the target to be launched with Vega from Kourou in 2014The paper will provide an overview of the IXV design and mission objectives in the frame of the atmospheric reentry overall activities, focusing on the avionics and software architecture and design.

  13. Evidence flow graph methods for validation and verification of expert systems

    NASA Technical Reports Server (NTRS)

    Becker, Lee A.; Green, Peter G.; Bhatnagar, Jayant

    1988-01-01

    This final report describes the results of an investigation into the use of evidence flow graph techniques for performing validation and verification of expert systems. This was approached by developing a translator to convert horn-clause rule bases into evidence flow graphs, a simulation program, and methods of analysis. These tools were then applied to a simple rule base which contained errors. It was found that the method was capable of identifying a variety of problems, for example that the order of presentation of input data or small changes in critical parameters could effect the output from a set of rules.

  14. Palmprint verification using Lagrangian decomposition and invariant interest points

    NASA Astrophysics Data System (ADS)

    Gupta, P.; Rattani, A.; Kisku, D. R.; Hwang, C. J.; Sing, J. K.

    2011-06-01

    This paper presents a palmprint based verification system using SIFT features and Lagrangian network graph technique. We employ SIFT for feature extraction from palmprint images whereas the region of interest (ROI) which has been extracted from wide palm texture at the preprocessing stage, is considered for invariant points extraction. Finally, identity is established by finding permutation matrix for a pair of reference and probe palm graphs drawn on extracted SIFT features. Permutation matrix is used to minimize the distance between two graphs. The propsed system has been tested on CASIA and IITK palmprint databases and experimental results reveal the effectiveness and robustness of the system.

  15. Power Performance Verification of a Wind Farm Using the Friedman's Test.

    PubMed

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L

    2016-06-03

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman's test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable.

  16. Power Performance Verification of a Wind Farm Using the Friedman’s Test

    PubMed Central

    Hernandez, Wilmar; López-Presa, José Luis; Maldonado-Correa, Jorge L.

    2016-01-01

    In this paper, a method of verification of the power performance of a wind farm is presented. This method is based on the Friedman’s test, which is a nonparametric statistical inference technique, and it uses the information that is collected by the SCADA system from the sensors embedded in the wind turbines in order to carry out the power performance verification of a wind farm. Here, the guaranteed power curve of the wind turbines is used as one more wind turbine of the wind farm under assessment, and a multiple comparison method is used to investigate differences between pairs of wind turbines with respect to their power performance. The proposed method says whether the power performance of the specific wind farm under assessment differs significantly from what would be expected, and it also allows wind farm owners to know whether their wind farm has either a perfect power performance or an acceptable power performance. Finally, the power performance verification of an actual wind farm is carried out. The results of the application of the proposed method showed that the power performance of the specific wind farm under assessment was acceptable. PMID:27271628

  17. Assessment of Galileo modal test results for mathematical model verification

    NASA Technical Reports Server (NTRS)

    Trubert, M.

    1984-01-01

    The modal test program for the Galileo Spacecraft was completed at the Jet Propulsion Laboratory in the summer of 1983. The multiple sine dwell method was used for the baseline test. The Galileo Spacecraft is a rather complex 2433 kg structure made of a central core on which seven major appendages representing 30 percent of the total mass are attached, resulting in a high modal density structure. The test revealed a strong nonlinearity in several major modes. This nonlinearity discovered in the course of the test necessitated running additional tests at the unusually high response levels of up to about 21 g. The high levels of response were required to obtain a model verification valid at the level of loads for which the spacecraft was designed. Because of the high modal density and the nonlinearity, correlation between the dynamic mathematical model and the test results becomes a difficult task. Significant changes in the pre-test analytical model are necessary to establish confidence in the upgraded analytical model used for the final load verification. This verification, using a test verified model, is required by NASA to fly the Galileo Spacecraft on the Shuttle/Centaur launch vehicle in 1986.

  18. Life cycle management of analytical methods.

    PubMed

    Parr, Maria Kristina; Schmidt, Alexander H

    2018-01-05

    In modern process management, the life cycle concept gains more and more importance. It focusses on the total costs of the process from invest to operation and finally retirement. Also for analytical procedures an increasing interest for this concept exists in the recent years. The life cycle of an analytical method consists of design, development, validation (including instrumental qualification, continuous method performance verification and method transfer) and finally retirement of the method. It appears, that also regulatory bodies have increased their awareness on life cycle management for analytical methods. Thus, the International Council for Harmonisation of Technical Requirements for Pharmaceuticals for Human Use (ICH), as well as the United States Pharmacopeial Forum discuss the enrollment of new guidelines that include life cycle management of analytical methods. The US Pharmacopeia (USP) Validation and Verification expert panel already proposed a new General Chapter 〈1220〉 "The Analytical Procedure Lifecycle" for integration into USP. Furthermore, also in the non-regulated environment a growing interest on life cycle management is seen. Quality-by-design based method development results in increased method robustness. Thereby a decreased effort is needed for method performance verification, and post-approval changes as well as minimized risk of method related out-of-specification results. This strongly contributes to reduced costs of the method during its life cycle. Copyright © 2017 Elsevier B.V. All rights reserved.

  19. ON-LINE MONITORING OF I&C TRANSMITTERS AND SENSORS FOR CALIBRATION VERIFICATION AND RESPONSE TIME TESTING WAS SUCCESSFULLY IMPLEMENTED AT ATR

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Erickson, Phillip A.; O'Hagan, Ryan; Shumaker, Brent

    The Advanced Test Reactor (ATR) has always had a comprehensive procedure to verify the performance of its critical transmitters and sensors, including RTDs, and pressure, level, and flow transmitters. These transmitters and sensors have been periodically tested for response time and calibration verification to ensure accuracy. With implementation of online monitoring techniques at ATR, the calibration verification and response time testing of these transmitters and sensors are verified remotely, automatically, hands off, include more portions of the system, and can be performed at almost any time during process operations. The work was done under a DOE funded SBIR project carriedmore » out by AMS. As a result, ATR is now able to save the manpower that has been spent over the years on manual calibration verification and response time testing of its temperature and pressure sensors and refocus those resources towards more equipment reliability needs. More importantly, implementation of OLM will help enhance the overall availability, safety, and efficiency. Together with equipment reliability programs of ATR, the integration of OLM will also help with I&C aging management goals of the Department of Energy and long-time operation of ATR.« less

  20. Nonlinear 3D MHD verification study: SpeCyl and PIXIE3D codes for RFP and Tokamak plasmas

    NASA Astrophysics Data System (ADS)

    Bonfiglio, D.; Cappello, S.; Chacon, L.

    2010-11-01

    A strong emphasis is presently placed in the fusion community on reaching predictive capability of computational models. An essential requirement of such endeavor is the process of assessing the mathematical correctness of computational tools, termed verification [1]. We present here a successful nonlinear cross-benchmark verification study between the 3D nonlinear MHD codes SpeCyl [2] and PIXIE3D [3]. Excellent quantitative agreement is obtained in both 2D and 3D nonlinear visco-resistive dynamics for reversed-field pinch (RFP) and tokamak configurations [4]. RFP dynamics, in particular, lends itself as an ideal non trivial test-bed for 3D nonlinear verification. Perspectives for future application of the fully-implicit parallel code PIXIE3D to RFP physics, in particular to address open issues on RFP helical self-organization, will be provided. [4pt] [1] M. Greenwald, Phys. Plasmas 17, 058101 (2010) [0pt] [2] S. Cappello and D. Biskamp, Nucl. Fusion 36, 571 (1996) [0pt] [3] L. Chac'on, Phys. Plasmas 15, 056103 (2008) [0pt] [4] D. Bonfiglio, L. Chac'on and S. Cappello, Phys. Plasmas 17 (2010)

  1. Design and experimental verification for optical module of optical vector-matrix multiplier.

    PubMed

    Zhu, Weiwei; Zhang, Lei; Lu, Yangyang; Zhou, Ping; Yang, Lin

    2013-06-20

    Optical computing is a new method to implement signal processing functions. The multiplication between a vector and a matrix is an important arithmetic algorithm in the signal processing domain. The optical vector-matrix multiplier (OVMM) is an optoelectronic system to carry out this operation, which consists of an electronic module and an optical module. In this paper, we propose an optical module for OVMM. To eliminate the cross talk and make full use of the optical elements, an elaborately designed structure that involves spherical lenses and cylindrical lenses is utilized in this optical system. The optical design software package ZEMAX is used to optimize the parameters and simulate the whole system. Finally, experimental data is obtained through experiments to evaluate the overall performance of the system. The results of both simulation and experiment indicate that the system constructed can implement the multiplication between a matrix with dimensions of 16 by 16 and a vector with a dimension of 16 successfully.

  2. POGO ground simulation test of H-I launch vehicle's second stage

    NASA Astrophysics Data System (ADS)

    Ono, Yoshio; Kohsetsu, Yuji; Shibukawa, Kiwao

    This paper describes a POGO ground simulation test of the Japanese new second stage for the H-I launch vehicle. It was the final prelaunch verification test of a POGO prevention of the H-I. This test was planned to examine POGO stability and was conducted in a Captive Firing Test (CFT) by mounting a flight-type second stage by a soft suspension system on the CFT test stand which gave the vehicle a pseudo inflight boundary condition of free-free in terms of the vehicle's structural dynamics. There was no indication that implied POGO from the data measured during the CFT. Consequently, this test suggested that the new second stage of the H-I was POGO free. Therefore, it was decided that the first test flight (TF no. 1) of the H-I would be made without a POGO Suppression Device. TF no. 1 was launched successfully on August 13, 1986, and its telemetry data showed no evidence of POGO phenomenon.

  3. Online 3D EPID-based dose verification: Proof of concept.

    PubMed

    Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel

    2016-07-01

    Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.

  4. Side impact test and analyses of a DOT-111 tank car : final report.

    DOT National Transportation Integrated Search

    2015-10-01

    Transportation Technology Center, Inc. conducted a side impact test on a DOT-111 tank car to evaluate the performance of the : tank car under dynamic impact conditions and to provide data for the verification and refinement of a computational model. ...

  5. FINAL REPORT FOR VERIFICATION OF THE METAL FINISHING FACILITY POLLUTION PREVENTION TOOL (MFFPPT)

    EPA Science Inventory

    The United States Environmental Protection Agency (USEPA) has prepared a computer process simulation package for the metal finishing industry that enables users to predict process outputs based upon process inputs and other operating conditions. This report documents the developm...

  6. Field verification of geogrid properties for base course reinforcement applications : final report.

    DOT National Transportation Integrated Search

    2013-11-01

    The proposed field study is a continuation of a recently concluded, ODOT-funded project titled: Development of ODOT Guidelines for the Use of Geogrids in Aggregate Bases, which is aimed at addressing the need for improved guidelines for base reinforc...

  7. Calibration of region-specific gates pile driving formula for LRFD : final report 561.

    DOT National Transportation Integrated Search

    2016-05-01

    This research project proposes new DOTD pile driving formulas for pile capacity verification using pile driving blow : counts obtained at either end-of-initial driving (EOID) or at the beginning-of-restrike (BOR). The pile driving : formulas were dev...

  8. 76 FR 17287 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-03-28

    ...EPA is finalizing rule revisions that modify existing requirements for sources affected by the federally administered emission trading programs including the NOX Budget Trading Program, the Acid Rain Program, and the Clean Air Interstate Rule. EPA is amending its Protocol Gas Verification Program (PGVP) and the minimum competency requirements for air emission testing (formerly air emission testing body requirements) to improve the accuracy of emissions data. EPA is also amending other sections of the Acid Rain Program continuous emission monitoring system regulations by adding and clarifying certain recordkeeping and reporting requirements, removing the provisions pertaining to mercury monitoring and reporting, removing certain requirements associated with a class-approved alternative monitoring system, disallowing the use of a particular quality assurance option in EPA Reference Method 7E, adding two incorporation by references that were inadvertently left out of the January 24, 2008 final rule, adding two new definitions, revising certain compliance dates, and clarifying the language and applicability of certain provisions.

  9. Sandia National Laboratories: Malware Technical Exchange Meeting (MTEM)

    Science.gov Websites

    Cyber & Infrastructure Security Global Security Remote Sensing & Verification Research Research Against Malware Detection of Malware Malware Research Malware in Mobile Devices Malware Attack Trends Malware Malware Research Malware in Mobile Devices Malware Attack Trends Success Stories of COTS Products

  10. Pretest information for a test to validate plume simulation procedures (FA-17)

    NASA Technical Reports Server (NTRS)

    Hair, L. M.

    1978-01-01

    The results of an effort to plan a final verification wind tunnel test to validate the recommended correlation parameters and application techniques were presented. The test planning effort was complete except for test site finalization and the associated coordination. Two suitable test sites were identified. Desired test conditions were shown. Subsequent sections of this report present the selected model and test site, instrumentation of this model, planned test operations, and some concluding remarks.

  11. A Multiscale Computational Model Combining a Single Crystal Plasticity Constitutive Model with the Generalized Method of Cells (GMC) for Metallic Polycrystals.

    PubMed

    Ghorbani Moghaddam, Masoud; Achuthan, Ajit; Bednarcyk, Brett A; Arnold, Steven M; Pineda, Evan J

    2016-05-04

    A multiscale computational model is developed for determining the elasto-plastic behavior of polycrystal metals by employing a single crystal plasticity constitutive model that can capture the microstructural scale stress field on a finite element analysis (FEA) framework. The generalized method of cells (GMC) micromechanics model is used for homogenizing the local field quantities. At first, the stand-alone GMC is applied for studying simple material microstructures such as a repeating unit cell (RUC) containing single grain or two grains under uniaxial loading conditions. For verification, the results obtained by the stand-alone GMC are compared to those from an analogous FEA model incorporating the same single crystal plasticity constitutive model. This verification is then extended to samples containing tens to hundreds of grains. The results demonstrate that the GMC homogenization combined with the crystal plasticity constitutive framework is a promising approach for failure analysis of structures as it allows for properly predicting the von Mises stress in the entire RUC, in an average sense, as well as in the local microstructural level, i.e. , each individual grain. Two-three orders of saving in computational cost, at the expense of some accuracy in prediction, especially in the prediction of the components of local tensor field quantities and the quantities near the grain boundaries, was obtained with GMC. Finally, the capability of the developed multiscale model linking FEA and GMC to solve real-life-sized structures is demonstrated by successfully analyzing an engine disc component and determining the microstructural scale details of the field quantities.

  12. A Multiscale Computational Model Combining a Single Crystal Plasticity Constitutive Model with the Generalized Method of Cells (GMC) for Metallic Polycrystals

    PubMed Central

    Ghorbani Moghaddam, Masoud; Achuthan, Ajit; Bednarcyk, Brett A.; Arnold, Steven M.; Pineda, Evan J.

    2016-01-01

    A multiscale computational model is developed for determining the elasto-plastic behavior of polycrystal metals by employing a single crystal plasticity constitutive model that can capture the microstructural scale stress field on a finite element analysis (FEA) framework. The generalized method of cells (GMC) micromechanics model is used for homogenizing the local field quantities. At first, the stand-alone GMC is applied for studying simple material microstructures such as a repeating unit cell (RUC) containing single grain or two grains under uniaxial loading conditions. For verification, the results obtained by the stand-alone GMC are compared to those from an analogous FEA model incorporating the same single crystal plasticity constitutive model. This verification is then extended to samples containing tens to hundreds of grains. The results demonstrate that the GMC homogenization combined with the crystal plasticity constitutive framework is a promising approach for failure analysis of structures as it allows for properly predicting the von Mises stress in the entire RUC, in an average sense, as well as in the local microstructural level, i.e., each individual grain. Two–three orders of saving in computational cost, at the expense of some accuracy in prediction, especially in the prediction of the components of local tensor field quantities and the quantities near the grain boundaries, was obtained with GMC. Finally, the capability of the developed multiscale model linking FEA and GMC to solve real-life-sized structures is demonstrated by successfully analyzing an engine disc component and determining the microstructural scale details of the field quantities. PMID:28773458

  13. INTRA-AND INTERANNUAL VARIABILITY OF ECOSYSTEM PROCESSES IN SHORTGRASS STEPPE: NEW MODEL, VERIFICATION, SIMULATIONS. (R824993)

    EPA Science Inventory

    The perspectives, information and conclusions conveyed in research project abstracts, progress reports, final reports, journal abstracts and journal publications convey the viewpoints of the principal investigator and may not represent the views and policies of ORD and EPA. Concl...

  14. Determination of somatropin charged variants by capillary zone electrophoresis - optimisation, verification and implementation of the European pharmacopoeia method.

    PubMed

    Storms, S M; Feltus, A; Barker, A R; Joly, M-A; Girard, M

    2009-03-01

    Measurement of somatropin charged variants by isoelectric focusing was replaced with capillary zone electrophoresis in the January 2006 European Pharmacopoeia Supplement 5.3, based on results from an interlaboratory collaborative study. Due to incompatibilities and method-robustness issues encountered prior to verification, a number of method parameters required optimisation. As the use of a diode array detector at 195 nm or 200 nm led to a loss of resolution, a variable wavelength detector using a 200 nm filter was employed. Improved injection repeatability was obtained by increasing the injection time and pressure, and changing the sample diluent from water to running buffer. Finally, definition of capillary pre-treatment and rinse procedures resulted in more consistent separations over time. Method verification data are presented demonstrating linearity, specificity, repeatability, intermediate precision, limit of quantitation, sample stability, solution stability, and robustness. Based on these experiments, several modifications to the current method have been recommended and incorporated into the European Pharmacopoeia to help improve method performance across laboratories globally.

  15. A Design Rationale Capture Tool to Support Design Verification and Re-use

    NASA Technical Reports Server (NTRS)

    Hooey, Becky Lee; Da Silva, Jonny C.; Foyle, David C.

    2012-01-01

    A design rationale tool (DR tool) was developed to capture design knowledge to support design verification and design knowledge re-use. The design rationale tool captures design drivers and requirements, and documents the design solution including: intent (why it is included in the overall design); features (why it is designed the way it is); information about how the design components support design drivers and requirements; and, design alternatives considered but rejected. For design verification purposes, the tool identifies how specific design requirements were met and instantiated within the final design, and which requirements have not been met. To support design re-use, the tool identifies which design decisions are affected when design drivers and requirements are modified. To validate the design tool, the design knowledge from the Taxiway Navigation and Situation Awareness (T-NASA; Foyle et al., 1996) system was captured and the DR tool was exercised to demonstrate its utility for validation and re-use.

  16. Striving to be known by significant others: automatic activation of self-verification goals in relationship contexts.

    PubMed

    Kraus, Michael W; Chen, Serena

    2009-07-01

    Extending research on the automatic activation of goals associated with significant others, the authors hypothesized that self-verification goals typically pursued with significant others are automatically elicited when a significant-other representation is activated. Supporting this hypothesis, the activation of a significant-other representation through priming (Experiments 1 and 3) or through a transference encounter (Experiment 2) led participants to seek feedback that verifies their preexisting self-views. Specifically, significant-other primed participants desired self-verifying feedback, in general (Experiment 1), from an upcoming interaction partner (Experiment 2), and relative to acquaintance-primed participants and favorable feedback (Experiment 3). Finally, self-verification goals were activated, especially for relational self-views deemed high in importance to participants' self-concepts (Experiment 2) and held with high certainty (Experiment 3). Implications for research on self-evaluative goals, the relational self, and the automatic goal activation literature are discussed, as are consequences for close relationships. (PsycINFO Database Record (c) 2009 APA, all rights reserved).

  17. Safeguardability of the vitrification option for disposal of plutonium

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pillay, K.K.S.

    1996-05-01

    Safeguardability of the vitrification option for plutonium disposition is rather complex and there is no experience base in either domestic or international safeguards for this approach. In the present treaty regime between the US and the states of the former Soviet Union, bilaterial verifications are considered more likely with potential for a third-party verification of safeguards. There are serious technological limitations to applying conventional bulk handling facility safeguards techniques to achieve independent verification of plutonium in borosilicate glass. If vitrification is the final disposition option chosen, maintaining continuity of knowledge of plutonium in glass matrices, especially those containing boron andmore » those spike with high-level wastes or {sup 137}Cs, is beyond the capability of present-day safeguards technologies and nondestructive assay techniques. The alternative to quantitative measurement of fissile content is to maintain continuity of knowledge through a combination of containment and surveillance, which is not the international norm for bulk handling facilities.« less

  18. Experiencing the "Community" in Community College Teaching through Mural Making.

    ERIC Educational Resources Information Center

    Elmes, Ellen

    2002-01-01

    Uses the five stages of creativity designed by psychologist Jacob Getzel--first insight, saturation, incubation, illumination, and verification--to describe a mural project in a rural community college. Reports that the project successfully paired students with community members to paint two local murals. (NB)

  19. ETV/ESTCP Demonstration Plan - Demonstration and Verification of a Turbine Power Generation System Utilizing Renewable Fuel: Landfill Gas

    EPA Science Inventory

    This Test and Quality Assurance Plan (TQAP) provides data quality objections for the success factors that were validated during this demonstration include energy production, emissions and emission reductions compared to alternative systems, economics, and operability, including r...

  20. 48 CFR 32.1007 - Administration and payment of performance-based payments.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... Government, post-payment reviews and verifications should normally be arranged as considered appropriate by...-based payment until the specified event or performance criterion has been successfully accomplished in accordance with the contract. If an event is cumulative, the contracting officer shall not approve the...

  1. 48 CFR 32.1007 - Administration and payment of performance-based payments.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Government, post-payment reviews and verifications should normally be arranged as considered appropriate by...-based payment until the specified event or performance criterion has been successfully accomplished in accordance with the contract. If an event is cumulative, the contracting officer shall not approve the...

  2. 48 CFR 32.1007 - Administration and payment of performance-based payments.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Government, post-payment reviews and verifications should normally be arranged as considered appropriate by...-based payment until the specified event or performance criterion has been successfully accomplished in accordance with the contract. If an event is cumulative, the contracting officer shall not approve the...

  3. Learning Deep Representations for Ground to Aerial Geolocalization (Open Access)

    DTIC Science & Technology

    2015-10-15

    proposed approach, Where-CNN, is inspired by deep learning success in face verification and achieves significant improvements over tra- ditional hand...crafted features and existing deep features learned from other large-scale databases. We show the ef- fectiveness of Where-CNN in finding matches

  4. Loads and Structural Dynamics Requirements for Spaceflight Hardware

    NASA Technical Reports Server (NTRS)

    Schultz, Kenneth P.

    2011-01-01

    The purpose of this document is to establish requirements relating to the loads and structural dynamics technical discipline for NASA and commercial spaceflight launch vehicle and spacecraft hardware. Requirements are defined for the development of structural design loads and recommendations regarding methodologies and practices for the conduct of load analyses are provided. As such, this document represents an implementation of NASA STD-5002. Requirements are also defined for structural mathematical model development and verification to ensure sufficient accuracy of predicted responses. Finally, requirements for model/data delivery and exchange are specified to facilitate interactions between Launch Vehicle Providers (LVPs), Spacecraft Providers (SCPs), and the NASA Technical Authority (TA) providing insight/oversight and serving in the Independent Verification and Validation role. In addition to the analysis-related requirements described above, a set of requirements are established concerning coupling phenomena or other interaction between structural dynamics and aerodynamic environments or control or propulsion system elements. Such requirements may reasonably be considered structure or control system design criteria, since good engineering practice dictates consideration of and/or elimination of the identified conditions in the development of those subsystems. The requirements are included here, however, to ensure that such considerations are captured in the design space for launch vehicles (LV), spacecraft (SC) and the Launch Abort Vehicle (LAV). The requirements in this document are focused on analyses to be performed to develop data needed to support structural verification. As described in JSC 65828, Structural Design Requirements and Factors of Safety for Spaceflight Hardware, implementation of the structural verification requirements is expected to be described in a Structural Verification Plan (SVP), which should describe the verification of each structural item for the applicable requirements. The requirement for and expected contents of the SVP are defined in JSC 65828. The SVP may also document unique verifications that meet or exceed these requirements with Technical Authority approval.

  5. The Use of Remote Sensing Satellites for Verification in International Law

    NASA Astrophysics Data System (ADS)

    Hettling, J. K.

    The contribution is a very sensitive topic which is currently about to gain significance and importance in the international community. It implies questions of international law as well as the contemplation of new developments and decisions in international politics. The paper will begin with the meaning and current status of verification in international law as well as the legal basis of satellite remote sensing in international treaties and resolutions. For the verification part, this implies giving a definition of verification and naming its fields of application and the different means of verification. For the remote sensing part, it involves the identification of relevant provisions in the Outer Space Treaty and the United Nations General Assembly Principles on Remote Sensing. Furthermore it shall be looked at practical examples: in how far have remote sensing satellites been used to verify international obligations? Are there treaties which would considerably profit from the use of remote sensing satellites? In this respect, there are various examples which can be contemplated, such as the ABM Treaty (even though out of force now), the SALT and START Agreements, the Chemical Weapons Convention and the Conventional Test Ban Treaty. It will be mentioned also that NGOs have started to verify international conventions, e.g. Landmine Monitor is verifying the Mine-Ban Convention. Apart from verifying arms control and disarmament treaties, satellites can also strengthen the negotiation of peace agreements (such as the Dayton Peace Talks) and the prevention of international conflicts from arising. Verification has played an increasingly prominent role in high-profile UN operations. Verification and monitoring can be applied to the whole range of elements that constitute a peace implementation process, ranging from the military aspects through electoral monitoring and human rights monitoring, from negotiating an accord to finally monitoring it. Last but not least the problem of enforcing international obligations needs to be addressed, especially the dependence of international law on the will of political leaders and their respective national interests.

  6. Verification and accreditation schemes for climate change activities: A review of requirements for verification of greenhouse gas reductions and accreditation of verifiers—Implications for long-term carbon sequestration

    NASA Astrophysics Data System (ADS)

    Roed-Larsen, Trygve; Flach, Todd

    The purpose of this chapter is to provide a review of existing national and international requirements for verification of greenhouse gas reductions and associated accreditation of independent verifiers. The credibility of results claimed to reduce or remove anthropogenic emissions of greenhouse gases (GHG) is of utmost importance for the success of emerging schemes to reduce such emissions. Requirements include transparency, accuracy, consistency, and completeness of the GHG data. The many independent verification processes that have developed recently now make up a quite elaborate tool kit for best practices. The UN Framework Convention for Climate Change and the Kyoto Protocol specifications for project mechanisms initiated this work, but other national and international actors also work intensely with these issues. One initiative gaining wide application is that taken by the World Business Council for Sustainable Development with the World Resources Institute to develop a "GHG Protocol" to assist companies in arranging for auditable monitoring and reporting processes of their GHG activities. A set of new international standards developed by the International Organization for Standardization (ISO) provides specifications for the quantification, monitoring, and reporting of company entity and project-based activities. The ISO is also developing specifications for recognizing independent GHG verifiers. This chapter covers this background with intent of providing a common understanding of all efforts undertaken in different parts of the world to secure the reliability of GHG emission reduction and removal activities. These verification schemes may provide valuable input to current efforts of securing a comprehensive, trustworthy, and robust framework for verification activities of CO2 capture, transport, and storage.

  7. Verification Image of The Veins on The Back Palm with Modified Local Line Binary Pattern (MLLBP) and Histogram

    NASA Astrophysics Data System (ADS)

    Prijono, Agus; Darmawan Hangkawidjaja, Aan; Ratnadewi; Saleh Ahmar, Ansari

    2018-01-01

    The verification to person who is used today as a fingerprint, signature, personal identification number (PIN) in the bank system, identity cards, attendance, easily copied and forged. This causes the system not secure and is vulnerable to unauthorized persons to access the system. In this research will be implemented verification system using the image of the blood vessels in the back of the palms as recognition more difficult to imitate because it is located inside the human body so it is safer to use. The blood vessels located at the back of the human hand is unique, even humans twins have a different image of the blood vessels. Besides the image of the blood vessels do not depend on a person’s age, so it can be used for long term, except in the case of an accident, or disease. Because of the unique vein pattern recognition can be used in a person. In this paper, we used a modification method to perform the introduction of a person based on the image of the blood vessel that is using Modified Local Line Binary Pattern (MLLBP). The process of matching blood vessel image feature extraction using Hamming Distance. Test case of verification is done by calculating the percentage of acceptance of the same person. Rejection error occurs if a person was not matched by the system with the data itself. The 10 person with 15 image compared to 5 image vein for each person is resulted 80,67% successful Another test case of the verification is done by verified two image from different person that is forgery, and the verification will be true if the system can rejection the image forgery. The ten different person is not verified and the result is obtained 94%.

  8. Verification of Advective Bar Elements Implemented in the Aria Thermal Response Code.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mills, Brantley

    2016-01-01

    A verification effort was undertaken to evaluate the implementation of the new advective bar capability in the Aria thermal response code. Several approaches to the verification process were taken : a mesh refinement study to demonstrate solution convergence in the fluid and the solid, visually examining the mapping of the advective bar element nodes to the surrounding surfaces, and a comparison of solutions produced using the advective bars for simple geometries with solutions from commercial CFD software . The mesh refinement study has shown solution convergence for simple pipe flow in both temperature and velocity . Guidelines were provided tomore » achieve appropriate meshes between the advective bar elements and the surrounding volume. Simulations of pipe flow using advective bars elements in Aria have been compared to simulations using the commercial CFD software ANSYS Fluent (r) and provided comparable solutions in temperature and velocity supporting proper implementation of the new capability. Verification of Advective Bar Elements iv Acknowledgements A special thanks goes to Dean Dobranich for his guidance and expertise through all stages of this effort . His advice and feedback was instrumental to its completion. Thanks also goes to Sam Subia and Tolu Okusanya for helping to plan many of the verification activities performed in this document. Thank you to Sam, Justin Lamb and Victor Brunini for their assistance in resolving issues encountered with running the advective bar element model. Finally, thanks goes to Dean, Sam, and Adam Hetzler for reviewing the document and providing very valuable comments.« less

  9. SU-F-T-269: Preliminary Experience of Kuwait Cancer Control Center (KCCC) On IMRT Treatment Planning and Pre-Treatment Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Sethuraman, TKR; Sherif, M; Subramanian, N

    Purpose: The complexity of IMRT delivery requires pre-treatment quality assurance and plan verification. KCCC has implemented IMRT clinically in few sites and will extend to all sites. Recently, our Varian linear accelerator and Eclipse planning system were upgraded from Millennium 80 to 120 Multileaf Collimator (MLC) and from v8.6 to 11.0 respectively. Our preliminary experience on the pre-treatment quality assurance verification is discussed. Methods: Eight Breast, Three Prostate and One Hypopharynx cancer patients were planned with step and shoot IMRT. All breast cases were planned before the upgrade with 60% cases treated. The ICRU 83 recommendations were followed for themore » dose prescription and constraints to OAR for all cases. Point dose measurement was done with CIRS cylindrical phantom and PTW 0.125 cc ionization chamber. Measured dose was compared with calculated dose at the point of measurement. Map CHECK diode array phantom was used for the plan verification. Planned and measured doses were compared by applying gamma index of 3% (dose difference) / 3 mm DTA (average distance to agreement). For all cases, a plan is considered to be successful if more than 95% of the tested diodes pass the gamma test. A prostate case was chosen to compare the plan verification before and after the upgrade. Results: Point dose measurement results were in agreement with the calculated doses. The maximum deviation observed was 2.3%. The passing rate of average gamma index was measured higher than 97% for the plan verification of all cases. Similar result was observed for plan verification of the chosen prostate case before and after the upgrade. Conclusion: Our preliminary experience from the obtained results validates the accuracy of our QA process and provides confidence to extend IMRT to all sites in Kuwait.« less

  10. NASA's Approach to Software Assurance

    NASA Technical Reports Server (NTRS)

    Wetherholt, Martha

    2015-01-01

    NASA defines software assurance as: the planned and systematic set of activities that ensure conformance of software life cycle processes and products to requirements, standards, and procedures via quality, safety, reliability, and independent verification and validation. NASA's implementation of this approach to the quality, safety, reliability, security and verification and validation of software is brought together in one discipline, software assurance. Organizationally, NASA has software assurance at each NASA center, a Software Assurance Manager at NASA Headquarters, a Software Assurance Technical Fellow (currently the same person as the SA Manager), and an Independent Verification and Validation Organization with its own facility. An umbrella risk mitigation strategy for safety and mission success assurance of NASA's software, software assurance covers a wide area and is better structured to address the dynamic changes in how software is developed, used, and managed, as well as it's increasingly complex functionality. Being flexible, risk based, and prepared for challenges in software at NASA is essential, especially as much of our software is unique for each mission.

  11. Numerical verification of composite rods theory on multi-story buildings analysis

    NASA Astrophysics Data System (ADS)

    El-Din Mansour, Alaa; Filatov, Vladimir; Gandzhuntsev, Michael; Ryasny, Nikita

    2018-03-01

    In the article, a verification proposal of the composite rods theory on the structural analysis of skeletons for high-rise buildings. A testing design model been formed on which horizontal elements been represented by a multilayer cantilever beam operates on transverse bending on which slabs are connected with a moment-non-transferring connections and a multilayer columns represents the vertical elements. Those connections are sufficiently enough to form a shearing action can be approximated by a certain shear forces function, the thing which significantly reduces the overall static indeterminacy degree of the structural model. A system of differential equations describe the operation mechanism of the multilayer rods that solved using the numerical approach of successive approximations method. The proposed methodology to be used while preliminary calculations for the sake of determining the rigidity characteristics of the structure; are needed. In addition, for a qualitative assessment of the results obtained by other methods when performing calculations with the verification aims.

  12. Leveraging pattern matching to solve SRAM verification challenges at advanced nodes

    NASA Astrophysics Data System (ADS)

    Kan, Huan; Huang, Lucas; Yang, Legender; Zou, Elaine; Wan, Qijian; Du, Chunshan; Hu, Xinyi; Liu, Zhengfang; Zhu, Yu; Zhang, Recoo; Huang, Elven; Muirhead, Jonathan

    2018-03-01

    Memory is a critical component in today's system-on-chip (SoC) designs. Static random-access memory (SRAM) blocks are assembled by combining intellectual property (IP) blocks that come from SRAM libraries developed and certified by the foundries for both functionality and a specific process node. Customers place these SRAM IP in their designs, adjusting as necessary to achieve DRC-clean results. However, any changes a customer makes to these SRAM IP during implementation, whether intentionally or in error, can impact yield and functionality. Physical verification of SRAM has always been a challenge, because these blocks usually contain smaller feature sizes and spacing constraints compared to traditional logic or other layout structures. At advanced nodes, critical dimension becomes smaller and smaller, until there is almost no opportunity to use optical proximity correction (OPC) and lithography to adjust the manufacturing process to mitigate the effects of any changes. The smaller process geometries, reduced supply voltages, increasing process variation, and manufacturing uncertainty mean accurate SRAM physical verification results are not only reaching new levels of difficulty, but also new levels of criticality for design success. In this paper, we explore the use of pattern matching to create an SRAM verification flow that provides both accurate, comprehensive coverage of the required checks and visual output to enable faster, more accurate error debugging. Our results indicate that pattern matching can enable foundries to improve SRAM manufacturing yield, while allowing designers to benefit from SRAM verification kits that can shorten the time to market.

  13. Development and Verification of Enclosure Radiation Capabilities in the CHarring Ablator Response (CHAR) Code

    NASA Technical Reports Server (NTRS)

    Salazar, Giovanni; Droba, Justin C.; Oliver, Brandon; Amar, Adam J.

    2016-01-01

    With the recent development of multi-dimensional thermal protection system (TPS) material response codes, the capability to account for surface-to-surface radiation exchange in complex geometries is critical. This paper presents recent efforts to implement such capabilities in the CHarring Ablator Response (CHAR) code developed at NASA's Johnson Space Center. This work also describes the different numerical methods implemented in the code to compute geometric view factors for radiation problems involving multiple surfaces. Verification of the code's radiation capabilities and results of a code-to-code comparison are presented. Finally, a demonstration case of a two-dimensional ablating cavity with enclosure radiation accounting for a changing geometry is shown.

  14. Analytical torque calculation and experimental verification of synchronous permanent magnet couplings with Halbach arrays

    NASA Astrophysics Data System (ADS)

    Seo, Sung-Won; Kim, Young-Hyun; Lee, Jung-Ho; Choi, Jang-Young

    2018-05-01

    This paper presents analytical torque calculation and experimental verification of synchronous permanent magnet couplings (SPMCs) with Halbach arrays. A Halbach array is composed of various numbers of segments per pole; we calculate and compare the magnetic torques for 2, 3, and 4 segments. Firstly, based on the magnetic vector potential, and using a 2D polar coordinate system, we obtain analytical solutions for the magnetic field. Next, through a series of processes, we perform magnetic torque calculations using the derived solutions and a Maxwell stress tensor. Finally, the analytical results are verified by comparison with the results of 2D and 3D finite element analysis and the results of an experiment.

  15. JWST Telescope Integration and Test Progress

    NASA Technical Reports Server (NTRS)

    Matthews, Gary W.; Whitman, Tony L.; Feinberg, Lee D.; Voyton, Mark F.; Lander, Juli A.; Keski-Kuha, Ritva

    2016-01-01

    The James Webb Space Telescope (JWST) is a 6.5m, segmented, IR telescope that will explore the first light of the universe after the big bang. The JWST Optical Telescope Element (Telescope) integration and test program is well underway. The telescope was completed in the spring of 2016 and the cryogenic test equipment has been through two optical test programs leading up to the final flight verification program. The details of the telescope mirror integration will be provided along with the current status of the flight observatory. In addition, the results of the two optical ground support equipment cryo tests will be shown and how these plans fold into the flight verification program.

  16. 78 FR 72077 - Energy Efficiency Program for Industrial Equipment: Final Determination Classifying UL...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-12-02

    ... Verification Services Inc. as a Nationally Recognized Certification Program for Small Electric Motors AGENCY... FURTHER INFORMATION CONTACT: Mr. Lucas Adin, U.S. Department of Energy, Building Technologies Office, Mail... conservation requirements for, among other things, electric motors and small electric motors, including test...

  17. 75 FR 16428 - Polyethylene Retail Carrier Bags from the Socialist Republic of Vietnam: Final Affirmative...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-04-01

    ... Calvert or Jun Jack Zhao, AD/CVD Operations, Office 6, Import Administration, International Trade... Operations, Office 6, ``Verification of the Questionnaire Responses Submitted by Chin Sheng Company, Ltd... concerning banking in Vietnam. See Memorandum to Barbara E. Tillman, Director, AD/ CVD Operations, Office 6...

  18. 78 FR 50448 - Agency Information Collection Activities; Submission for OMB Review; Comment Request; Income and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-19

    ... (ETA) sponsored information collection request (ICR) titled, ``Income and Eligibility Verification... this request to the Office of Information and Regulatory Affairs, Attn: OMB Desk Officer for DOL-ETA..., the ETA issued a final rule regarding the Confidentiality and Disclosure of State Unemployment...

  19. Final report on a cold climate permeable interlocking concrete pavement test facility at the University of New Hampshire Stormwater Center.

    DOT National Transportation Integrated Search

    2013-05-01

    University of New Hampshire Stormwater Center (UNHSC) completed a two year field verification study of a permeable interlocking concrete pavement (PICP) stormwater management system. The purpose of this study was to evaluate the cold climate function...

  20. UAS Integration in the NAS Project: Part Task 6 V & V Simulation: Primary Results

    NASA Technical Reports Server (NTRS)

    Rorie, Conrad; Fern, Lisa; Shively, Jay; Santiago, Confesor

    2016-01-01

    This is a presentation of the preliminary results on final V and V (Verification and Validation) activity of [RTCA (Radio Technical Commission for Aeronautics)] SC (Special Committee)-228 DAA (Detect and Avoid) HMI (Human-Machine Interface) requirements for display alerting and guidance.

  1. Synthesis of calculational methods for design and analysis of radiation shields for nuclear rocket systems

    NASA Technical Reports Server (NTRS)

    Capo, M. A.; Disney, R. K.; Jordan, T. A.; Soltesz, R. G.; Woodsum, H. C.

    1969-01-01

    Eight computer programs make up a nine volume synthesis containing two design methods for nuclear rocket radiation shields. The first design method is appropriate for parametric and preliminary studies, while the second accomplishes the verification of a final nuclear rocket reactor design.

  2. Quantification, Prediction, and the Online Impact of Sentence Truth-Value: Evidence from Event-Related Potentials

    ERIC Educational Resources Information Center

    Nieuwland, Mante S.

    2016-01-01

    Do negative quantifiers like "few" reduce people's ability to rapidly evaluate incoming language with respect to world knowledge? Previous research has addressed this question by examining whether online measures of quantifier comprehension match the "final" interpretation reflected in verification judgments. However, these…

  3. A methodology for producing reliable software, volume 1

    NASA Technical Reports Server (NTRS)

    Stucki, L. G.; Moranda, P. B.; Foshee, G.; Kirchoff, M.; Omre, R.

    1976-01-01

    An investigation into the areas having an impact on producing reliable software including automated verification tools, software modeling, testing techniques, structured programming, and management techniques is presented. This final report contains the results of this investigation, analysis of each technique, and the definition of a methodology for producing reliable software.

  4. 75 FR 31288 - Plant-Verified Drop Shipment (PVDS)-Nonpostal Documentation

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-06-03

    ... POSTAL SERVICE 39 CFR Part 111 Plant-Verified Drop Shipment (PVDS)--Nonpostal Documentation AGENCY... 8125, Plant-Verified Drop Shipment (PVDS) Verification and Clearance, is the sole source of evidence... induction points of plant-verified drop shipment mailings, the Postal Service is adopting this final rule to...

  5. Aircraft electromagnetic compatibility

    NASA Technical Reports Server (NTRS)

    Clarke, Clifton A.; Larsen, William E.

    1987-01-01

    Illustrated are aircraft architecture, electromagnetic interference environments, electromagnetic compatibility protection techniques, program specifications, tasks, and verification and validation procedures. The environment of 400 Hz power, electrical transients, and radio frequency fields are portrayed and related to thresholds of avionics electronics. Five layers of protection for avionics are defined. Recognition is given to some present day electromagnetic compatibility weaknesses and issues which serve to reemphasize the importance of EMC verification of equipment and parts, and their ultimate EMC validation on the aircraft. Proven standards of grounding, bonding, shielding, wiring, and packaging are laid out to help provide a foundation for a comprehensive approach to successful future aircraft design and an understanding of cost effective EMC in an aircraft setting.

  6. Threads of Mission Success

    NASA Technical Reports Server (NTRS)

    Gavin, Thomas R.

    2006-01-01

    This viewgraph presentation reviews the many parts of the JPL mission planning process that the project manager has to work with. Some of them are: NASA & JPL's institutional requirements, the mission systems design requirements, the science interactions, the technical interactions, financial requirements, verification and validation, safety and mission assurance, and independent assessment, review and reporting.

  7. Medical device development.

    PubMed

    Panescu, Dorin

    2009-01-01

    The development of a successful medical product requires not only engineering design efforts, but also clinical, regulatory, marketing and business expertise. This paper reviews items related to the process of designing medical devices. It discusses the steps required to take a medical product idea from concept, through development, verification and validation, regulatory approvals and market release.

  8. Fighting Domestic and International Fraud in the Admissions and Registrar's Offices

    ERIC Educational Resources Information Center

    Koenig, Ann M.; Devlin, Edward

    2012-01-01

    The education sector is no stranger to fraud, unfortunately. This article provides best practice guidance in recognizing and dealing with fraud, with emphasis on domestic and international academic credential fraud. It includes practical approaches to academic document review and verification. Success in fighting fraud requires becoming informed,…

  9. Safety Verification of the Small Aircraft Transportation System Concept of Operations

    NASA Technical Reports Server (NTRS)

    Carreno, Victor; Munoz, Cesar

    2005-01-01

    A critical factor in the adoption of any new aeronautical technology or concept of operation is safety. Traditionally, safety is accomplished through a rigorous process that involves human factors, low and high fidelity simulations, and flight experiments. As this process is usually performed on final products or functional prototypes, concept modifications resulting from this process are very expensive to implement. This paper describe an approach to system safety that can take place at early stages of a concept design. It is based on a set of mathematical techniques and tools known as formal methods. In contrast to testing and simulation, formal methods provide the capability of exhaustive state exploration analysis. We present the safety analysis and verification performed for the Small Aircraft Transportation System (SATS) Concept of Operations (ConOps). The concept of operations is modeled using discrete and hybrid mathematical models. These models are then analyzed using formal methods. The objective of the analysis is to show, in a mathematical framework, that the concept of operation complies with a set of safety requirements. It is also shown that the ConOps has some desirable characteristic such as liveness and absence of dead-lock. The analysis and verification is performed in the Prototype Verification System (PVS), which is a computer based specification language and a theorem proving assistant.

  10. [Inheritance rights fo the child born from post-mortem fertilization].

    PubMed

    Iniesta Delgado, Juan José

    2008-01-01

    Spanish Law allows in the possibility of post mortem fertilization, recognizing the paternity of the deceased male. The most prominent legal effects of this fact have to do with the succession of his father. The way of fixing the child's portion in the forced succession and its protection, the question of determining his share in the inheritance and the necessity of defending his rights until the verification of the birth are some of the issues that are discussed in this article.

  11. Online 3D EPID-based dose verification: Proof of concept

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda

    Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.« less

  12. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, in a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation; testing results, and other information. Where appropriate, actual performance history was used for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to verify compliance with requirements and to highlight design or performance shortcomings for further decision-making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability and maintainability analysis, and present findings and observation based on analysis leading to the Ground Systems Preliminary Design Review milestone.

  13. Use the predictive models to explore the key factors affecting phytoplankton succession in Lake Erhai, China.

    PubMed

    Zhu, Rong; Wang, Huan; Chen, Jun; Shen, Hong; Deng, Xuwei

    2018-01-01

    Increasing algae in Lake Erhai has resulted in frequent blooms that have not only led to water ecosystem degeneration but also seriously influenced the quality of the water supply and caused extensive damage to the local people, as the lake is a water resource for Dali City. Exploring the key factors affecting phytoplankton succession and developing predictive models with easily detectable parameters for phytoplankton have been proven to be practical ways to improve water quality. To this end, a systematic survey focused on phytoplankton succession was conducted over 2 years in Lake Erhai. The data from the first study year were used to develop predictive models, and the data from the second year were used for model verification. The seasonal succession of phytoplankton in Lake Erhai was obvious. The dominant groups were Cyanobacteria in the summer, Chlorophyta in the autumn and Bacillariophyta in the winter. The developments and verification of predictive models indicated that compared to phytoplankton biomass, phytoplankton density is more effective for estimating phytoplankton variation in Lake Erhai. CCA (canonical correlation analysis) indicated that TN (total nitrogen), TP (total phosphorus), DO (dissolved oxygen), SD (Secchi depth), Cond (conductivity), T (water temperature), and ORP (oxidation reduction potential) had significant influences (p < 0.05) on the phytoplankton community. The CCA of the dominant species found that Microcystis was significantly influenced by T. The dominant Chlorophyta, Psephonema aenigmaticum and Mougeotia, were significantly influenced by TN. All results indicated that TN and T were the two key factors driving phytoplankton succession in Lake Erhai.

  14. Partial defect verification of spent fuel assemblies by PDET: Principle and field testing in Interim Spent fuel Storage Facility (CLAB) in Sweden

    DOE PAGES

    Ham, Y.; Kerr, P.; Sitaraman, S.; ...

    2016-05-05

    Here, the need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called "difficult-to-access" areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into "difficult-to-access" areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reportedmore » the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17×17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly bunrup levels.« less

  15. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lucconi, G; Department of Radiation Oncology, Massachusetts General Hospital, Boston, MA; Bentefour, E

    Purpose: The clinical commissioning of a workflow for pre-treatment range verification/adjustment for the head treatment of pediatric medulloblastoma patients, including dose monitoring during treatment. Methods: An array of Si-diodes (DIODES Incorporated) is placed on the patient skin on the opposite side to the beam entrance. A “scout” SOBP beam, with a longer beam range to cover the diodes in its plateau, is delivered; the measured signal is analyzed and the extracted water equivalent path lengths (WEPL) are compared to the expected values, revealing if a range correction is needed. Diodes stay in place during treatment to measure dose. The workflowmore » was tested in solid water and head phantoms and validated against independent WEPL measurements. Both measured WEPL and skin doses were compared to computed values from the TPS (XiO); a Markus chamber was used for reference dose measurements. Results: The WEPL accuracy of the method was verified by comparing it with the dose extinction method. It resulted, for both solid water and head phantom, in the sub-millimeter range, with a deviation less than 1% to the value extracted from the TPS. The accuracy of dose measurements in the fall-off part of the dose profile was validated against the Markus chamber. The entire range verification workflow was successfully tested for the mock-treatment of head phantom with the standard delivery of 90 cGy per field per fraction. The WEPL measurement revealed no need for range correction. The dose measurements agreed to better than 4% with the prescription dose. The robustness of the method and workflow, including detector array, hardware set and software functions, was successfully stress-tested with multiple repetitions. Conclusion: The performance of the in-vivo range verification system and related workflow meet the clinical requirements in terms of the needed WEPL accuracy for pretreatment range verification with acceptable dose to the patient.« less

  16. Partial Defect Verification of Spent Fuel Assemblies by PDET: Principle and Field Testing in Interim Spent Fuel Storage Facility (CLAB) in Sweden

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ham, Y.S.; Kerr, P.; Sitaraman, S.

    The need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called 'difficult-to-access' areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into 'difficult-to-access' areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reported themore » successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17x17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly burnup levels. (authors)« less

  17. Partial defect verification of spent fuel assemblies by PDET: Principle and field testing in Interim Spent fuel Storage Facility (CLAB) in Sweden

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ham, Y.; Kerr, P.; Sitaraman, S.

    Here, the need for the development of a credible method and instrument for partial defect verification of spent fuel has been emphasized over a few decades in the safeguards communities as the diverted spent fuel pins can be the source of nuclear terrorism or devices. The need is increasingly more important and even urgent as many countries have started to transfer spent fuel to so called "difficult-to-access" areas such as dry storage casks, reprocessing or geological repositories. Partial defect verification is required by IAEA before spent fuel is placed into "difficult-to-access" areas. Earlier, Lawrence Livermore National Laboratory (LLNL) has reportedmore » the successful development of a new, credible partial defect verification method for pressurized water reactor (PWR) spent fuel assemblies without use of operator data, and further reported the validation experiments using commercial spent fuel assemblies with some missing fuel pins. The method was found to be robust as the method is relatively invariant to the characteristic variations of spent fuel assemblies such as initial fuel enrichment, cooling time, and burn-up. Since then, the PDET system has been designed and prototyped for 17×17 PWR spent fuel assemblies, complete with data acquisition software and acquisition electronics. In this paper, a summary description of the PDET development followed by results of the first successful field testing using the integrated PDET system and actual spent fuel assemblies performed in a commercial spent fuel storage site, known as Central Interim Spent fuel Storage Facility (CLAB) in Sweden will be presented. In addition to partial defect detection initial studies have determined that the tool can be used to verify the operator declared average burnup of the assembly as well as intra-assembly bunrup levels.« less

  18. 77 FR 17422 - Notice of Final Determination of Sales at Less Than Fair Value and Affirmative Critical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-03-26

    ... listings to reflect certain verification findings. Also, in February 2012, the petitioner and the... two interior storage compartments accessible through one or more separate external doors or drawers or... external door or drawer is either a refrigerator compartment or convertible compartment, but is not a...

  19. ESTE Project Brief: Environmental and Sustainable Technology Evaluations (ESTE): Verification of Qualitative Spot Test Kits for Lead in Paint

    EPA Science Inventory

    On April 22, 2008, EPA issued the final Lead; Renovation, Repair, and Painting (RRP) Program Rule. The rule addresses lead-based paint hazards created by renovation, repair, and painting activities that disturb lead-based paint in target housing and child-occupied facilities. Und...

  20. Experimental Verification of a Pneumatic Transport System for the Rapid Evacuation of Tunnels, Part II - Test Program

    DOT National Transportation Integrated Search

    1978-12-01

    This study is the final phase of a muck pipeline program begun in 1973. The objective of the study was to evaluate a pneumatic pipeline system for muck haulage from a tunnel excavated by a tunnel boring machine. The system was comprised of a muck pre...

  1. 77 FR 64475 - Notice of Final Determination of Sales at Less Than Fair Value: Circular Welded Carbon-Quality...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-22

    ...) (Preliminary Determination). On June 12, 2012, respondent Universal Tube and Plastic Industries, Ltd. (UTP-JA... Universal Tube and Plastic Industries, Ltd. in the Antidumping Duty Investigation of Circular Welded Carbon... Verification of Universal Tube and Plastic Industries, Ltd. (UTP-JA) and Its Home Market Affiliates...

  2. Aqueous Cleaning and Validation for Space Shuttle Propulsion Hardware at the White Sands Test Facility

    NASA Technical Reports Server (NTRS)

    Hornung, Steven D.; Biesinger, Paul; Kirsch, Mike; Beeson, Harold; Leuders, Kathy

    1999-01-01

    The NASA White Sands Test Facility (WSTF) has developed an entirely aqueous final cleaning and verification process to replace the current chlorofluorocarbon (CFC) 113 based process. This process has been accepted for final cleaning and cleanliness verification of WSTF ground support equipment. The aqueous process relies on ultrapure water at 50 C (323 K) and ultrasonic agitation for removal of organic compounds and particulate. The cleanliness is verified bv determining the total organic carbon (TOC) content and filtration with particulate counting. The effectiveness of the aqueous methods for detecting hydrocarbon contamination and particulate was compared to the accepted CFC 113 sampling procedures. Testing with known contaminants, such as hydraulic fluid and cutting and lubricating oils, to establish a correlation between aqueous TOC and CFC 113 nonvolatile residue (NVR) was performed. Particulate sampling on cleaned batches of hardware that were randomly separated and sampled by the two methods was performed. This paper presents the approach and results, and discusses the issues in establishing the equivalence of aqueous sampling to CFC 113 sampling, while describing the approach for implementing aqueous techniques on Space Shuttle Propulsion hardware.

  3. Innovative safety valve selection techniques and data.

    PubMed

    Miller, Curt; Bredemyer, Lindsey

    2007-04-11

    The new valve data resources and modeling tools that are available today are instrumental in verifying that that safety levels are being met in both current installations and project designs. If the new ISA 84 functional safety practices are followed closely, good industry validated data used, and a user's maintenance integrity program strictly enforced, plants should feel confident that their design has been quantitatively reinforced. After 2 years of exhaustive reliability studies, there are now techniques and data available to support this safety system component deficiency. Everyone who has gone through the process of safety integrity level (SIL) verification (i.e. reliability math) will appreciate the progress made in this area. The benefits of these advancements are improved safety with lower lifecycle costs such as lower capital investment and/or longer testing intervals. This discussion will start with a review of the different valve, actuator, and solenoid/positioner combinations that can be used and their associated application restraints. Failure rate reliability studies (i.e. FMEDA) and data associated with the final combinations will then discussed. Finally, the impact of the selections on each safety system's SIL verification will be reviewed.

  4. eBiometrics: an enhanced multi-biometrics authentication technique for real-time remote applications on mobile devices

    NASA Astrophysics Data System (ADS)

    Kuseler, Torben; Lami, Ihsan; Jassim, Sabah; Sellahewa, Harin

    2010-04-01

    The use of mobile communication devices with advance sensors is growing rapidly. These sensors are enabling functions such as Image capture, Location applications, and Biometric authentication such as Fingerprint verification and Face & Handwritten signature recognition. Such ubiquitous devices are essential tools in today's global economic activities enabling anywhere-anytime financial and business transactions. Cryptographic functions and biometric-based authentication can enhance the security and confidentiality of mobile transactions. Using Biometric template security techniques in real-time biometric-based authentication are key factors for successful identity verification solutions, but are venerable to determined attacks by both fraudulent software and hardware. The EU-funded SecurePhone project has designed and implemented a multimodal biometric user authentication system on a prototype mobile communication device. However, various implementations of this project have resulted in long verification times or reduced accuracy and/or security. This paper proposes to use built-in-self-test techniques to ensure no tampering has taken place on the verification process prior to performing the actual biometric authentication. These techniques utilises the user personal identification number as a seed to generate a unique signature. This signature is then used to test the integrity of the verification process. Also, this study proposes the use of a combination of biometric modalities to provide application specific authentication in a secure environment, thus achieving optimum security level with effective processing time. I.e. to ensure that the necessary authentication steps and algorithms running on the mobile device application processor can not be undermined or modified by an imposter to get unauthorized access to the secure system.

  5. Derivation of sorting programs

    NASA Technical Reports Server (NTRS)

    Varghese, Joseph; Loganantharaj, Rasiah

    1990-01-01

    Program synthesis for critical applications has become a viable alternative to program verification. Nested resolution and its extension are used to synthesize a set of sorting programs from their first order logic specifications. A set of sorting programs, such as, naive sort, merge sort, and insertion sort, were successfully synthesized starting from the same set of specifications.

  6. FY2017 Final Report: Power of the People: A technical ethical and experimental examination of the use of crowdsourcing to support international nuclear safeguards verification.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gastelum, Zoe Nellie; Sentz, Kari; Swanson, Meili Claire

    Recent advances in information technology have led to an expansion of crowdsourcing activities that utilize the “power of the people” harnessed via online games, communities of interest, and other platforms to collect, analyze, verify, and provide technological solutions for challenges from a multitude of domains. To related this surge in popularity, the research team developed a taxonomy of crowdsourcing activities as they relate to international nuclear safeguards, evaluated the potential legal and ethical issues surrounding the use of crowdsourcing to support safeguards, and proposed experimental designs to test the capabilities and prospect for the use of crowdsourcing to support nuclearmore » safeguards verification.« less

  7. Array automated assembly task, phase 2. Low cost silicon solar array project

    NASA Technical Reports Server (NTRS)

    Rhee, S. S.; Jones, G. T.; Allison, K. T.

    1978-01-01

    Several modifications instituted in the wafer surface preparation process served to significantly reduce the process cost to 1.55 cents per peak watt in 1975 cents. Performance verification tests of a laser scanning system showed a limited capability to detect hidden cracks or defects, but with potential equipment modifications this cost effective system could be rendered suitable for applications. Installation of electroless nickel plating system was completed along with an optimization of the wafer plating process. The solder coating and flux removal process verification test was completed. An optimum temperature range of 500-550 C was found to produce uniform solder coating with the restriction that a modified dipping procedure is utilized. Finally, the construction of the spray-on dopant equipment was completed.

  8. Advanced manufacturing development of a composite empennage component for L-1011 aircraft

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Work on process verification and tooling development continued. The cover process development was completed with the decision to proceed with low resin content prepreg material (34 + or - 3% by weight) in the fabrication of production readiness verification test (PRVT) specimens and the full-scale covers. The structural integrity of the cover/joint design was verified with the successful test of the cover attachment to fuselage ancillary test specimen (H25). Failure occurred, as predicted, in the skin panel away from the fuselage joint at 141 percent of the design ultimate load. With the successful completion of the H25 test, the PRVT cover specimens, which are identical to the H25 ancillary test specimen, were cleared for production. Eight of the twenty cover specimens were fabricated and are in preparation for test. All twenty of the PRVT spar specimens were fabricated and also were prepared for test. The environmental chambers used in the durability test of ten cover and ten spar PRVT specimens were completed and installed in the load reaction frames.

  9. A Randomized Controlled Study of the Use of Video Double-Lumen Endobronchial Tubes Versus Double-Lumen Endobronchial Tubes in Thoracic Surgery.

    PubMed

    Heir, Jagtar Singh; Guo, Shu-Lin; Purugganan, Ronaldo; Jackson, Tim A; Sekhon, Anupamjeet Kaur; Mirza, Kazim; Lasala, Javier; Feng, Lei; Cata, Juan P

    2018-02-01

    To compare the incidence of fiberoptic bronchoscope (FOB) use (1) during verification of initial placement and (2) for reconfirmation of correct placement following repositioning, when either a double-lumen tube (DLT) or video double-lumen tube (VDLT) was used for lung isolation during thoracic surgery. A randomized controlled study. Single-center university teaching hospital. The study comprised 80 patients who were 18 years or older requiring lung isolation for surgery. After institutional review board approval, patients were randomized prior to surgery to either DLT or VDLT usage. Attending anesthesiologists placed the Mallinckrodt DLT or Vivasight (ET View Ltd, Misgav, Israel) VDLT with conventional laryngoscopy or video laryngoscopy then verified correct tube position through the view provided with either VDLT external monitor or FOB. Data collected included: sex, body mass index, successful intubation and endobronchial placement, intubation time, confirmation time of tube position, FOB use, quality of view, dislodgement of tube, and ability to forewarn dislodgement of endobronchial cuff and complications. FOB use for verification of final position of the tube (VDLT 13.2% [5/38] v DLT 100% [42/42], p < 0.0001), need for FOB to correct the dislodgement (VDLT 7.7% [1/13] v DLT 100% [14/14], p < 0.0001), dislodgement during positioning (VDLT 61.5% [8/13] v DLT 64.3% [9/14], p = ns), dislodgement during surgery (VDLT 38.5% [5/13] v DLT 21.4% [3/14], p = ns), and ability to forewarn dislodgement of endobronchial cuff (VDLT 18.4% [7/38] v DLT 4.8% [2/42], p = 0.078). This study demonstrated a reduction of 86.8% in FOB use, which was a similar reduction found in other published studies. Copyright © 2018 Elsevier Inc. All rights reserved.

  10. Expert system verification and validation study

    NASA Technical Reports Server (NTRS)

    French, Scott W.; Hamilton, David

    1992-01-01

    Five workshops on verification and validation (V&V) of expert systems (ES) where taught during this recent period of performance. Two key activities, previously performed under this contract, supported these recent workshops (1) Survey of state-of-the-practice of V&V of ES and (2) Development of workshop material and first class. The first activity involved performing an extensive survey of ES developers in order to answer several questions regarding the state-of-the-practice in V&V of ES. These questions related to the amount and type of V&V done and the successfulness of this V&V. The next key activity involved developing an intensive hands-on workshop in V&V of ES. This activity involved surveying a large number of V&V techniques, conventional as well as ES specific ones. In addition to explaining the techniques, we showed how each technique could be applied on a sample problem. References were included in the workshop material, and cross referenced to techniques, so that students would know where to go to find additional information about each technique. In addition to teaching specific techniques, we included an extensive amount of material on V&V concepts and how to develop a V&V plan for an ES project. We felt this material was necessary so that developers would be prepared to develop an orderly and structured approach to V&V. That is, they would have a process that supported the use of the specific techniques. Finally, to provide hands-on experience, we developed a set of case study exercises. These exercises were to provide an opportunity for the students to apply all the material (concepts, techniques, and planning material) to a realistic problem.

  11. Attention and implicit memory in the category-verification and lexical decision tasks.

    PubMed

    Mulligan, Neil W; Peterson, Daniel

    2008-05-01

    Prior research on implicit memory appeared to support 3 generalizations: Conceptual tests are affected by divided attention, perceptual tasks are affected by certain divided-attention manipulations, and all types of priming are affected by selective attention. These generalizations are challenged in experiments using the implicit tests of category verification and lexical decision. First, both tasks were unaffected by divided-attention tasks known to impact other priming tasks. Second, both tasks were unaffected by a manipulation of selective attention in which colored words were either named or their colors identified. Thus, category verification, unlike other conceptual tasks, appears unaffected by divided attention, and some selective-attention tasks, and lexical decision, unlike other perceptual tasks, appears unaffected by a difficult divided-attention task and some selective-attention tasks. Finally, both tasks were affected by a selective-attention task in which attention was manipulated across objects (rather than within objects), indicating some susceptibility to selective attention. The results contradict an analysis on the basis of the conceptual-perceptual distinction and other more specific hypotheses but are consistent with the distinction between production and identification priming.

  12. Verification and Optimal Control of Context-Sensitive Probabilistic Boolean Networks Using Model Checking and Polynomial Optimization

    PubMed Central

    Hiraishi, Kunihiko

    2014-01-01

    One of the significant topics in systems biology is to develop control theory of gene regulatory networks (GRNs). In typical control of GRNs, expression of some genes is inhibited (activated) by manipulating external stimuli and expression of other genes. It is expected to apply control theory of GRNs to gene therapy technologies in the future. In this paper, a control method using a Boolean network (BN) is studied. A BN is widely used as a model of GRNs, and gene expression is expressed by a binary value (ON or OFF). In particular, a context-sensitive probabilistic Boolean network (CS-PBN), which is one of the extended models of BNs, is used. For CS-PBNs, the verification problem and the optimal control problem are considered. For the verification problem, a solution method using the probabilistic model checker PRISM is proposed. For the optimal control problem, a solution method using polynomial optimization is proposed. Finally, a numerical example on the WNT5A network, which is related to melanoma, is presented. The proposed methods provide us useful tools in control theory of GRNs. PMID:24587766

  13. VerifEYE: a real-time meat inspection system for the beef processing industry

    NASA Astrophysics Data System (ADS)

    Kocak, Donna M.; Caimi, Frank M.; Flick, Rick L.; Elharti, Abdelmoula

    2003-02-01

    Described is a real-time meat inspection system developed for the beef processing industry by eMerge Interactive. Designed to detect and localize trace amounts of contamination on cattle carcasses in the packing process, the system affords the beef industry an accurate, high speed, passive optical method of inspection. Using a method patented by United States Department of Agriculture and Iowa State University, the system takes advantage of fluorescing chlorophyll found in the animal's diet and therefore the digestive track to allow detection and imaging of contaminated areas that may harbor potentially dangerous microbial pathogens. Featuring real-time image processing and documentation of performance, the system can be easily integrated into a processing facility's Hazard Analysis and Critical Control Point quality assurance program. This paper describes the VerifEYE carcass inspection and removal verification system. Results indicating the feasibility of the method, as well as field data collected using a prototype system during four university trials conducted in 2001 are presented. Two successful demonstrations using the prototype system were held at a major U.S. meat processing facility in early 2002.

  14. Comprehensive predictions of target proteins based on protein-chemical interaction using virtual screening and experimental verifications.

    PubMed

    Kobayashi, Hiroki; Harada, Hiroko; Nakamura, Masaomi; Futamura, Yushi; Ito, Akihiro; Yoshida, Minoru; Iemura, Shun-Ichiro; Shin-Ya, Kazuo; Doi, Takayuki; Takahashi, Takashi; Natsume, Tohru; Imoto, Masaya; Sakakibara, Yasubumi

    2012-04-05

    Identification of the target proteins of bioactive compounds is critical for elucidating the mode of action; however, target identification has been difficult in general, mostly due to the low sensitivity of detection using affinity chromatography followed by CBB staining and MS/MS analysis. We applied our protocol of predicting target proteins combining in silico screening and experimental verification for incednine, which inhibits the anti-apoptotic function of Bcl-xL by an unknown mechanism. One hundred eighty-two target protein candidates were computationally predicted to bind to incednine by the statistical prediction method, and the predictions were verified by in vitro binding of incednine to seven proteins, whose expression can be confirmed in our cell system.As a result, 40% accuracy of the computational predictions was achieved successfully, and we newly found 3 incednine-binding proteins. This study revealed that our proposed protocol of predicting target protein combining in silico screening and experimental verification is useful, and provides new insight into a strategy for identifying target proteins of small molecules.

  15. NEW DEVELOPMENTS AND APPLICATIONS OF SUPERHEATED EMULSIONS: WARHEAD VERIFICATION AND SPECIAL NUCLEAR MATERIAL INTERDICTION.

    PubMed

    d'Errico, F; Chierici, A; Gattas-Sethi, M; Philippe, S; Goldston, R; Glaser, A

    2018-04-25

    In recent years, neutron detection with superheated emulsions has received renewed attention thanks to improved detector manufacturing and read-out techniques, and thanks to successful applications in warhead verification and special nuclear material (SNM) interdiction. Detectors are currently manufactured with methods allowing high uniformity of the drop sizes, which in turn allows the use of optical read-out techniques based on dynamic light scattering. Small detector cartridges arranged in 2D matrices are developed for the verification of a declared warhead without revealing its design. For this application, the enabling features of the emulsions are that bubbles formed at different times cannot be distinguished from each other, while the passive nature of the detectors avoids the susceptibility to electronic snooping and tampering. Large modules of emulsions are developed to detect the presence of shielded special nuclear materials hidden in cargo containers 'interrogated' with high energy X-rays. In this case, the enabling features of the emulsions are photon discrimination, a neutron detection threshold close to 3 MeV and a rate-insensitive read-out.

  16. Finite element simulation and Experimental verification of Incremental Sheet metal Forming

    NASA Astrophysics Data System (ADS)

    Kaushik Yanamundra, Krishna; Karthikeyan, R., Dr.; Naranje, Vishal, Dr

    2018-04-01

    Incremental sheet metal forming is now a proven manufacturing technique that can be employed to obtain application specific, customized, symmetric or asymmetric shapes that are required by automobile or biomedical industries for specific purposes like car body parts, dental implants or knee implants. Finite element simulation of metal forming process is being performed successfully using explicit dynamics analysis of commercial FE software. The simulation is mainly useful in optimization of the process as well design of the final product. This paper focuses on simulating the incremental sheet metal forming process in ABAQUS, and validating the results using experimental methods. The shapes generated for testing are of trapezoid, dome and elliptical shapes whose G codes are written and fed into the CNC milling machine with an attached forming tool with a hemispherical bottom. The same pre-generated coordinates are used to simulate a similar machining conditions in ABAQUS and the tool forces, stresses and strains in the workpiece while machining are obtained as the output data. The forces experimentally were recorded using a dynamometer. The experimental and simulated results were then compared and thus conclusions were drawn.

  17. Assessment of Thermal Control and Protective Coatings

    NASA Technical Reports Server (NTRS)

    Mell, Richard J.

    2000-01-01

    This final report is concerned with the tasks performed during the contract period which included spacecraft coating development, testing, and applications. Five marker coatings consisting of a bright yellow handrail coating, protective overcoat for ceramic coatings, and specialized primers for composites (or polymer) surfaces were developed and commercialized by AZ Technology during this program. Most of the coatings have passed space environmental stability requirements via ground tests and/or flight verification. Marker coatings and protective overcoats were successfully flown on the Passive Optical Sample Assembly (POSA) and the Optical Properties Monitor (OPM) experiments flown on the Russian space station MIR. To date, most of the coatings developed and/or modified during this program have been utilized on the International Space Station and other spacecraft. For ISS, AZ Technology manufactured the 'UNITY' emblem now being flown on the NASA UNITY node (Node 1) that is docked to the Russian Zarya (FGB) utilizing the colored marker coatings (white, blue, red) developed by AZ Technology. The UNITY emblem included the US American flag, the Unity logo, and NASA logo on a white background, applied to a Beta cloth substrate.

  18. Modeling interfacial fracture in Sierra.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brown, Arthur A.; Ohashi, Yuki; Lu, Wei-Yang

    2013-09-01

    This report summarizes computational efforts to model interfacial fracture using cohesive zone models in the SIERRA/SolidMechanics (SIERRA/SM) finite element code. Cohesive surface elements were used to model crack initiation and propagation along predefined paths. Mesh convergence was observed with SIERRA/SM for numerous geometries. As the funding for this project came from the Advanced Simulation and Computing Verification and Validation (ASC V&V) focus area, considerable effort was spent performing verification and validation. Code verification was performed to compare code predictions to analytical solutions for simple three-element simulations as well as a higher-fidelity simulation of a double-cantilever beam. Parameter identification was conductedmore » with Dakota using experimental results on asymmetric double-cantilever beam (ADCB) and end-notched-flexure (ENF) experiments conducted under Campaign-6 funding. Discretization convergence studies were also performed with respect to mesh size and time step and an optimization study was completed for mode II delamination using the ENF geometry. Throughout this verification process, numerous SIERRA/SM bugs were found and reported, all of which have been fixed, leading to over a 10-fold increase in convergence rates. Finally, mixed-mode flexure experiments were performed for validation. One of the unexplained issues encountered was material property variability for ostensibly the same composite material. Since the variability is not fully understood, it is difficult to accurately assess uncertainty when performing predictions.« less

  19. Verification Challenges at Low Numbers

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered ofmore » the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.« less

  20. Thermal System Verification and Model Validation for NASA's Cryogenic Passively Cooled James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Cleveland, Paul E.; Parrish, Keith A.

    2005-01-01

    A thorough and unique thermal verification and model validation plan has been developed for NASA s James Webb Space Telescope. The JWST observatory consists of a large deployed aperture optical telescope passively cooled to below 50 Kelvin along with a suite of several instruments passively and actively cooled to below 37 Kelvin and 7 Kelvin, respectively. Passive cooling to these extremely low temperatures is made feasible by the use of a large deployed high efficiency sunshield and an orbit location at the L2 Lagrange point. Another enabling feature is the scale or size of the observatory that allows for large radiator sizes that are compatible with the expected power dissipation of the instruments and large format Mercury Cadmium Telluride (HgCdTe) detector arrays. This passive cooling concept is simple, reliable, and mission enabling when compared to the alternatives of mechanical coolers and stored cryogens. However, these same large scale observatory features, which make passive cooling viable, also prevent the typical flight configuration fully-deployed thermal balance test that is the keystone to most space missions thermal verification plan. JWST is simply too large in its deployed configuration to be properly thermal balance tested in the facilities that currently exist. This reality, when combined with a mission thermal concept with little to no flight heritage, has necessitated the need for a unique and alternative approach to thermal system verification and model validation. This paper describes the thermal verification and model validation plan that has been developed for JWST. The plan relies on judicious use of cryogenic and thermal design margin, a completely independent thermal modeling cross check utilizing different analysis teams and software packages, and finally, a comprehensive set of thermal tests that occur at different levels of JWST assembly. After a brief description of the JWST mission and thermal architecture, a detailed description of the three aspects of the thermal verification and model validation plan is presented.

  1. Model Checking Verification and Validation at JPL and the NASA Fairmont IV and V Facility

    NASA Technical Reports Server (NTRS)

    Schneider, Frank; Easterbrook, Steve; Callahan, Jack; Montgomery, Todd

    1999-01-01

    We show how a technology transfer effort was carried out. The successful use of model checking on a pilot JPL flight project demonstrates the usefulness and the efficacy of the approach. The pilot project was used to model a complex spacecraft controller. Software design and implementation validation were carried out successfully. To suggest future applications we also show how the implementation validation step can be automated. The effort was followed by the formal introduction of the modeling technique as a part of the JPL Quality Assurance process.

  2. Ares I-X Range Safety Trajectory Analyses Overview and Independent Validation and Verification

    NASA Technical Reports Server (NTRS)

    Tarpley, Ashley F.; Starr, Brett R.; Tartabini, Paul V.; Craig, A. Scott; Merry, Carl M.; Brewer, Joan D.; Davis, Jerel G.; Dulski, Matthew B.; Gimenez, Adrian; Barron, M. Kyle

    2011-01-01

    All Flight Analysis data products were successfully generated and delivered to the 45SW in time to support the launch. The IV&V effort allowed data generators to work through issues early. Data consistency proved through the IV&V process provided confidence that the delivered data was of high quality. Flight plan approval was granted for the launch. The test flight was successful and had no safety related issues. The flight occurred within the predicted flight envelopes. Post flight reconstruction results verified the simulations accurately predicted the FTV trajectory.

  3. Large area sheet task: Advanced Dendritic Web Growth Development

    NASA Technical Reports Server (NTRS)

    Duncan, C. S.; Seidensticker, R. G.; Mchugh, J. P.; Hopkins, R. H.; Meier, D.; Schruben, J.

    1981-01-01

    A melt level control system was implemented to provide stepless silicon feed rates from zero to rates exactly matching the silicon consumed during web growth. Bench tests of the unit were successfully completed and the system mounted in a web furnace for operational verification. Tests of long term temperature drift correction techniques were made; web width monitoring seems most appropriate for feedback purposes. A system to program the initiation of the web growth cycle was successfully tested. A low cost temperature controller was tested which functions as well as units four times as expensive.

  4. Skill of Predicting Heavy Rainfall Over India: Improvement in Recent Years Using UKMO Global Model

    NASA Astrophysics Data System (ADS)

    Sharma, Kuldeep; Ashrit, Raghavendra; Bhatla, R.; Mitra, A. K.; Iyengar, G. R.; Rajagopal, E. N.

    2017-11-01

    The quantitative precipitation forecast (QPF) performance for heavy rains is still a challenge, even for the most advanced state-of-art high-resolution Numerical Weather Prediction (NWP) modeling systems. This study aims to evaluate the performance of UK Met Office Unified Model (UKMO) over India for prediction of high rainfall amounts (>2 and >5 cm/day) during the monsoon period (JJAS) from 2007 to 2015 in short range forecast up to Day 3. Among the various modeling upgrades and improvements in the parameterizations during this period, the model horizontal resolution has seen an improvement from 40 km in 2007 to 17 km in 2015. Skill of short range rainfall forecast has improved in UKMO model in recent years mainly due to increased horizontal and vertical resolution along with improved physics schemes. Categorical verification carried out using the four verification metrics, namely, probability of detection (POD), false alarm ratio (FAR), frequency bias (Bias) and Critical Success Index, indicates that QPF has improved by >29 and >24% in case of POD and FAR. Additionally, verification scores like EDS (Extreme Dependency Score), EDI (Extremal Dependence Index) and SEDI (Symmetric EDI) are used with special emphasis on verification of extreme and rare rainfall events. These scores also show an improvement by 60% (EDS) and >34% (EDI and SEDI) during the period of study, suggesting an improved skill of predicting heavy rains.

  5. Analysis of Phenix end-of-life natural convection test with the MARS-LMR code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeong, H. Y.; Ha, K. S.; Lee, K. L.

    The end-of-life test of Phenix reactor performed by the CEA provided an opportunity to have reliable and valuable test data for the validation and verification of a SFR system analysis code. KAERI joined this international program for the analysis of Phenix end-of-life natural circulation test coordinated by the IAEA from 2008. The main objectives of this study were to evaluate the capability of existing SFR system analysis code MARS-LMR and to identify any limitation of the code. The analysis was performed in three stages: pre-test analysis, blind posttest analysis, and final post-test analysis. In the pre-test analysis, the design conditionsmore » provided by the CEA were used to obtain a prediction of the test. The blind post-test analysis was based on the test conditions measured during the tests but the test results were not provided from the CEA. The final post-test analysis was performed to predict the test results as accurate as possible by improving the previous modeling of the test. Based on the pre-test analysis and blind test analysis, the modeling for heat structures in the hot pool and cold pool, steel structures in the core, heat loss from roof and vessel, and the flow path at core outlet were reinforced in the final analysis. The results of the final post-test analysis could be characterized into three different phases. In the early phase, the MARS-LMR simulated the heat-up process correctly due to the enhanced heat structure modeling. In the mid phase before the opening of SG casing, the code reproduced the decrease of core outlet temperature successfully. Finally, in the later phase the increase of heat removal by the opening of the SG opening was well predicted with the MARS-LMR code. (authors)« less

  6. Practical Verification & Safeguard Tools for C/C++

    DTIC Science & Technology

    2007-11-01

    735; RDDC Valcartier; novembre 2007. Ce document est le rapport final d’un projet de recherche qui a été mené en 2005-2006. Le but de ce projet... 13 2.8 On Defects . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.9 Memory Management Problems... 13 2.9.1 Use of Freed Memory . . . . . . . . . . . . . . . . . . . . . . . . . 13 2.9.2 Underallocated Memory for a

  7. Perpetual Model Validation

    DTIC Science & Technology

    2017-03-01

    models of software execution, for example memory access patterns, to check for security intrusions. Additional research was performed to tackle the...considered using indirect models of software execution, for example memory access patterns, to check for security intrusions. Additional research ...deterioration for example , no longer corresponds to the model used during verification time. Finally, the research looked at ways to combine hybrid systems

  8. 76 FR 53918 - Privacy Act of 1974; Department of Homeland Security/Federal Emergency Management Agency-001...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-08-30

    ... their status as a member of law enforcement. During Hurricane Katrina, displaced individuals experienced... in Section 689c of the Post- Katrina Emergency Management Reform Act (PKEMRA) of 2006, Public Law 109... successful verification. Authority for maintenance of the system: Section 689c of the Post-Katrina Emergency...

  9. Avoiding treatment bias of REDD+ monitoring by sampling with partial replacement

    Treesearch

    Michael Kohl; Charles T Scott; Andrew J Lister; Inez Demon; Daniel Plugge

    2015-01-01

    Implementing REDD+ renders the development of a measurement, reporting and verification (MRV) system necessary to monitor carbon stock changes. MRV systems generally apply a combination of remote sensing techniques and in-situ field assessments. In-situ assessments can be based on 1) permanent plots, which are assessed on all successive occasions, 2) temporary plots,...

  10. Constellation Ground Systems Launch Availability Analysis: Enhancing Highly Reliable Launch Systems Design

    NASA Technical Reports Server (NTRS)

    Gernand, Jeffrey L.; Gillespie, Amanda M.; Monaghan, Mark W.; Cummings, Nicholas H.

    2010-01-01

    Success of the Constellation Program's lunar architecture requires successfully launching two vehicles, Ares I/Orion and Ares V/Altair, within a very limited time period. The reliability and maintainability of flight vehicles and ground systems must deliver a high probability of successfully launching the second vehicle in order to avoid wasting the on-orbit asset launched by the first vehicle. The Ground Operations Project determined which ground subsystems had the potential to affect the probability of the second launch and allocated quantitative availability requirements to these subsystems. The Ground Operations Project also developed a methodology to estimate subsystem reliability, availability, and maintainability to ensure that ground subsystems complied with allocated launch availability and maintainability requirements. The verification analysis developed quantitative estimates of subsystem availability based on design documentation, testing results, and other information. Where appropriate, actual performance history was used to calculate failure rates for legacy subsystems or comparative components that will support Constellation. The results of the verification analysis will be used to assess compliance with requirements and to highlight design or performance shortcomings for further decision making. This case study will discuss the subsystem requirements allocation process, describe the ground systems methodology for completing quantitative reliability, availability, and maintainability analysis, and present findings and observation based on analysis leading to the Ground Operations Project Preliminary Design Review milestone.

  11. Definition of ground test for verification of large space structure control

    NASA Technical Reports Server (NTRS)

    Glaese, John R.

    1994-01-01

    Under this contract, the Large Space Structure Ground Test Verification (LSSGTV) Facility at the George C. Marshall Space Flight Center (MSFC) was developed. Planning in coordination with NASA was finalized and implemented. The contract was modified and extended with several increments of funding to procure additional hardware and to continue support for the LSSGTV facility. Additional tasks were defined for the performance of studies in the dynamics, control and simulation of tethered satellites. When the LSSGTV facility development task was completed, support and enhancement activities were funded through a new competitive contract won by LCD. All work related to LSSGTV performed under NAS8-35835 has been completed and documented. No further discussion of these activities will appear in this report. This report summarizes the tether dynamics and control studies performed.

  12. Design, analysis, and test verification of advanced encapsulation systems

    NASA Technical Reports Server (NTRS)

    Mardesich, N.; Minning, C.

    1982-01-01

    Design sensitivities are established for the development of photovoltaic module criteria and the definition of needed research tasks. The program consists of three phases. In Phase I, analytical models were developed to perform optical, thermal, electrical, and structural analyses on candidate encapsulation systems. From these analyses several candidate systems will be selected for qualification testing during Phase II. Additionally, during Phase II, test specimens of various types will be constructed and tested to determine the validity of the analysis methodology developed in Phase I. In Phse III, a finalized optimum design based on knowledge gained in Phase I and II will be developed. All verification testing was completed during this period. Preliminary results and observations are discussed. Descriptions of the thermal, thermal structural, and structural deflection test setups are included.

  13. Applicability of SREM to the Verification of Management Information System Software Requirements. Volume II.

    DTIC Science & Technology

    1981-04-30

    f --tlu Final-Report: Applicability of SREM to the Verification of Management Information System Software Requirements, wtch was prepared for the Army...MA _________ TO ________ UTA 1ASE ___________ StMZ25. 70.aC. .. 3CA, c(ie m(Sl f :~ rin I : ruq in SBII Z tSI. M 4.7/.3 69.9 . MA S U/WA0 1.241.5 96.8...IR.D iTEM B-2 C4 .4 . I.I z- 0 44 f - U l c- I ao V. a, I. vv!N0 ~ q * a - i= - a ~ ePcu m ~ bft 0 = z z z z z Uz 4 P4 -F5 zz - -4 zzz z C6 z c. 0. 4 4 v

  14. Land use/land cover mapping (1:25000) of Taiwan, Republic of China by automated multispectral interpretation of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Sung, Q. C.; Miller, L. D.

    1977-01-01

    Three methods were tested for collection of the training sets needed to establish the spectral signatures of the land uses/land covers sought due to the difficulties of retrospective collection of representative ground control data. Computer preprocessing techniques applied to the digital images to improve the final classification results were geometric corrections, spectral band or image ratioing and statistical cleaning of the representative training sets. A minimal level of statistical verification was made based upon the comparisons between the airphoto estimates and the classification results. The verifications provided a further support to the selection of MSS band 5 and 7. It also indicated that the maximum likelihood ratioing technique can achieve more agreeable classification results with the airphoto estimates than the stepwise discriminant analysis.

  15. Cassini's RTGs undergo mechanical and electrical verification tests in the PHSF

    NASA Technical Reports Server (NTRS)

    1997-01-01

    This radioisotope thermoelectric generator (RTG), at center, is ready for electrical verification testing now that it has been installed on the Cassini spacecraft in the Payload Hazardous Servicing Facility. A handling fixture, at far left, remains attached. This is the third and final RTG to be installed on Cassini for the prelaunch tests. The RTGs will provide electrical power to Cassini on its 6.7-year trip to the Saturnian system and during its four-year mission at Saturn. RTGs use heat from the natural decay of plutonium to generate electric power. The generators enable spacecraft to operate at great distances from the Sun where solar power systems are not feasible. The Cassini mission is targeted for an Oct. 6 launch aboard a Titan IVB/Centaur expendable launch vehicle.

  16. Spot scanning proton therapy plan assessment: design and development of a dose verification application for use in routine clinical practice

    NASA Astrophysics Data System (ADS)

    Augustine, Kurt E.; Walsh, Timothy J.; Beltran, Chris J.; Stoker, Joshua B.; Mundy, Daniel W.; Parry, Mark D.; Bues, Martin; Fatyga, Mirek

    2016-04-01

    The use of radiation therapy for the treatment of cancer has been carried out clinically since the late 1800's. Early on however, it was discovered that a radiation dose sufficient to destroy cancer cells can also cause severe injury to surrounding healthy tissue. Radiation oncologists continually strive to find the perfect balance between a dose high enough to destroy the cancer and one that avoids damage to healthy organs. Spot scanning or "pencil beam" proton radiotherapy offers another option to improve on this. Unlike traditional photon therapy, proton beams stop in the target tissue, thus better sparing all organs beyond the targeted tumor. In addition, the beams are far narrower and thus can be more precisely "painted" onto the tumor, avoiding exposure to surrounding healthy tissue. To safely treat patients with proton beam radiotherapy, dose verification should be carried out for each plan prior to treatment. Proton dose verification systems are not currently commercially available so the Department of Radiation Oncology at the Mayo Clinic developed its own, called DOSeCHECK, which offers two distinct dose simulation methods: GPU-based Monte Carlo and CPU-based analytical. The three major components of the system include the web-based user interface, the Linux-based dose verification simulation engines, and the supporting services and components. The architecture integrates multiple applications, libraries, platforms, programming languages, and communication protocols and was successfully deployed in time for Mayo Clinic's first proton beam therapy patient. Having a simple, efficient application for dose verification greatly reduces staff workload and provides additional quality assurance, ultimately improving patient safety.

  17. Initial verification and validation of RAZORBACK - A research reactor transient analysis code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Talley, Darren G.

    2015-09-01

    This report describes the work and results of the initial verification and validation (V&V) of the beta release of the Razorback code. Razorback is a computer code designed to simulate the operation of a research reactor (such as the Annular Core Research Reactor (ACRR)) by a coupled numerical solution of the point reactor kinetics equations, the energy conservation equation for fuel element heat transfer, and the mass, momentum, and energy conservation equations for the water cooling of the fuel elements. This initial V&V effort was intended to confirm that the code work to-date shows good agreement between simulation and actualmore » ACRR operations, indicating that the subsequent V&V effort for the official release of the code will be successful.« less

  18. Characterization of infrasound from lightning

    NASA Astrophysics Data System (ADS)

    Assink, J. D.; Evers, L. G.; Holleman, I.; Paulssen, H.

    2008-08-01

    During thunderstorm activity in the Netherlands, electromagnetic and infrasonic signals are emitted due to the process of lightning and thunder. It is shown that correlating infrasound detections with results from a electromagnetic lightning detection network is successful up to distances of 50 km from the infrasound array. Infrasound recordings clearly show blastwave characteristics which can be related to cloud-ground discharges, with a dominant frequency between 1-5 Hz. Amplitude measurements of CG discharges can partly be explained by the beam pattern of a line source with a dominant frequency of 3.9 Hz, up to a distance of 20 km. The ability to measure lightning activity with infrasound arrays has both positive and negative implications for CTBT verification purposes. As a scientific application, lightning studies can benefit from the worldwide infrasound verification system.

  19. Structuring an Adult Learning Environment. Part IV: Establishing an Environment for Problem Solving.

    ERIC Educational Resources Information Center

    Frankel, Alan; Brennan, James

    Through the years, many researchers have advanced theories of problem solving. Probably the best definition of problem solving to apply to adult learning programs is Wallas' (1926) four-stage theory. The stages are (1) a preparation, (2) an incubation period, (3) a moment of illumination, and (4) final application or verification of the solution.…

  20. Quasideterminant solutions of the extended noncommutative Kadomtsev-Petviashvili hierarchy

    NASA Astrophysics Data System (ADS)

    Wu, Hongxia; Liu, Jingxin; Li, Chunxia

    2017-07-01

    We construct a nonauto Darboux transformation for the extended noncommutative Kadomtsev-Petviashvili (ncKP) hierarchy and consequently derive its quasi-Wronskian solution. We also obtain the quasi-Wronskian solution of the ncKP equation with self-consistent sources (ncKPESCS) as a by-product. Finally, we use the direct verification method to prove the quasi-Wronskian solution of the ncKPESCS.

  1. 77 FR 64483 - Circular Welded Carbon-Quality Steel Pipe from the Socialist Republic of Vietnam: Notice of Final...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-10-22

    ... Pipe from the Socialist Republic of Vietnam;'' ``Verification of the Sales Response of Midwest Air... Steel Joint Stock Company.... Sun Steel Joint Stock 4.57 Company. Huu Lien Asia Corporation........ Huu Lien Asia 4.57 Corporation. Hoa Phat Steel Pipe Co Hoa Phat Steel Pipe Co.. 4.57 Vietnam-Wide Rate \\13...

  2. Development of a pavement management system for Virginia : final report on phase I : application and verification of a pilot pavement condition inventory for Virginia interstate flexible pavements.

    DOT National Transportation Integrated Search

    1984-01-01

    The study reported here addresses some of the earlier phases in the development of a pavement management system for the state of Virginia. Among the issues discussed are the development of an adequate data base and the implementation of a condition r...

  3. Computer software documentation

    NASA Technical Reports Server (NTRS)

    Comella, P. A.

    1973-01-01

    A tutorial in the documentation of computer software is presented. It presents a methodology for achieving an adequate level of documentation as a natural outgrowth of the total programming effort commencing with the initial problem statement and definition and terminating with the final verification of code. It discusses the content of adequate documentation, the necessity for such documentation and the problems impeding achievement of adequate documentation.

  4. The Iowa new practice model: Advancing technician roles to increase pharmacists' time to provide patient care services.

    PubMed

    Andreski, Michael; Myers, Megan; Gainer, Kate; Pudlo, Anthony

    Determine the effects of an 18-month pilot project using tech-check-tech in 7 community pharmacies on 1) rate of dispensing errors not identified during refill prescription final product verification; 2) pharmacist workday task composition; and 3) amount of patient care services provided and the reimbursement status of those services. Pretest-posttest quasi-experimental study where baseline and study periods were compared. Pharmacists and pharmacy technicians in 7 community pharmacies in Iowa. The outcome measures were 1) percentage of technician verified refill prescriptions where dispensing errors were not identified on final product verification; 2) percentage of time spent by pharmacists in dispensing, management, patient care, practice development, and other activities; 3) the number of pharmacist patient care services provided per pharmacist hours worked; and 4) percentage of time that technician product verification was used. There was no significant difference in overall errors (0.2729% vs. 0.5124%, P = 0.513), patient safety errors (0.0525% vs. 0.0651%, P = 0.837), or administrative errors (0.2204% vs. 0.4784%, P = 0.411). Pharmacist's time in dispensing significantly decreased (67.3% vs. 49.06%, P = 0.005), and time in direct patient care (19.96% vs. 34.72%, P = 0.003), increased significantly. Time in other activities did not significantly change. Reimbursable services per pharmacist hour (0.11 vs. 0.30, P = 0.129), did not significantly change. Non-reimbursable services increased significantly (2.77 vs. 4.80, P = 0.042). Total services significantly increased (2.88 vs. 5.16, P = 0.044). Pharmacy technician product verification of refill prescriptions preserved dispensing safety while significantly increasing the time spent in delivery of pharmacist provided patient care services. The total number of pharmacist services provided per hour also increased significantly, driven primarily by a significant increase in the number of non-reimbursed services. This was mostly likely due to the increased time available to provide patient care. Reimbursed services per hour did not increase significantly mostly likely due to lack of payers. Copyright © 2018 American Pharmacists Association®. Published by Elsevier Inc. All rights reserved.

  5. Measurement, monitoring, and verification: make it work!

    Treesearch

    Coeli M. Hoover

    2011-01-01

    The capacity of forests to absorb and store carbon is certainly, as the authors note, an important tool in the greenhouse gas mitigation toolbox. Our understanding of what elements can make forest carbon offset projects successful has grown a great deal over time, as the global community has come to understand that forest degradation and conversion are the result of a...

  6. Space Qualification Testing of a Shape Memory Alloy Deployable CubeSat Antenna

    DTIC Science & Technology

    2016-09-15

    the SMA deployment in the space environment. The HCT QHA successfully passed all required NASA General Environmental Verification Standards space... NASA /JPL parabolic deployable antenna design [28] .................. 19 Figure 11. SERC and NASA /JPL parabolic antenna prototype [28...19 Figure 12. SERC and NASA /JPL parabolic antenna stowed configuration [28] ............. 20 Figure 13. JPL KaPDA antenna [29

  7. Single Event Effects mitigation with TMRG tool

    NASA Astrophysics Data System (ADS)

    Kulis, S.

    2017-01-01

    Single Event Effects (SEE) are a major concern for integrated circuits exposed to radiation. There have been several techniques proposed to protect circuits against radiation-induced upsets. Among the others, the Triple Modular Redundancy (TMR) technique is one of the most popular. The purpose of the Triple Modular Redundancy Generator (TMRG) tool is to automatize the process of triplicating digital circuits freeing the designer from introducing the TMR code manually at the implementation stage. It helps to ensure that triplicated logic is maintained through the design process. Finally, the tool streamlines the process of introducing SEE in gate level simulations for final verification.

  8. Supporting Technology for Chain of Custody of Nuclear Weapons and Materials throughout the Dismantlement and Disposition Processes

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bunch, Kyle J.; Jones, Anthony M.; Ramuhalli, Pradeep

    The ratification and ongoing implementation of the New START Treaty have been widely regarded as noteworthy global security achievements for both the Obama Administration and the Putin (formerly Medvedev) regime. But deeper cuts that move beyond the United States and Russia to engage the P-5 and other nuclear weapons possessor states are envisioned under future arms control regimes, and are indeed required for the P-5 in accordance with their Article VI disarmament obligations in the Nuclear Non-Proliferation Treaty. Future verification needs will include monitoring the cessation of production of new fissile material for weapons, monitoring storage of warhead components andmore » fissile materials and verifying dismantlement of warheads, pits, secondary stages, and other materials. A fundamental challenge to implementing a nuclear disarmament regime is the ability to thwart unauthorized material diversion throughout the dismantlement and disposition process through strong chain of custody implementation. Verifying the declared presence, or absence, of nuclear materials and weapons components throughout the dismantlement and disposition lifecycle is a critical aspect of the disarmament process. From both the diplomatic and technical perspectives, verification under these future arms control regimes will require new solutions. Since any acceptable verification technology must protect sensitive design information and attributes to prevent the release of classified or other proliferation-sensitive information, non-nuclear non-sensitive modalities may provide significant new verification tools which do not require the use of additional information barriers. Alternative verification technologies based upon electromagnetic and acoustics could potentially play an important role in fulfilling the challenging requirements of future verification regimes. For example, researchers at the Pacific Northwest National Laboratory (PNNL) have demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to rapidly confirm the presence of specific components on a yes/no basis without revealing classified information. PNNL researchers have also used ultrasonic measurements to obtain images of material microstructures which may be used as templates or unique identifiers of treaty-limited items. Such alternative technologies are suitable for application in various stages of weapons dismantlement and often include the advantage of an inherent information barrier due to the inability to extract classified weapon design information from the collected data. As a result, these types of technologies complement radiation-based verification methods for arms control. This article presents an overview of several alternative verification technologies that are suitable for supporting a future, broader and more intrusive arms control regime that spans the nuclear weapons disarmament lifecycle. The general capabilities and limitations of each verification modality are discussed and example technologies are presented. Potential applications are defined in the context of the nuclear material and weapons lifecycle. Example applications range from authentication (e.g., tracking and signatures within the chain of custody from downloading through weapons storage, unclassified templates and unique identification) to verification of absence and final material disposition.« less

  9. Hyperplex-MRM: a hybrid multiple reaction monitoring method using mTRAQ/iTRAQ labeling for multiplex absolute quantification of human colorectal cancer biomarker.

    PubMed

    Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie

    2013-09-06

    Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.

  10. Forest Carbon Monitoring and Reporting for REDD+: What Future for Africa?

    PubMed

    Gizachew, Belachew; Duguma, Lalisa A

    2016-11-01

    A climate change mitigation mechanism for emissions reduction from reduced deforestation and forest degradation, plus forest conservation, sustainable management of forest, and enhancement of carbon stocks (REDD+), has received an international political support in the climate change negotiations. The mechanism will require, among others, an unprecedented technical capacity for monitoring, reporting and verification of carbon emissions from the forest sector. A functional monitoring, reporting and verification requires inventories of forest area, carbon stock and changes, both for the construction of forest reference emissions level and compiling the report on the actual emissions, which are essentially lacking in developing countries, particularly in Africa. The purpose of this essay is to contribute to a better understanding of the state and prospects of forest monitoring and reporting in the context of REDD+ in Africa. We argue that monitoring and reporting capacities in Africa fall short of the stringent requirements of the methodological guidance for monitoring, reporting and verification for REDD+, and this may weaken the prospects for successfully implementing REDD+ in the continent. We presented the challenges and prospects in the national forest inventory, remote sensing and reporting infrastructures. A North-South, South-South collaboration as well as governments own investments in monitoring, reporting and verification system could help Africa leapfrog in monitoring and reporting. These could be delivered through negotiations for the transfer of technology, technical capacities, and experiences that exist among developed countries that traditionally compile forest carbon reports in the context of the Kyoto protocol.

  11. Software Quality Assurance and Verification for the MPACT Library Generation Process

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Liu, Yuxuan; Williams, Mark L.; Wiarda, Dorothea

    This report fulfills the requirements for the Consortium for the Advanced Simulation of Light-Water Reactors (CASL) milestone L2:RTM.P14.02, “SQA and Verification for MPACT Library Generation,” by documenting the current status of the software quality, verification, and acceptance testing of nuclear data libraries for MPACT. It provides a brief overview of the library generation process, from general-purpose evaluated nuclear data files (ENDF/B) to a problem-dependent cross section library for modeling of light-water reactors (LWRs). The software quality assurance (SQA) programs associated with each of the software used to generate the nuclear data libraries are discussed; specific tests within the SCALE/AMPX andmore » VERA/XSTools repositories are described. The methods and associated tests to verify the quality of the library during the generation process are described in detail. The library generation process has been automated to a degree to (1) ensure that it can be run without user intervention and (2) to ensure that the library can be reproduced. Finally, the acceptance testing process that will be performed by representatives from the Radiation Transport Methods (RTM) Focus Area prior to the production library’s release is described in detail.« less

  12. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  13. Sequence verification of synthetic DNA by assembly of sequencing reads

    PubMed Central

    Wilson, Mandy L.; Cai, Yizhi; Hanlon, Regina; Taylor, Samantha; Chevreux, Bastien; Setubal, João C.; Tyler, Brett M.; Peccoud, Jean

    2013-01-01

    Gene synthesis attempts to assemble user-defined DNA sequences with base-level precision. Verifying the sequences of construction intermediates and the final product of a gene synthesis project is a critical part of the workflow, yet one that has received the least attention. Sequence validation is equally important for other kinds of curated clone collections. Ensuring that the physical sequence of a clone matches its published sequence is a common quality control step performed at least once over the course of a research project. GenoREAD is a web-based application that breaks the sequence verification process into two steps: the assembly of sequencing reads and the alignment of the resulting contig with a reference sequence. GenoREAD can determine if a clone matches its reference sequence. Its sophisticated reporting features help identify and troubleshoot problems that arise during the sequence verification process. GenoREAD has been experimentally validated on thousands of gene-sized constructs from an ORFeome project, and on longer sequences including whole plasmids and synthetic chromosomes. Comparing GenoREAD results with those from manual analysis of the sequencing data demonstrates that GenoREAD tends to be conservative in its diagnostic. GenoREAD is available at www.genoread.org. PMID:23042248

  14. A new plan-scoring method using normal tissue complication probability for personalized treatment plan decisions in prostate cancer

    NASA Astrophysics Data System (ADS)

    Kim, Kwang Hyeon; Lee, Suk; Shim, Jang Bo; Yang, Dae Sik; Yoon, Won Sup; Park, Young Je; Kim, Chul Yong; Cao, Yuan Jie; Chang, Kyung Hwan

    2018-01-01

    The aim of this study was to derive a new plan-scoring index using normal tissue complication probabilities to verify different plans in the selection of personalized treatment. Plans for 12 patients treated with tomotherapy were used to compare scoring for ranking. Dosimetric and biological indexes were analyzed for the plans for a clearly distinguishable group ( n = 7) and a similar group ( n = 12), using treatment plan verification software that we developed. The quality factor ( QF) of our support software for treatment decisions was consistent with the final treatment plan for the clearly distinguishable group (average QF = 1.202, 100% match rate, n = 7) and the similar group (average QF = 1.058, 33% match rate, n = 12). Therefore, we propose a normal tissue complication probability (NTCP) based on the plan scoring index for verification of different plans for personalized treatment-plan selection. Scoring using the new QF showed a 100% match rate (average NTCP QF = 1.0420). The NTCP-based new QF scoring method was adequate for obtaining biological verification quality and organ risk saving using the treatment-planning decision-support software we developed for prostate cancer.

  15. The Roles of Verification, Validation and Uncertainty Quantification in the NASA Standard for Models and Simulations

    NASA Technical Reports Server (NTRS)

    Zang, Thomas A.; Luckring, James M.; Morrison, Joseph H.; Blattnig, Steve R.; Green, Lawrence L.; Tripathi, Ram K.

    2007-01-01

    The National Aeronautics and Space Administration (NASA) recently issued an interim version of the Standard for Models and Simulations (M&S Standard) [1]. The action to develop the M&S Standard was identified in an internal assessment [2] of agency-wide changes needed in the wake of the Columbia Accident [3]. The primary goal of this standard is to ensure that the credibility of M&S results is properly conveyed to those making decisions affecting human safety or mission success criteria. The secondary goal is to assure that the credibility of the results from models and simulations meets the project requirements (for credibility). This presentation explains the motivation and key aspects of the M&S Standard, with a special focus on the requirements for verification, validation and uncertainty quantification. Some pilot applications of this standard to computational fluid dynamics applications will be provided as illustrations. The authors of this paper are the members of the team that developed the initial three drafts of the standard, the last of which benefited from extensive comments from most of the NASA Centers. The current version (number 4) incorporates modifications made by a team representing 9 of the 10 NASA Centers. A permanent version of the M&S Standard is expected by December 2007. The scope of the M&S Standard is confined to those uses of M&S that support program and project decisions that may affect human safety or mission success criteria. Such decisions occur, in decreasing order of importance, in the operations, the test & evaluation, and the design & analysis phases. Requirements are placed on (1) program and project management, (2) models, (3) simulations and analyses, (4) verification, validation and uncertainty quantification (VV&UQ), (5) recommended practices, (6) training, (7) credibility assessment, and (8) reporting results to decision makers. A key component of (7) and (8) is the use of a Credibility Assessment Scale, some of the details of which were developed in consultation with William Oberkampf, David Peercy and Timothy Trocano of Sandia National Laboratories. The focus of most of the requirements, including those for VV&UQ, is on the documentation of what was done and the reporting, using the Credibility Assessment Scale, of the level of rigor that was followed. The aspects of one option for the Credibilty Assessment Scale are (1) code verification, (2) solution verification, (3) validation, (4) predictive capability, (5) technical review, (6) process control, and (7) operator and analyst qualification.

  16. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness , which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  17. Dosimetry for audit and clinical trials: challenges and requirements

    NASA Astrophysics Data System (ADS)

    Kron, T.; Haworth, A.; Williams, I.

    2013-06-01

    Many important dosimetry audit networks for radiotherapy have their roots in clinical trial quality assurance (QA). In both scenarios it is essential to test two issues: does the treatment plan conform with the clinical requirements and is the plan a reasonable representation of what is actually delivered to a patient throughout their course of treatment. Part of a sound quality program would be an external audit of these issues with verification of the equivalence of plan and treatment typically referred to as a dosimetry audit. The increasing complexity of radiotherapy planning and delivery makes audits challenging. While verification of absolute dose delivered at a reference point was the standard of external dosimetry audits two decades ago this is often deemed inadequate for verification of treatment approaches such as Intensity Modulated Radiation Therapy (IMRT) and Volumetric Modulated Arc Therapy (VMAT). As such, most dosimetry audit networks have successfully introduced more complex tests of dose delivery using anthropomorphic phantoms that can be imaged, planned and treated as a patient would. The new challenge is to adapt this approach to ever more diversified radiotherapy procedures with image guided/adaptive radiotherapy, motion management and brachytherapy being the focus of current research.

  18. Strategies for Validation Testing of Ground Systems

    NASA Technical Reports Server (NTRS)

    Annis, Tammy; Sowards, Stephanie

    2009-01-01

    In order to accomplish the full Vision for Space Exploration announced by former President George W. Bush in 2004, NASA will have to develop a new space transportation system and supporting infrastructure. The main portion of this supporting infrastructure will reside at the Kennedy Space Center (KSC) in Florida and will either be newly developed or a modification of existing vehicle processing and launch facilities, including Ground Support Equipment (GSE). This type of large-scale launch site development is unprecedented since the time of the Apollo Program. In order to accomplish this successfully within the limited budget and schedule constraints a combination of traditional and innovative strategies for Verification and Validation (V&V) have been developed. The core of these strategies consists of a building-block approach to V&V, starting with component V&V and ending with a comprehensive end-to-end validation test of the complete launch site, called a Ground Element Integration Test (GEIT). This paper will outline these strategies and provide the high level planning for meeting the challenges of implementing V&V on a large-scale development program. KEY WORDS: Systems, Elements, Subsystem, Integration Test, Ground Systems, Ground Support Equipment, Component, End Item, Test and Verification Requirements (TVR), Verification Requirements (VR)

  19. New generation of universal modeling for centrifugal compressors calculation

    NASA Astrophysics Data System (ADS)

    Galerkin, Y.; Drozdov, A.

    2015-08-01

    The Universal Modeling method is in constant use from mid - 1990th. Below is presented the newest 6th version of the Method. The flow path configuration of 3D impellers is presented in details. It is possible to optimize meridian configuration including hub/shroud curvatures, axial length, leading edge position, etc. The new model of vaned diffuser includes flow non-uniformity coefficient based on CFD calculations. The loss model was built from the results of 37 experiments with compressors stages of different flow rates and loading factors. One common set of empirical coefficients in the loss model guarantees the efficiency definition within an accuracy of 0.86% at the design point and 1.22% along the performance curve. The model verification was made. Four multistage compressors performances with vane and vaneless diffusers were calculated. As the model verification was made, four multistage compressors performances with vane and vaneless diffusers were calculated. Two of these compressors have quite unusual flow paths. The modeling results were quite satisfactory in spite of these peculiarities. One sample of the verification calculations is presented in the text. This 6th version of the developed computer program is being already applied successfully in the design practice.

  20. Identifying Rhodamine Dye Plume Sources in Near-Shore Oceanic Environments by Integration of Chemical and Visual Sensors

    PubMed Central

    Tian, Yu; Kang, Xiaodong; Li, Yunyi; Li, Wei; Zhang, Aiqun; Yu, Jiangchen; Li, Yiping

    2013-01-01

    This article presents a strategy for identifying the source location of a chemical plume in near-shore oceanic environments where the plume is developed under the influence of turbulence, tides and waves. This strategy includes two modules: source declaration (or identification) and source verification embedded in a subsumption architecture. Algorithms for source identification are derived from the moth-inspired plume tracing strategies based on a chemical sensor. The in-water test missions, conducted in November 2002 at San Clemente Island (California, USA) in June 2003 in Duck (North Carolina, USA) and in October 2010 at Dalian Bay (China), successfully identified the source locations after autonomous underwater vehicles tracked the rhodamine dye plumes with a significant meander over 100 meters. The objective of the verification module is to verify the declared plume source using a visual sensor. Because images taken in near shore oceanic environments are very vague and colors in the images are not well-defined, we adopt a fuzzy color extractor to segment the color components and recognize the chemical plume and its source by measuring color similarity. The source verification module is tested by images taken during the CPT missions. PMID:23507823

  1. Efficient model checking of network authentication protocol based on SPIN

    NASA Astrophysics Data System (ADS)

    Tan, Zhi-hua; Zhang, Da-fang; Miao, Li; Zhao, Dan

    2013-03-01

    Model checking is a very useful technique for verifying the network authentication protocols. In order to improve the efficiency of modeling and verification on the protocols with the model checking technology, this paper first proposes a universal formalization description method of the protocol. Combined with the model checker SPIN, the method can expediently verify the properties of the protocol. By some modeling simplified strategies, this paper can model several protocols efficiently, and reduce the states space of the model. Compared with the previous literature, this paper achieves higher degree of automation, and better efficiency of verification. Finally based on the method described in the paper, we model and verify the Privacy and Key Management (PKM) authentication protocol. The experimental results show that the method of model checking is effective, which is useful for the other authentication protocols.

  2. Seismic verification of nuclear plant equipment anchorage: Volume 1, Development of anchorage guidelines: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czarnecki, R M

    1987-05-01

    Guidelines have been developed to evaluate the seismic adequacy of the anchorage of various classes of electrical and mechanical equipment in nuclear power plants covered by NRC Unresolved Safety Issue A-46. The guidelines consist of screening tables that give the seismic anchorage capacity as a function of key equipment and anchorage fasteners, inspection checklists for field verification of anchorage adequacy, and provisions for outliers that can be used to further investigate anchorages that cannot be verified in the field. The screening tables are based on an analysis of the anchorage forces developed by common equipment types and on strength criteriamore » to quantify the holding power of anchor bolts and welds. The strength criteria for expansion anchor bolts were developed by collecting and analyzing a large quantity of test data.« less

  3. Seismic verification of nuclear plant equipment anchorage: Volume 2, Anchorage inspection workbook: Final report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Czarnecki, R M

    1987-05-01

    Guidelines have been developed to evaluate the seismic adequacy of the anchorage of various classes of electrical and mechanical equipment in nuclear power plants covered by NRC Unresolved Safety Issue A-46. The guidelines consist of screening tables that give the seismic anchorage capacity as a function of key equipment and anchorage fasteners, inspection checklists for field verification of anchorage adequacy, and provisions for outliers that can be used to further investigate anchorages that cannot be verified in the field. The screening tables are based on an analysis of the anchorage forces developed by common equipment types and on strength criteriamore » to quantify the holding power of anchor bolts and welds. The strength criteria for expansion anchor bolts were developed by collecting and analyzing a large quantity of test data.« less

  4. Quality dependent fusion of intramodal and multimodal biometric experts

    NASA Astrophysics Data System (ADS)

    Kittler, J.; Poh, N.; Fatukasi, O.; Messer, K.; Kryszczuk, K.; Richiardi, J.; Drygajlo, A.

    2007-04-01

    We address the problem of score level fusion of intramodal and multimodal experts in the context of biometric identity verification. We investigate the merits of confidence based weighting of component experts. In contrast to the conventional approach where confidence values are derived from scores, we use instead raw measures of biometric data quality to control the influence of each expert on the final fused score. We show that quality based fusion gives better performance than quality free fusion. The use of quality weighted scores as features in the definition of the fusion functions leads to further improvements. We demonstrate that the achievable performance gain is also affected by the choice of fusion architecture. The evaluation of the proposed methodology involves 6 face and one speech verification experts. It is carried out on the XM2VTS data base.

  5. The opto-mechanical design process: from vision to reality

    NASA Astrophysics Data System (ADS)

    Kvamme, E. Todd; Stubbs, David M.; Jacoby, Michael S.

    2017-08-01

    The design process for an opto-mechanical sub-system is discussed from requirements development through test. The process begins with a proper mission understanding and the development of requirements for the system. Preliminary design activities are then discussed with iterative analysis and design work being shared between the design, thermal, and structural engineering personnel. Readiness for preliminary review and the path to a final design review are considered. The value of prototyping and risk mitigation testing is examined with a focus on when it makes sense to execute a prototype test program. System level margin is discussed in general terms, and the practice of trading margin in one area of performance to meet another area is reviewed. Requirements verification and validation is briefly considered. Testing and its relationship to requirements verification concludes the design process.

  6. Cassini's Test Methodology for Flight Software Verification and Operations

    NASA Technical Reports Server (NTRS)

    Wang, Eric; Brown, Jay

    2007-01-01

    The Cassini spacecraft was launched on 15 October 1997 on a Titan IV-B launch vehicle. The spacecraft is comprised of various subsystems, including the Attitude and Articulation Control Subsystem (AACS). The AACS Flight Software (FSW) and its development has been an ongoing effort, from the design, development and finally operations. As planned, major modifications to certain FSW functions were designed, tested, verified and uploaded during the cruise phase of the mission. Each flight software upload involved extensive verification testing. A standardized FSW testing methodology was used to verify the integrity of the flight software. This paper summarizes the flight software testing methodology used for verifying FSW from pre-launch through the prime mission, with an emphasis on flight experience testing during the first 2.5 years of the prime mission (July 2004 through January 2007).

  7. Verification of intensity modulated radiation therapy beams using a tissue equivalent plastic scintillator dosimetry system

    NASA Astrophysics Data System (ADS)

    Petric, Martin Peter

    This thesis describes the development and implementation of a novel method for the dosimetric verification of intensity modulated radiation therapy (IMRT) fields with several advantages over current techniques. Through the use of a tissue equivalent plastic scintillator sheet viewed by a charge-coupled device (CCD) camera, this method provides a truly tissue equivalent dosimetry system capable of efficiently and accurately performing field-by-field verification of IMRT plans. This work was motivated by an initial study comparing two IMRT treatment planning systems. The clinical functionality of BrainLAB's BrainSCAN and Varian's Helios IMRT treatment planning systems were compared in terms of implementation and commissioning, dose optimization, and plan assessment. Implementation and commissioning revealed differences in the beam data required to characterize the beam prior to use with the BrainSCAN system requiring higher resolution data compared to Helios. This difference was found to impact on the ability of the systems to accurately calculate dose for highly modulated fields, with BrainSCAN being more successful than Helios. The dose optimization and plan assessment comparisons revealed that while both systems use considerably different optimization algorithms and user-control interfaces, they are both capable of producing substantially equivalent dose plans. The extensive use of dosimetric verification techniques in the IMRT treatment planning comparison study motivated the development and implementation of a novel IMRT dosimetric verification system. The system consists of a water-filled phantom with a tissue equivalent plastic scintillator sheet built into the top surface. Scintillation light is reflected by a plastic mirror within the phantom towards a viewing window where it is captured using a CCD camera. Optical photon spread is removed using a micro-louvre optical collimator and by deconvolving a glare kernel from the raw images. Characterization of this new dosimetric verification system indicates excellent dose response and spatial linearity, high spatial resolution, and good signal uniformity and reproducibility. Dosimetric results from square fields, dynamic wedged fields, and a 7-field head and neck IMRT treatment plan indicate good agreement with film dosimetry distributions. Efficiency analysis of the system reveals a 50% reduction in time requirements for field-by-field verification of a 7-field IMRT treatment plan compared to film dosimetry.

  8. Design and development of the 2m resolution camera for ROCSAT-2

    NASA Astrophysics Data System (ADS)

    Uguen, Gilbert; Luquet, Philippe; Chassat, François

    2017-11-01

    EADS-Astrium has recently completed the development of a 2m-resolution camera, so-called RSI (Remote Sensing Instrument), for the small-satellite ROCSAT-2, which is the second component of the long-term space program of the Republic of China. The National Space Program Office of Taïwan selected EADS-Astrium as the Prime Contractor for the development of the spacecraft, including the bus and the main instrument RSI. The main challenges for the RSI development were: - to introduce innovative technologies in order to meet the high performance requirements while achieving the design simplicity necessary for the mission (low mass, low power) - to have a development approach and verification compatible with the very tight development schedule This paper describes the instrument design together with the development and verification logic that were implemented to successfully meet these objectives.

  9. Shuttle avionics software development trials: Tribulations and successes, the backup flight system

    NASA Technical Reports Server (NTRS)

    Chevers, E. S.

    1985-01-01

    The development and verification of the Backup Flight System software (BFS) is discussed. The approach taken for the BFS was to develop a very simple and straightforward software program and then test it in every conceivable manner. The result was a program that contained approximately 12,000 full words including ground checkout and the built in test program for the computer. To perform verification, a series of tests was defined using the actual flight type hardware and simulated flight conditions. Then simulated flights were flown and detailed performance analysis was conducted. The intent of most BFS tests was to demonstrate that a stable flightpath could be obtained after engagement from an anomalous initial condition. The extention of the BFS to meet the requirements of the orbital flight test phase is also described.

  10. European Train Control System: A Case Study in Formal Verification

    NASA Astrophysics Data System (ADS)

    Platzer, André; Quesel, Jan-David

    Complex physical systems have several degrees of freedom. They only work correctly when their control parameters obey corresponding constraints. Based on the informal specification of the European Train Control System (ETCS), we design a controller for its cooperation protocol. For its free parameters, we successively identify constraints that are required to ensure collision freedom. We formally prove the parameter constraints to be sharp by characterizing them equivalently in terms of reachability properties of the hybrid system dynamics. Using our deductive verification tool KeYmaera, we formally verify controllability, safety, liveness, and reactivity properties of the ETCS protocol that entail collision freedom. We prove that the ETCS protocol remains correct even in the presence of perturbation by disturbances in the dynamics. We verify that safety is preserved when a PI controlled speed supervision is used.

  11. Failure of Cleaning Verification in Pharmaceutical Industry Due to Uncleanliness of Stainless Steel Surface.

    PubMed

    Haidar Ahmad, Imad A; Blasko, Andrei

    2017-08-11

    The aim of this work is to identify the parameters that affect the recovery of pharmaceutical residues from the surface of stainless steel coupons. A series of factors were assessed, including drug product spike levels, spiking procedure, drug-excipient ratios, analyst-to-analyst variability, intraday variability, and cleaning procedure of the coupons. The lack of a well-defined procedure that consistently cleaned the coupon surface was identified as the major contributor to low and variable recoveries. Assessment of cleaning the surface of the coupons with clean-in-place solutions (CIP) gave high recovery (>90%) and reproducible results (Srel≤4%) regardless of the conditions that were assessed previously. The approach was successfully applied for cleaning verification of small molecules (MW <1,000 Da) as well as large biomolecules (MW up to 50,000 Da).

  12. Development of Sample Verification System for Sample Return Missions

    NASA Technical Reports Server (NTRS)

    Toda, Risaku; McKinney, Colin; Jackson, Shannon P.; Mojarradi, Mohammad; Trebi-Ollennu, Ashitey; Manohara, Harish

    2011-01-01

    This paper describes the development of a proof of-concept sample verification system (SVS) for in-situ mass measurement of planetary rock and soil sample in future robotic sample return missions. Our proof-of-concept SVS device contains a 10 cm diameter pressure sensitive elastic membrane placed at the bottom of a sample canister. The membrane deforms under the weight of accumulating planetary sample. The membrane is positioned in proximity to an opposing substrate with a narrow gap. The deformation of the membrane makes the gap to be narrower, resulting in increased capacitance between the two nearly parallel plates. Capacitance readout circuitry on a nearby printed circuit board (PCB) transmits data via a low-voltage differential signaling (LVDS) interface. The fabricated SVS proof-of-concept device has successfully demonstrated approximately 1pF/gram capacitance change

  13. Quality by design case study 1: Design of 5-fluorouracil loaded lipid nanoparticles by the W/O/W double emulsion - Solvent evaporation method.

    PubMed

    Amasya, Gulin; Badilli, Ulya; Aksu, Buket; Tarimci, Nilufer

    2016-03-10

    With Quality by Design (QbD), a systematic approach involving design and development of all production processes to achieve the final product with a predetermined quality, you work within a design space that determines the critical formulation and process parameters. Verification of the quality of the final product is no longer necessary. In the current study, the QbD approach was used in the preparation of lipid nanoparticle formulations to improve skin penetration of 5-Fluorouracil, a widely-used compound for treating non-melanoma skin cancer. 5-Fluorouracil-loaded lipid nanoparticles were prepared by the W/O/W double emulsion - solvent evaporation method. Artificial neural network software was used to evaluate the data obtained from the lipid nanoparticle formulations, to establish the design space, and to optimize the formulations. Two different artificial neural network models were developed. The limit values of the design space of the inputs and outputs obtained by both models were found to be within the knowledge space. The optimal formulations recommended by the models were prepared and the critical quality attributes belonging to those formulations were assigned. The experimental results remained within the design space limit values. Consequently, optimal formulations with the critical quality attributes determined to achieve the Quality Target Product Profile were successfully obtained within the design space by following the QbD steps. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Verification Techniques for Parameter Selection and Bayesian Model Calibration Presented for an HIV Model

    NASA Astrophysics Data System (ADS)

    Wentworth, Mami Tonoe

    Uncertainty quantification plays an important role when making predictive estimates of model responses. In this context, uncertainty quantification is defined as quantifying and reducing uncertainties, and the objective is to quantify uncertainties in parameter, model and measurements, and propagate the uncertainties through the model, so that one can make a predictive estimate with quantified uncertainties. Two of the aspects of uncertainty quantification that must be performed prior to propagating uncertainties are model calibration and parameter selection. There are several efficient techniques for these processes; however, the accuracy of these methods are often not verified. This is the motivation for our work, and in this dissertation, we present and illustrate verification frameworks for model calibration and parameter selection in the context of biological and physical models. First, HIV models, developed and improved by [2, 3, 8], describe the viral infection dynamics of an HIV disease. These are also used to make predictive estimates of viral loads and T-cell counts and to construct an optimal control for drug therapy. Estimating input parameters is an essential step prior to uncertainty quantification. However, not all the parameters are identifiable, implying that they cannot be uniquely determined by the observations. These unidentifiable parameters can be partially removed by performing parameter selection, a process in which parameters that have minimal impacts on the model response are determined. We provide verification techniques for Bayesian model calibration and parameter selection for an HIV model. As an example of a physical model, we employ a heat model with experimental measurements presented in [10]. A steady-state heat model represents a prototypical behavior for heat conduction and diffusion process involved in a thermal-hydraulic model, which is a part of nuclear reactor models. We employ this simple heat model to illustrate verification techniques for model calibration. For Bayesian model calibration, we employ adaptive Metropolis algorithms to construct densities for input parameters in the heat model and the HIV model. To quantify the uncertainty in the parameters, we employ two MCMC algorithms: Delayed Rejection Adaptive Metropolis (DRAM) [33] and Differential Evolution Adaptive Metropolis (DREAM) [66, 68]. The densities obtained using these methods are compared to those obtained through the direct numerical evaluation of the Bayes' formula. We also combine uncertainties in input parameters and measurement errors to construct predictive estimates for a model response. A significant emphasis is on the development and illustration of techniques to verify the accuracy of sampling-based Metropolis algorithms. We verify the accuracy of DRAM and DREAM by comparing chains, densities and correlations obtained using DRAM, DREAM and the direct evaluation of Bayes formula. We also perform similar analysis for credible and prediction intervals for responses. Once the parameters are estimated, we employ energy statistics test [63, 64] to compare the densities obtained by different methods for the HIV model. The energy statistics are used to test the equality of distributions. We also consider parameter selection and verification techniques for models having one or more parameters that are noninfluential in the sense that they minimally impact model outputs. We illustrate these techniques for a dynamic HIV model but note that the parameter selection and verification framework is applicable to a wide range of biological and physical models. To accommodate the nonlinear input to output relations, which are typical for such models, we focus on global sensitivity analysis techniques, including those based on partial correlations, Sobol indices based on second-order model representations, and Morris indices, as well as a parameter selection technique based on standard errors. A significant objective is to provide verification strategies to assess the accuracy of those techniques, which we illustrate in the context of the HIV model. Finally, we examine active subspace methods as an alternative to parameter subset selection techniques. The objective of active subspace methods is to determine the subspace of inputs that most strongly affect the model response, and to reduce the dimension of the input space. The major difference between active subspace methods and parameter selection techniques is that parameter selection identifies influential parameters whereas subspace selection identifies a linear combination of parameters that impacts the model responses significantly. We employ active subspace methods discussed in [22] for the HIV model and present a verification that the active subspace successfully reduces the input dimensions.

  15. Robust Requirements Tracing via Internet Search Technology: Improving an IV and V Technique. Phase 2

    NASA Technical Reports Server (NTRS)

    Hayes, Jane; Dekhtyar, Alex

    2004-01-01

    There are three major objectives to this phase of the work. (1) Improvement of Information Retrieval (IR) methods for Independent Verification and Validation (IV&V) requirements tracing. Information Retrieval methods are typically developed for very large (order of millions - tens of millions and more documents) document collections and therefore, most successfully used methods somewhat sacrifice precision and recall in order to achieve efficiency. At the same time typical IR systems treat all user queries as independent of each other and assume that relevance of documents to queries is subjective for each user. The IV&V requirements tracing problem has a much smaller data set to operate on, even for large software development projects; the set of queries is predetermined by the high-level specification document and individual requirements considered as query input to IR methods are not necessarily independent from each other. Namely, knowledge about the links for one requirement may be helpful in determining the links of another requirement. Finally, while the final decision on the exact form of the traceability matrix still belongs to the IV&V analyst, his/her decisions are much less arbitrary than those of an Internet search engine user. All this suggests that the information available to us in the framework of the IV&V tracing problem can be successfully leveraged to enhance standard IR techniques, which in turn would lead to increased recall and precision. We developed several new methods during Phase II; (2) IV&V requirements tracing IR toolkit. Based on the methods developed in Phase I and their improvements developed in Phase II, we built a toolkit of IR methods for IV&V requirements tracing. The toolkit has been integrated, at the data level, with SAIC's SuperTracePlus (STP) tool; (3) Toolkit testing. We tested the methods included in the IV&V requirements tracing IR toolkit on a number of projects.

  16. Using Teamcenter engineering software for a successive punching tool lifecycle management

    NASA Astrophysics Data System (ADS)

    Blaga, F.; Pele, A.-V.; Stǎnǎşel, I.; Buidoş, T.; Hule, V.

    2015-11-01

    The paper presents studies and researches results of the implementation of Teamcenter (TC) integrated management of a product lifecycle, in a virtual enterprise. The results are able to be implemented also in a real enterprise. The product was considered a successive punching and cutting tool, designed to materialize a metal sheet part. The paper defines the technical documentation flow (flow of information) in the process of constructive computer aided design of the tool. After the design phase is completed a list of parts is generated containing standard or manufactured components (BOM, Bill of Materials). The BOM may be exported to MS Excel (.xls) format and can be transferred to other departments of the company in order to supply the necessary materials and resources to achieve the final product. This paper describes the procedure to modify or change certain dimensions of sheet metal part obtained by punching. After 3D and 2D design, the digital prototype of punching tool moves to following lifecycle phase of the manufacturing process. For each operation of the technological process the corresponding phases are described in detail. Teamcenter enables to describe manufacturing company structure, underlying workstations that carry out various operations of manufacturing process. The paper revealed that the implementation of Teamcenter PDM in a company, improves efficiency of managing product information, eliminating time working with search, verification and correction of documentation, while ensuring the uniqueness and completeness of the product data.

  17. SENSOR++: Simulation of Remote Sensing Systems from Visible to Thermal Infrared

    NASA Astrophysics Data System (ADS)

    Paproth, C.; Schlüßler, E.; Scherbaum, P.; Börner, A.

    2012-07-01

    During the development process of a remote sensing system, the optimization and the verification of the sensor system are important tasks. To support these tasks, the simulation of the sensor and its output is valuable. This enables the developers to test algorithms, estimate errors, and evaluate the capabilities of the whole sensor system before the final remote sensing system is available and produces real data. The presented simulation concept, SENSOR++, consists of three parts. The first part is the geometric simulation which calculates where the sensor looks at by using a ray tracing algorithm. This also determines whether the observed part of the scene is shadowed or not. The second part describes the radiometry and results in the spectral at-sensor radiance from the visible spectrum to the thermal infrared according to the simulated sensor type. In the case of earth remote sensing, it also includes a model of the radiative transfer through the atmosphere. The final part uses the at-sensor radiance to generate digital images by using an optical and an electronic sensor model. Using SENSOR++ for an optimization requires the additional application of task-specific data processing algorithms. The principle of the simulation approach is explained, all relevant concepts of SENSOR++ are discussed, and first examples of its use are given, for example a camera simulation for a moon lander. Finally, the verification of SENSOR++ is demonstrated.

  18. Quality Assurance Assessment of the F-35 Lightning II Program

    DTIC Science & Technology

    2013-09-30

    assurance personnel had not verified epoxy primer, urethane topcoat, and abrasion - resistant coating processes. In another case, there was no indication...other for electrical resistance . A review of drawing requirements and discussions Contractor Assessments DODIG-2013-140 │ 11 with personnel noted that...the operators were not required to perform the electrical resistance verification, even though it was later determined to be required. Finally, the

  19. Reduction of Microbial Contaminants in Drinking Water by Ultraviolet Light Technology: ETS UV MODEL UVL-200-4 (Report and Statement)

    EPA Science Inventory

    Final technical report provides test methods used and verification results to be published on ETV web sites. The ETS UV System Model UVL-200-4 was tested to validate the UV dose delivered by the system using biodosimetry and a set line approach. The set line for 40 mJ/cm2 Red...

  20. Quantitative assessment of the physical potential of proton beam range verification with PET/CT.

    PubMed

    Knopf, A; Parodi, K; Paganetti, H; Cascio, E; Bonab, A; Bortfeld, T

    2008-08-07

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6 degrees to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  1. Quantitative assessment of the physical potential of proton beam range verification with PET/CT

    NASA Astrophysics Data System (ADS)

    Knopf, A.; Parodi, K.; Paganetti, H.; Cascio, E.; Bonab, A.; Bortfeld, T.

    2008-08-01

    A recent clinical pilot study demonstrated the feasibility of offline PET/CT range verification for proton therapy treatments. In vivo PET measurements are challenged by blood perfusion, variations of tissue compositions, patient motion and image co-registration uncertainties. Besides these biological and treatment specific factors, the accuracy of the method is constrained by the underlying physical processes. This phantom study distinguishes physical factors from other factors, assessing the reproducibility, consistency and sensitivity of the PET/CT range verification method. A spread-out Bragg-peak (SOBP) proton field was delivered to a phantom consisting of poly-methyl methacrylate (PMMA), lung and bone equivalent material slabs. PET data were acquired in listmode at a commercial PET/CT scanner available within 10 min walking distance from the proton therapy unit. The measured PET activity distributions were compared to simulations of the PET signal based on Geant4 and FLUKA Monte Carlo (MC) codes. To test the reproducibility of the measured PET signal, data from two independent measurements at the same geometrical position in the phantom were compared. Furthermore, activation depth profiles within identical material arrangements but at different positions within the irradiation field were compared to test the consistency of the measured PET signal. Finally, activation depth profiles through air/lung, air/bone and lung/bone interfaces parallel as well as at 6° to the beam direction were studied to investigate the sensitivity of the PET/CT range verification method. The reproducibility and the consistency of the measured PET signal were found to be of the same order of magnitude. They determine the physical accuracy of the PET measurement to be about 1 mm. However, range discrepancies up to 2.6 mm between two measurements and range variations up to 2.6 mm within one measurement were found at the beam edge and at the edge of the field of view (FOV) of the PET scanner. PET/CT range verification was found to be able to detect small range modifications in the presence of complex tissue inhomogeneities. This study indicates the physical potential of the PET/CT verification method to detect the full-range characteristic of the delivered dose in the patient.

  2. Design and Development of the Space Shuttle Tail Service Masts

    NASA Technical Reports Server (NTRS)

    Dandage, S. R.; Herman, N. A.; Godfrey, S. E.; Uda, R. T.

    1977-01-01

    The successful launch of a space shuttle vehicle depends on the proper operation of two tail service masts (TSMs). Reliable TSM operation is assured through a comprehensive design, development, and testing program. The results of the concept verification test (CVT) and the resulting impact on prototype TSM design are presented. The design criteria are outlined, and the proposed prototype TSM tests are described.

  3. Cohesion: The Vital Ingredient for Successful Army Units

    DTIC Science & Technology

    1982-04-19

    responding in military life as well.𔄀 A special problem of social cohesion directly related to social background was the integration of minority troops...forces has been a powerful verification of sociological theory concerning social cohesion and organizational effectiveness. Sociological theory does not...prevent the development of groups with social cohesion committed to the military hierarchy. 2 5 Personality of Wnit Mmbers Among the characteristics

  4. The End-To-End Safety Verification Process Implemented to Ensure Safe Operations of the Columbus Research Module

    NASA Astrophysics Data System (ADS)

    Arndt, J.; Kreimer, J.

    2010-09-01

    The European Space Laboratory COLUMBUS was launched in February 2008 with NASA Space Shuttle Atlantis. Since successful docking and activation this manned laboratory forms part of the International Space Station(ISS). Depending on the objectives of the Mission Increments the on-orbit configuration of the COLUMBUS Module varies with each increment. This paper describes the end-to-end verification which has been implemented to ensure safe operations under the condition of a changing on-orbit configuration. That verification process has to cover not only the configuration changes as foreseen by the Mission Increment planning but also those configuration changes on short notice which become necessary due to near real-time requests initiated by crew or Flight Control, and changes - most challenging since unpredictable - due to on-orbit anomalies. Subject of the safety verification is on one hand the on orbit configuration itself including the hardware and software products, on the other hand the related Ground facilities needed for commanding of and communication to the on-orbit System. But also the operational products, e.g. the procedures prepared for crew and ground control in accordance to increment planning, are subject of the overall safety verification. In order to analyse the on-orbit configuration for potential hazards and to verify the implementation of the related Safety required hazard controls, a hierarchical approach is applied. The key element of the analytical safety integration of the whole COLUMBUS Payload Complement including hardware owned by International Partners is the Integrated Experiment Hazard Assessment(IEHA). The IEHA especially identifies those hazardous scenarios which could potentially arise through physical and operational interaction of experiments. A major challenge is the implementation of a Safety process which owns quite some rigidity in order to provide reliable verification of on-board Safety and which likewise provides enough flexibility which is desired by manned space operations with scientific objectives. In the period of COLUMBUS operations since launch already a number of lessons learnt could be implemented especially in the IEHA that allow to improve the flexibility of on-board operations without degradation of Safety.

  5. Understanding the contamination of food with mineral oil: the need for a confirmatory analytical and procedural approach.

    PubMed

    Spack, Lionel W; Leszczyk, Gabriela; Varela, Jesus; Simian, Hervé; Gude, Thomas; Stadler, Richard H

    2017-06-01

    The contamination of food by mineral oil hydrocarbons (MOHs) found in packaging is a long-running concern. A main source of MOHs in foods is the migration of mineral oil from recycled board into the packed food products. Consequently, the majority of food manufacturers have taken protective measures, e.g., by using virgin board instead of recycled fibres and, where feasible, introducing functional barriers to mitigate migration. Despite these protective measures, MOHs may still be observed in low amounts in certain food products, albeit due to different entry points across the food supply chain. In this study, we successfully apply gas chromatography coupled to mass spectrometry (GC-MS) to demonstrate, through marker compounds and the profile of the hydrocarbon response, the possible source of contamination using mainly chocolate and cereals as food matrices. The conventional liquid chromatography-one-dimensional GC coupled to a flame ionisation detector (LC-GC-FID) is a useful screening method, but in cases of positive samples it must be complemented by a confirmatory method such as, for example, GC-MS, allowing a verification of mineral oil contamination. The procedural approach proposed in this study entails profile analysis, marker identification, and interpretation and final quantification.

  6. Palm Vein Verification Using Multiple Features and Locality Preserving Projections

    PubMed Central

    Bu, Wei; Wu, Xiangqian; Zhao, Qiushi

    2014-01-01

    Biometrics is defined as identifying people by their physiological characteristic, such as iris pattern, fingerprint, and face, or by some aspects of their behavior, such as voice, signature, and gesture. Considerable attention has been drawn on these issues during the last several decades. And many biometric systems for commercial applications have been successfully developed. Recently, the vein pattern biometric becomes increasingly attractive for its uniqueness, stability, and noninvasiveness. A vein pattern is the physical distribution structure of the blood vessels underneath a person's skin. The palm vein pattern is very ganglion and it shows a huge number of vessels. The attitude of the palm vein vessels stays in the same location for the whole life and its pattern is definitely unique. In our work, the matching filter method is proposed for the palm vein image enhancement. New palm vein features extraction methods, global feature extracted based on wavelet coefficients and locality preserving projections (WLPP), and local feature based on local binary pattern variance and locality preserving projections (LBPV_LPP) have been proposed. Finally, the nearest neighbour matching method has been proposed that verified the test palm vein images. The experimental result shows that the EER to the proposed method is 0.1378%. PMID:24693230

  7. A monogamy-of-entanglement game with applications to device-independent quantum cryptography

    NASA Astrophysics Data System (ADS)

    Tomamichel, Marco; Fehr, Serge; Kaniewski, Jędrzej; Wehner, Stephanie

    2013-10-01

    We consider a game in which two separate laboratories collaborate to prepare a quantum system and are then asked to guess the outcome of a measurement performed by a third party in a random basis on that system. Intuitively, by the uncertainty principle and the monogamy of entanglement, the probability that both players simultaneously succeed in guessing the outcome correctly is bounded. We are interested in the question of how the success probability scales when many such games are performed in parallel. We show that any strategy that maximizes the probability to win every game individually is also optimal for the parallel repetition of the game. Our result implies that the optimal guessing probability can be achieved without the use of entanglement. We explore several applications of this result. Firstly, we show that it implies security for standard BB84 quantum key distribution when the receiving party uses fully untrusted measurement devices, i.e. we show that BB84 is one-sided device independent. Secondly, we show how our result can be used to prove security of a one-round position-verification scheme. Finally, we generalize a well-known uncertainty relation for the guessing probability to quantum side information.

  8. Palm vein verification using multiple features and locality preserving projections.

    PubMed

    Al-Juboori, Ali Mohsin; Bu, Wei; Wu, Xiangqian; Zhao, Qiushi

    2014-01-01

    Biometrics is defined as identifying people by their physiological characteristic, such as iris pattern, fingerprint, and face, or by some aspects of their behavior, such as voice, signature, and gesture. Considerable attention has been drawn on these issues during the last several decades. And many biometric systems for commercial applications have been successfully developed. Recently, the vein pattern biometric becomes increasingly attractive for its uniqueness, stability, and noninvasiveness. A vein pattern is the physical distribution structure of the blood vessels underneath a person's skin. The palm vein pattern is very ganglion and it shows a huge number of vessels. The attitude of the palm vein vessels stays in the same location for the whole life and its pattern is definitely unique. In our work, the matching filter method is proposed for the palm vein image enhancement. New palm vein features extraction methods, global feature extracted based on wavelet coefficients and locality preserving projections (WLPP), and local feature based on local binary pattern variance and locality preserving projections (LBPV_LPP) have been proposed. Finally, the nearest neighbour matching method has been proposed that verified the test palm vein images. The experimental result shows that the EER to the proposed method is 0.1378%.

  9. Hierarchical structural health monitoring system combining a fiber optic spinal cord network and distributed nerve cell devices

    NASA Astrophysics Data System (ADS)

    Minakuchi, Shu; Tsukamoto, Haruka; Takeda, Nobuo

    2009-03-01

    This study proposes novel hierarchical sensing concept for detecting damages in composite structures. In the hierarchical system, numerous three-dimensionally structured sensor devices are distributed throughout the whole structural area and connected with the optical fiber network through transducing mechanisms. The distributed "sensory nerve cell" devices detect the damage, and the fiber optic "spinal cord" network gathers damage signals and transmits the information to a measuring instrument. This study began by discussing the basic concept of the hierarchical sensing system thorough comparison with existing fiber optic based systems and nerve systems in the animal kingdom. Then, in order to validate the proposed sensing concept, impact damage detection system for the composite structure was proposed. The sensor devices were developed based on Comparative Vacuum Monitoring (CVM) system and the Brillouin based distributed strain sensing was utilized to gather the damage signals from the distributed devices. Finally a verification test was conducted using prototype devices. Occurrence of barely visible impact damage was successfully detected and it was clearly indicated that the hierarchical system has better repairability, higher robustness, and wider monitorable area compared to existing systems utilizing embedded optical fiber sensors.

  10. Hyperspectral Geobotanical Remote Sensing for Monitoring and Verifying CO 2 Containment Final Report CRADA No. TC-2036-02

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Pickles, W. L.; Ebrom, D. A.

    This collaborative effort was in support of the CO 2 Capture Project (CCP), to develop techniques that integrate overhead images of plant species, plant health, geological formations, soil types, aquatic, and human use spatial patterns for detection and discrimination of any CO 2 releases from underground storage formations. The goal of this work was to demonstrate advanced hyperspectral geobotanical remote sensing methods to assess potential leakage of CO 2 from underground storage. The timeframes and scales relevant to the long-term storage of CO 2 in the subsurface make remote sensing methods attractive. Moreover, it has been shown that individual fieldmore » measurements of gas composition are subject to variability on extremely small temporal and spatial scales. The ability to verify ultimate reservoir integrity and to place individual surface measurements into context will be crucial to successful long-term monitoring and verification activities. The desired results were to produce a defined and tested procedure that could be easily used for long-term monitoring of possible CO 2 leakage from underground CO 2 sequestration sites. This testing standard will be utilized on behalf of the oil industry.« less

  11. State of the art and taxonomy of prognostics approaches, trends of prognostics applications and open issues towards maturity at different technology readiness levels

    NASA Astrophysics Data System (ADS)

    Javed, Kamran; Gouriveau, Rafael; Zerhouni, Noureddine

    2017-09-01

    Integrating prognostics to a real application requires a certain maturity level and for this reason there is a lack of success stories about development of a complete Prognostics and Health Management system. In fact, the maturity of prognostics is closely linked to data and domain specific entities like modeling. Basically, prognostics task aims at predicting the degradation of engineering assets. However, practically it is not possible to precisely predict the impending failure, which requires a thorough understanding to encounter different sources of uncertainty that affect prognostics. Therefore, different aspects crucial to the prognostics framework, i.e., from monitoring data to remaining useful life of equipment need to be addressed. To this aim, the paper contributes to state of the art and taxonomy of prognostics approaches and their application perspectives. In addition, factors for prognostics approach selection are identified, and new case studies from component-system level are discussed. Moreover, open challenges toward maturity of the prognostics under uncertainty are highlighted and scheme for an efficient prognostics approach is presented. Finally, the existing challenges for verification and validation of prognostics at different technology readiness levels are discussed with respect to open challenges.

  12. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paul, J. N.; Chin, M. R.; Sjoden, G. E.

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reactionmore » rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)« less

  13. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    NASA Astrophysics Data System (ADS)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  14. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Andrews, Stephen F.; Morgenstern, Wendy M.; Bartholomew, Maureen O.; McComas, David C.; Bauer, Frank H. (Technical Monitor)

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, attitude control, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on previous missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the perceived benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  15. Using Automation to Improve the Flight Software Testing Process

    NASA Technical Reports Server (NTRS)

    ODonnell, James R., Jr.; Morgenstern, Wendy M.; Bartholomew, Maureen O.

    2001-01-01

    One of the critical phases in the development of a spacecraft attitude control system (ACS) is the testing of its flight software. The testing (and test verification) of ACS flight software requires a mix of skills involving software, knowledge of attitude control, and attitude control hardware, data manipulation, and analysis. The process of analyzing and verifying flight software test results often creates a bottleneck which dictates the speed at which flight software verification can be conducted. In the development of the Microwave Anisotropy Probe (MAP) spacecraft ACS subsystem, an integrated design environment was used that included a MAP high fidelity (HiFi) simulation, a central database of spacecraft parameters, a script language for numeric and string processing, and plotting capability. In this integrated environment, it was possible to automate many of the steps involved in flight software testing, making the entire process more efficient and thorough than on previous missions. In this paper, we will compare the testing process used on MAP to that used on other missions. The software tools that were developed to automate testing and test verification will be discussed, including the ability to import and process test data, synchronize test data and automatically generate HiFi script files used for test verification, and an automated capability for generating comparison plots. A summary of the benefits of applying these test methods on MAP will be given. Finally, the paper will conclude with a discussion of re-use of the tools and techniques presented, and the ongoing effort to apply them to flight software testing of the Triana spacecraft ACS subsystem.

  16. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  17. Monte Carlo verification of radiotherapy treatments with CloudMC.

    PubMed

    Miras, Hector; Jiménez, Rubén; Perales, Álvaro; Terrón, José Antonio; Bertolet, Alejandro; Ortiz, Antonio; Macías, José

    2018-06-27

    A new implementation has been made on CloudMC, a cloud-based platform presented in a previous work, in order to provide services for radiotherapy treatment verification by means of Monte Carlo in a fast, easy and economical way. A description of the architecture of the application and the new developments implemented is presented together with the results of the tests carried out to validate its performance. CloudMC has been developed over Microsoft Azure cloud. It is based on a map/reduce implementation for Monte Carlo calculations distribution over a dynamic cluster of virtual machines in order to reduce calculation time. CloudMC has been updated with new methods to read and process the information related to radiotherapy treatment verification: CT image set, treatment plan, structures and dose distribution files in DICOM format. Some tests have been designed in order to determine, for the different tasks, the most suitable type of virtual machines from those available in Azure. Finally, the performance of Monte Carlo verification in CloudMC is studied through three real cases that involve different treatment techniques, linac models and Monte Carlo codes. Considering computational and economic factors, D1_v2 and G1 virtual machines were selected as the default type for the Worker Roles and the Reducer Role respectively. Calculation times up to 33 min and costs of 16 € were achieved for the verification cases presented when a statistical uncertainty below 2% (2σ) was required. The costs were reduced to 3-6 € when uncertainty requirements are relaxed to 4%. Advantages like high computational power, scalability, easy access and pay-per-usage model, make Monte Carlo cloud-based solutions, like the one presented in this work, an important step forward to solve the long-lived problem of truly introducing the Monte Carlo algorithms in the daily routine of the radiotherapy planning process.

  18. MO-G-BRE-04: Automatic Verification of Daily Treatment Deliveries and Generation of Daily Treatment Reports for a MR Image-Guided Treatment Machine

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Yang, D; Li, X; Li, H

    2014-06-15

    Purpose: Two aims of this work were to develop a method to automatically verify treatment delivery accuracy immediately after patient treatment and to develop a comprehensive daily treatment report to provide all required information for daily MR-IGRT review. Methods: After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a novel MR-IGRT treatment machine, we designed a method to use 1) treatment plan files, 2) delivery log files, and 3) dosimetric calibration information to verify the accuracy and completeness of daily treatment deliveries. The method verifies the correctness of delivered treatment plans and beams, beammore » segments, and for each segment, the beam-on time and MLC leaf positions. Composite primary fluence maps are calculated from the MLC leaf positions and the beam-on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. We also designed the daily treatment delivery report by including all required information for MR-IGRT and physics weekly review - the plan and treatment fraction information, dose verification information, daily patient setup screen captures, and the treatment delivery verification results. Results: The parameters in the log files (e.g. MLC positions) were independently verified and deemed accurate and trustable. A computer program was developed to implement the automatic delivery verification and daily report generation. The program was tested and clinically commissioned with sufficient IMRT and 3D treatment delivery data. The final version has been integrated into a commercial MR-IGRT treatment delivery system. Conclusion: A method was developed to automatically verify MR-IGRT treatment deliveries and generate daily treatment reports. Already in clinical use since December 2013, the system is able to facilitate delivery error detection, and expedite physician daily IGRT review and physicist weekly chart review.« less

  19. A software tool to automatically assure and report daily treatment deliveries by a cobalt‐60 radiation therapy device

    PubMed Central

    Wooten, H. Omar; Green, Olga; Li, Harold H.; Liu, Shi; Li, Xiaoling; Rodriguez, Vivian; Mutic, Sasa; Kashani, Rojano

    2016-01-01

    The aims of this study were to develop a method for automatic and immediate verification of treatment delivery after each treatment fraction in order to detect and correct errors, and to develop a comprehensive daily report which includes delivery verification results, daily image‐guided radiation therapy (IGRT) review, and information for weekly physics reviews. After systematically analyzing the requirements for treatment delivery verification and understanding the available information from a commercial MRI‐guided radiotherapy treatment machine, we designed a procedure to use 1) treatment plan files, 2) delivery log files, and 3) beam output information to verify the accuracy and completeness of each daily treatment delivery. The procedure verifies the correctness of delivered treatment plan parameters including beams, beam segments and, for each segment, the beam‐on time and MLC leaf positions. For each beam, composite primary fluence maps are calculated from the MLC leaf positions and segment beam‐on time. Error statistics are calculated on the fluence difference maps between the plan and the delivery. A daily treatment delivery report is designed to include all required information for IGRT and weekly physics reviews including the plan and treatment fraction information, daily beam output information, and the treatment delivery verification results. A computer program was developed to implement the proposed procedure of the automatic delivery verification and daily report generation for an MRI guided radiation therapy system. The program was clinically commissioned. Sensitivity was measured with simulated errors. The final version has been integrated into the commercial version of the treatment delivery system. The method automatically verifies the EBRT treatment deliveries and generates the daily treatment reports. Already in clinical use for over one year, it is useful to facilitate delivery error detection, and to expedite physician daily IGRT review and physicist weekly chart review. PACS number(s): 87.55.km PMID:27167269

  20. PET/CT imaging for treatment verification after proton therapy: A study with plastic phantoms and metallic implants

    PubMed Central

    Parodi, Katia; Paganetti, Harald; Cascio, Ethan; Flanz, Jacob B.; Bonab, Ali A.; Alpert, Nathaniel M.; Lohmann, Kevin; Bortfeld, Thomas

    2008-01-01

    The feasibility of off-line positron emission tomography/computed tomography (PET/CT) for routine three dimensional in-vivo treatment verification of proton radiation therapy is currently under investigation at Massachusetts General Hospital in Boston. In preparation for clinical trials, phantom experiments were carried out to investigate the sensitivity and accuracy of the method depending on irradiation and imaging parameters. Furthermore, they addressed the feasibility of PET/CT as a robust verification tool in the presence of metallic implants. These produce x-ray CT artifacts and fluence perturbations which may compromise the accuracy of treatment planning algorithms. Spread-out Bragg peak proton fields were delivered to different phantoms consisting of polymethylmethacrylate (PMMA), PMMA stacked with lung and bone equivalent materials, and PMMA with titanium rods to mimic implants in patients. PET data were acquired in list mode starting within 20 min after irradiation at a commercial luthetium-oxyorthosilicate (LSO)-based PET/CT scanner. The amount and spatial distribution of the measured activity could be well reproduced by calculations based on the GEANT4 and FLUKA Monte Carlo codes. This phantom study supports the potential of millimeter accuracy for range monitoring and lateral field position verification even after low therapeutic dose exposures of 2 Gy, despite the delay between irradiation and imaging. It also indicates the value of PET for treatment verification in the presence of metallic implants, demonstrating a higher sensitivity to fluence perturbations in comparison to a commercial analytical treatment planning system. Finally, it addresses the suitability of LSO-based PET detectors for hadron therapy monitoring. This unconventional application of PET involves countrates which are orders of magnitude lower than in diagnostic tracer imaging, i.e., the signal of interest is comparable to the noise originating from the intrinsic radioactivity of the detector itself. In addition to PET alone, PET/CT imaging provides accurate information on the position of the imaged object and may assess possible anatomical changes during fractionated radiotherapy in clinical applications. PMID:17388158

  1. Verification and Implementation of Operations Safety Controls for Flight Missions

    NASA Technical Reports Server (NTRS)

    Smalls, James R.; Jones, Cheryl L.; Carrier, Alicia S.

    2010-01-01

    There are several engineering disciplines, such as reliability, supportability, quality assurance, human factors, risk management, safety, etc. Safety is an extremely important engineering specialty within NASA, and the consequence involving a loss of crew is considered a catastrophic event. Safety is not difficult to achieve when properly integrated at the beginning of each space systems project/start of mission planning. The key is to ensure proper handling of safety verification throughout each flight/mission phase. Today, Safety and Mission Assurance (S&MA) operations engineers continue to conduct these flight product reviews across all open flight products. As such, these reviews help ensure that each mission is accomplished with safety requirements along with controls heavily embedded in applicable flight products. Most importantly, the S&MA operations engineers are required to look for important design and operations controls so that safety is strictly adhered to as well as reflected in the final flight product.

  2. Coherent Lidar Design and Performance Verification

    NASA Technical Reports Server (NTRS)

    Frehlich, Rod

    1996-01-01

    This final report summarizes the investigative results from the 3 complete years of funding and corresponding publications are listed. The first year saw the verification of beam alignment for coherent Doppler lidar in space by using the surface return. The second year saw the analysis and computerized simulation of using heterodyne efficiency as an absolute measure of performance of coherent Doppler lidar. A new method was proposed to determine the estimation error for Doppler lidar wind measurements without the need for an independent wind measurement. Coherent Doppler lidar signal covariance, including wind shear and turbulence, was derived and calculated for typical atmospheric conditions. The effects of wind turbulence defined by Kolmogorov spatial statistics were investigated theoretically and with simulations. The third year saw the performance of coherent Doppler lidar in the weak signal regime determined by computer simulations using the best velocity estimators. Improved algorithms for extracting the performance of velocity estimators with wind turbulence included were also produced.

  3. Research on the injectors remanufacturing

    NASA Astrophysics Data System (ADS)

    Daraba, D.; Alexandrescu, I. M.; Daraba, C.

    2017-05-01

    During the remanufacturing process, the injector body - after disassembling and cleaning process - should be subjected to some strict control processes, both visually and by an electronic microscope, for evidencing any defects that may occur on the sealing surface of the injector body and the atomizer. In this paper we present the path followed by an injector body in the process of remanufacturing, exemplifying the verification method of roughness and hardness of the sealing surfaces, as well as the microscopic analysis of the sealing surface areas around the inlet. These checks can indicate which path the injector body has to follow during the remanufacturing. The control methodology of the injector body, that is established on the basis of this research, helps preventing some defective injector bodies to enter into the remanufacturing process, thus reducing to a minimum the number of remanufactured injectors to be declared non-conforming after final verification process.

  4. A system for EPID-based real-time treatment delivery verification during dynamic IMRT treatment.

    PubMed

    Fuangrod, Todsaporn; Woodruff, Henry C; van Uytven, Eric; McCurdy, Boyd M C; Kuncic, Zdenka; O'Connor, Daryl J; Greer, Peter B

    2013-09-01

    To design and develop a real-time electronic portal imaging device (EPID)-based delivery verification system for dynamic intensity modulated radiation therapy (IMRT) which enables detection of gross treatment delivery errors before delivery of substantial radiation to the patient. The system utilizes a comprehensive physics-based model to generate a series of predicted transit EPID image frames as a reference dataset and compares these to measured EPID frames acquired during treatment. The two datasets are using MLC aperture comparison and cumulative signal checking techniques. The system operation in real-time was simulated offline using previously acquired images for 19 IMRT patient deliveries with both frame-by-frame comparison and cumulative frame comparison. Simulated error case studies were used to demonstrate the system sensitivity and performance. The accuracy of the synchronization method was shown to agree within two control points which corresponds to approximately ∼1% of the total MU to be delivered for dynamic IMRT. The system achieved mean real-time gamma results for frame-by-frame analysis of 86.6% and 89.0% for 3%, 3 mm and 4%, 4 mm criteria, respectively, and 97.9% and 98.6% for cumulative gamma analysis. The system can detect a 10% MU error using 3%, 3 mm criteria within approximately 10 s. The EPID-based real-time delivery verification system successfully detected simulated gross errors introduced into patient plan deliveries in near real-time (within 0.1 s). A real-time radiation delivery verification system for dynamic IMRT has been demonstrated that is designed to prevent major mistreatments in modern radiation therapy.

  5. Verification of mesoscale objective analyses of VAS and rawinsonde data using the March 1982 AVE/VAS special network data

    NASA Technical Reports Server (NTRS)

    Doyle, James D.; Warner, Thomas T.

    1987-01-01

    Various combinations of VAS (Visible and Infrared Spin Scan Radiometer Atmospheric Sounder) data, conventional rawinsonde data, and gridded data from the National Weather Service's (NWS) global analysis, were used in successive-correction and variational objective-analysis procedures. Analyses are produced for 0000 GMT 7 March 1982, when the VAS sounding distribution was not greatly limited by the existence of cloud cover. The successive-correction (SC) procedure was used with VAS data alone, rawinsonde data alone, and both VAS and rawinsonde data. Variational techniques were applied in three ways. Each of these techniques was discussed.

  6. Advanced on-site power plant development technology program

    NASA Technical Reports Server (NTRS)

    Kemp, F. S.

    1985-01-01

    A 30-cell stack was tested for 7200 hours. At 6000 hours the stack was successfully refilled with acid with no loss of performance. A second stack containing the advanced Configuration B cell package was fabricated and assembled for testing in 1985. A 200-kW brassboard inverter was successfully evaluated, verifying the design of the two-bridge ASCR circuit design. A fuel processing catalyst train was tested for 2000 hours verifying the catalyst for use in a 200-kW development reformer. The development reformer was fabricated for evaluation in 1985. The initial test plan was prepared for a 200-kW verification test article.

  7. Application of optimal control theory to the design of the NASA/JPL 70-meter antenna servos

    NASA Technical Reports Server (NTRS)

    Alvarez, L. S.; Nickerson, J.

    1989-01-01

    The application of Linear Quadratic Gaussian (LQG) techniques to the design of the 70-m axis servos is described. Linear quadratic optimal control and Kalman filter theory are reviewed, and model development and verification are discussed. Families of optimal controller and Kalman filter gain vectors were generated by varying weight parameters. Performance specifications were used to select final gain vectors.

  8. Final Report for the ZERT Project: Basic Science of Retention Issues, Risk Assessment & Measurement, Monitoring and Verification for Geologic Sequestration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spangler, Lee; Cunningham, Alfred; Lageson, David

    2011-03-31

    ZERT has made major contributions to five main areas of sequestration science: improvement of computational tools; measurement and monitoring techniques to verify storage and track migration of CO{sub 2}; development of a comprehensive performance and risk assessment framework; fundamental geophysical, geochemical and hydrological investigations of CO{sub 2} storage; and investigate innovative, bio-based mitigation strategies.

  9. Comparison of DSMC Reaction Models with QCT Reaction Rates for Nitrogen

    DTIC Science & Technology

    2016-07-17

    The U.S. Government is joint author of the work and has the right to use, modify, reproduce, release, perform, display, or disclose the work. 13...Distribution A: Approved for Public Release, Distribution Unlimited PA #16299 Introduction • Comparison with measurements is final goal • Validation...model verification and parameter adjustment • Four chemistry models: total collision energy (TCE), quantum kinetic (QK), vibration-dissociation favoring

  10. Verification procedure for the wavefront quality of the primary mirrors for the MRO interferometer

    NASA Astrophysics Data System (ADS)

    Bakker, Eric J.; Olivares, Andres; Schmell, Reed A.; Schmell, Rodney A.; Gartner, Darren; Jaramillo, Anthony; Romero, Kelly; Rael, Andres; Lewis, Jeff

    2009-08-01

    We present the verification procedure for the 1.4 meter primary mirrors of the Magdalena Ridge Observatory Interferometer (MROI). Six mirrors are in mass production at Optical Surface Technologies (OST) in Albuquerque. The six identical parabolic mirrors will have a radius of curvature of 6300 mm and a final surface wavefront quality of 29 nm rms. The mirrors will be tested in a tower using a computer generated hologram, and the Intellium⢠H2000 interferometer from Engineering Synthesis Design, Inc. (ESDI). The mirror fabrication activities are currently in the early stage of polishing and have already delivered some promising results with the interferometer. A complex passive whiffle tree has been designed and fabricated by Advanced Mechanical and Optical Systems (AMOS, Belgium) that takes into account the gravity loading for an alt-alt mount. The final testing of the primary mirrors will be completed with the mirror cells that will be used in the telescopes. In addition we report on shear tests performed on the mirror cell pads on the back of the primary mirrors. These pads are glued to the mirror. The shear test has demonstrated that the glue can withstand at least 4.9 kilo Newton. This is within the requirements.

  11. International Space Station Passive Thermal Control System Analysis, Top Ten Lessons-Learned

    NASA Technical Reports Server (NTRS)

    Iovine, John

    2011-01-01

    The International Space Station (ISS) has been on-orbit for over 10 years, and there have been numerous technical challenges along the way from design to assembly to on-orbit anomalies and repairs. The Passive Thermal Control System (PTCS) management team has been a key player in successfully dealing with these challenges. The PTCS team performs thermal analysis in support of design and verification, launch and assembly constraints, integration, sustaining engineering, failure response, and model validation. This analysis is a significant body of work and provides a unique opportunity to compile a wealth of real world engineering and analysis knowledge and the corresponding lessons-learned. The analysis lessons encompass the full life cycle of flight hardware from design to on-orbit performance and sustaining engineering. These lessons can provide significant insight for new projects and programs. Key areas to be presented include thermal model fidelity, verification methods, analysis uncertainty, and operations support.

  12. TRAC-PF1 code verification with data from the OTIS test facility. [Once-Through Intergral System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Childerson, M.T.; Fujita, R.K.

    1985-01-01

    A computer code (TRAC-PF1/MOD1) developed for predicting transient thermal and hydraulic integral nuclear steam supply system (NSSS) response was benchmarked. Post-small break loss-of-coolant accident (LOCA) data from a scaled, experimental facility, designated the One-Through Integral System (OTIS), were obtained for the Babcock and Wilcox NSSS and compared to TRAC predictions. The OTIS tests provided a challenging small break LOCA data set for TRAC verification. The major phases of a small break LOCA observed in the OTIS tests included pressurizer draining and loop saturation, intermittent reactor coolant system circulation, boiler-condenser mode, and the initial stages of refill. The TRAC code wasmore » successful in predicting OTIS loop conditions (system pressures and temperatures) after modification of the steam generator model. In particular, the code predicted both pool and auxiliary-feedwater initiated boiler-condenser mode heat transfer.« less

  13. EVA Design, Verification, and On-Orbit Operations Support Using Worksite Analysis

    NASA Technical Reports Server (NTRS)

    Hagale, Thomas J.; Price, Larry R.

    2000-01-01

    The International Space Station (ISS) design is a very large and complex orbiting structure with thousands of Extravehicular Activity (EVA) worksites. These worksites are used to assemble and maintain the ISS. The challenge facing EVA designers was how to design, verify, and operationally support such a large number of worksites within cost and schedule. This has been solved through the practical use of computer aided design (CAD) graphical techniques that have been developed and used with a high degree of success over the past decade. The EVA design process allows analysts to work concurrently with hardware designers so that EVA equipment can be incorporated and structures configured to allow for EVA access and manipulation. Compliance with EVA requirements is strictly enforced during the design process. These techniques and procedures, coupled with neutral buoyancy underwater testing, have proven most valuable in the development, verification, and on-orbit support of planned or contingency EVA worksites.

  14. Space station software reliability analysis based on failures observed during testing at the multisystem integration facility

    NASA Technical Reports Server (NTRS)

    Tamayo, Tak Chai

    1987-01-01

    Quality of software not only is vital to the successful operation of the space station, it is also an important factor in establishing testing requirements, time needed for software verification and integration as well as launching schedules for the space station. Defense of management decisions can be greatly strengthened by combining engineering judgments with statistical analysis. Unlike hardware, software has the characteristics of no wearout and costly redundancies, thus making traditional statistical analysis not suitable in evaluating reliability of software. A statistical model was developed to provide a representation of the number as well as types of failures occur during software testing and verification. From this model, quantitative measure of software reliability based on failure history during testing are derived. Criteria to terminate testing based on reliability objectives and methods to estimate the expected number of fixings required are also presented.

  15. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1975-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program Assembly Control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools--the all-digital simulator, the hybrid simulator, and the Iron Bird simulator--are described, as well as the program test plans and their implementation on the various simulators. Failure-effects analysis and the creation of special failure-generating software for testing purposes are described. The quality of the end product is evidenced by the F-8 DFBW flight test program in which 42 flights, totaling 58 hours of flight time, were successfully made without any DFCS inflight software, or hardware, failures.

  16. Low cost solar array project silicon materials task. Development of a process for high capacity arc heater production of silicon for solar arrays

    NASA Technical Reports Server (NTRS)

    Fey, M. G.

    1981-01-01

    The experimental verification system for the production of silicon via the arc heater-sodium reduction of SiCl4 was designed, fabricated, installed, and operated. Each of the attendant subsystems was checked out and operated to insure performance requirements. These subsystems included: the arc heaters/reactor, cooling water system, gas system, power system, Control & Instrumentation system, Na injection system, SiCl4 injection system, effluent disposal system and gas burnoff system. Prior to introducing the reactants (Na and SiCl4) to the arc heater/reactor, a series of gas only-power tests was conducted to establish the operating parameters of the three arc heaters of the system. Following the successful completion of the gas only-power tests and the readiness tests of the sodium and SiCl4 injection systems, a shakedown test of the complete experimental verification system was conducted.

  17. Motivation Matters: Lessons for REDD+ Participatory Measurement, Reporting and Verification from Three Decades of Child Health Participatory Monitoring in Indonesia.

    PubMed

    Ekowati, Dian; Hofstee, Carola; Praputra, Andhika Vega; Sheil, Douglas

    2016-01-01

    Participatory Measurement, Reporting and Verification (PMRV), in the context of reducing emissions from deforestation and forest degradation with its co-benefits (REDD+) requires sustained monitoring and reporting by community members. This requirement appears challenging and has yet to be achieved. Other successful, long established, community self-monitoring and reporting systems may provide valuable lessons. The Indonesian integrated village healthcare program (Posyandu) was initiated in the 1980s and still provides effective and successful participatory measurement and reporting of child health status across the diverse, and often remote, communities of Indonesia. Posyandu activities focus on the growth and development of children under the age of five by recording their height and weight and reporting these monthly to the Ministry of Health. Here we focus on the local Posyandu personnel (kaders) and their motivations and incentives for contributing. While Posyandu and REDD+ measurement and reporting activities differ, there are sufficient commonalities to draw useful lessons. We find that the Posyandu kaders are motivated by their interests in health care, by their belief that it benefits the community, and by encouragement by local leaders. Recognition from the community, status within the system, training opportunities, competition among communities, and small payments provide incentives to sustain participation. We examine these lessons in the context of REDD+.

  18. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem

    PubMed Central

    Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-01-01

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem. PMID:29597286

  19. Forecasting of cyanobacterial density in Torrão reservoir using artificial neural networks.

    PubMed

    Torres, Rita; Pereira, Elisa; Vasconcelos, Vítor; Teles, Luís Oliva

    2011-06-01

    The ability of general regression neural networks (GRNN) to forecast the density of cyanobacteria in the Torrão reservoir (Tâmega river, Portugal), in a period of 15 days, based on three years of collected physical and chemical data, was assessed. Several models were developed and 176 were selected based on their correlation values for the verification series. A time lag of 11 was used, equivalent to one sample (periods of 15 days in the summer and 30 days in the winter). Several combinations of the series were used. Input and output data collected from three depths of the reservoir were applied (surface, euphotic zone limit and bottom). The model that presented a higher average correlation value presented the correlations 0.991; 0.843; 0.978 for training, verification and test series. This model had the three series independent in time: first test series, then verification series and, finally, training series. Only six input variables were considered significant to the performance of this model: ammonia, phosphates, dissolved oxygen, water temperature, pH and water evaporation, physical and chemical parameters referring to the three depths of the reservoir. These variables are common to the next four best models produced and, although these included other input variables, their performance was not better than the selected best model.

  20. A Large-Scale Study of Fingerprint Matching Systems for Sensor Interoperability Problem.

    PubMed

    AlShehri, Helala; Hussain, Muhammad; AboAlSamh, Hatim; AlZuair, Mansour

    2018-03-28

    The fingerprint is a commonly used biometric modality that is widely employed for authentication by law enforcement agencies and commercial applications. The designs of existing fingerprint matching methods are based on the hypothesis that the same sensor is used to capture fingerprints during enrollment and verification. Advances in fingerprint sensor technology have raised the question about the usability of current methods when different sensors are employed for enrollment and verification; this is a fingerprint sensor interoperability problem. To provide insight into this problem and assess the status of state-of-the-art matching methods to tackle this problem, we first analyze the characteristics of fingerprints captured with different sensors, which makes cross-sensor matching a challenging problem. We demonstrate the importance of fingerprint enhancement methods for cross-sensor matching. Finally, we conduct a comparative study of state-of-the-art fingerprint recognition methods and provide insight into their abilities to address this problem. We performed experiments using a public database (FingerPass) that contains nine datasets captured with different sensors. We analyzed the effects of different sensors and found that cross-sensor matching performance deteriorates when different sensors are used for enrollment and verification. In view of our analysis, we propose future research directions for this problem.

  1. Integrating Formal Methods and Testing 2002

    NASA Technical Reports Server (NTRS)

    Cukic, Bojan

    2002-01-01

    Traditionally, qualitative program verification methodologies and program testing are studied in separate research communities. None of them alone is powerful and practical enough to provide sufficient confidence in ultra-high reliability assessment when used exclusively. Significant advances can be made by accounting not only tho formal verification and program testing. but also the impact of many other standard V&V techniques, in a unified software reliability assessment framework. The first year of this research resulted in the statistical framework that, given the assumptions on the success of the qualitative V&V and QA procedures, significantly reduces the amount of testing needed to confidently assess reliability at so-called high and ultra-high levels (10-4 or higher). The coming years shall address the methodologies to realistically estimate the impacts of various V&V techniques to system reliability and include the impact of operational risk to reliability assessment. Combine formal correctness verification, process and product metrics, and other standard qualitative software assurance methods with statistical testing with the aim of gaining higher confidence in software reliability assessment for high-assurance applications. B) Quantify the impact of these methods on software reliability. C) Demonstrate that accounting for the effectiveness of these methods reduces the number of tests needed to attain certain confidence level. D) Quantify and justify the reliability estimate for systems developed using various methods.

  2. Cleaning verification: Exploring the effect of the cleanliness of stainless steel surface on sample recovery.

    PubMed

    Haidar Ahmad, Imad A; Tam, James; Li, Xue; Duffield, William; Tarara, Thomas; Blasko, Andrei

    2017-02-05

    The parameters affecting the recovery of pharmaceutical residues from the surface of stainless steel coupons for quantitative cleaning verification method development have been studied, including active pharmaceutical ingredient (API) level, spiking procedure, API/excipient ratio, analyst-to-analyst variability, inter-day variability, and cleaning procedure of the coupons. The lack of a well-defined procedure that consistently cleaned coupon surface was identified as the major contributor to low and variable recoveries. Assessment of acid, base, and oxidant washes, as well as the order of treatment, showed that a base-water-acid-water-oxidizer-water wash procedure resulted in consistent, accurate spiked recovery (>90%) and reproducible results (S rel ≤4%). By applying this cleaning procedure to the previously used coupons that failed the cleaning acceptance criteria, multiple analysts were able to obtain consistent recoveries from day-to-day for different APIs, and API/excipient ratios at various spike levels. We successfully applied our approach for cleaning verification of small molecules (MW<1000Da) as well as large biomolecules (MW up to 50,000Da). Method robustness was greatly influenced by the sample preparation procedure, especially for analyses using total organic carbon (TOC) determination. Copyright © 2016 Elsevier B.V. All rights reserved.

  3. "Tech-check-tech": a review of the evidence on its safety and benefits.

    PubMed

    Adams, Alex J; Martin, Steven J; Stolpe, Samuel F

    2011-10-01

    The published evidence on state-authorized programs permitting final verification of medication orders by pharmacy technicians, including the programs' impact on pharmacist work hours and clinical activities, is reviewed. Some form of "tech-check-tech" (TCT)--the checking of a technician's order-filling accuracy by another technician rather than a pharmacist--is authorized for use by pharmacies in at least nine states. The results of 11 studies published since 1978 indicate that technicians' accuracy in performing final dispensing checks is very comparable to pharmacists' accuracy (mean ± S.D., 99.6% ± 0.55% versus 99.3% ± 0.68%, respectively). In 6 of those studies, significant differences in accuracy or error detection rates favoring TCT were reported (p < 0.05), although published TCT studies to date have had important limitations. In states with active or pilot TCT programs, pharmacists surveyed have reported that the practice has yielded time savings (estimates range from 10 hours per month to 1 hour per day), enabling them to spend more time providing clinical services. States permitting TCT programs require technicians to complete special training before assuming TCT duties, which are generally limited to restocking automated dispensing machines and filling unit dose batches of refills in hospitals and other institutional settings. The published evidence demonstrates that pharmacy technicians can perform as accurately as pharmacists, perhaps more accurately, in the final verification of unit dose orders in institutional settings. Current TCT programs have fairly consistent elements, including the limitation of TCT to institutional settings, advanced education and training requirements for pharmacy technicians, and ongoing quality assurance.

  4. A Survey of Measurement, Mitigation, and Verification Field Technologies for Carbon Sequestration Geologic Storage

    NASA Astrophysics Data System (ADS)

    Cohen, K. K.; Klara, S. M.; Srivastava, R. D.

    2004-12-01

    The U.S. Department of Energy's (U.S. DOE's) Carbon Sequestration Program is developing state-of-the-science technologies for measurement, mitigation, and verification (MM&V) in field operations of geologic sequestration. MM&V of geologic carbon sequestration operations will play an integral role in the pre-injection, injection, and post-injection phases of carbon capture and storage projects to reduce anthropogenic greenhouse gas emissions. Effective MM&V is critical to the success of CO2 storage projects and will be used by operators, regulators, and stakeholders to ensure safe and permanent storage of CO2. In the U.S. DOE's Program, Carbon sequestration MM&V has numerous instrumental roles: Measurement of a site's characteristics and capability for sequestration; Monitoring of the site to ensure the storage integrity; Verification that the CO2 is safely stored; and Protection of ecosystems. Other drivers for MM&V technology development include cost-effectiveness, measurement precision, and frequency of measurements required. As sequestration operations are implemented in the future, it is anticipated that measurements over long time periods and at different scales will be required; this will present a significant challenge. MM&V sequestration technologies generally utilize one of the following approaches: below ground measurements; surface/near-surface measurements; aerial and satellite imagery; and modeling/simulations. Advanced subsurface geophysical technologies will play a primary role for MM&V. It is likely that successful MM&V programs will incorporate multiple technologies including but not limited to: reservoir modeling and simulations; geophysical techniques (a wide variety of seismic methods, microgravity, electrical, and electromagnetic techniques); subsurface fluid movement monitoring methods such as injection of tracers, borehole and wellhead pressure sensors, and tiltmeters; surface/near surface methods such as soil gas monitoring and infrared sensors and; aerial and satellite imagery. This abstract will describe results, similarities, and contrasts for funded studies from the U.S. DOE's Carbon Sequestration Program including examples from the Sleipner North Sea Project, the Canadian Weyburn Field/Dakota Gasification Plant Project, the Frio Formation Texas Project, and Yolo County Bioreactor Landfill Project. The abstract will also address the following: How are the terms ``measurement,'' ``mitigation''and ``verification'' defined in the Program? What is the U.S. DOE's Carbon Sequestration Program Roadmap and what are the Roadmap goals for MM&V? What is the current status of MM&V technologies?

  5. TET-1- A German Microsatellite for Technology On -Orbit Verification

    NASA Astrophysics Data System (ADS)

    Föckersperger, S.; Lattner, K.; Kaiser, C.; Eckert, S.; Bärwald, W.; Ritzmann, S.; Mühlbauer, P.; Turk, M.; Willemsen, P.

    2008-08-01

    Due to the high safety standards in the space industry every new product must go through a verification process before qualifying for operation in a space system. Within the verification process the payload undergoes a series of tests which prove that it is in accordance with mission requirements in terms of function, reliability and safety. Important verification components are the qualification for use on the ground as well as the On-Orbit Verification (OOV), i.e. proof that the product is suitable for use under virtual space conditions (on-orbit). Here it is demonstrated that the product functions under conditions which cannot or can only be partially simulated on the ground. The OOV-Program of the DLR serves to bridge the gap between the product tested and qualified on the ground and the utilization of the product in space. Due to regular and short-term availability of flight opportunities industry and research facilities can verify their latest products under space conditions and demonstrate their reliability and marketability. The Technologie-Erprobungs-Tr&äger TET (Technology Experiments Carrier) comprises the core elements of the OOV Program. A programmatic requirement of the OOV Program is that a satellite bus already verified in orbit be used in the first segment of the program. An analysis of suitable satellite buses showed that a realization of the TET satellite bus based on the BIRD satellite bus fulfilled the programmatic requirements best. Kayser-Threde was selected by DLR as Prime Contractor to perform the project together with its major subcontractors Astro- und Feinwerktechnik, Berlin for the platform development and DLR-GSOC for the ground segment development. TET is now designed to be a modular and flexible micro-satellite for any orbit between 450 and 850 km altitude and inclination between 53° and SSO. With an overall mass of 120 kg TET is able to accommodate experiments of up to 50 kg. A multipurpose payload supply systemThere is significant confusion in the space industry today over the terms used to describe satellite bus architectures. Terms such as "standard bus" (or "common bus"), "modular bus" and "plug-and-play bus" are often used with little understanding of what the terms actually mean, and even less understanding of what the differences in these space architectures mean. It may seem that these terms are subtle differentiators, but in reality these terms describe radically different ways to design, build, test, and operate satellites. Furthermore, these terms imply very different business models for the acquisition, operation, and sustainment of space systems. This paper will define and describe the difference between "standard buses", "modular buses" and "plug-and-play buses"; giving examples of each kind with a cost/benefit discussion of each type. under Kayser-Threde responsibility provides the necessary interfaces to the experiments. The first TET mission is scheduled for mid of 2010. TET will be launched as piggy-back payload on any available launcher worldwide to reduce launch cost and provide maximum flexibility. Finally, TET will provide all services required by the experimenters for a one year mission operation to perform a successful OOV-mission with its technology experiments leading to an efficient access to space for German industry and institutions.

  6. EDITORIAL: International Workshop on Monte Carlo Techniques in Radiotherapy Delivery and Verification

    NASA Astrophysics Data System (ADS)

    Verhaegen, Frank; Seuntjens, Jan

    2008-03-01

    Monte Carlo particle transport techniques offer exciting tools for radiotherapy research, where they play an increasingly important role. Topics of research related to clinical applications range from treatment planning, motion and registration studies, brachytherapy, verification imaging and dosimetry. The International Workshop on Monte Carlo Techniques in Radiotherapy Delivery and Verification took place in a hotel in Montreal in French Canada, from 29 May-1 June 2007, and was the third workshop to be held on a related topic, which now seems to have become a tri-annual event. About one hundred workers from many different countries participated in the four-day meeting. Seventeen experts in the field were invited to review topics and present their latest work. About half of the audience was made up by young graduate students. In a very full program, 57 papers were presented and 10 posters were on display during most of the meeting. On the evening of the third day a boat trip around the island of Montreal allowed participants to enjoy the city views, and to sample the local cuisine. The topics covered at the workshop included the latest developments in the most popular Monte Carlo transport algorithms, fast Monte Carlo, statistical issues, source modeling, MC treatment planning, modeling of imaging devices for treatment verification, registration and deformation of images and a sizeable number of contributions on brachytherapy. In this volume you will find 27 short papers resulting from the workshop on a variety of topics, some of them on very new stuff such as graphics processing units for fast computing, PET modeling, dual-energy CT, calculations in dynamic phantoms, tomotherapy devices, . . . . We acknowledge the financial support of the National Cancer Institute of Canada, the Institute of Cancer Research of the Canadian Institutes of Health Research, the Association Québécoise des Physicien(ne)s Médicaux Clinique, the Institute of Physics, and MedicalPhysicsWeb. At McGill we thank the following departments for support: the Cancer Axis of the Research Institute of the McGill University Health Center, the Faculties of Medicine and Science, the Departments of Oncology and Physics and the Medical Physics Unit. The following companies are thanked: TomoTherapy and Standard Imaging. The American Association of Physicists in Medicine and the International Atomic Energy Agency are gratefully acknowledged for endorsing the meeting. A final word of thanks goes out to all of those who contributed to the successful Workshop: first of all our administrative assistant Ms Margery Knewstubb, the website developer Dr François DeBlois, the two heads of the logistics team, Ms Emily Poon and Ms Emily Heath, our local medical physics students and staff, the IOP staff and the authors who shared their new and exciting work with us. Editors: Frank Verhaegen and Jan Seuntjens (McGill University) Associate editors: Luc Beaulieu, Iwan Kawrakow, Tony Popescu and David Rogers

  7. Southern California Edison Grid Integration Evaluation: Cooperative Research and Development Final Report, CRADA Number CRD-10-376

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mather, Barry

    2015-07-09

    The objective of this project is to use field verification to improve DOE’s ability to model and understand the impacts of, as well as develop solutions for, high penetration PV deployments in electrical utility distribution systems. The Participant will work with NREL to assess the existing distribution system at SCE facilities and assess adding additional PV systems into the electric power system.

  8. Summary electrophoretic data base on human embryonic kidney cell strain 8514

    NASA Technical Reports Server (NTRS)

    Plank, L. D.; Kunze, M. E.; Arquiza, M. V.; Morrison, D. R.; Todd, P. W.

    1985-01-01

    To properly plan the electrophoresis equipment verification test (EEVT) and continuous flow electrophoresis system (CFES) experiments with human embryonic kidney cells, first a candidate cell lot had to be chosen on the basis of electrophoretic heterogeneity, growth potential, cytogenetics, and urokinase production. Cell lot 8514 from MA Bioproducts, Inc. was chosen for this purpose, and several essential analytical electrophoresis experiments were performed to test its final suitability for these experiments.

  9. GnoSys: Raising the Level of Discourse in Programming

    DTIC Science & Technology

    2016-03-01

    slotting these results into their proper place in the unified whole is still ongoing as of the date of this final report. Approved for Public Release...static restrictions we place on type systems; they can be any computable property. This is in keep- ing with the general design philosophy of GnoSys...different languages. Because contracts are frequently placed on module boundaries, they ease the burden placed on our analysis and verification tools

  10. Pitting of Space Shuttle's Inconel Honeycomb Conical Seal Panel

    NASA Technical Reports Server (NTRS)

    Zimmerman, Frank; Gentz, Steven J.; Miller, James B.

    2006-01-01

    This paper describes the approach, findings, conclusions and recommendations associated with the investigation of the conical seal pitting. It documents the cause and contributing factors of the pitting, the means used to isolate each contributor, and the supporting evidence for the primary cause of the pitting. Finally, the selection, development and verification of the repair procedure used to restore the conical seal panel is described with supporting process and metallurgical rationale for selection.

  11. Supporting the President's Arms Control and Nonproliferation Agenda: Transparency and Verification for Nuclear Arms Reductions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Doyle, James E; Meek, Elizabeth

    2009-01-01

    The President's arms control and nonproliferation agenda is still evolving and the details of initiatives supporting it remain undefined. This means that DOE, NNSA, NA-20, NA-24 and the national laboratories can help define the agenda, and the policies and the initiatives to support it. This will require effective internal and interagency coordination. The arms control and nonproliferation agenda is broad and includes the path-breaking goal of creating conditions for the elimination of nuclear weapons. Responsibility for various elements of the agenda will be widely scattered across the interagency. Therefore an interagency mapping exercise should be performed to identify the keymore » points of engagement within NNSA and other agencies for creating effective policy coordination mechanisms. These can include informal networks, working groups, coordinating committees, interagency task forces, etc. It will be important for NA-20 and NA-24 to get a seat at the table and a functional role in many of these coordinating bodies. The arms control and nonproliferation agenda comprises both mature and developing policy initiatives. The more mature elements such as CTBT ratification and a follow-on strategic nuclear arms treaty with Russia have defined milestones. However, recent press reports indicate that even the START follow-on strategic arms pact that is planned to be complete by the end of 2009 may take significantly longer and be more expansive in scope. The Russians called for proposals to count non-deployed as well as deployed warheads. Other elements of the agenda such as FMCT, future bilateral nuclear arms reductions following a START follow-on treaty, nuclear posture changes, preparations for an international nuclear security summit, strengthened international safeguards and multilateral verification are in much earlier stages of development. For this reason any survey of arms control capabilities within the USG should be structured to address potential needs across the near-term (1-4) years and longer-term (5-10) years planning horizons. Some final observations include acknowledging the enduring nature of several key objectives on the Obama Administration's arms control and nonproliferation agenda. The CTBT, FMCT, bilateral nuclear arms reductions and strengthening the NPT have been sought by successive U.S. Administrations for nearly thirty years. Efforts towards negotiated arms control, although de-emphasized by the G.W. Bush Administration, have remained a pillar of U.S. national security strategy for decades and are likely to be of enduring if not increasing importance for decades to come. Therefore revitalization and expansion of USG capabilities in this area can be a positive legacy no matter what near-term arms control goals are achieved over the next four years. This is why it is important to reconstruct integrated bureaucratic, legislative, budgetary and diplomatic strategies to sustain the arms control and nonproliferation agenda. In this endeavor some past lessons must be taken to heart to avoid bureaucratic overkill and keep interagency policy-making and implementation structures lean and effective. On the Technical side a serious, sustained multilateral program to develop, down select and performance test nuclear weapons dismantlement verification technologies and procedures should be immediately initiated. In order to make this happen the United States and Russia should join with the UK and other interested states in creating a sustained, full-scale research and development program for verification at their respective nuc1ear weapons and defense establishments. The goals include development of effective technologies and procedures for: (1) Attribute measurement systems to certify nuclear warheads and military fissile materials; (2) Chain-of-custody methods to track items after they are authenticated and enter accountability; (3) Transportation monitoring; (4) Storage monitoring; (5) Fissile materials conversion verification. The remainder of this paper focuses on transparency and verification for nuclear arms and fissile material reductions.« less

  12. Documentation of a Gulf sturgeon spawning site on the Yellow River, Alabama, USA

    USGS Publications Warehouse

    Kreiser, Brian R.; Berg, J.; Randall, M.; Parauka, F.; Floyd, S.; Young, B.; Sulak, Kenneth J.

    2008-01-01

    Parauka and Giorgianni (2002) reported that potential Gulf sturgeon spawning habitat is present in the Yellow River; however, efforts to document spawning by the collection of eggs or larvae have been unsuccessful in the past. Herein, we report on the first successful collection of eggs from a potential spawning site on the Yellow River and the verification of their identity as Gulf sturgeon by using molecular methods.

  13. [Protocols for the diagnostic verification of lymph node toxoplasmosis].

    PubMed

    Carosi, G; Ghezzi, L G; Filice, G; Maccabruni, A; Parisi, A; Carnevale, G

    1983-05-31

    The protocol we have fixed for the diagnosis of lymphonodal toxoplasmosis includes a precise succession of tests: 1) specific repeated serological tests (I.H.A.T., I.F.A.T., IgM--I.F.A.T. on total serum and on pure IgM fraction); 2) lymphonodal biopsy for histological examination and biological test (isolation procedure in mouse). We have evaluated the effectiveness of our protocol in 20 cases that we observed during 1980.

  14. Nuclear Nonproliferation Ontology Assessment Team Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Strasburg, Jana D.; Hohimer, Ryan E.

    Final Report for the NA22 Simulations, Algorithm and Modeling (SAM) Ontology Assessment Team's efforts from FY09-FY11. The Ontology Assessment Team began in May 2009 and concluded in September 2011. During this two-year time frame, the Ontology Assessment team had two objectives: (1) Assessing the utility of knowledge representation and semantic technologies for addressing nuclear nonproliferation challenges; and (2) Developing ontological support tools that would provide a framework for integrating across the Simulation, Algorithm and Modeling (SAM) program. The SAM Program was going through a large assessment and strategic planning effort during this time and as a result, the relative importancemore » of these two objectives changed, altering the focus of the Ontology Assessment Team. In the end, the team conducted an assessment of the state of art, created an annotated bibliography, and developed a series of ontological support tools, demonstrations and presentations. A total of more than 35 individuals from 12 different research institutions participated in the Ontology Assessment Team. These included subject matter experts in several nuclear nonproliferation-related domains as well as experts in semantic technologies. Despite the diverse backgrounds and perspectives, the Ontology Assessment team functioned very well together and aspects could serve as a model for future inter-laboratory collaborations and working groups. While the team encountered several challenges and learned many lessons along the way, the Ontology Assessment effort was ultimately a success that led to several multi-lab research projects and opened up a new area of scientific exploration within the Office of Nuclear Nonproliferation and Verification.« less

  15. Imaging and Quantitation of a Succession of Transient Intermediates Reveal the Reversible Self-Assembly Pathway of a Simple Icosahedral Virus Capsid.

    PubMed

    Medrano, María; Fuertes, Miguel Ángel; Valbuena, Alejandro; Carrillo, Pablo J P; Rodríguez-Huete, Alicia; Mateu, Mauricio G

    2016-11-30

    Understanding the fundamental principles underlying supramolecular self-assembly may facilitate many developments, from novel antivirals to self-organized nanodevices. Icosahedral virus particles constitute paradigms to study self-assembly using a combination of theory and experiment. Unfortunately, assembly pathways of the structurally simplest virus capsids, those more accessible to detailed theoretical studies, have been difficult to study experimentally. We have enabled the in vitro self-assembly under close to physiological conditions of one of the simplest virus particles known, the minute virus of mice (MVM) capsid, and experimentally analyzed its pathways of assembly and disassembly. A combination of electron microscopy and high-resolution atomic force microscopy was used to structurally characterize and quantify a succession of transient assembly and disassembly intermediates. The results provided an experiment-based model for the reversible self-assembly pathway of a most simple (T = 1) icosahedral protein shell. During assembly, trimeric capsid building blocks are sequentially added to the growing capsid, with pentamers of building blocks and incomplete capsids missing one building block as conspicuous intermediates. This study provided experimental verification of many features of self-assembly of a simple T = 1 capsid predicted by molecular dynamics simulations. It also demonstrated atomic force microscopy imaging and automated analysis, in combination with electron microscopy, as a powerful single-particle approach to characterize at high resolution and quantify transient intermediates during supramolecular self-assembly/disassembly reactions. Finally, the efficient in vitro self-assembly achieved for the oncotropic, cell nucleus-targeted MVM capsid may facilitate its development as a drug-encapsidating nanoparticle for anticancer targeted drug delivery.

  16. Introduction

    NASA Astrophysics Data System (ADS)

    de Graauw, T.

    2010-01-01

    First of all, I would like to wish all of you an happy New Year, which I sincerely hope will bring you success, happiness and interesting new opportunities. For us in ALMA, the end of 2009 and the beginning of 2010 have been very exciting and this is once more a special moment in the development of our observatory. After transporting our third antenna to the high altitude Chajnantor plateau, at 5000 meters above sea level, our team successfully combined the outputs of these antennas using "phase closure", a standard method in interferometry. This achievement marks one more milestone along the way to the beginning of Commissioning and Science Verification, CSV, which, once completed, will mark the beginning of Early Science for ALMA. There was an official announcement about this milestone at the AAS meeting early January and we also wanted to share this good news with you through this newsletter, which contains the content of the announcement. In another area, this newsletter contains the progress on site and a presentation of the Atacama Compact Array (ACA). This is the second part of a two part series on antennas, a continuation of the article in the last newsletter. The ACA plays a crucial part in the imaging of extended sources with ALMA. Without the ACA, the ability to produce accurate images would be very restricted. Finally, as you know, we like to show the human face of this great endeavour we are building and this time, we decided to highlight the Department of Technical Services, another fundamental piece working actively to make ALMA the most powerful radio observatory ever built.

  17. Isolation of novel microsatellite markers and their application for genetic diversity and parentage analyses in sika deer.

    PubMed

    Yang, Wanyun; Zheng, Junjun; Jia, Boyin; Wei, Haijun; Wang, Guiwu; Yang, Fuhe

    2018-02-15

    Every part of the sika deer (Cervus nippon) body is valuable traditional Chinese medicine. And sika deer is the most important semi-domestic medicinal animal that is widely bred in Jilin province northeast of China. But few studies had been conducted to characterize the microsatellite markers derived from sika deer. We firstly used IlluminaHiSeq™2500 sequencing technology obtained 125Mbp genomic data of sika deer. Using microsatellite identification tool (MISA), 22,479 microsatellites were identified. From these data, 100 potential primers were selected for further polymorphic validation, finally, 76 primer pairs were successfully amplified and 29 primer pairs were found to be obvious polymorphic in 8 different individuals. Using those polymorphic microsatellite markers, we analyzed the genetic diversity of Jilin sika deer population. The mean number of alleles of the 29 loci is 9.31 based on genotyping blood DNA from 96 Jilin sika deer; The mean expected heterozygosity and polymorphic information content (PIC) value of the 29 loci is 0.72 and 0.68 respectively, and among which 26 loci are highly polymorphic (PIC>0.50). According to the electrophoretic results and PIC value of these 29 loci, 10 loci with combined paternity exclusion probabilities>99.99% were selected to use in parentage verification for 16 sika deer. All the offspring of a family could be successfully assigned to their biological father. These microsatellite markers generated in this study could greatly facilitate future studies of molecular breeding in sika deer. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Viability Study for an Unattended UF 6 Cylinder Verification Station: Phase I Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Leon E.; Miller, Karen A.; Garner, James R.

    In recent years, the International Atomic Energy Agency (IAEA) has pursued innovative techniques and an integrated suite of safeguards measures to address the verification challenges posed by the front end of the nuclear fuel cycle. Among the unattended instruments currently being explored by the IAEA is an Unattended Cylinder Verification Station (UCVS) that could provide automated, independent verification of the declared relative enrichment, 235U mass, total uranium mass and identification for all declared UF 6 cylinders in a facility (e.g., uranium enrichment plants and fuel fabrication plants). Under the auspices of the United States and European Commission Support Programs tomore » the IAEA, a project was undertaken to assess the technical and practical viability of the UCVS concept. The US Support Program team consisted of Pacific Northwest National Laboratory (PNNL, lead), Los Alamos National Laboratory (LANL), Oak Ridge National Laboratory (ORNL) and Savanah River National Laboratory (SRNL). At the core of the viability study is a long-term field trial of a prototype UCVS system at a Westinghouse fuel fabrication facility. A key outcome of the study is a quantitative performance evaluation of two nondestructive assay (NDA) methods being considered for inclusion in a UCVS: Hybrid Enrichment Verification Array (HEVA), and Passive Neutron Enrichment Meter (PNEM). This report provides context for the UCVS concept and the field trial: potential UCVS implementation concepts at an enrichment facility; an overview of UCVS prototype design; field trial objectives and activities. Field trial results and interpretation are presented, with a focus on the performance of PNEM and HEVA for the assay of over 200 “typical” Type 30B cylinders, and the viability of an “NDA Fingerprint” concept as a high-fidelity means to periodically verify that the contents of a given cylinder are consistent with previous scans. A modeling study, combined with field-measured instrument uncertainties, provides an assessment of the partial-defect sensitivity of HEVA and PNEM for both one-time assay and (repeated) NDA Fingerprint verification scenarios. The findings presented in this report represent a significant step forward in the community’s understanding of the strengths and limitations of the PNEM and HEVA NDA methods, and the viability of the UCVS concept in front-end fuel cycle facilities. This experience will inform Phase II of the UCVS viability study, should the IAEA pursue it.« less

  19. Advancing the Fork detector for quantitative spent nuclear fuel verification

    DOE PAGES

    Vaccaro, S.; Gauld, I. C.; Hu, J.; ...

    2018-01-31

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less

  20. Advancing the Fork detector for quantitative spent nuclear fuel verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Vaccaro, S.; Gauld, I. C.; Hu, J.

    The Fork detector is widely used by the safeguards inspectorate of the European Atomic Energy Community (EURATOM) and the International Atomic Energy Agency (IAEA) to verify spent nuclear fuel. Fork measurements are routinely performed for safeguards prior to dry storage cask loading. Additionally, spent fuel verification will be required at the facilities where encapsulation is performed for acceptance in the final repositories planned in Sweden and Finland. The use of the Fork detector as a quantitative instrument has not been prevalent due to the complexity of correlating the measured neutron and gamma ray signals with fuel inventories and operator declarations.more » A spent fuel data analysis module based on the ORIGEN burnup code was recently implemented to provide automated real-time analysis of Fork detector data. This module allows quantitative predictions of expected neutron count rates and gamma units as measured by the Fork detectors using safeguards declarations and available reactor operating data. This study describes field testing of the Fork data analysis module using data acquired from 339 assemblies measured during routine dry cask loading inspection campaigns in Europe. Assemblies include both uranium oxide and mixed-oxide fuel assemblies. More recent measurements of 50 spent fuel assemblies at the Swedish Central Interim Storage Facility for Spent Nuclear Fuel are also analyzed. An evaluation of uncertainties in the Fork measurement data is performed to quantify the ability of the data analysis module to verify operator declarations and to develop quantitative go/no-go criteria for safeguards verification measurements during cask loading or encapsulation operations. The goal of this approach is to provide safeguards inspectors with reliable real-time data analysis tools to rapidly identify discrepancies in operator declarations and to detect potential partial defects in spent fuel assemblies with improved reliability and minimal false positive alarms. Finally, the results are summarized, and sources and magnitudes of uncertainties are identified, and the impact of analysis uncertainties on the ability to confirm operator declarations is quantified.« less

  1. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  2. Aerospace Nickel-cadmium Cell Verification

    NASA Technical Reports Server (NTRS)

    Manzo, Michelle A.; Strawn, D. Michael; Hall, Stephen W.

    2001-01-01

    During the early years of satellites, NASA successfully flew "NASA-Standard" nickel-cadmium (Ni-Cd) cells manufactured by GE/Gates/SAFF on a variety of spacecraft. In 1992 a NASA Battery Review Board determined that the strategy of a NASA Standard Cell and Battery Specification and the accompanying NASA control of a standard manufacturing control document (MCD) for Ni-Cd cells and batteries was unwarranted. As a result of that determination, standards were abandoned and the use of cells other than the NASA Standard was required. In order to gain insight into the performance and characteristics of the various aerospace Ni-Cd products available, tasks were initiated within the NASA Aerospace Flight Battery Systems Program that involved the procurement and testing of representative aerospace Ni-Cd cell designs. A standard set of test conditions was established in order to provide similar information about the products from various vendors. The objective of this testing was to provide independent verification of representative commercial flight cells available in the marketplace today. This paper will provide a summary of the verification tests run on cells from various manufacturers: Sanyo 35 Ampere-hour (Ali) standard and 35 Ali advanced Ni-Cd cells, SAFr 50 Ah Ni-Cd cells and Eagle-Picher 21 Ali Magnum and 21 Ali Super Ni-CdTM cells from Eagle-Picher were put through a full evaluation. A limited number of 18 and 55 Ali cells from Acme Electric were also tested to provide an initial evaluation of the Acme aerospace cell designs. Additionally, 35 Ali aerospace design Ni-MH cells from Sanyo were evaluated under the standard conditions established for this program. Ile test program is essentially complete. The cell design parameters, the verification test plan and the details of the test result will be discussed.

  3. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verburg, J; Bortfeld, T

    Purpose: We present a new system to perform prompt gamma-ray spectroscopy during proton pencil-beam scanning treatments, which enables in vivo verification of the proton range. This system will be used for the first clinical studies of this technology. Methods: After successful pre-clinical testing of prompt gamma-ray spectroscopy, a full scale system for clinical studies is now being assembled. Prompt gamma-rays will be detected during patient treatment using an array of 8 detector modules arranged behind a tungsten collimator. Each detector module consists of a lanthanum(III) bromide scintillator, a photomultiplier tube, and custom electronics for stable high voltage supply and signalmore » amplification. A new real-time data acquisition and control system samples the signals from the detectors with analog-to-digital converters, analyses events of interest, and communicates with the beam delivery systems. The timing of the detected events was synchronized to the cyclotron radiofrequency and the pencil-beam delivery. Range verification is performed by matching measured energy- and timeresolved gamma-ray spectra to nuclear reaction models based on the clinical treatment plan. Experiments in phantoms were performed using clinical beams in order to assess the performance of the systems. Results: The experiments showed reliable real-time analysis of more than 10 million detector events per second. The individual detector modules acquired accurate energy- and time-resolved gamma-ray measurements at a rate of 1 million events per second, which is typical for beams delivered with a clinical dose rate. The data acquisition system successfully tracked the delivery of the scanned pencil-beams to determine the location of range deviations within the treatment field. Conclusion: A clinical system for proton range verification using prompt gamma-ray spectroscopy has been designed and is being prepared for use during patient treatments. We anticipate to start a first clinical study in the near future. This work was supported by the Federal Share of program income earned by Massachusetts; General Hospital on C06-CA059267, Proton Therapy Research and Treatment Center.« less

  4. NASA's Space Launch System: Systems Engineering Approach for Affordability and Mission Success

    NASA Technical Reports Server (NTRS)

    Hutt, John J.; Whitehead, Josh; Hanson, John

    2017-01-01

    NASA is working toward the first launch of the Space Launch System, a new, unmatched capability for deep space exploration with launch readiness planned for 2019. Since program start in 2011, SLS has passed several major formal design milestones, and every major element of the vehicle has produced test and flight hardware. The SLS approach to systems engineering has been key to the program's success. Key aspects of the SLS SE&I approach include: 1) minimizing the number of requirements, 2) elimination of explicit verification requirements, 3) use of certified models of subsystem capability in lieu of requirements when appropriate and 4) certification of capability beyond minimum required capability.

  5. Verification of mesoscale objective analyses of VAS and rawinsode data using the March 1982 AVE/VAS special network data. [Atmospheric Variability Experiment/Visible-infrared spin-scan radiometer Atmospheric Sounder

    NASA Technical Reports Server (NTRS)

    Doyle, James D.; Warner, Thomas T.

    1988-01-01

    Various combinations of VAS (Visible and Infrared Spin Scan Radiometer Atmospheric Sounder) data, conventional rawinsonde data, and gridded data from the National Weather Service's (NWS) global analysis, were used in successive-correction and variational objective-analysis procedures. Analyses are produced for 0000 GMT 7 March 1982, when the VAS sounding distribution was not greatly limited by the existence of cloud cover. The successive-correction (SC) Procedure was used with VAS data alone, rawinsonde data alone, and both VAS and rawinsonde data. Variational techniques were applied in three ways. Each of these techniques was discussed.

  6. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  7. Test Guide for ADS-33E-PRF

    DTIC Science & Technology

    2008-07-01

    8501A (Reference 2), and from the V/STOL specification MIL-F-83300 (Reference 3). ADS-33E-PRF contains intermeshed requirements on not only shoi -t- and...While final verification will in most cases require flight testing, initial checks can be performed through analysis and on ground-based simulators...they are difficult to test, or for some reason are deficient in one or more areas. In such cases one or more alternate criteria are presented where

  8. Satellite Power Systems (SPS) concept definition study. Volume 6: SPS technology requirements and verification

    NASA Technical Reports Server (NTRS)

    Hanley, G.

    1978-01-01

    Volume 6 of the SPS Concept Definition Study is presented and also incorporates results of NASA/MSFC in-house effort. This volume includes a supporting research and technology summary. Other volumes of the final report that provide additional detail are as follows: (1) Executive Summary; (2) SPS System Requirements; (3) SPS Concept Evolution; (4) SPS Point Design Definition; (5) Transportation and Operations Analysis; and Volume 7, SPS Program Plan and Economic Analysis.

  9. Satellite power systems (SPS) concept definition study. Volume 7: SPS program plan and economic analysis, appendixes

    NASA Technical Reports Server (NTRS)

    Hanley, G.

    1978-01-01

    Three appendixes in support of Volume 7 are contained in this document. The three appendixes are: (1) Satellite Power System Work Breakdown Structure Dictionary; (2) SPS cost Estimating Relationships; and (3) Financial and Operational Concept. Other volumes of the final report that provide additional detail are: Executive Summary; SPS Systems Requirements; SPS Concept Evolution; SPS Point Design Definition; Transportation and Operations Analysis; and SPS Technology Requirements and Verification.

  10. Particle Tracking Model Transport Process Verification: Diffusion Algorithm

    DTIC Science & Technology

    2015-07-01

    sediment densities in space and time along with final particle fates (Demirbilek et al. 2004; Davies et al. 2005; McDonald et al. 2006; Lackey and... McDonald 2007). Although a versatile model currently utilized in various coastal, estuarine, and riverine applications, PTM is specifically designed to...Algorithm 5a. CONTRACT NUMBER 5b. GRANT NUMBER 5c. PROGRAM ELEMENT NUMBER 6. AUTHOR( S ) 5d. PROJECT NUMBER 5e. TASK NUMBER 5f. WORK UNIT NUMBER 7

  11. Wetland delineation with IKONOS high-resolution satellite imagery, Fort Custer Training Center, Battle Creek, Michigan, 2005

    USGS Publications Warehouse

    Fuller, L.M.; Morgan, T.R.; Aichele, Stephen S.

    2006-01-01

    The Michigan Army National Guard’s Fort Custer Training Center (FCTC) in Battle Creek, Mich., has the responsibility to protect wetland resources on the training grounds while providing training opportunities, and for future development planning at the facility. The National Wetlands Inventory (NWI) data have been the primary wetland-boundary resource, but a check on scale and accuracy of the wetland boundary information for the Fort Custer Training Center was needed. In cooperation with the FCTC, the U.S. Geological Survey (USGS) used an early spring IKONOS pan-sharpened satellite image to delineate the wetlands and create a more accurate wetland map for the FCTC. The USGS tested automated approaches (supervised and unsupervised classifications) to identify the wetland areas from the IKONOS satellite image, but the automated approaches alone did not yield accurate results. To ensure accurate wetland boundaries, the final wetland map was manually digitized on the basis of the automated supervised and unsupervised classifications, in combination with NWI data, field verifications, and visual interpretation of the IKONOS satellite image. The final wetland areas digitized from the IKONOS satellite imagery were similar to those in NWI; however, the wetland boundaries differed in some areas, a few wetlands mapped on the NWI were determined not to be wetlands from the IKONOS image and field verification, and additional previously unmapped wetlands not recognized by the NWI were identified from the IKONOS image.

  12. Autonomy Software: V&V Challenges and Characteristics

    NASA Technical Reports Server (NTRS)

    Schumann, Johann; Visser, Willem

    2006-01-01

    The successful operation of unmanned air vehicles requires software with a high degree of autonomy. Only if high level functions can be carried out without human control and intervention, complex missions in a changing and potentially unknown environment can be carried out successfully. Autonomy software is highly mission and safety critical: failures, caused by flaws in the software cannot only jeopardize the mission, but could also endanger human life (e.g., a crash of an UAV in a densely populated area). Due to its large size, high complexity, and use of specialized algorithms (planner, constraint-solver, etc.), autonomy software poses specific challenges for its verification, validation, and certification. -- - we have carried out a survey among researchers aid scientists at NASA to study these issues. In this paper, we will present major results of this study, discussing the broad spectrum. of notions and characteristics of autonomy software and its challenges for design and development. A main focus of this survey was to evaluate verification and validation (V&V) issues and challenges, compared to the development of "traditional" safety-critical software. We will discuss important issues in V&V of autonomous software and advanced V&V tools which can help to mitigate software risks. Results of this survey will help to identify and understand safety concerns in autonomy software and will lead to improved strategies for mitigation of these risks.

  13. Motivation Matters: Lessons for REDD+ Participatory Measurement, Reporting and Verification from Three Decades of Child Health Participatory Monitoring in Indonesia

    PubMed Central

    Ekowati, Dian; Hofstee, Carola; Praputra, Andhika Vega; Sheil, Douglas

    2016-01-01

    Participatory Measurement, Reporting and Verification (PMRV), in the context of reducing emissions from deforestation and forest degradation with its co-benefits (REDD+) requires sustained monitoring and reporting by community members. This requirement appears challenging and has yet to be achieved. Other successful, long established, community self-monitoring and reporting systems may provide valuable lessons. The Indonesian integrated village healthcare program (Posyandu) was initiated in the 1980s and still provides effective and successful participatory measurement and reporting of child health status across the diverse, and often remote, communities of Indonesia. Posyandu activities focus on the growth and development of children under the age of five by recording their height and weight and reporting these monthly to the Ministry of Health. Here we focus on the local Posyandu personnel (kaders) and their motivations and incentives for contributing. While Posyandu and REDD+ measurement and reporting activities differ, there are sufficient commonalities to draw useful lessons. We find that the Posyandu kaders are motivated by their interests in health care, by their belief that it benefits the community, and by encouragement by local leaders. Recognition from the community, status within the system, training opportunities, competition among communities, and small payments provide incentives to sustain participation. We examine these lessons in the context of REDD+. PMID:27806053

  14. Numerical implementation, verification and validation of two-phase flow four-equation drift flux model with Jacobian-free Newton–Krylov method

    DOE PAGES

    Zou, Ling; Zhao, Haihua; Zhang, Hongbin

    2016-08-24

    This study presents a numerical investigation on using the Jacobian-free Newton–Krylov (JFNK) method to solve the two-phase flow four-equation drift flux model with realistic constitutive correlations (‘closure models’). The drift flux model is based on Isshi and his collaborators’ work. Additional constitutive correlations for vertical channel flow, such as two-phase flow pressure drop, flow regime map, wall boiling and interfacial heat transfer models, were taken from the RELAP5-3D Code Manual and included to complete the model. The staggered grid finite volume method and fully implicit backward Euler method was used for the spatial discretization and time integration schemes, respectively. Themore » Jacobian-free Newton–Krylov method shows no difficulty in solving the two-phase flow drift flux model with a discrete flow regime map. In addition to the Jacobian-free approach, the preconditioning matrix is obtained by using the default finite differencing method provided in the PETSc package, and consequently the labor-intensive implementation of complex analytical Jacobian matrix is avoided. Extensive and successful numerical verification and validation have been performed to prove the correct implementation of the models and methods. Code-to-code comparison with RELAP5-3D has further demonstrated the successful implementation of the drift flux model.« less

  15. Is it possible to eliminate patient identification errors in medical imaging?

    PubMed

    Danaher, Luke A; Howells, Joan; Holmes, Penny; Scally, Peter

    2011-08-01

    The aim of this article is to review a system that validates and documents the process of ensuring the correct patient, correct site and side, and correct procedure (commonly referred to as the 3 C's) within medical imaging. A 4-step patient identification and procedure matching process was developed using health care and aviation models. The process was established in medical imaging departments after a successful interventional radiology pilot program. The success of the project was evaluated using compliance audit data, incident reporting data before and after the implementation of the process, and a staff satisfaction survey. There was 95% to 100% verification of site and side and 100% verification of correct patient, procedure, and consent. Correct patient data and side markers were present in 82% to 95% of cases. The number of incidents before and after the implementation of the 3 C's was difficult to assess because of a change in reporting systems and incident underreporting. More incidents are being reported, particularly "near misses." All near misses were related to incorrect patient identification stickers being placed on request forms. The majority of staff members surveyed found the process easy (55.8%), quick (47.7%), relevant (51.7%), and useful (60.9%). Although identification error is difficult to eliminate, practical initiatives can engender significant systems improvement in complex health care environments. Crown Copyright © 2011. Published by Elsevier Inc. All rights reserved.

  16. System engineering of the Atacama Large Millimeter/submillimeter Array

    NASA Astrophysics Data System (ADS)

    Bhatia, Ravinder; Marti, Javier; Sugimoto, Masahiro; Sramek, Richard; Miccolis, Maurizio; Morita, Koh-Ichiro; Arancibia, Demián.; Araya, Andrea; Asayama, Shin'ichiro; Barkats, Denis; Brito, Rodrigo; Brundage, William; Grammer, Wes; Haupt, Christoph; Kurlandczyk, Herve; Mizuno, Norikazu; Napier, Peter; Pizarro, Eduardo; Saini, Kamaljeet; Stahlman, Gretchen; Verzichelli, Gianluca; Whyborn, Nick; Yagoubov, Pavel

    2012-09-01

    The Atacama Large Millimeter/submillimeter Array (ALMA) will be composed of 66 high precision antennae located at 5000 meters altitude in northern Chile. This paper will present the methodology, tools and processes adopted to system engineer a project of high technical complexity, by system engineering teams that are remotely located and from different cultures, and in accordance with a demanding schedule and within tight financial constraints. The technical and organizational complexity of ALMA requires a disciplined approach to the definition, implementation and verification of the ALMA requirements. During the development phase, System Engineering chairs all technical reviews and facilitates the resolution of technical conflicts. We have developed analysis tools to analyze the system performance, incorporating key parameters that contribute to the ultimate performance, and are modeled using best estimates and/or measured values obtained during test campaigns. Strict tracking and control of the technical budgets ensures that the different parts of the system can operate together as a whole within ALMA boundary conditions. System Engineering is responsible for acceptances of the thousands of hardware items delivered to Chile, and also supports the software acceptance process. In addition, System Engineering leads the troubleshooting efforts during testing phases of the construction project. Finally, the team is conducting System level verification and diagnostics activities to assess the overall performance of the observatory. This paper will also share lessons learned from these system engineering and verification approaches.

  17. Verification assessment of piston boundary conditions for Lagrangian simulation of compressible flow similarity solutions

    DOE PAGES

    Ramsey, Scott D.; Ivancic, Philip R.; Lilieholm, Jennifer F.

    2015-12-10

    This work is concerned with the use of similarity solutions of the compressible flow equations as benchmarks or verification test problems for finite-volume compressible flow simulation software. In practice, this effort can be complicated by the infinite spatial/temporal extent of many candidate solutions or “test problems.” Methods can be devised with the intention of ameliorating this inconsistency with the finite nature of computational simulation; the exact strategy will depend on the code and problem archetypes under investigation. For example, self-similar shock wave propagation can be represented in Lagrangian compressible flow simulations as rigid boundary-driven flow, even if no such “piston”more » is present in the counterpart mathematical similarity solution. The purpose of this work is to investigate in detail the methodology of representing self-similar shock wave propagation as a piston-driven flow in the context of various test problems featuring simple closed-form solutions of infinite spatial/temporal extent. The closed-form solutions allow for the derivation of similarly closed-form piston boundary conditions (BCs) for use in Lagrangian compressible flow solvers. Finally, the consequences of utilizing these BCs (as opposed to directly initializing the self-similar solution in a computational spatial grid) are investigated in terms of common code verification analysis metrics (e.g., shock strength/position errors and global convergence rates).« less

  18. Methodologies for Verification and Validation of Space Launch System (SLS) Structural Dynamic Models

    NASA Technical Reports Server (NTRS)

    Coppolino, Robert N.

    2018-01-01

    Responses to challenges associated with verification and validation (V&V) of Space Launch System (SLS) structural dynamics models are presented in this paper. Four methodologies addressing specific requirements for V&V are discussed. (1) Residual Mode Augmentation (RMA), which has gained acceptance by various principals in the NASA community, defines efficient and accurate FEM modal sensitivity models that are useful in test-analysis correlation and reconciliation and parametric uncertainty studies. (2) Modified Guyan Reduction (MGR) and Harmonic Reduction (HR, introduced in 1976), developed to remedy difficulties encountered with the widely used Classical Guyan Reduction (CGR) method, are presented. MGR and HR are particularly relevant for estimation of "body dominant" target modes of shell-type SLS assemblies that have numerous "body", "breathing" and local component constituents. Realities associated with configuration features and "imperfections" cause "body" and "breathing" mode characteristics to mix resulting in a lack of clarity in the understanding and correlation of FEM- and test-derived modal data. (3) Mode Consolidation (MC) is a newly introduced procedure designed to effectively "de-feature" FEM and experimental modes of detailed structural shell assemblies for unambiguous estimation of "body" dominant target modes. Finally, (4) Experimental Mode Verification (EMV) is a procedure that addresses ambiguities associated with experimental modal analysis of complex structural systems. Specifically, EMV directly separates well-defined modal data from spurious and poorly excited modal data employing newly introduced graphical and coherence metrics.

  19. Method for fabrication and verification of conjugated nanoparticle-antibody tuning elements for multiplexed electrochemical biosensors.

    PubMed

    La Belle, Jeffrey T; Fairchild, Aaron; Demirok, Ugur K; Verma, Aman

    2013-05-15

    There is a critical need for more accurate, highly sensitive and specific assay for disease diagnosis and management. A novel, multiplexed, single sensor using rapid and label free electrochemical impedance spectroscopy tuning method has been developed. The key challenges while monitoring multiple targets is frequency overlap. Here we describe the methods to circumvent the overlap, tune by use of nanoparticle (NP) and discuss the various fabrication and characterization methods to develop this technique. First sensors were fabricated using printed circuit board (PCB) technology and nickel and gold layers were electrodeposited onto the PCB sensors. An off-chip conjugation of gold NP's to molecular recognition elements (with verification technique) is described as well. A standard covalent immobilization of the molecular recognition elements is also discussed with quality control techniques. Finally use and verification of sensitivity and specificity is also presented. By use of gold NP's of various sizes, we have demonstrated the possibility and shown little loss of sensitivity and specificity in the molecular recognition of inflammatory markers as "model" targets for our tuning system. By selection of other sized NP's or NP's of various materials, the tuning effect can be further exploited. The novel platform technology developed could be utilized in critical care, clinical management and at home health and disease management. Copyright © 2013 Elsevier Inc. All rights reserved.

  20. Verification assessment of piston boundary conditions for Lagrangian simulation of compressible flow similarity solutions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ramsey, Scott D.; Ivancic, Philip R.; Lilieholm, Jennifer F.

    This work is concerned with the use of similarity solutions of the compressible flow equations as benchmarks or verification test problems for finite-volume compressible flow simulation software. In practice, this effort can be complicated by the infinite spatial/temporal extent of many candidate solutions or “test problems.” Methods can be devised with the intention of ameliorating this inconsistency with the finite nature of computational simulation; the exact strategy will depend on the code and problem archetypes under investigation. For example, self-similar shock wave propagation can be represented in Lagrangian compressible flow simulations as rigid boundary-driven flow, even if no such “piston”more » is present in the counterpart mathematical similarity solution. The purpose of this work is to investigate in detail the methodology of representing self-similar shock wave propagation as a piston-driven flow in the context of various test problems featuring simple closed-form solutions of infinite spatial/temporal extent. The closed-form solutions allow for the derivation of similarly closed-form piston boundary conditions (BCs) for use in Lagrangian compressible flow solvers. Finally, the consequences of utilizing these BCs (as opposed to directly initializing the self-similar solution in a computational spatial grid) are investigated in terms of common code verification analysis metrics (e.g., shock strength/position errors and global convergence rates).« less

  1. Cleanliness verification process at Martin Marietta Astronautics

    NASA Astrophysics Data System (ADS)

    King, Elizabeth A.; Giordano, Thomas J.

    1994-06-01

    The Montreal Protocol and the 1990 Clean Air Act Amendments mandate CFC-113, other chlorinated fluorocarbons (CFC's) and 1,1,1-Trichloroethane (TCA) be banned from production after December 31, 1995. In response to increasing pressures, the Air Force has formulated policy that prohibits purchase of these solvents for Air Force use after April 1, 1994. In response to the Air Force policy, Martin Marietta Astronautics is in the process of eliminating all CFC's and TCA from use at the Engineering Propulsion Laboratory (EPL), located on Air Force property PJKS. Gross and precision cleaning operations are currently performed on spacecraft components at EPL. The final step of the operation is a rinse with a solvent, typically CFC-113. This solvent is then analyzed for nonvolatile residue (NVR), particle count and total filterable solids (TFS) to determine cleanliness of the parts. The CFC-113 used in this process must be replaced in response to the above policies. Martin Marietta Astronautics, under contract to the Air Force, is currently evaluating and testing alternatives for a cleanliness verification solvent. Completion of test is scheduled for May, 1994. Evaluation of the alternative solvents follows a three step approach. This first is initial testing of solvents picked from literature searches and analysis. The second step is detailed testing of the top candidates from the initial test phase. The final step is implementation and validation of the chosen alternative(s). Testing will include contaminant removal, nonvolatile residue, material compatibility and propellant compatibility. Typical materials and contaminants will be tested with a wide range of solvents. Final results of the three steps will be presented as well as the implementation plan for solvent replacement.

  2. Trends in age verification among U.S. adolescents attempting to buy cigarettes at retail stores, 2000-2009.

    PubMed

    Filippidis, Filippos T; Agaku, Israel T; Connolly, Gregory N; Vardavas, Constantine I

    2014-04-01

    This study assessed trends in age verification prior to cigarette sales to U.S. middle and high school students, and refusal to sell cigarettes to students aged <18 years during 2000-2009. Data were obtained from the 2000-2009 National Youth Tobacco Survey. Trends during 2000-2009 were assessed using binary logistic regression (p<0.05). The proportion of all students, who reported being asked to show proof of age prior to a cigarette purchase in the past 30 days did not change significantly between 2000 (46.9%) and 2009 (44.9%) (p=0.529 for linear trend). No significant trend in the proportion of students aged < 18 years who were refused a sale when attempting to buy cigarettes was observed between 2000 (39.8%) and 2009 (36.7%) (p=0.283 for linear trend). Refusal of a cigarette sale was significantly higher among under-aged boys compared to girls (adjusted odds ratio=1.48; 95% confidence interval: 1.28-1.70). About half of U.S. middle and high school students who reported making a cigarette purchase were not asked for proof of age, and about three of five under-aged buyers successfully made a cigarette purchase in 2009. Intensified implementation and enforcement of policies requiring age verification among youths is warranted to reduce access and use of tobacco products. Copyright © 2014 Elsevier Inc. All rights reserved.

  3. Performance verification and system parameter identification of spacecraft tape recorder control servo

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, A. K.

    1979-01-01

    Design adequacy of the lead-lag compensator of the frequency loop, accuracy checking of the analytical expression for the electrical motor transfer function, and performance evaluation of the speed control servo of the digital tape recorder used on-board the 1976 Viking Mars Orbiters and Voyager 1977 Jupiter-Saturn flyby spacecraft are analyzed. The transfer functions of the most important parts of a simplified frequency loop used for test simulation are described and ten simulation cases are reported. The first four of these cases illustrate the method of selecting the most suitable transfer function for the hysteresis synchronous motor, while the rest verify and determine the servo performance parameters and alternative servo compensation schemes. It is concluded that the linear methods provide a starting point for the final verification/refinement of servo design by nonlinear time response simulation and that the variation of the parameters of the static/dynamic Coulomb friction is as expected in a long-life space mission environment.

  4. Wide-Field Lensing Mass Maps from Dark Energy Survey Science Verification Data

    DOE PAGES

    Chang, C.

    2015-07-29

    We present a mass map reconstructed from weak gravitational lensing shear measurements over 139 deg 2 from the Dark Energy Survey science verification data. The mass map probes both luminous and dark matter, thus providing a tool for studying cosmology. We also find good agreement between the mass map and the distribution of massive galaxy clusters identified using a red-sequence cluster finder. Potential candidates for superclusters and voids are identified using these maps. We measure the cross-correlation between the mass map and a magnitude-limited foreground galaxy sample and find a detection at the 6.8σ level with 20 arc min smoothing.more » These measurements are consistent with simulated galaxy catalogs based on N-body simulations from a cold dark matter model with a cosmological constant. This suggests low systematics uncertainties in the map. Finally, we summarize our key findings in this Letter; the detailed methodology and tests for systematics are presented in a companion paper.« less

  5. Seismic behavior of a low-rise horizontal cylindrical tank

    NASA Astrophysics Data System (ADS)

    Fiore, Alessandra; Rago, Carlo; Vanzi, Ivo; Greco, Rita; Briseghella, Bruno

    2018-05-01

    Cylindrical storage tanks are widely used for various types of liquids, including hazardous contents, thus requiring suitable and careful design for seismic actions. The study herein presented deals with the dynamic analysis of a ground-based horizontal cylindrical tank containing butane and with its safety verification. The analyses are based on a detailed finite element (FE) model; a simplified one-degree-of-freedom idealization is also set up and used for verification of the FE results. Particular attention is paid to sloshing and asynchronous seismic input effects. Sloshing effects are investigated according to the current literature state of the art. An efficient methodology based on an "impulsive-convective" decomposition of the container-fluid motion is adopted for the calculation of the seismic force. The effects of asynchronous ground motion are studied by suitable pseudo-static analyses. Comparison between seismic action effects, obtained with and without consideration of sloshing and asynchronous seismic input, shows a rather important influence of these conditions on the final results.

  6. Verification and Validation of the General Mission Analysis Tool (GMAT)

    NASA Technical Reports Server (NTRS)

    Hughes, Steven P.; Qureshi, Rizwan H.; Cooley, D. Steven; Parker, Joel J. K.; Grubb, Thomas G.

    2014-01-01

    This paper describes the processes and results of Verification and Validation (V&V) efforts for the General Mission Analysis Tool (GMAT). We describe the test program and environments, the tools used for independent test data, and comparison results. The V&V effort produced approximately 13,000 test scripts that are run as part of the nightly buildtest process. In addition, we created approximately 3000 automated GUI tests that are run every two weeks. Presenting all test results are beyond the scope of a single paper. Here we present high-level test results in most areas, and detailed test results for key areas. The final product of the V&V effort presented in this paper was GMAT version R2013a, the first Gold release of the software with completely updated documentation and greatly improved quality. Release R2013a was the staging release for flight qualification performed at Goddard Space Flight Center (GSFC) ultimately resulting in GMAT version R2013b.

  7. Spitzer Space Telescope in-orbit checkout and science verification operations

    NASA Technical Reports Server (NTRS)

    Linick, Sue H.; Miles, John W.; Gilbert, John B.; Boyles, Carol A.

    2004-01-01

    Spitzer Space Telescope, the fourth and final of NASA's great observatories, and the first mission in NASA's Origins Program was launched 25 August 2003 into an Earth-trailing solar orbit. The observatory was designed to probe and explore the universe in the infrared. Before science data could be acquired, however, the observatory had to be initialized, characterized, calibrated, and commissioned. A two phased operations approach was defined to complete this work. These phases were identified as In-Orbit Checkout (IOC) and Science Verification (SV). Because the observatory lifetime is cryogen-limited these operations had to be highly efficient. The IOC/SV operations design accommodated a pre-defined distributed organizational structure and a complex, cryogenic flight system. Many checkout activities were inter-dependent, and therefore the operations concept and ground data system had to provide the flexibility required for a 'short turn-around' environment. This paper describes the adaptive operations system design and evolution, implementation, and lessons-learned from the completion of IOC/SV.

  8. Simulation verification techniques study: Simulation self test hardware design and techniques report

    NASA Technical Reports Server (NTRS)

    1974-01-01

    The final results are presented of the hardware verification task. The basic objectives of the various subtasks are reviewed along with the ground rules under which the overall task was conducted and which impacted the approach taken in deriving techniques for hardware self test. The results of the first subtask and the definition of simulation hardware are presented. The hardware definition is based primarily on a brief review of the simulator configurations anticipated for the shuttle training program. The results of the survey of current self test techniques are presented. The data sources that were considered in the search for current techniques are reviewed, and results of the survey are presented in terms of the specific types of tests that are of interest for training simulator applications. Specifically, these types of tests are readiness tests, fault isolation tests and incipient fault detection techniques. The most applicable techniques were structured into software flows that are then referenced in discussions of techniques for specific subsystems.

  9. SCA security verification on wireless sensor network node

    NASA Astrophysics Data System (ADS)

    He, Wei; Pizarro, Carlos; de la Torre, Eduardo; Portilla, Jorge; Riesgo, Teresa

    2011-05-01

    Side Channel Attack (SCA) differs from traditional mathematic attacks. It gets around of the exhaustive mathematic calculation and precisely pin to certain points in the cryptographic algorithm to reveal confidential information from the running crypto-devices. Since the introduction of SCA by Paul Kocher et al [1], it has been considered to be one of the most critical threats to the resource restricted but security demanding applications, such as wireless sensor networks. In this paper, we focus our work on the SCA-concerned security verification on WSN (wireless sensor network). A detailed setup of the platform and an analysis of the results of DPA (power attack) and EMA (electromagnetic attack) is presented. The setup follows the way of low-cost setup to make effective SCAs. Meanwhile, surveying the weaknesses of WSNs in resisting SCA attacks, especially for the EM attack. Finally, SCA-Prevention suggestions based on Differential Security Strategy for the FPGA hardware implementation in WSN will be given, helping to get an improved compromise between security and cost.

  10. SOFIA pointing history

    NASA Astrophysics Data System (ADS)

    Kärcher, Hans J.; Kunz, Nans; Temi, Pasquale; Krabbe, Alfred; Wagner, Jörg; Süß, Martin

    2014-07-01

    The original pointing accuracy requirement of the Stratospheric Observatory for Infrared Astronomy SOFIA was defined at the beginning of the program in the late 1980s as very challenging 0.2 arcsec rms. The early science flights of the observatory started in December 2010 and the observatory has reached in the mean time nearly 0.7 arcsec rms, which is sufficient for most of the SOFIA science instruments. NASA and DLR, the owners of SOFIA, are planning now a future 4 year program to bring the pointing down to the ultimate 0.2 arcsec rms. This may be the right time to recall the history of the pointing requirement and its verification and the possibility of its achievement via early computer models and wind tunnel tests, later computer aided end-to-end simulations up to the first commissioning flights some years ago. The paper recollects the tools used in the different project phases for the verification of the pointing performance, explains the achievements and may give hints for the planning of the upcoming final pointing improvement phase.

  11. Finger vein verification system based on sparse representation.

    PubMed

    Xin, Yang; Liu, Zhi; Zhang, Haixia; Zhang, Hong

    2012-09-01

    Finger vein verification is a promising biometric pattern for personal identification in terms of security and convenience. The recognition performance of this technology heavily relies on the quality of finger vein images and on the recognition algorithm. To achieve efficient recognition performance, a special finger vein imaging device is developed, and a finger vein recognition method based on sparse representation is proposed. The motivation for the proposed method is that finger vein images exhibit a sparse property. In the proposed system, the regions of interest (ROIs) in the finger vein images are segmented and enhanced. Sparse representation and sparsity preserving projection on ROIs are performed to obtain the features. Finally, the features are measured for recognition. An equal error rate of 0.017% was achieved based on the finger vein image database, which contains images that were captured by using the near-IR imaging device that was developed in this study. The experimental results demonstrate that the proposed method is faster and more robust than previous methods.

  12. A Process Algebraic Approach to Software Architecture Design

    NASA Astrophysics Data System (ADS)

    Aldini, Alessandro; Bernardo, Marco; Corradini, Flavio

    Process algebra is a formal tool for the specification and the verification of concurrent and distributed systems. It supports compositional modeling through a set of operators able to express concepts like sequential composition, alternative composition, and parallel composition of action-based descriptions. It also supports mathematical reasoning via a two-level semantics, which formalizes the behavior of a description by means of an abstract machine obtained from the application of structural operational rules and then introduces behavioral equivalences able to relate descriptions that are syntactically different. In this chapter, we present the typical behavioral operators and operational semantic rules for a process calculus in which no notion of time, probability, or priority is associated with actions. Then, we discuss the three most studied approaches to the definition of behavioral equivalences - bisimulation, testing, and trace - and we illustrate their congruence properties, sound and complete axiomatizations, modal logic characterizations, and verification algorithms. Finally, we show how these behavioral equivalences and some of their variants are related to each other on the basis of their discriminating power.

  13. Final report on Weeks Island Monitoring Phase : 1999 through 2004.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ehgartner, Brian L.; Munson, Darrell Eugene

    2005-05-01

    This Final Report on the Monitoring Phase of the former Weeks Island Strategic Petroleum Reserve crude oil storage facility details the results of five years of monitoring of various surface accessible quantities at the decommissioned facility. The Weeks Island mine was authorized by the State of Louisiana as a Strategic Petroleum Reserve oil storage facility from 1979 until decommissioning of the facility in 1999. Discovery of a sinkhole over the facility in 1992 with freshwater inflow to the facility threatened the integrity of the oil storage and led to the decision to remove the oil, fill the chambers with brine,more » and decommission the facility. Thereafter, a monitoring phase, by agreement between the Department of Energy and the State, addressed facility stability and environmental concerns. Monitoring of the surface ground water and the brine of the underground chambers from the East Fill Hole produced no evidence of hydrocarbon contamination, which suggests that any unrecovered oil remaining in the underground chambers has been contained. Ever diminishing progression of the initial major sinkhole, and a subsequent minor sinkhole, with time was verification of the response of sinkholes to filling of the facility with brine. Brine filling of the facility ostensively eliminates any further growth or new formation from freshwater inflow. Continued monitoring of sinkhole response, together with continued surface surveillance for environmental problems, confirmed the intended results of brine pressurization. Surface subsidence measurements over the mine continued throughout the monitoring phase. And finally, the outward flow of brine was monitored as a measure of the creep closure of the mine chambers. Results of each of these monitoring activities are presented, with their correlation toward assuring the stability and environmental security of the decommissioned facility. The results suggest that the decommissioning was successful and no contamination of the surface environment by crude oil has been found.« less

  14. How to Find a Bug in Ten Thousand Lines Transport Solver? Outline of Experiences from AN Advection-Diffusion Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F.

    2011-12-01

    Almost all natural phenomena on Earth are highly nonlinear. Even simplifications to the equations describing nature usually end up being nonlinear partial differential equations. Transport (ADR) equation is a pivotal equation in atmospheric sciences and water quality. This nonlinear equation needs to be solved numerically for practical purposes so academicians and engineers thoroughly rely on the assistance of numerical codes. Thus, numerical codes require verification before they are utilized for multiple applications in science and engineering. Model verification is a mathematical procedure whereby a numerical code is checked to assure the governing equation is properly solved as it is described in the design document. CFD verification is not a straightforward and well-defined course. Only a complete test suite can uncover all the limitations and bugs. Results are needed to be assessed to make a distinction between bug-induced-defect and innate limitation of a numerical scheme. As Roache (2009) said, numerical verification is a state-of-the-art procedure. Sometimes novel tricks work out. This study conveys the synopsis of the experiences we gained during a comprehensive verification process which was done for a transport solver. A test suite was designed including unit tests and algorithmic tests. Tests were layered in complexity in several dimensions from simple to complex. Acceptance criteria defined for the desirable capabilities of the transport code such as order of accuracy, mass conservation, handling stiff source term, spurious oscillation, and initial shape preservation. At the begining, mesh convergence study which is the main craft of the verification is performed. To that end, analytical solution of ADR equation gathered. Also a new solution was derived. In the more general cases, lack of analytical solution could be overcome through Richardson Extrapolation and Manufactured Solution. Then, two bugs which were concealed during the mesh convergence study uncovered with the method of false injection and visualization of the results. Symmetry had dual functionality: there was a bug, which was hidden due to the symmetric nature of a test (it was detected afterward utilizing artificial false injection), on the other hand self-symmetry was used to design a new test, and in a case the analytical solution of the ADR equation was unknown. Assisting subroutines designed to check and post-process conservation of mass and oscillatory behavior. Finally, capability of the solver also checked for stiff reaction source term. The above test suite not only was a decent tool of error detection but also it provided a thorough feedback on the ADR solvers limitations. Such information is the crux of any rigorous numerical modeling for a modeler who deals with surface/subsurface pollution transport.

  15. Verification of the new detection method for irradiated spices based on microbial survival by collaborative blind trial

    NASA Astrophysics Data System (ADS)

    Miyahara, M.; Furuta, M.; Takekawa, T.; Oda, S.; Koshikawa, T.; Akiba, T.; Mori, T.; Mimura, T.; Sawada, C.; Yamaguchi, T.; Nishioka, S.; Tada, M.

    2009-07-01

    An irradiation detection method using the difference of the radiation sensitivity of the heat-treated microorganisms was developed as one of the microbiological detection methods of the irradiated foods. This detection method is based on the difference of the viable cell count before and after heat treatment (70 °C and 10 min). The verification by collaborative blind trial of this method was done by nine inspecting agencies in Japan. The samples used for this trial were five kinds of spices consisting of non-irradiated, 5 kGy irradiated, and 7 kGy irradiated black pepper, allspice, oregano, sage, and paprika, respectively. As a result of this collaboration, a high percentage (80%) of the correct answers was obtained for irradiated black pepper and allspice. However, the method was less successful for irradiated oregano, sage, and paprika. It might be possible to use this detection method for preliminary screening of the irradiated foods but further work is necessary to confirm these findings.

  16. Detection of hail signatures from single-polarization C-band radar reflectivity

    NASA Astrophysics Data System (ADS)

    Kunz, Michael; Kugel, Petra I. S.

    2015-02-01

    Five different criteria that estimate hail signatures from single-polarization radar data are statistically evaluated over a 15-year period by categorical verification against loss data provided by a building insurance company. The criteria consider different levels or thresholds of radar reflectivity, some of them complemented by estimates of the 0 °C level or cloud top temperature. Applied to reflectivity data from a single C-band radar in southwest Germany, it is found that all criteria are able to reproduce most of the past damage-causing hail events. However, the criteria substantially overestimate hail occurrence by up to 80%, mainly due to the verification process using damage data. Best results in terms of highest Heidke Skill Score HSS or Critical Success Index CSI are obtained for the Hail Detection Algorithm (HDA) and the Probability of Severe Hail (POSH). Radar-derived hail probability shows a high spatial variability with a maximum on the lee side of the Black Forest mountains and a minimum in the broad Rhine valley.

  17. Execution of the Spitzer In-orbit Checkout and Science Verification Plan

    NASA Technical Reports Server (NTRS)

    Miles, John W.; Linick, Susan H.; Long, Stacia; Gilbert, John; Garcia, Mark; Boyles, Carole; Werner, Michael; Wilson, Robert K.

    2004-01-01

    The Spitzer Space Telescope is an 85-cm telescope with three cryogenically cooled instruments. Following launch, the observatory was initialized and commissioned for science operations during the in-orbit checkout (IOC) and science verification (SV) phases, carried out over a total of 98.3 days. The execution of the IOC/SV mission plan progressively established Spitzer capabilities taking into consideration thermal, cryogenic, optical, pointing, communications, and operational designs and constraints. The plan was carried out with high efficiency, making effective use of cryogen-limited flight time. One key component to the success of the plan was the pre-launch allocation of schedule reserve in the timeline of IOC/SV activities, and how it was used in flight both to cover activity redesign and growth due to continually improving spacecraft and instrument knowledge, and to recover from anomalies. This paper describes the adaptive system design and evolution, implementation, and lessons learned from IOC/SV operations. It is hoped that this information will provide guidance to future missions with similar engineering challenges

  18. Generating Models of Infinite-State Communication Protocols Using Regular Inference with Abstraction

    NASA Astrophysics Data System (ADS)

    Aarts, Fides; Jonsson, Bengt; Uijen, Johan

    In order to facilitate model-based verification and validation, effort is underway to develop techniques for generating models of communication system components from observations of their external behavior. Most previous such work has employed regular inference techniques which generate modest-size finite-state models. They typically suppress parameters of messages, although these have a significant impact on control flow in many communication protocols. We present a framework, which adapts regular inference to include data parameters in messages and states for generating components with large or infinite message alphabets. A main idea is to adapt the framework of predicate abstraction, successfully used in formal verification. Since we are in a black-box setting, the abstraction must be supplied externally, using information about how the component manages data parameters. We have implemented our techniques by connecting the LearnLib tool for regular inference with the protocol simulator ns-2, and generated a model of the SIP component as implemented in ns-2.

  19. Design and Verification of a Distributed Communication Protocol

    NASA Technical Reports Server (NTRS)

    Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.

  20. Hierarchical specification of the SIFT fault tolerant flight control system

    NASA Technical Reports Server (NTRS)

    Melliar-Smith, P. M.; Schwartz, R. L.

    1981-01-01

    The specification and mechanical verification of the Software Implemented Fault Tolerance (SIFT) flight control system is described. The methodology employed in the verification effort is discussed, and a description of the hierarchical models of the SIFT system is given. To meet the objective of NASA for the reliability of safety critical flight control systems, the SIFT computer must achieve a reliability well beyond the levels at which reliability can be actually measured. The methodology employed to demonstrate rigorously that the SIFT computer meets as reliability requirements is described. The hierarchy of design specifications from very abstract descriptions of system function down to the actual implementation is explained. The most abstract design specifications can be used to verify that the system functions correctly and with the desired reliability since almost all details of the realization were abstracted out. A succession of lower level models refine these specifications to the level of the actual implementation, and can be used to demonstrate that the implementation has the properties claimed of the abstract design specifications.

  1. Review of Large Spacecraft Deployable Membrane Antenna Structures

    NASA Astrophysics Data System (ADS)

    Liu, Zhi-Quan; Qiu, Hui; Li, Xiao; Yang, Shu-Li

    2017-11-01

    The demand for large antennas in future space missions has increasingly stimulated the development of deployable membrane antenna structures owing to their light weight and small stowage volume. However, there is little literature providing a comprehensive review and comparison of different membrane antenna structures. Space-borne membrane antenna structures are mainly classified as either parabolic or planar membrane antenna structures. For parabolic membrane antenna structures, there are five deploying and forming methods, including inflation, inflation-rigidization, elastic ribs driven, Shape Memory Polymer (SMP)-inflation, and electrostatic forming. The development and detailed comparison of these five methods are presented. Then, properties of membrane materials (including polyester film and polyimide film) for parabolic membrane antennas are compared. Additionally, for planar membrane antenna structures, frame shapes have changed from circular to rectangular, and different tensioning systems have emerged successively, including single Miura-Natori, double, and multi-layer tensioning systems. Recent advances in structural configurations, tensioning system design, and dynamic analysis for planar membrane antenna structures are investigated. Finally, future trends for large space membrane antenna structures are pointed out and technical problems are proposed, including design and analysis of membrane structures, materials and processes, membrane packing, surface accuracy stability, and test and verification technology. Through a review of large deployable membrane antenna structures, guidance for space membrane-antenna research and applications is provided.

  2. Particle Release Experiment (PRex) Final Report

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Keillor, Martin E.; Arrigo, Leah M.; Detwiler, Rebecca S.

    2014-09-30

    An experiment to release radioactive particles representative of small-scale venting from an underground nuclear test was conducted to gather data in support of treaty verification and monitoring activities. For this experiment, a CO 2-driven “air cannon” was used to release La-140 at ambient temperatures. Lanthanum-140 was chosen to represent the fission fragments because of its short half-life and prominent gamma-ray emissions; the choice was also influenced by the successful production and use of La-140 with low levels of radioactive contaminants in a Defence Research and Development Canada Field Trial. The source was created through activation of high-purity natural lanthanum oxidemore » at the reactor of Washington State University, Pullman, Washington. Multiple varieties of witness plates and air samplers were laid in an irregular grid covering the area over which the plume was modeled to deposit. Aerial survey, a NaI(Tl) mobile spectrometer, and handheld and backpack instruments ranging from polyvinyl toluene to high-purity germanium were used to survey the plume. Additionally, three varieties of soil sampling were investigated. The relative sensitivity and utility of sampling and survey methods are discussed in the context of On-Site Inspection. The measurements and samples show a high degree of correlation and form a valuable set of test data.« less

  3. All fiber optics circular-state swept source polarization-sensitive optical coherence tomography.

    PubMed

    Lin, Hermann; Kao, Meng-Chun; Lai, Chih-Ming; Huang, Jyun-Cin; Kuo, Wen-Chuan

    2014-02-01

    A swept source (SS)-based circular-state (CS) polarization-sensitive optical coherence tomography (PS-OCT) constructed entirely with polarization-maintaining fiber optics components is proposed with the experimental verification. By means of the proposed calibration scheme, bulk quarter-wave plates can be replaced by fiber optics polarization controllers to, therefore, realize an all-fiber optics CS SSPS-OCT. We also present a numerical dispersion compensation method, which can not only enhance the axial resolution, but also improve the signal-to-noise ratio of the images. We demonstrate that this compact and portable CS SSPS-OCT system with an accuracy comparable to bulk optics systems requires less stringent lens alignment and can possibly serve as a technology to realize PS-OCT instrument for clinical applications (e.g., endoscopy). The largest deviations in the phase retardation (PR) and fast-axis (FA) angle due to sample probe in the linear scanning and a rotation angle smaller than 65 deg were of the same order as those in stationary probe setups. The influence of fiber bending on the measured PR and FA is also investigated. The largest deviations of the PR were 3.5 deg and the measured FA change by ~12 to 21 deg. Finally, in vivo imaging of the human fingertip and nail was successfully demonstrated with a linear scanning probe.

  4. James Webb Space Telescope (JWST) Integrated Science Instruments Module (ISIM) Cryo-Vacuum (CV) Test Campaign Summary

    NASA Technical Reports Server (NTRS)

    Yew, Calinda; Lui, Yan; Whitehouse, Paul; Banks, Kimberly

    2016-01-01

    JWST Integrated Science Instruments Module (ISIM) completed its system-level space simulation testing program at the NASA Goddard Space Flight Center (GSFC). In March 2016, ISIM was successfully delivered to the next level of integration with the Optical Telescope Element (OTE), to form OTIS (OTE + ISIM), after concluding a series of three cryo-vacuum (CV) tests. During these tests, the complexity of the mission has generated challenging requirements that demand highly reliable system performance and capabilities from the Space Environment Simulator (SES) vacuum chamber. The first test served as a risk reduction test; the second test provided the initial verification of the fully-integrated flight instruments; and the third test verified the system in its final flight configuration following mechanical environmental tests (vibration and acoustics). From one test to the next, shortcomings of the facility were uncovered and associated improvements in operational capabilities and reliability of the facility were required to enable the project to verify system-level requirements. This paper: (1) provides an overview of the integrated mechanical and thermal facility systems required to achieve the objectives of JWST ISIM testing, (2) compares the overall facility performance and instrumentation results from the three ISIM CV tests, and (3) summarizes lessons learned from the ISIM testing campaign.

  5. Robot-Assisted Fracture Surgery: Surgical Requirements and System Design.

    PubMed

    Georgilas, Ioannis; Dagnino, Giulio; Tarassoli, Payam; Atkins, Roger; Dogramadzi, Sanja

    2018-03-09

    The design of medical devices is a complex and crucial process to ensure patient safety. It has been shown that improperly designed devices lead to errors and associated accidents and costs. A key element for a successful design is incorporating the views of the primary and secondary stakeholders early in the development process. They provide insights into current practice and point out specific issues with the current processes and equipment in use. This work presents how information from a user-study conducted in the early stages of the RAFS (Robot Assisted Fracture Surgery) project informed the subsequent development and testing of the system. The user needs were captured using qualitative methods and converted to operational, functional, and non-functional requirements based on the methods derived from product design and development. This work presents how the requirements inform a new workflow for intra-articular joint fracture reduction using a robotic system. It is also shown how the various elements of the system are developed to explicitly address one or more of the requirements identified, and how intermediate verification tests are conducted to ensure conformity. Finally, a validation test in the form of a cadaveric trial confirms the ability of the designed system to satisfy the aims set by the original research question and the needs of the users.

  6. Induced polarization for characterizing and monitoring soil stabilization processes

    NASA Astrophysics Data System (ADS)

    Saneiyan, S.; Ntarlagiannis, D.; Werkema, D. D., Jr.

    2017-12-01

    Soil stabilization is critical in addressing engineering problems related to building foundation support, road construction and soil erosion among others. To increase soil strength, the stiffness of the soil is enhanced through injection/precipitation of a chemical agents or minerals. Methods such as cement injection and microbial induced carbonate precipitation (MICP) are commonly applied. Verification of a successful soil stabilization project is often challenging as treatment areas are spatially extensive and invasive sampling is expensive, time consuming and limited to sporadic points at discrete times. The geophysical method, complex conductivity (CC), is sensitive to mineral surface properties, hence a promising method to monitor soil stabilization projects. Previous laboratory work has established the sensitivity of CC on MICP processes. We performed a MICP soil stabilization projects and collected CC data for the duration of the treatment (15 days). Subsurface images show small, but very clear changes, in the area of MICP treatment; the changes observed fully agree with the bio-geochemical monitoring, and previous laboratory experiments. Our results strongly suggest that CC is sensitive to field MICP treatments. Finally, our results show that good quality data alone are not adequate for the correct interpretation of field CC data, at least when the signals are low. Informed data processing routines and the inverse modeling parameters are required to produce optimal results.

  7. An adaptable, low cost test-bed for unmanned vehicle systems research

    NASA Astrophysics Data System (ADS)

    Goppert, James M.

    2011-12-01

    An unmanned vehicle systems test-bed has been developed. The test-bed has been designed to accommodate hardware changes and various vehicle types and algorithms. The creation of this test-bed allows research teams to focus on algorithm development and employ a common well-tested experimental framework. The ArduPilotOne autopilot was developed to provide the necessary level of abstraction for multiple vehicle types. The autopilot was also designed to be highly integrated with the Mavlink protocol for Micro Air Vehicle (MAV) communication. Mavlink is the native protocol for QGroundControl, a MAV ground control program. Features were added to QGroundControl to accommodate outdoor usage. Next, the Mavsim toolbox was developed for Scicoslab to allow hardware-in-the-loop testing, control design and analysis, and estimation algorithm testing and verification. In order to obtain linear models of aircraft dynamics, the JSBSim flight dynamics engine was extended to use a probabilistic Nelder-Mead simplex method. The JSBSim aircraft dynamics were compared with wind-tunnel data collected. Finally, a structured methodology for successive loop closure control design is proposed. This methodology is demonstrated along with the rest of the test-bed tools on a quadrotor, a fixed wing RC plane, and a ground vehicle. Test results for the ground vehicle are presented.

  8. Identification of DEP domain-containing proteins by a machine learning method and experimental analysis of their expression in human HCC tissues

    NASA Astrophysics Data System (ADS)

    Liao, Zhijun; Wang, Xinrui; Zeng, Yeting; Zou, Quan

    2016-12-01

    The Dishevelled/EGL-10/Pleckstrin (DEP) domain-containing (DEPDC) proteins have seven members. However, whether this superfamily can be distinguished from other proteins based only on the amino acid sequences, remains unknown. Here, we describe a computational method to segregate DEPDCs and non-DEPDCs. First, we examined the Pfam numbers of the known DEPDCs and used the longest sequences for each Pfam to construct a phylogenetic tree. Subsequently, we extracted 188-dimensional (188D) and 20D features of DEPDCs and non-DEPDCs and classified them with random forest classifier. We also mined the motifs of human DEPDCs to find the related domains. Finally, we designed experimental verification methods of human DEPDC expression at the mRNA level in hepatocellular carcinoma (HCC) and adjacent normal tissues. The phylogenetic analysis showed that the DEPDCs superfamily can be divided into three clusters. Moreover, the 188D and 20D features can both be used to effectively distinguish the two protein types. Motif analysis revealed that the DEP and RhoGAP domain was common in human DEPDCs, human HCC and the adjacent tissues that widely expressed DEPDCs. However, their regulation was not identical. In conclusion, we successfully constructed a binary classifier for DEPDCs and experimentally verified their expression in human HCC tissues.

  9. Spaced-based Cosmic Ray Astrophysics

    NASA Astrophysics Data System (ADS)

    Seo, Eun-Suk

    2016-03-01

    The bulk of cosmic ray data has been obtained with great success by balloon-borne instruments, particularly with NASA's long duration flights over Antarctica. More recently, PAMELA on a Russian Satellite and AMS-02 on the International Space Station (ISS) started providing exciting measurements of particles and anti-particles with unprecedented precision upto TeV energies. In order to address open questions in cosmic ray astrophysics, future missions require spaceflight exposures for rare species, such as isotopes, ultra-heavy elements, and high (the ``knee'' and above) energies. Isotopic composition measurements up to about 10 GeV/nucleon that are critical for understanding interstellar propagation and origin of the elements are still to be accomplished. The cosmic ray composition in the knee (PeV) region holds a key to understanding the origin of cosmic rays. Just last year, the JAXA-led CALET ISS mission, and the DAMPE Chinese Satellite were launched. NASA's ISS-CREAM completed its final verification at GSFC, and was delivered to KSC to await launch on SpaceX. In addition, a EUSO-like mission for ultrahigh energy cosmic rays and an HNX-like mission for ultraheavy nuclei could accomplish a vision for a cosmic ray observatory in space. Strong support of NASA's Explorer Program category of payloads would be needed for completion of these missions over the next decade.

  10. Five biomedical experiments flown in an Earth orbiting laboratory: Lessons learned from developing these experiments on the first international microgravity mission from concept to landing

    NASA Technical Reports Server (NTRS)

    Winget, C. M.; Lashbrook, J. J.; Callahan, P. X.; Schaefer, R. L.

    1993-01-01

    There are numerous problems associated with accommodating complex biological systems in microgravity in the flexible laboratory systems installed in the Orbiter cargo bay. This presentation will focus upon some of the lessons learned along the way from the University laboratory to the IML-1 Microgravity Laboratory. The First International Microgravity Laboratory (IML-1) mission contained a large number of specimens, including: 72 million nematodes, US-1; 3 billion yeast cells, US-2; 32 million mouse limb-bud cells, US-3; and 540 oat seeds (96 planted), FOTRAN. All five of the experiments had to undergo significant redevelopment effort in order to allow the investigator's ideas and objectives to be accommodated within the constraints of the IML-1 mission. Each of these experiments were proposed as unique entities rather than part of the mission, and many procedures had to be modified from the laboratory practice to meet IML-1 constraints. After a proposal is accepted by NASA for definition, an interactive process is begun between the Principal Investigator and the developer to ensure a maximum science return. The success of the five SLSPO-managed experiments was the result of successful completion of all preflight biological testing and hardware verification finalized at the KSC Life Sciences Support Facility housed in Hangar L. The ESTEC Biorack facility housed three U.S. experiments (US-1, US-2, and US-3). The U.S. Gravitational Plant Physiology Facility housed GTHRES and FOTRAN. The IML-1 mission (launched from KSC on 22 Jan. 1992, and landed at Dryden Flight Research Facility on 30 Jan. 1992) was an outstanding success--close to 100 percent of the prelaunch anticipated science return was achieved and, in some cases, greater than 100 percent was achieved (because of an extra mission day).

  11. Intermediate experimental vehicle, ESA program aerodynamics-aerothermodynamics key technologies for spacecraft design and successful flight

    NASA Astrophysics Data System (ADS)

    Dutheil, Sylvain; Pibarot, Julien; Tran, Dac; Vallee, Jean-Jacques; Tribot, Jean-Pierre

    2016-07-01

    With the aim of placing Europe among the world's space players in the strategic area of atmospheric re-entry, several studies on experimental vehicle concepts and improvements of critical re-entry technologies have paved the way for the flight of an experimental space craft. The successful flight of the Intermediate eXperimental Vehicle (IXV), under ESA's Future Launchers Preparatory Programme (FLPP), is definitively a significant step forward from the Atmospheric Reentry Demonstrator flight (1998), establishing Europe as a key player in this field. The IXV project objectives were the design, development, manufacture and ground and flight verification of an autonomous European lifting and aerodynamically controlled reentry system, which is highly flexible and maneuverable. The paper presents, the role of aerodynamics aerothermodynamics as part of the key technologies for designing an atmospheric re-entry spacecraft and securing a successful flight.

  12. Cassini's Maneuver Automation Software (MAS) Process: How to Successfully Command 200 Navigation Maneuvers

    NASA Technical Reports Server (NTRS)

    Yang, Genevie Velarde; Mohr, David; Kirby, Charles E.

    2008-01-01

    To keep Cassini on its complex trajectory, more than 200 orbit trim maneuvers (OTMs) have been planned from July 2004 to July 2010. With only a few days between many of these OTMs, the operations process of planning and executing the necessary commands had to be automated. The resulting Maneuver Automation Software (MAS) process minimizes the workforce required for, and maximizes the efficiency of, the maneuver design and uplink activities. The MAS process is a well-organized and logically constructed interface between Cassini's Navigation (NAV), Spacecraft Operations (SCO), and Ground Software teams. Upon delivery of an orbit determination (OD) from NAV, the MAS process can generate a maneuver design and all related uplink and verification products within 30 minutes. To date, all 112 OTMs executed by the Cassini spacecraft have been successful. MAS was even used to successfully design and execute a maneuver while the spacecraft was in safe mode.

  13. The Seismic Aftershock Monitoring System (SAMS) for OSI - Experiences from IFE14

    NASA Astrophysics Data System (ADS)

    Gestermann, Nicolai; Sick, Benjamin; Häge, Martin; Blake, Thomas; Labak, Peter; Joswig, Manfred

    2016-04-01

    An on-site inspection (OSI) is the third of four elements of the verification regime of the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The sole purpose of an OSI is to confirm whether a nuclear weapon test explosion or any other nuclear explosion has been carried out in violation of the treaty and to gather any facts which might assist in identifying any possible violator. It thus constitutes the final verification measure under the CTBT if all other available measures are not able to confirm the nature of a suspicious event. The Provisional Technical Secretariat (PTS) carried out the Integrated Field Exercise 2014 (IFE14) in the Dead Sea Area of Jordan from 3 November to 9. December 2014. It was a fictitious OSI whose aim was to test the inspection capabilities in an integrated manner. The technologies allowed during an OSI are listed in the Treaty. The aim of the Seismic Aftershock Monitoring System (SAMS) is to detect and localize aftershocks of low magnitudes of the triggering event or collapses of underground cavities. The locations of these events are expected in the vicinity of a possible previous explosion and help to narrow down the search area within an inspection area (IA) of an OSI. The success of SAMS depends on the main elements, hardware, software, deployment strategy, the search logic and not least the effective use of personnel. All elements of SAMS were tested and improved during the Built-Up Exercises (BUE) which took place in Austria and Hungary. IFE14 provided more realistic climatic and hazardous terrain conditions with limited resources. Significant variations in topography of the IA of IFE14 in the mountainous Dead Sea Area of Jordan led to considerable challenges which were not expected from experiences encountered during BUE. The SAMS uses mini arrays with an aperture of about 100 meters and with a total of 4 elements. The station network deployed during IFE14 and results of the data analysis will be presented. Possible aftershocks of the triggering event are expected in a very low magnitude range. Therefore the detection threshold of the network is one of the key parameters of SAMS and crucial for the success of the monitoring. One of the objectives was to record magnitude values down to -2.0 ML. The threshold values have been compared with historical seismicity in the region and those monitored during IFE14. Results of the threshold detection estimation and experiences of the exercise will be presented.

  14. Combination of an electrolytic pretreatment unit with secondary water reclamation processes

    NASA Technical Reports Server (NTRS)

    Wells, G. W.; Bonura, M. S.

    1973-01-01

    The design and fabrication of a flight concept prototype electrolytic pretreatment unit (EPU) and of a contractor-furnished air evaporation unit (AEU) are described. The integrated EPU and AEU potable water recovery system is referred to as the Electrovap and is capable of processing the urine and flush water of a six-man crew. Results of a five-day performance verification test of the Electrovap system are presented and plans are included for the extended testing of the Electrovap to produce data applicable to the combination of electrolytic pretreatment with most final potable water recovery systems. Plans are also presented for a program to define the design requirements for combining the electrolytic pretreatment unit with a reverse osmosis final processing unit.

  15. A Limited Flight Test Investigation of Pilot-Induced Oscillation Due to Elevator Rate Limiting (HAVE LIMITS)

    DTIC Science & Technology

    1997-06-01

    c10ioid@ #-w odor n .Oprsiw ri rpw,20503.nDvi ,tws ot 1. AG__USE u NLY (~tLeswblit) 2. RCEPORNTODATE 3 REPORToTYYE A55W? O~flCCCOV9ISRO June 1997 Final...discussed. The test analyses of the results. iii r - This page irwk’idonally left blnk. - 3 DI ivI ®I S EXE.CVnWV% This rmport prsets the results of a...I........................... 3 Verification of Ahtra• Modai

  16. Lay out, test verification and in orbit performance of HELIOS a temperature control system

    NASA Technical Reports Server (NTRS)

    Brungs, W.

    1975-01-01

    HELIOS temperature control system is described. The main design features and the impact of interactions between experiment, spacecraft system, and temperature control system requirements on the design are discussed. The major limitations of the thermal design regarding a closer sun approach are given and related to test experience and performance data obtained in orbit. Finally the validity of the test results achieved with prototype and flight spacecraft is evaluated by comparison between test data, orbit temperature predictions and flight data.

  17. Development and analysis of closed cycle circulator elements. Final report 31 Jul 978-31 May 1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shih, C.C.; Karr, G.R.; Perkins, J.F.

    1980-05-01

    A series of experiments with various flow rates of laser gas and coolants under several levels of energy inputs has been conducted on the Army Closed Cycle Circulator for pulsed EDL to collect sufficient data for flow calibration and coefficient determination. Verification of the theoretical models depicting the functions of the heat exchangers in maintaining the thermal balance in the flow through the steady and transient states are made through comparison with results of the experimental analysis.

  18. Signature extension for spectral variation in soils, volume 4

    NASA Technical Reports Server (NTRS)

    Berry, J. K.; Smith, J. A.; Jonranson, K.

    1976-01-01

    The reduced 1975-1976 field data at Garden City, Kansas are presented. These data are being used to evaluate the SRVC model predictions, to compare the ERIM-SUITS model with both the SRVC results and field data, and finally, to provide a data base for reviewing multitemporal trajectories. In particular, the applicability of the tasselled cap transformation is reviewed. The first detailed verification of this approach utilizing actual field measured data from the LACIE field measurement program, rather than LANDSAT data, is given.

  19. Development of a Multifuel Individual/Squad Stove

    DTIC Science & Technology

    1990-02-01

    1 . Final Letter Report, Fix Verification Test of the MISS, U.S. Army CRTC , April 1989. m. Health Hazard Assessment, 4 April 1989, enclosed. n...CIVIL, 1 \\48ABIpPTR TECHNICAL REPORT __...__AD _ NATICK/TR-90/020 ceq DEVELOPMENT OF A00 O MULTIFUEL o INDIVIDUAL/SQUAD STOVE N I O BY DONALD W...NUMBERS PROGRAM PROJECT ITASK ~ WORK UNIT ELEMENT NO. NO. NO. ACCESSION NO. D5i48 24 1 146 TITLE (Include Security Classification) D 2 Development of a

  20. Design of a modular digital computer system DRL 4 and 5. [design of airborne/spaceborne computer system

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Design and development efforts for a spaceborne modular computer system are reported. An initial baseline description is followed by an interface design that includes definition of the overall system response to all classes of failure. Final versions for the register level designs for all module types were completed. Packaging, support and control executive software, including memory utilization estimates and design verification plan, were formalized to insure a soundly integrated design of the digital computer system.

  1. NAS Parallel Benchmarks. 2.4

    NASA Technical Reports Server (NTRS)

    VanderWijngaart, Rob; Biegel, Bryan A. (Technical Monitor)

    2002-01-01

    We describe a new problem size, called Class D, for the NAS Parallel Benchmarks (NPB), whose MPI source code implementation is being released as NPB 2.4. A brief rationale is given for how the new class is derived. We also describe the modifications made to the MPI (Message Passing Interface) implementation to allow the new class to be run on systems with 32-bit integers, and with moderate amounts of memory. Finally, we give the verification values for the new problem size.

  2. VARTM Model Development and Verification

    NASA Technical Reports Server (NTRS)

    Cano, Roberto J. (Technical Monitor); Dowling, Norman E.

    2004-01-01

    In this investigation, a comprehensive Vacuum Assisted Resin Transfer Molding (VARTM) process simulation model was developed and verified. The model incorporates resin flow through the preform, compaction and relaxation of the preform, and viscosity and cure kinetics of the resin. The computer model can be used to analyze the resin flow details, track the thickness change of the preform, predict the total infiltration time and final fiber volume fraction of the parts, and determine whether the resin could completely infiltrate and uniformly wet out the preform.

  3. Reputation-based collaborative network biology.

    PubMed

    Binder, Jean; Boue, Stephanie; Di Fabio, Anselmo; Fields, R Brett; Hayes, William; Hoeng, Julia; Park, Jennifer S; Peitsch, Manuel C

    2015-01-01

    A pilot reputation-based collaborative network biology platform, Bionet, was developed for use in the sbv IMPROVER Network Verification Challenge to verify and enhance previously developed networks describing key aspects of lung biology. Bionet was successful in capturing a more comprehensive view of the biology associated with each network using the collective intelligence and knowledge of the crowd. One key learning point from the pilot was that using a standardized biological knowledge representation language such as BEL is critical to the success of a collaborative network biology platform. Overall, Bionet demonstrated that this approach to collaborative network biology is highly viable. Improving this platform for de novo creation of biological networks and network curation with the suggested enhancements for scalability will serve both academic and industry systems biology communities.

  4. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  5. Validation of a Quality Management Metric

    DTIC Science & Technology

    2000-09-01

    quality management metric (QMM) was used to measure the performance of ten software managers on Department of Defense (DoD) software development programs. Informal verification and validation of the metric compared the QMM score to an overall program success score for the entire program and yielded positive correlation. The results of applying the QMM can be used to characterize the quality of software management and can serve as a template to improve software management performance. Future work includes further refining the QMM, applying the QMM scores to provide feedback

  6. Component-Oriented Behavior Extraction for Autonomic System Design

    NASA Technical Reports Server (NTRS)

    Bakera, Marco; Wagner, Christian; Margaria,Tiziana; Hinchey, Mike; Vassev, Emil; Steffen, Bernhard

    2009-01-01

    Rich and multifaceted domain specific specification languages like the Autonomic System Specification Language (ASSL) help to design reliable systems with self-healing capabilities. The GEAR game-based Model Checker has been used successfully to investigate properties of the ESA Exo- Mars Rover in depth. We show here how to enable GEAR s game-based verification techniques for ASSL via systematic model extraction from a behavioral subset of the language, and illustrate it on a description of the Voyager II space mission.

  7. Development of laser interferometric high-precision geometry monitor for JASMINE

    NASA Astrophysics Data System (ADS)

    Niwa, Yoshito; Arai, Koji; Ueda, Akitoshi; Sakagami, Masaaki; Gouda, Naoteru; Kobayashi, Yukiyasu; Yamada, Yoshiyuki; Yano, Taihei

    2008-07-01

    The telescope geometry of JASMINE should be stabilized and monitored with the accuracy of about 10 to 100 picometer or 10 to 100 picoradian in root-mean-square over about 10 hours. For this purpose, a high-precision interferometric laser metrology system is employed. One of useful techniques for measuring displacements in extremely minute scales is the heterodyne interferometrical method. Experiment for verification of multi degree of freedom measurement was performed and mirror motions were successfully monitored with three degree of freedom.

  8. Functions of Tenascin-C and Integrin alpha9beta1 in Mediating Prostate Cancer Bone Metastasis

    DTIC Science & Technology

    2017-10-01

    additional engineered cell lines for verification and we plan to also generate stable knockout cell lines using CRISPR /Cas 9 gene editing technology...addition to the proposed study, we plan to also produce VCaP cells that are null (knockout) for alpha 9 integrin using CRISPR /Cas9 gene editing protocols...We are experienced with CRISPR -Cas knockdown and have successfully engineered cells previously. We do not expect any particular difficulty in

  9. AJ26 rocket engine test

    NASA Image and Video Library

    2010-11-10

    Fire and steam signal a successful test firing of Orbital Sciences Corporation's Aerojet AJ26 rocket engine at John C. Stennis Space Center. AJ26 engines will be used to power Orbital's Taurus II space vehicle on commercial cargo flights to the International Space Station. On Nov. 10, operators at Stennis' E-1 Test Stand conducted a 10-second test fire of the engine, the first of a series of three verification tests. Orbital has partnered with NASA to provide eight missions to the ISS by 2015.

  10. Combining real-time monitoring and knowledge-based analysis in MARVEL

    NASA Technical Reports Server (NTRS)

    Schwuttke, Ursula M.; Quan, A. G.; Angelino, R.; Veregge, J. R.

    1993-01-01

    Real-time artificial intelligence is gaining increasing attention for applications in which conventional software methods are unable to meet technology needs. One such application area is the monitoring and analysis of complex systems. MARVEL, a distributed monitoring and analysis tool with multiple expert systems, was developed and successfully applied to the automation of interplanetary spacecraft operations at NASA's Jet Propulsion Laboratory. MARVEL implementation and verification approaches, the MARVEL architecture, and the specific benefits that were realized by using MARVEL in operations are described.

  11. GPM Solar Array Gravity Negated Deployment Testing

    NASA Technical Reports Server (NTRS)

    Penn, Jonathan; Johnson, Chris; Lewis, Jesse; Dear, Trevin; Stewart, Alphonso

    2014-01-01

    NASA Goddard Space Flight Center (GSFC) successfully developed a g-negation support system for use on the solar arrays of the Global Precipitation Measurement (GPM) Satellite. This system provides full deployment capability at the subsystem and observatory levels. In addition, the system provides capability for deployed configuration first mode frequency verification testing. The system consists of air pads, a support structure, an air supply, and support tables. The g-negation support system was used to support all deployment activities for flight solar array deployment testing.

  12. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  13. Ghost publications in the pediatric surgery match.

    PubMed

    Gasior, Alessandra C; Knott, E Marty; Fike, Frankie B; Moratello, Vincent E; St Peter, Shawn D; Ostlie, Daniel J; Snyder, Charles L

    2013-09-01

    Pediatric surgery fellowship is considered one of the most competitive subspecialties in medicine. With fierce competition increasing the stakes, publications and first authorship are paramount to the success rate of matching. We analyzed Electronic Residency Application Service applications for verification of authorship to determine rate of misrepresentation. After institutional review board approval, the bibliographies of fellowship applications from 2007-2009 were reviewed to allow time for publication. Only peer-reviewed journal articles were evaluated. A Medline search was conducted for the article, by author or by title. If the article could not be found, other authors and journal were used as search parameters. If the article was still not found, the website for the journal was searched for abstract or manuscript. Finally, an experienced medical sciences librarian was consulted for remaining unidentified articles. Differences between misrepresented and accurate applications were analyzed, including: age, gender, medical and undergraduate school parameters, advanced degrees, other fellowships, number of publications, first author publications, American Board of Surgery In-Training Examination scores, and match success. There were 147 applications reviewed. Evidence of misrepresentation was found in 17.6% of the applicants (24/136), with 34 instances in 785 manuscripts (4.3%). Manuscripts classified as published were verified 96.7% of the time, were not found in 1.4%, and had incorrect authors or journal in less than 1% each. "In press" manuscripts were verified 88.3% of the time, 6.4% could not be found, and 4.3% had an incorrect journal listing. Number of publications (P = 0.026) and first author publications (P = 0.037) correlated with misrepresentation. None of the remaining variables was significant. The pediatric surgical pool has a very low incidence of suspicious citations; however, authorship claims should be verified. Copyright © 2013 Elsevier Inc. All rights reserved.

  14. In-flight demonstration of the Space Station Freedom Health Maintenance Facility fluid therapy system (E300/E05)

    NASA Technical Reports Server (NTRS)

    Lloyd, Charles W.

    1993-01-01

    The Space Station Freedom (SSF) Health Maintenance Facility (HMF) will provide medical care for crew members for up to 10 days. An integral part of the required medical care consists of providing intravenous infusion of fluids, electrolyte solutions, and nutrients to sustain an ill or injured crew member. In terrestrial health care facilities, intravenous solutions are normally stored in large quantities. However, due to the station's weight and volume constraints, an adequate supply of the required solutions cannot be carried onboard SSF. By formulating medical fluids onboard from concentrates and station water as needed, the Fluid Therapy System (FTS) eliminates weight and volume concerns regarding intravenous fluids. The first full-system demonstration of FTS is continuous microgravity will be conducted in Spacelab-Japan (SL-J). The FTS evaluation consists of two functional objectives and an in-flight demonstration of intravenous administration of fluids. The first is to make and store sterile water and IV solutions onboard the spacecraft. If intravenous fluids are to be produced in SSF, successful sterilization of water and reconstituting of IV solutions must be achieved. The second objective is to repeat the verification of the FTS infusion pump, which had been performed in Spacelab Life Sciences - 1 (SLS-1). during SLS-1, the FTS IV pump was operated in continuous microgravity for the first time. The pump functioned successfully, and valuable knowledge on its performance in continuous microgravity was obtained. Finally, the technique of starting an IF in microgravity will be demonstrated. The IV technique requires modifications in microgravity, such as use of restraints for equipment and crew members involved.

  15. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  16. 18 CFR 281.213 - Data Verification Committee.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... preparation. (e) The Data Verification Committee shall prepare a report concerning the proposed index of... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Data Verification....213 Data Verification Committee. (a) Each interstate pipeline shall establish a Data Verification...

  17. The politics of verification and the control of nuclear tests, 1945-1980

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gallagher, N.W.

    1990-01-01

    This dissertation addresses two questions: (1) why has agreement been reached on verification regimes to support some arms control accords but not others; and (2) what determines the extent to which verification arrangements promote stable cooperation. This study develops an alternative framework for analysis by examining the politics of verification at two levels. The logical politics of verification are shaped by the structure of the problem of evaluating cooperation under semi-anarchical conditions. The practical politics of verification are driven by players' attempts to use verification arguments to promote their desired security outcome. The historical material shows that agreements on verificationmore » regimes are reached when key domestic and international players desire an arms control accord and believe that workable verification will not have intolerable costs. Clearer understanding of how verification is itself a political problem, and how players manipulate it to promote other goals is necessary if the politics of verification are to support rather than undermine the development of stable cooperation.« less

  18. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  19. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the... verifications and analysis. It may also be necessary to limit the range of conditions under which the PEMS can... additional information or analysis to support your conclusions. (b) Overall verification. This paragraph (b...

  20. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  1. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...

  2. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 30 Mineral Resources 2 2011-07-01 2011-07-01 false What is the Platform Verification Program? 250... Platforms and Structures Platform Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms...

  3. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  4. AdaBoost-based on-line signature verifier

    NASA Astrophysics Data System (ADS)

    Hongo, Yasunori; Muramatsu, Daigo; Matsumoto, Takashi

    2005-03-01

    Authentication of individuals is rapidly becoming an important issue. The authors previously proposed a Pen-input online signature verification algorithm. The algorithm considers a writer"s signature as a trajectory of pen position, pen pressure, pen azimuth, and pen altitude that evolve over time, so that it is dynamic and biometric. Many algorithms have been proposed and reported to achieve accuracy for on-line signature verification, but setting the threshold value for these algorithms is a problem. In this paper, we introduce a user-generic model generated by AdaBoost, which resolves this problem. When user- specific models (one model for each user) are used for signature verification problems, we need to generate the models using only genuine signatures. Forged signatures are not available because imposters do not give forged signatures for training in advance. However, we can make use of another's forged signature in addition to the genuine signatures for learning by introducing a user generic model. And Adaboost is a well-known classification algorithm, making final decisions depending on the sign of the output value. Therefore, it is not necessary to set the threshold value. A preliminary experiment is performed on a database consisting of data from 50 individuals. This set consists of western-alphabet-based signatures provide by a European research group. In this experiment, our algorithm gives an FRR of 1.88% and an FAR of 1.60%. Since no fine-tuning was done, this preliminary result looks very promising.

  5. Pandemic preparedness in Hawaii: a multicenter verification of real-time RT-PCR for the direct detection of influenza virus types A and B.

    PubMed

    Whelen, A Christian; Bankowski, Matthew J; Furuya, Glenn; Honda, Stacey; Ueki, Robert; Chan, Amelia; Higa, Karen; Kumashiro, Diane; Moore, Nathaniel; Lee, Roland; Koyamatsu, Terrie; Effler, Paul V

    2010-01-01

    We integrated multicenter, real-time (RTi) reverse transcription polymerase chain reaction (RT-PCR) screening into a statewide laboratory algorithm for influenza surveillance and response. Each of three sites developed its own testing strategy and was challenged with one randomized and blinded panel of 50 specimens previously tested for respiratory viruses. Following testing, each participating laboratory reported its results to the Hawaii State Department of Health, State Laboratories Division for evaluation and possible discrepant analysis. Two of three laboratories reported a 100% sensitivity and specificity, resulting in a 100% positive predictive value and a 100% negative predictive value (NPV) for influenza type A. The third laboratory showed a 71% sensitivity for influenza type A (83% NPV) with 100% specificity. All three laboratories were 100% sensitive and specific for the detection of influenza type B. Discrepant analysis indicated that the lack of sensitivity experienced by the third laboratory may have been due to the analyte-specific reagent probe used by that laboratory. Use of a newer version of the product with a secondary panel of 20 specimens resulted in a sensitivity and specificity of 100%. All three laboratories successfully verified their ability to conduct clinical testing for influenza using diverse nucleic acid extraction and RTi RT-PCR platforms. Successful completion of the verification by all collaborating laboratories paved the way for the integration of those facilities into a statewide laboratory algorithm for influenza surveillance and response.

  6. A quantitative comparison of precipitation forecasts between the storm-scale numerical weather prediction model and auto-nowcast system in Jiangsu, China

    NASA Astrophysics Data System (ADS)

    Wang, Gaili; Yang, Ji; Wang, Dan; Liu, Liping

    2016-11-01

    Extrapolation techniques and storm-scale Numerical Weather Prediction (NWP) models are two primary approaches for short-term precipitation forecasts. The primary objective of this study is to verify precipitation forecasts and compare the performances of two nowcasting schemes: a Beijing Auto-Nowcast system (BJ-ANC) based on extrapolation techniques and a storm-scale NWP model called the Advanced Regional Prediction System (ARPS). The verification and comparison takes into account six heavy precipitation events that occurred in the summer of 2014 and 2015 in Jiangsu, China. The forecast performances of the two schemes were evaluated for the next 6 h at 1-h intervals using gridpoint-based measures of critical success index, bias, index of agreement, root mean square error, and using an object-based verification method called Structure-Amplitude-Location (SAL) score. Regarding gridpoint-based measures, BJ-ANC outperforms ARPS at first, but then the forecast accuracy decreases rapidly with lead time and performs worse than ARPS after 4-5 h of the initial forecast. Regarding the object-based verification method, most forecasts produced by BJ-ANC focus on the center of the diagram at the 1-h lead time and indicate high-quality forecasts. As the lead time increases, BJ-ANC overestimates precipitation amount and produces widespread precipitation, especially at a 6-h lead time. The ARPS model overestimates precipitation at all lead times, particularly at first.

  7. A web-based system for supporting global land cover data production

    NASA Astrophysics Data System (ADS)

    Han, Gang; Chen, Jun; He, Chaoying; Li, Songnian; Wu, Hao; Liao, Anping; Peng, Shu

    2015-05-01

    Global land cover (GLC) data production and verification process is very complicated, time consuming and labor intensive, requiring huge amount of imagery data and ancillary data and involving many people, often from different geographic locations. The efficient integration of various kinds of ancillary data and effective collaborative classification in large area land cover mapping requires advanced supporting tools. This paper presents the design and development of a web-based system for supporting 30-m resolution GLC data production by combining geo-spatial web-service and Computer Support Collaborative Work (CSCW) technology. Based on the analysis of the functional and non-functional requirements from GLC mapping, a three tiers system model is proposed with four major parts, i.e., multisource data resources, data and function services, interactive mapping and production management. The prototyping and implementation of the system have been realised by a combination of Open Source Software (OSS) and commercially available off-the-shelf system. This web-based system not only facilitates the integration of heterogeneous data and services required by GLC data production, but also provides online access, visualization and analysis of the images, ancillary data and interim 30 m global land-cover maps. The system further supports online collaborative quality check and verification workflows. It has been successfully applied to China's 30-m resolution GLC mapping project, and has improved significantly the efficiency of GLC data production and verification. The concepts developed through this study should also benefit other GLC or regional land-cover data production efforts.

  8. Two barriers to realizing the benefits of biometrics: a chain perspective on biometrics and identity fraud as biometrics' real challenge

    NASA Astrophysics Data System (ADS)

    Grijpink, Jan

    2004-06-01

    Along at least twelve dimensions biometric systems might vary. We need to exploit this variety to manoeuvre biometrics into place to be able to realise its social potential. Subsequently, two perspectives on biometrics are proposed revealing that biometrics will probably be ineffective in combating identity fraud, organised crime and terrorism: (1) the value chain perspective explains the first barrier: our strong preference for large scale biometric systems for general compulsory use. These biometric systems cause successful infringements to spread unnoticed. A biometric system will only function adequately if biometrics is indispensable for solving the dominant chain problem. Multi-chain use of biometrics takes it beyond the boundaries of good manageability. (2) the identity fraud perspective exposes the second barrier: our traditional approach to identity verification. We focus on identity documents, neglecting the person and the situation involved. Moreover, western legal cultures have made identity verification procedures known, transparent, uniform and predictable. Thus, we have developed a blind spot to identity fraud. Biometrics provides good potential to better checking persons, but will probably be used to enhance identity documents. Biometrics will only pay off if it confronts the identity fraudster with less predictable verification processes and more risks of his identity fraud being spotted. Standardised large scale applications of biometrics for general compulsory use without countervailing measures will probably produce the reverse. This contribution tentatively presents a few headlines for an overall biometrics strategy that could better resist identity fraud.

  9. Design and Testing of a Prototype Lunar or Planetary Surface Landing Research Vehicle (LPSLRV)

    NASA Technical Reports Server (NTRS)

    Murphy, Gloria A.

    2010-01-01

    This handbook describes a two-semester senior design course sponsored by the NASA Office of Education, the Exploration Systems Mission Directorate (ESMD), and the NASA Space Grant Consortium. The course was developed and implemented by the Mechanical and Aerospace Engineering Department (MAE) at Utah State University. The course final outcome is a packaged senior design course that can be readily incorporated into the instructional curriculum at universities across the country. The course materials adhere to the standards of the Accreditation Board for Engineering and Technology (ABET), and is constructed to be relevant to key research areas identified by ESMD. The design project challenged students to apply systems engineering concepts to define research and training requirements for a terrestrial-based lunar landing simulator. This project developed a flying prototype for a Lunar or Planetary Surface Landing Research Vehicle (LPSRV). Per NASA specifications the concept accounts for reduced lunar gravity, and allows the terminal stage of lunar descent to be flown either by remote pilot or autonomously. This free-flying platform was designed to be sufficiently-flexible to allow both sensor evaluation and pilot training. This handbook outlines the course materials, describes the systems engineering processes developed to facilitate design fabrication, integration, and testing. This handbook presents sufficient details of the final design configuration to allow an independent group to reproduce the design. The design evolution and details regarding the verification testing used to characterize the system are presented in a separate project final design report. Details of the experimental apparatus used for system characterization may be found in Appendix F, G, and I of that report. A brief summary of the ground testing and systems verification is also included in Appendix A of this report. Details of the flight tests will be documented in a separate flight test report. This flight test report serves as a complement to the course handbook presented here. This project was extremely ambitious, and achieving all of the design and test objectives was a daunting task. The schedule ran slightly longer than a single academic year with the complete design closure not occurring until early April. Integration and verification testing spilled over into late May and the first flight did not occur until mid to late June. The academic year at Utah State University ended on May 8, 2010. Following the end of the academic year, testing and integration was performed by the faculty advisor, paid research assistants, and volunteer student help

  10. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...

  11. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  12. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  13. TH-B-204-03: TG-199: Implanted Markers for Radiation Treatment Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, Z.

    Implanted markers as target surrogates have been widely used for treatment verification, as they provide safe and reliable monitoring of the inter- and intra-fractional target motion. The rapid advancement of technology requires a critical review and recommendation for the usage of implanted surrogates in current field. The symposium, also reporting an update of AAPM TG 199 - Implanted Target Surrogates for Radiation Treatment Verification, will be focusing on all clinical aspects of using the implanted target surrogates for treatment verification and related issues. A wide variety of markers available in the market will be first reviewed, including radiopaque markers, MRImore » compatible makers, non-migrating coils, surgical clips and electromagnetic transponders etc. The pros and cons of each kind will be discussed. The clinical applications of implanted surrogates will be presented based on different anatomical sites. For the lung, we will discuss gated treatments and 2D or 3D real-time fiducial tracking techniques. For the prostate, we will be focusing on 2D-3D, 3D-3D matching and electromagnetic transponder based localization techniques. For the liver, we will review techniques when patients are under gating, shallow or free breathing condition. We will review techniques when treating challenging breast cancer as deformation may occur. Finally, we will summarize potential issues related to the usage of implanted target surrogates with TG 199 recommendations. A review of fiducial migration and fiducial derived target rotation in different disease sites will be provided. The issue of target deformation, especially near the diaphragm, and related suggestions will be also presented and discussed. Learning Objectives: Knowledge of a wide variety of markers Knowledge of their application for different disease sites Understand of issues related to these applications Z. Wang: Research funding support from Brainlab AG Q. Xu: Consultant for Accuray; Q. Xu, I am a consultant for Accuray planning service.« less

  14. SU-D-213-05: Design, Evaluation and First Applications of a Off-Site State-Of-The-Art 3D Dosimetry System

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Malcolm, J; Mein, S; McNiven, A

    2015-06-15

    Purpose: To design, construct and commission a prototype in-house three dimensional (3D) dose verification system for stereotatic body radiotherapy (SBRT) verification at an off-site partner institution. To investigate the potential of this system to achieve sufficient performance (1mm resolution, 3% noise, within 3% of true dose reading) for SBRT verification. Methods: The system was designed utilizing a parallel ray geometry instigated by precision telecentric lenses and an LED 630nm light source. Using a radiochromic dosimeter, a 3D dosimetric comparison with our gold-standard system and treatment planning software (Eclipse) was done for a four-field box treatment, under gamma passing criteria ofmore » 3%/3mm/10% dose threshold. Post off-site installation, deviations in the system’s dose readout performance was assessed by rescanning the four-field box irradiated dosimeter and using line-profiles to compare on-site and off-site mean and noise levels in four distinct dose regions. As a final step, an end-to-end test of the system was completed at the off-site location, including CT-simulation, irradiation of the dosimeter and a 3D dosimetric comparison of the planned (Pinnacle{sup 3}) to delivered dose for a spinal SBRT treatment(12 Gy per fraction). Results: The noise level in the high and medium dose regions of the four field box treatment was relatively 5% pre and post installation. This reflects the reduction in positional uncertainty through the new design. This At 1mm dose voxels, the gamma pass rates(3%,3mm) for our in-house gold standard system and the off-site system were comparable at 95.8% and 93.2% respectively. Conclusion: This work will describe the end-to-end process and results of designing, installing, and commissioning a state-of-the-art 3D dosimetry system created for verification of advanced radiation treatments including spinal radiosurgery.« less

  15. Ultrasound functional imaging in an ex vivo beating porcine heart platform

    NASA Astrophysics Data System (ADS)

    Petterson, Niels J.; Fixsen, Louis S.; Rutten, Marcel C. M.; Pijls, Nico H. J.; van de Vosse, Frans N.; Lopata, Richard G. P.

    2017-12-01

    In recent years, novel ultrasound functional imaging (UFI) techniques have been introduced to assess cardiac function by measuring, e.g. cardiac output (CO) and/or myocardial strain. Verification and reproducibility assessment in a realistic setting remain major issues. Simulations and phantoms are often unrealistic, whereas in vivo measurements often lack crucial hemodynamic parameters or ground truth data, or suffer from the large physiological and clinical variation between patients when attempting clinical validation. Controlled validation in certain pathologies is cumbersome and often requires the use of lab animals. In this study, an isolated beating pig heart setup was adapted and used for performance assessment of UFI techniques such as volume assessment and ultrasound strain imaging. The potential of performing verification and reproducibility studies was demonstrated. For proof-of-principle, validation of UFI in pathological hearts was examined. Ex vivo porcine hearts (n  =  6, slaughterhouse waste) were resuscitated and attached to a mock circulatory system. Radio frequency ultrasound data of the left ventricle were acquired in five short axis views and one long axis view. Based on these slices, the CO was measured, where verification was performed using flow sensor measurements in the aorta. Strain imaging was performed providing radial, circumferential and longitudinal strain to assess reproducibility and inter-subject variability under steady conditions. Finally, strains in healthy hearts were compared to a heart with an implanted left ventricular assist device, simulating a failing, supported heart. Good agreement between ultrasound and flow sensor based CO measurements was found. Strains were highly reproducible (intraclass correlation coefficients  >0.8). Differences were found due to biological variation and condition of the hearts. Strain magnitude and patterns in the assisted heart were available for different pump action, revealing large changes compared to the normal condition. The setup provides a valuable benchmarking platform for UFI techniques. Future studies will include work on different pathologies and other means of measurement verification.

  16. Outcomes of the JNT 1955 Phase I Viability Study of Gamma Emission Tomography for Spent Fuel Verification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jacobsson-Svard, Staffan; Smith, Leon E.; White, Timothy

    The potential for gamma emission tomography (GET) to detect partial defects within a spent nuclear fuel assembly has been assessed within the IAEA Support Program project JNT 1955, phase I, which was completed and reported to the IAEA in October 2016. Two safeguards verification objectives were identified in the project; (1) independent determination of the number of active pins that are present in a measured assembly, in the absence of a priori information about the assembly; and (2) quantitative assessment of pin-by-pin properties, for example the activity of key isotopes or pin attributes such as cooling time and relative burnup,more » under the assumption that basic fuel parameters (e.g., assembly type and nominal fuel composition) are known. The efficacy of GET to meet these two verification objectives was evaluated across a range of fuel types, burnups and cooling times, while targeting a total interrogation time of less than 60 minutes. The evaluations were founded on a modelling and analysis framework applied to existing and emerging GET instrument designs. Monte Carlo models of different fuel types were used to produce simulated tomographer responses to large populations of “virtual” fuel assemblies. The simulated instrument response data were then processed using a variety of tomographic-reconstruction and image-processing methods, and scoring metrics were defined and used to evaluate the performance of the methods.This paper describes the analysis framework and metrics used to predict tomographer performance. It also presents the design of a “universal” GET (UGET) instrument intended to support the full range of verification scenarios envisioned by the IAEA. Finally, it gives examples of the expected partial-defect detection capabilities for some fuels and diversion scenarios, and it provides a comparison of predicted performance for the notional UGET design and an optimized variant of an existing IAEA instrument.« less

  17. Weak lensing in the Dark Energy Survey

    NASA Astrophysics Data System (ADS)

    Troxel, Michael

    2016-03-01

    I will present the current status of weak lensing results from the Dark Energy Survey (DES). DES will survey 5000 square degrees in five photometric bands (grizY), and has already provided a competitive weak lensing catalog from Science Verification data covering just 3% of the final survey footprint. I will summarize the status of shear catalog production using observations from the first year of the survey and discuss recent weak lensing science results from DES. Finally, I will report on the outlook for future cosmological analyses in DES including the two-point cosmic shear correlation function and discuss challenges that DES and future surveys will face in achieving a control of systematics that allows us to take full advantage of the available statistical power of our shear catalogs.

  18. New Developments in Magnetostatic Cleanliness Modeling

    NASA Astrophysics Data System (ADS)

    Mehlem, K.; Wiegand, A.; Weickert, S.

    2012-05-01

    The paper describes improvements and extensions of the multiple magnetic dipole modeling method (MDM) for cleanliness verification which had been introduced by the author1 in 1977 and then applied during 3 decades to numerous international projects. The solutions of specific modeling problems which had been left unsolved so far, are described in the present paper. Special attention is given to the ambiguities of MDM solutions caused by the limited data coverage available. Constraint handling by the constraint-free NLP solver, optimal MDM sizing and multiple-point far-field compensation techniques are presented. The recent extension of the MDM method to field gradient data is formulated and demonstrated by an example. Finally, a complex MDM application (Ulysses) is presented. Finally, a short description of the MDM software GAMAG, recently introduced by the author1, is given.

  19. TU-C-BRE-11: 3D EPID-Based in Vivo Dosimetry: A Major Step Forward Towards Optimal Quality and Safety in Radiation Oncology Practice

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mijnheer, B; Mans, A; Olaciregui-Ruiz, I

    Purpose: To develop a 3D in vivo dosimetry method that is able to substitute pre-treatment verification in an efficient way, and to terminate treatment delivery if the online measured 3D dose distribution deviates too much from the predicted dose distribution. Methods: A back-projection algorithm has been further developed and implemented to enable automatic 3D in vivo dose verification of IMRT/VMAT treatments using a-Si EPIDs. New software tools were clinically introduced to allow automated image acquisition, to periodically inspect the record-and-verify database, and to automatically run the EPID dosimetry software. The comparison of the EPID-reconstructed and planned dose distribution is donemore » offline to raise automatically alerts and to schedule actions when deviations are detected. Furthermore, a software package for online dose reconstruction was also developed. The RMS of the difference between the cumulative planned and reconstructed 3D dose distributions was used for triggering a halt of a linac. Results: The implementation of fully automated 3D EPID-based in vivo dosimetry was able to replace pre-treatment verification for more than 90% of the patient treatments. The process has been fully automated and integrated in our clinical workflow where over 3,500 IMRT/VMAT treatments are verified each year. By optimizing the dose reconstruction algorithm and the I/O performance, the delivered 3D dose distribution is verified in less than 200 ms per portal image, which includes the comparison between the reconstructed and planned dose distribution. In this way it was possible to generate a trigger that can stop the irradiation at less than 20 cGy after introducing large delivery errors. Conclusion: The automatic offline solution facilitated the large scale clinical implementation of 3D EPID-based in vivo dose verification of IMRT/VMAT treatments; the online approach has been successfully tested for various severe delivery errors.« less

  20. Quality Assurance in Environmental Technology Verification (ETV): Analysis and Impact on the EU ETV Pilot Programme Performance

    NASA Astrophysics Data System (ADS)

    Molenda, Michał; Ratman-Kłosińska, Izabela

    2018-03-01

    Many innovative environmental technologies never reach the market because they are new and cannot demonstrate a successful track record of previous applications. This fact is a serious obstacle on their way to the market. Lack of credible data on the performance of a technology causes mistrust of investors in innovations, especially from public sector, who seek effective solutions however without compromising the technical and financial risks associated with their implementation. Environmental technology verification (ETV) offers a credible, robust and transparent process that results in a third party confirmation of the claims made by the providers about the performance of the novel environmental technologies. Verifications of performance are supported by high quality, independent test data. In that way ETV as a tool helps establish vendor credibility and buyer confidence. Several countries across the world have implemented ETV in the form of national or regional programmes. ETV in the European Union was implemented as a voluntary scheme if a form of a pilot programme. The European Commission launched the Environmental Technology Pilot Programme of the European Union (EU ETV) in 2011. The paper describes the European model of ETV set up and put to operation under the Pilot Programme of Environmental Technologies Verification of the European Union. The goal, objectives, technological scope, involved entities are presented. An attempt has been made to summarise the results of the EU ETV scheme performance available for the period of 2012 when the programme has become fully operational until the first half of 2016. The study was aimed at analysing the overall organisation and efficiency of the EU ETV Pilot Programme. The study was based on the analysis of the documents the operation of the EU ETV system. For this purpose, a relevant statistical analysis of the data on the performance of the EU ETV system provided by the European Commission was carried out.

  1. WE-D-BRA-04: Online 3D EPID-Based Dose Verification for Optimum Patient Safety

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Spreeuw, H; Rozendaal, R; Olaciregui-Ruiz, I

    2015-06-15

    Purpose: To develop an online 3D dose verification tool based on EPID transit dosimetry to ensure optimum patient safety in radiotherapy treatments. Methods: A new software package was developed which processes EPID portal images online using a back-projection algorithm for the 3D dose reconstruction. The package processes portal images faster than the acquisition rate of the portal imager (∼ 2.5 fps). After a portal image is acquired, the software seeks for “hot spots” in the reconstructed 3D dose distribution. A hot spot is in this study defined as a 4 cm{sup 3} cube where the average cumulative reconstructed dose exceedsmore » the average total planned dose by at least 20% and 50 cGy. If a hot spot is detected, an alert is generated resulting in a linac halt. The software has been tested by irradiating an Alderson phantom after introducing various types of serious delivery errors. Results: In our first experiment the Alderson phantom was irradiated with two arcs from a 6 MV VMAT H&N treatment having a large leaf position error or a large monitor unit error. For both arcs and both errors the linac was halted before dose delivery was completed. When no error was introduced, the linac was not halted. The complete processing of a single portal frame, including hot spot detection, takes about 220 ms on a dual hexacore Intel Xeon 25 X5650 CPU at 2.66 GHz. Conclusion: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for various kinds of gross delivery errors. The detection of hot spots was proven to be effective for the timely detection of these errors. Current work is focused on hot spot detection criteria for various treatment sites and the introduction of a clinical pilot program with online verification of hypo-fractionated (lung) treatments.« less

  2. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toltz, A; Seuntjens, J; Hoesl, M

    Purpose: With the aim of reducing acute esophageal radiation toxicity in pediatric patients receiving craniospinal irradiation (CSI), we investigated the implementation of an in-vivo, adaptive proton therapy range verification methodology. Simulation experiments and in-phantom measurements were conducted to validate the range verification technique for this clinical application. Methods: A silicon diode array system has been developed and experimentally tested in phantom for passively scattered proton beam range verification for a prostate treatment case by correlating properties of the detector signal to the water equivalent path length (WEPL). We propose to extend the methodology to verify range distal to the vertebralmore » body for pediatric CSI cases by placing this small volume dosimeter in the esophagus of the anesthetized patient immediately prior to treatment. A set of calibration measurements was performed to establish a time signal to WEPL fit for a “scout” beam in a solid water phantom. Measurements are compared against Monte Carlo simulation in GEANT4 using the Tool for Particle Simulation (TOPAS). Results: Measurements with the diode array in a spread out Bragg peak of 14 cm modulation width and 15 cm range (177 MeV passively scattered beam) in solid water were successfully validated against proton fluence rate simulations in TOPAS. The resulting calibration curve allows for a sensitivity analysis of detector system response with dose rate in simulation and with individual diode position through simulation on patient CT data. Conclusion: Feasibility has been shown for the application of this range verification methodology to pediatric CSI. An in-vivo measurement to determine the WEPL to the inner surface of the esophagus will allow for personalized adjustment of the treatment plan to ensure sparing of the esophagus while confirming target coverage. A Toltz acknowledges partial support by the CREATE Medical Physics Research Training Network grant of the Natural Sciences and Engineering Research Council (Grant number: 432290)« less

  3. Field evaluations of the VD max approach for substantiation of a 25 kGy sterilization dose and its application to other preselected doses

    NASA Astrophysics Data System (ADS)

    Kowalski, John B.; Herring, Craig; Baryschpolec, Lisa; Reger, John; Patel, Jay; Feeney, Mary; Tallentire, Alan

    2002-08-01

    The International and European standards for radiation sterilization require evidence of the effectiveness of a minimum sterilization dose of 25 kGy but do not provide detailed guidance on how this evidence can be generated. An approach, designated VD max, has recently been described and computer evaluated to provide safe and unambiguous substantiation of a 25 kGy sterilization dose. The approach has been further developed into a practical method, which has been subjected to field evaluations at three manufacturing facilities which produce different types of medical devices. The three facilities each used a different overall evaluation strategy: Facility A used VD max for quarterly dose audits; Facility B compared VD max and Method 1 in side-by-side parallel experiments; and Facility C, a new facility at start-up, used VD max for initial substantiation of 25 kGy and subsequent quarterly dose audits. A common element at all three facilities was the use of 10 product units for irradiation in the verification dose experiment. The field evaluations of the VD max method were successful at all three facilities; they included many different types of medical devices/product families with a wide range of average bioburden and sample item portion values used in the verification dose experiments. Overall, around 500 verification dose experiments were performed and no failures were observed. In the side-by-side parallel experiments, the outcomes of the VD max experiments were consistent with the outcomes observed with Method 1. The VD max approach has been extended to sterilization doses >25 and <25 kGy; verification doses have been derived for sterilization doses of 15, 20, 30, and 35 kGy. Widespread application of the VD max method for doses other than 25 kGy must await controlled field evaluations and the development of appropriate specifications/standards.

  4. SU-G-BRB-15: Verifications of Absolute and Relative Dosimetry of a Novel Stereotactic Breast Device: GammaPodTM

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Becker, S; Mossahebi, S; Yi, B

    Purpose: A dedicated stereotactic breast radiotherapy device, GammaPod, was developed to treat early stage breast cancer. The first clinical unit was installed and commissioned at University of Maryland. We report our methodology of absolute dosimetry in multiple calibration conditions and dosimetric verifications of treatment plans produced by the system. Methods: GammaPod unit is comprised of a rotating hemi-spherical source carrier containing 36 Co-60 sources and a concentric tungsten collimator providing beams of 15 and 25 mm. Absolute dose calibration formalism was developed with modifications to AAPM protocols for unique geometry and different calibration medium (acrylic, polyethylene or liquid water). Breastmore » cup-size specific and collimator output factors were measured and verified with respect to Monte-Carlo simulations for single isocenter plans. Multiple isocenter plans were generated for various target size, location and cup-sizes in phantoms and 20 breast cancer patients images. Stereotactic mini-farmer chamber, OSL and TLD detectors as well as radio-chromic films were used for dosimetric measurements. Results: At the time of calibration (1/14/2016), absolute dose rate of the GammaPod was established to be 2.10 Gy/min in acrylic for 25 mm for sources installed in March 2011. Output factor for 15 mm collimator was measured to be 0.950. Absolute dose calibration was independently verified by IROC-Houston with a TLD/Institution ratio of 0.99. Cup size specific output measurements in liquid water for single isocenter were found to be within 3.0% of MC simulations. Point-dose measurements of multiple isocenter treatment plans were found to be within −1.0 ± 1.2 % of treatment planning system while 2-dimensional gamma analysis yielded a pass rate of 97.9 ± 2.2 % using gamma criteria of 3% and 2mm. Conclusion: The first GammaPod treatment unit for breast stereotactic radiotherapy was successfully installed, calibrated and commissioned for patient treatments. An absolute dosimetry and dosimetric verification protocols were successfully created.« less

  5. International validation of quality indicators for evaluating priority setting in low income countries: process and key lessons.

    PubMed

    Kapiriri, Lydia

    2017-06-19

    While there have been efforts to develop frameworks to guide healthcare priority setting; there has been limited focus on evaluation frameworks. Moreover, while the few frameworks identify quality indicators for successful priority setting, they do not provide the users with strategies to verify these indicators. Kapiriri and Martin (Health Care Anal 18:129-147, 2010) developed a framework for evaluating priority setting in low and middle income countries. This framework provides BOTH parameters for successful priority setting and proposes means of their verification. Before its use in real life contexts, this paper presents results from a validation process of the framework. The framework validation involved 53 policy makers and priority setting researchers at the global, national and sub-national levels (in Uganda). They were requested to indicate the relative importance of the proposed parameters as well as the feasibility of obtaining the related information. We also pilot tested the proposed means of verification. Almost all the respondents evaluated all the parameters, including the contextual factors, as 'very important'. However, some respondents at the global level thought 'presence of incentives to comply', 'reduced disagreements', 'increased public understanding,' 'improved institutional accountability' and 'meeting the ministry of health objectives', which could be a reflection of their levels of decision making. All the proposed means of verification were assessed as feasible with the exception of meeting observations which would require an insider. These findings results were consistent with those obtained from the pilot testing. These findings are relevant to policy makers and researchers involved in priority setting in low and middle income countries. To the best of our knowledge, this is one of the few initiatives that has involved potential users of a framework (at the global and in a Low Income Country) in its validation. The favorable validation of all the parameters at the national and sub-national levels implies that the framework has potential usefulness at those levels, as is. The parameters that were disputed at the global level necessitate further discussion when using the framework at that level. The next step is to use the validated framework in evaluating actual priority setting at the different levels.

  6. Medicare program; prospective payment system and consolidated billing for skilled nursing facilities for FY 2010; minimum data set, version 3.0 for skilled nursing facilities and Medicaid nursing facilities. Final rule.

    PubMed

    2009-08-11

    This final rule updates the payment rates used under the prospective payment system (PPS) for skilled nursing facilities (SNFs), for fiscal year (FY) 2010. In addition, it recalibrates the case-mix indexes so that they more accurately reflect parity in expenditures related to the implementation of case-mix refinements in January 2006. It also discusses the results of our ongoing analysis of nursing home staff time measurement data collected in the Staff Time and Resource Intensity Verification project, as well as a new Resource Utilization Groups, version 4 case-mix classification model for FY 2011 that will use the updated Minimum Data Set 3.0 resident assessment for case-mix classification. In addition, this final rule discusses the public comments that we have received on these and other issues, including a possible requirement for the quarterly reporting of nursing home staffing data, as well as on applying the quality monitoring mechanism in place for all other SNF PPS facilities to rural swing-bed hospitals. Finally, this final rule revises the regulations to incorporate certain technical corrections.

  7. Evaluation of MPLM Design and Mission 6A Coupled Loads Analyses

    NASA Technical Reports Server (NTRS)

    Bookout, Paul S.; Ricks, Ed

    1999-01-01

    Through the development of a space shuttle payload, there are usually several coupled loads analyses (CLA) performed: preliminary design, critical design, final design and verification loads analysis (VLA). A final design CLA is the last analysis conducted prior to model delivery to the shuttle program for the VLA. The finite element models used in the final design CLA and the VLA are test verified dynamic math models. Mission 6A is the first of many flights of the Multi-Purpose Logistics Module (MPLM). The MPLM was developed by Alenia Spazio S.p.A. (an Italian aerospace company) and houses the International Standard Payload Racks (ISPR) for transportation to the space station in the shuttle. Marshall Space Flight Center (MSFC), the payload integrator of the MPLM for Mission 6A, performed the final design CLA using the M6.OZC shuttle data for liftoff and landing conditions using the proper shuttle cargo manifest. Alenia performed the preliminary and critical design CLAs for the development of the MPLM. However, these CLAs did not use the current Mission 6A cargo manifest. An evaluation of the preliminary and critical design performed by Alenia and the final design performed by MSFC is presented.

  8. Mars Science Laboratory Propulsive Maneuver Design and Execution

    NASA Technical Reports Server (NTRS)

    Wong, Mau C.; Kangas, Julie A.; Ballard, Christopher G.; Gustafson, Eric D.; Martin-Mur, Tomas J.

    2012-01-01

    The NASA Mars Science Laboratory (MSL) rover, Curiosity, was launched on November 26, 2011 and successfully landed at the Gale Crater on Mars. For the 8-month interplanetary trajectory from Earth to Mars, five nominal and two contingency trajectory correction maneuvers (TCM) were planned. The goal of these TCMs was to accurately deliver the spacecraft to the desired atmospheric entry aimpoint in Martian atmosphere so as to ensure a high probability of successful landing on the Mars surface. The primary mission requirements on maneuver performance were the total mission propellant usage and the entry flight path angle (EFPA) delivery accuracy. They were comfortably met in this mission. In this paper we will describe the spacecraft propulsion system, TCM constraints and requirements, TCM design processes, and their implementation and verification.

  9. Scores on the 16 Personality Factor Test and Success in College Calculus 1.

    ERIC Educational Resources Information Center

    Shaughnessy, Michael F.; And Others

    This study explored personality variables measured by the 16 Personality Factor (16PF) test and their relevance to success, as defined by the final course grade, in college calculus courses with 94 students. Two personality variables were significant predictors of success as determined by the final course grade. A Statistical Analysis System…

  10. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  11. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  12. Critical Software for Human Spaceflight

    NASA Technical Reports Server (NTRS)

    Preden, Antonio; Kaschner, Jens; Rettig, Felix; Rodriggs, Michael

    2017-01-01

    The NASA Orion vehicle that will fly to the moon in the next years is propelled along its mission by the European Service Module (ESM), developed by ESA and its prime contractor Airbus Defense and Space. This paper describes the development of the Propulsion Drive Electronics (PDE) Software that provides the interface between the propulsion hardware of the European Service Module with the Orion flight computers, and highlights the challenges that have been faced during the development. Particularly, the specific aspects relevant to Human Spaceflight in an international cooperation are presented, as the compliance to both European and US standards and the software criticality classification to the highest category A. An innovative aspect of the PDE SW is its Time- Triggered Ethernet interface with the Orion Flight Computers, which has never been flown so far on any European spacecraft. Finally the verification aspects are presented, applying the most exigent quality requirements defined in the European Cooperation for Space Standardization (ECSS) standards such as the structural coverage analysis of the object code and the recourse to an independent software verification and validation activity carried on in parallel by a different team.

  13. An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks.

    PubMed

    Zhu, Hongfei; Tan, Yu-An; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang

    2018-05-22

    With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people's lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size.

  14. Failure analysis of blots for diesel engine intercooler

    NASA Astrophysics Data System (ADS)

    Ren, Ping; Li, Zongquan; Wu, Jiangfei; Guo, Yibin; Li, Wanyou

    2017-05-01

    In diesel generating sets, it will lead to the abominable working condition if the fault couldn’t be recovered when the bolt of intercooler cracks. This paper aims at the fault of the blots of diesel generator intercooler and completes the analysis of the static strength and fatigue strength. Static intensity is checked considering blot preload and thermal stress. In order to obtain the thermal stress of the blot, thermodynamic of intercooler is calculated according to the measured temperature. Based on the measured vibration response and the finite element model, using dynamic load identification technique, equivalent excitation force of unit was solved. In order to obtain the force of bolt, the excitation force is loaded into the finite element model. By considering the thermal stress and preload as the average stress while the mechanical stress as the wave stress, fatigue strength analysis has been accomplished. Procedure of diagnosis is proposed in this paper. Finally, according to the result of intensity verification the fatigue failure is validation. Thereby, further studies are necessary to verification the result of the intensity analysis and put forward some improvement suggestion.

  15. Hafnium transistor design for neural interfacing.

    PubMed

    Parent, David W; Basham, Eric J

    2008-01-01

    A design methodology is presented that uses the EKV model and the g(m)/I(D) biasing technique to design hafnium oxide field effect transistors that are suitable for neural recording circuitry. The DC gain of a common source amplifier is correlated to the structural properties of a Field Effect Transistor (FET) and a Metal Insulator Semiconductor (MIS) capacitor. This approach allows a transistor designer to use a design flow that starts with simple and intuitive 1-D equations for gain that can be verified in 1-D MIS capacitor TCAD simulations, before final TCAD process verification of transistor properties. The DC gain of a common source amplifier is optimized by using fast 1-D simulations and using slower, complex 2-D simulations only for verification. The 1-D equations are used to show that the increased dielectric constant of hafnium oxide allows a higher DC gain for a given oxide thickness. An additional benefit is that the MIS capacitor can be employed to test additional performance parameters important to an open gate transistor such as dielectric stability and ionic penetration.

  16. Development of Independent-type Optical CT

    NASA Astrophysics Data System (ADS)

    Yamaguchi, Tatsushi; Shiozawa, Daigoro; Rokunohe, Toshiaki; Kida, Junzo; Zhang, Wei

    Optical current transformers (optical CTs) have features that they can be made much smaller and lighter than conventional electromagnetic induction transformers by their simple structure, and contribute to improvement of equipment reliability because of their excellent surge resistance performance. Authors consider optical CTs to be next generation transformers, and are conducting research and development of optical CTs aiming to apply to measuring and protection in electric power systems. Specifically we developed an independent-type optical CT by utilizing basic data of optical CTs accumulated for large current characteristics, temperature characteristics, vibration resistance characteristics, and so on. In performance verification, type tests complying with IEC standards, such as short-time current tests, insulation tests, accuracy tests, and so on, showed good results. This report describes basic principle and configuration of optical CTs. After that, as basic characteristics of optical CTs, conditions and results of verification tests for dielectric breakdown characteristics of sensor fibers, large current characteristics, temperature characteristics, and vibration resistance characteristics are described. Finally, development outline of the independent-type optical CT aiming to apply to all digital substation and its type tests results are described.

  17. Practical Formal Verification of Diagnosability of Large Models via Symbolic Model Checking

    NASA Technical Reports Server (NTRS)

    Cavada, Roberto; Pecheur, Charles

    2003-01-01

    This document reports on the activities carried out during a four-week visit of Roberto Cavada at the NASA Ames Research Center. The main goal was to test the practical applicability of the framework proposed, where a diagnosability problem is reduced to a Symbolic Model Checking problem. Section 2 contains a brief explanation of major techniques currently used in Symbolic Model Checking, and how these techniques can be tuned in order to obtain good performances when using Model Checking tools. Diagnosability is performed on large and structured models of real plants. Section 3 describes how these plants are modeled, and how models can be simplified to improve the performance of Symbolic Model Checkers. Section 4 reports scalability results. Three test cases are briefly presented, and several parameters and techniques have been applied on those test cases in order to produce comparison tables. Furthermore, comparison between several Model Checkers is reported. Section 5 summarizes the application of diagnosability verification to a real application. Several properties have been tested, and results have been highlighted. Finally, section 6 draws some conclusions, and outlines future lines of research.

  18. An Identity-Based Anti-Quantum Privacy-Preserving Blind Authentication in Wireless Sensor Networks

    PubMed Central

    Zhu, Hongfei; Tan, Yu-an; Zhu, Liehuang; Wang, Xianmin; Zhang, Quanxin; Li, Yuanzhang

    2018-01-01

    With the development of wireless sensor networks, IoT devices are crucial for the Smart City; these devices change people’s lives such as e-payment and e-voting systems. However, in these two systems, the state-of-art authentication protocols based on traditional number theory cannot defeat a quantum computer attack. In order to protect user privacy and guarantee trustworthy of big data, we propose a new identity-based blind signature scheme based on number theorem research unit lattice, this scheme mainly uses a rejection sampling theorem instead of constructing a trapdoor. Meanwhile, this scheme does not depend on complex public key infrastructure and can resist quantum computer attack. Then we design an e-payment protocol using the proposed scheme. Furthermore, we prove our scheme is secure in the random oracle, and satisfies confidentiality, integrity, and non-repudiation. Finally, we demonstrate that the proposed scheme outperforms the other traditional existing identity-based blind signature schemes in signing speed and verification speed, outperforms the other lattice-based blind signature in signing speed, verification speed, and signing secret key size. PMID:29789475

  19. Real-Time Reliability Verification for UAV Flight Control System Supporting Airworthiness Certification.

    PubMed

    Xu, Haiyang; Wang, Ping

    2016-01-01

    In order to verify the real-time reliability of unmanned aerial vehicle (UAV) flight control system and comply with the airworthiness certification standard, we proposed a model-based integration framework for modeling and verification of time property. Combining with the advantages of MARTE, this framework uses class diagram to create the static model of software system, and utilizes state chart to create the dynamic model. In term of the defined transformation rules, the MARTE model could be transformed to formal integrated model, and the different part of the model could also be verified by using existing formal tools. For the real-time specifications of software system, we also proposed a generating algorithm for temporal logic formula, which could automatically extract real-time property from time-sensitive live sequence chart (TLSC). Finally, we modeled the simplified flight control system of UAV to check its real-time property. The results showed that the framework could be used to create the system model, as well as precisely analyze and verify the real-time reliability of UAV flight control system.

  20. Measurement of self-evaluative motives: a shopping scenario.

    PubMed

    Wajda, Theresa A; Kolbe, Richard; Hu, Michael Y; Cui, Annie Peng

    2008-08-01

    To develop measures of consumers' self-evaluative motives of Self-verification, Self-enhancement, and Self-improvement within the context of a mall shopping environment, an initial set of 49 items was generated by conducting three focus-group sessions. These items were subsequently converted into shopping-dependent motive statements. 250 undergraduate college students responded on a 7-point scale to each statement as these related to the acquisition of recent personal shopping goods. An exploratory factor analysis yielded five factors, accounting for 57.7% of the variance, three of which corresponded to the Self-verification motive (five items), Self-enhancement motive (three items), and Self-improvement motive (six items). These 14 items, along with 9 reconstructed items, yielded 23 items retained and subjected to additional testing. In a final round of data collection, 169 college students provided data for exploratory factor analysis. 11 items were used in confirmatory factor analysis. Analysis indicated that the 11-item scale adequately captured measures of the three self-evaluative motives. However, further data reduction produced a 9-item scale with marked improvement in statistical fit over the 11-item scale.

Top