Sample records for digital design verification

  1. Mutation Testing for Effective Verification of Digital Components of Physical Systems

    NASA Astrophysics Data System (ADS)

    Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.

    2015-12-01

    Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.

  2. A UVM simulation environment for the study, optimization and verification of HL-LHC digital pixel readout chips

    NASA Astrophysics Data System (ADS)

    Marconi, S.; Conti, E.; Christiansen, J.; Placidi, P.

    2018-05-01

    The operating conditions of the High Luminosity upgrade of the Large Hadron Collider are very demanding for the design of next generation hybrid pixel readout chips in terms of particle rate, radiation level and data bandwidth. To this purpose, the RD53 Collaboration has developed for the ATLAS and CMS experiments a dedicated simulation and verification environment using industry-consolidated tools and methodologies, such as SystemVerilog and the Universal Verification Methodology (UVM). This paper presents how the so-called VEPIX53 environment has first guided the design of digital architectures, optimized for processing and buffering very high particle rates, and secondly how it has been reused for the functional verification of the first large scale demonstrator chip designed by the collaboration, which has recently been submitted.

  3. Analysis of potential errors in real-time streamflow data and methods of data verification by digital computer

    USGS Publications Warehouse

    Lystrom, David J.

    1972-01-01

    Various methods of verifying real-time streamflow data are outlined in part II. Relatively large errors (those greater than 20-30 percent) can be detected readily by use of well-designed verification programs for a digital computer, and smaller errors can be detected only by discharge measurements and field observations. The capability to substitute a simulated discharge value for missing or erroneous data is incorporated in some of the verification routines described. The routines represent concepts ranging from basic statistical comparisons to complex watershed modeling and provide a selection from which real-time data users can choose a suitable level of verification.

  4. Software development for airborne radar

    NASA Astrophysics Data System (ADS)

    Sundstrom, Ingvar G.

    Some aspects for development of software in a modern multimode airborne nose radar are described. First, an overview of where software is used in the radar units is presented. The development phases-system design, functional design, detailed design, function verification, and system verification-are then used as the starting point for the discussion. Methods, tools, and the most important documents are described. The importance of video flight recording in the early stages and use of a digital signal generators for performance verification is emphasized. Some future trends are discussed.

  5. A digital flight control system verification laboratory

    NASA Technical Reports Server (NTRS)

    De Feo, P.; Saib, S.

    1982-01-01

    A NASA/FAA program has been established for the verification and validation of digital flight control systems (DFCS), with the primary objective being the development and analysis of automated verification tools. In order to enhance the capabilities, effectiveness, and ease of using the test environment, software verification tools can be applied. Tool design includes a static analyzer, an assertion generator, a symbolic executor, a dynamic analysis instrument, and an automated documentation generator. Static and dynamic tools are integrated with error detection capabilities, resulting in a facility which analyzes a representative testbed of DFCS software. Future investigations will ensue particularly in the areas of increase in the number of software test tools, and a cost effectiveness assessment.

  6. Design exploration and verification platform, based on high-level modeling and FPGA prototyping, for fast and flexible digital communication in physics experiments

    NASA Astrophysics Data System (ADS)

    Magazzù, G.; Borgese, G.; Costantino, N.; Fanucci, L.; Incandela, J.; Saponara, S.

    2013-02-01

    In many research fields as high energy physics (HEP), astrophysics, nuclear medicine or space engineering with harsh operating conditions, the use of fast and flexible digital communication protocols is becoming more and more important. The possibility to have a smart and tested top-down design flow for the design of a new protocol for control/readout of front-end electronics is very useful. To this aim, and to reduce development time, costs and risks, this paper describes an innovative design/verification flow applied as example case study to a new communication protocol called FF-LYNX. After the description of the main FF-LYNX features, the paper presents: the definition of a parametric SystemC-based Integrated Simulation Environment (ISE) for high-level protocol definition and validation; the set up of figure of merits to drive the design space exploration; the use of ISE for early analysis of the achievable performances when adopting the new communication protocol and its interfaces for a new (or upgraded) physics experiment; the design of VHDL IP cores for the TX and RX protocol interfaces; their implementation on a FPGA-based emulator for functional verification and finally the modification of the FPGA-based emulator for testing the ASIC chipset which implements the rad-tolerant protocol interfaces. For every step, significant results will be shown to underline the usefulness of this design and verification approach that can be applied to any new digital protocol development for smart detectors in physics experiments.

  7. Microprocessor Based Temperature Control of Liquid Delivery with Flow Disturbances.

    ERIC Educational Resources Information Center

    Kaya, Azmi

    1982-01-01

    Discusses analytical design and experimental verification of a PID control value for a temperature controlled liquid delivery system, demonstrating that the analytical design techniques can be experimentally verified by using digital controls as a tool. Digital control instrumentation and implementation are also demonstrated and documented for…

  8. Digital-flight-control-system software written in automated-engineering-design language: A user's guide of verification and validation tools

    NASA Technical Reports Server (NTRS)

    Saito, Jim

    1987-01-01

    The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.

  9. Computer program user's manual for FIREFINDER digital topographic data verification library dubbing system

    NASA Astrophysics Data System (ADS)

    Ceres, M.; Heselton, L. R., III

    1981-11-01

    This manual describes the computer programs for the FIREFINDER Digital Topographic Data Verification-Library-Dubbing System (FFDTDVLDS), and will assist in the maintenance of these programs. The manual contains detailed flow diagrams and associated descriptions for each computer program routine and subroutine. Complete computer program listings are also included. This information should be used when changes are made in the computer programs. The operating system has been designed to minimize operator intervention.

  10. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Punnoose, Ratish J.; Armstrong, Robert C.; Wong, Matthew H.

    Formal methods have come into wide use because of their effectiveness in verifying "safety and security" requirements of digital systems; a set of requirements for which testing is mostly ineffective. Formal methods are routinely used in the design and verification of high-consequence digital systems in industry. This report outlines our work in assessing the capabilities of commercial and open source formal tools and the ways in which they can be leveraged in digital design workflows.

  11. Digital avionics systems - Overview of FAA/NASA/industry-wide briefing

    NASA Technical Reports Server (NTRS)

    Larsen, William E.; Carro, Anthony

    1986-01-01

    The effects of incorporating digital technology into the design of aircraft on the airworthiness criteria and certification procedures for aircraft are investigated. FAA research programs aimed at providing data for the functional assessment of aircraft which use digital systems for avionics and flight control functions are discussed. The need to establish testing, assurance assessment, and configuration management technologies to insure the reliability of digital systems is discussed; consideration is given to design verification, system performance/robustness, and validation technology.

  12. Formal design and verification of a reliable computing platform for real-time control. Phase 2: Results

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Divito, Ben L.

    1992-01-01

    The design and formal verification of the Reliable Computing Platform (RCP), a fault tolerant computing system for digital flight control applications is presented. The RCP uses N-Multiply Redundant (NMR) style redundancy to mask faults and internal majority voting to flush the effects of transient faults. The system is formally specified and verified using the Ehdm verification system. A major goal of this work is to provide the system with significant capability to withstand the effects of High Intensity Radiated Fields (HIRF).

  13. Design of a modular digital computer system DRL 4 and 5. [design of airborne/spaceborne computer system

    NASA Technical Reports Server (NTRS)

    1973-01-01

    Design and development efforts for a spaceborne modular computer system are reported. An initial baseline description is followed by an interface design that includes definition of the overall system response to all classes of failure. Final versions for the register level designs for all module types were completed. Packaging, support and control executive software, including memory utilization estimates and design verification plan, were formalized to insure a soundly integrated design of the digital computer system.

  14. Formal development of a clock synchronization circuit

    NASA Technical Reports Server (NTRS)

    Miner, Paul S.

    1995-01-01

    This talk presents the latest stage in formal development of a fault-tolerant clock synchronization circuit. The development spans from a high level specification of the required properties to a circuit realizing the core function of the system. An abstract description of an algorithm has been verified to satisfy the high-level properties using the mechanical verification system EHDM. This abstract description is recast as a behavioral specification input to the Digital Design Derivation system (DDD) developed at Indiana University. DDD provides a formal design algebra for developing correct digital hardware. Using DDD as the principle design environment, a core circuit implementing the clock synchronization algorithm was developed. The design process consisted of standard DDD transformations augmented with an ad hoc refinement justified using the Prototype Verification System (PVS) from SRI International. Subsequent to the above development, Wilfredo Torres-Pomales discovered an area-efficient realization of the same function. Establishing correctness of this optimization requires reasoning in arithmetic, so a general verification is outside the domain of both DDD transformations and model-checking techniques. DDD represents digital hardware by systems of mutually recursive stream equations. A collection of PVS theories was developed to aid in reasoning about DDD-style streams. These theories include a combinator for defining streams that satisfy stream equations, and a means for proving stream equivalence by exhibiting a stream bisimulation. DDD was used to isolate the sub-system involved in Torres-Pomales' optimization. The equivalence between the original design and the optimized verified was verified in PVS by exhibiting a suitable bisimulation. The verification depended upon type constraints on the input streams and made extensive use of the PVS type system. The dependent types in PVS provided a useful mechanism for defining an appropriate bisimulation.

  15. Verification of Triple Modular Redundancy Insertion for Reliable and Trusted Systems

    NASA Technical Reports Server (NTRS)

    Berg, Melanie; LaBel, Kenneth

    2016-01-01

    If a system is required to be protected using triple modular redundancy (TMR), improper insertion can jeopardize the reliability and security of the system. Due to the complexity of the verification process and the complexity of digital designs, there are currently no available techniques that can provide complete and reliable confirmation of TMR insertion. We propose a method for TMR insertion verification that satisfies the process for reliable and trusted systems.

  16. Property-Based Monitoring of Analog and Mixed-Signal Systems

    NASA Astrophysics Data System (ADS)

    Havlicek, John; Little, Scott; Maler, Oded; Nickovic, Dejan

    In the recent past, there has been a steady growth of the market for consumer embedded devices such as cell phones, GPS and portable multimedia systems. In embedded systems, digital, analog and software components are combined on a single chip, resulting in increasingly complex designs that introduce richer functionality on smaller devices. As a consequence, the potential insertion of errors into a design becomes higher, yielding an increasing need for automated analog and mixed-signal validation tools. In the purely digital setting, formal verification based on properties expressed in industrial specification languages such as PSL and SVA is nowadays successfully integrated in the design flow. On the other hand, the validation of analog and mixed-signal systems still largely depends on simulation-based, ad-hoc methods. In this tutorial, we consider some ingredients of the standard verification methodology that can be successfully exported from digital to analog and mixed-signal setting, in particular property-based monitoring techniques. Property-based monitoring is a lighter approach to the formal verification, where the system is seen as a "black-box" that generates sets of traces, whose correctness is checked against a property, that is its high-level specification. Although incomplete, monitoring is effectively used to catch faults in systems, without guaranteeing their full correctness.

  17. A clocking discipline for two-phase digital integrated circuits

    NASA Astrophysics Data System (ADS)

    Noice, D. C.

    1983-09-01

    Sooner or later a designer of digital circuits must face the problem of timing verification so he can avoid errors caused by clock skew, critical races, and hazards. Unlike previous verification methods, such as timing simulation and timing analysis, the approach presented here guarantees correct operation despite uncertainty about delays in the circuit. The result is a clocking discipline that deals with timing abstractions only. It is not based on delay calculations; it is only concerned with the correct, synchronous operation at some clock rate. Accordingly, it may be used earlier in the design cycle, which is particularly important to integrated circuit designs. The clocking discipline consists of a notation of clocking types, and composition rules for using the types. Together, the notation and rules define a formal theory of two phase clocking. The notation defines the names and exact characteristics for different signals that are used in a two phase digital system. The notation makes it possible to develop rules for propagating the clocking types through particular circuits.

  18. An Educational Laboratory for Digital Control and Rapid Prototyping of Power Electronic Circuits

    ERIC Educational Resources Information Center

    Choi, Sanghun; Saeedifard, M.

    2012-01-01

    This paper describes a new educational power electronics laboratory that was developed primarily to reinforce experimentally the fundamental concepts presented in a power electronics course. The developed laboratory combines theoretical design, simulation studies, digital control, fabrication, and verification of power-electronic circuits based on…

  19. Resolution verification targets for airborne and spaceborne imaging systems at the Stennis Space Center

    NASA Astrophysics Data System (ADS)

    McKellip, Rodney; Yuan, Ding; Graham, William; Holland, Donald E.; Stone, David; Walser, William E.; Mao, Chengye

    1997-06-01

    The number of available spaceborne and airborne systems will dramatically increase over the next few years. A common systematic approach toward verification of these systems will become important for comparing the systems' operational performance. The Commercial Remote Sensing Program at the John C. Stennis Space Center (SSC) in Mississippi has developed design requirements for a remote sensing verification target range to provide a means to evaluate spatial, spectral, and radiometric performance of optical digital remote sensing systems. The verification target range consists of spatial, spectral, and radiometric targets painted on a 150- by 150-meter concrete pad located at SSC. The design criteria for this target range are based upon work over a smaller, prototypical target range at SSC during 1996. This paper outlines the purpose and design of the verification target range based upon an understanding of the systems to be evaluated as well as data analysis results from the prototypical target range.

  20. 77 FR 50723 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., ``Verification, Validation, Reviews, and Audits for Digital Computer Software used in Safety Systems of Nuclear... NRC regulations promoting the development of, and compliance with, software verification and...

  1. Technical experiences of implementing a wireless tracking and facial biometric verification system for a clinical environment

    NASA Astrophysics Data System (ADS)

    Liu, Brent; Lee, Jasper; Documet, Jorge; Guo, Bing; King, Nelson; Huang, H. K.

    2006-03-01

    By implementing a tracking and verification system, clinical facilities can effectively monitor workflow and heighten information security in today's growing demand towards digital imaging informatics. This paper presents the technical design and implementation experiences encountered during the development of a Location Tracking and Verification System (LTVS) for a clinical environment. LTVS integrates facial biometrics with wireless tracking so that administrators can manage and monitor patient and staff through a web-based application. Implementation challenges fall into three main areas: 1) Development and Integration, 2) Calibration and Optimization of Wi-Fi Tracking System, and 3) Clinical Implementation. An initial prototype LTVS has been implemented within USC's Healthcare Consultation Center II Outpatient Facility, which currently has a fully digital imaging department environment with integrated HIS/RIS/PACS/VR (Voice Recognition).

  2. Selecting a software development methodology. [of digital flight control systems

    NASA Technical Reports Server (NTRS)

    Jones, R. E.

    1981-01-01

    The state of the art analytical techniques for the development and verification of digital flight control software is studied and a practical designer oriented development and verification methodology is produced. The effectiveness of the analytic techniques chosen for the development and verification methodology are assessed both technically and financially. Technical assessments analyze the error preventing and detecting capabilities of the chosen technique in all of the pertinent software development phases. Financial assessments describe the cost impact of using the techniques, specifically, the cost of implementing and applying the techniques as well as the relizable cost savings. Both the technical and financial assessment are quantitative where possible. In the case of techniques which cannot be quantitatively assessed, qualitative judgements are expressed about the effectiveness and cost of the techniques. The reasons why quantitative assessments are not possible will be documented.

  3. Qualification of the flight-critical AFTI/F-16 digital flight control system. [Advanced Fighter Technology Integration

    NASA Technical Reports Server (NTRS)

    Mackall, D. A.; Ishmael, S. D.; Regenie, V. A.

    1983-01-01

    Qualification considerations for assuring the safety of a life-critical digital flight control system include four major areas: systems interactions, verification, validation, and configuration control. The AFTI/F-16 design, development, and qualification illustrate these considerations. In this paper, qualification concepts, procedures, and methodologies are discussed and illustrated through specific examples.

  4. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  5. Formal design and verification of a reliable computing platform for real-time control (phase 3 results)

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Divito, Ben L.; Holloway, C. Michael

    1994-01-01

    In this paper the design and formal verification of the lower levels of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications, are presented. The RCP uses NMR-style redundancy to mask faults and internal majority voting to flush the effects of transient faults. Two new layers of the RCP hierarchy are introduced: the Minimal Voting refinement (DA_minv) of the Distributed Asynchronous (DA) model and the Local Executive (LE) Model. Both the DA_minv model and the LE model are specified formally and have been verified using the Ehdm verification system. All specifications and proofs are available electronically via the Internet using anonymous FTP or World Wide Web (WWW) access.

  6. Formal Techniques for Synchronized Fault-Tolerant Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present the formal verification of synchronizing aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization is based on an extended state machine model incorporating snapshots of local processors clocks.

  7. Electronic simulation of a barometric pressure sensor for the meteorological monitor assembly

    NASA Technical Reports Server (NTRS)

    Guiar, C. N.; Duff, L. W.

    1982-01-01

    An analysis of the electronic simulation of barometric pressure used to self-test the counter electronics of the digital barometer is presented. The barometer is part of the Meteorological Monitor Assembly that supports navigation in deep space communication. The theory of operation of the digital barometer, the design details, and the verification procedure used with the barometric pressure simulator are presented.

  8. Provable Transient Recovery for Frame-Based, Fault-Tolerant Computing Systems

    NASA Technical Reports Server (NTRS)

    DiVito, Ben L.; Butler, Ricky W.

    1992-01-01

    We present a formal verification of the transient fault recovery aspects of the Reliable Computing Platform (RCP), a fault-tolerant computing system architecture for digital flight control applications. The RCP uses NMR-style redundancy to mask faults and internal majority voting to purge the effects of transient faults. The system design has been formally specified and verified using the EHDM verification system. Our formalization accommodates a wide variety of voting schemes for purging the effects of transients.

  9. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    NASA Astrophysics Data System (ADS)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  10. Computer Program User’s Manual for FIREFINDER Digital Topographic Data Verification Library Dubbing System. Volume II. Dubbing.

    DTIC Science & Technology

    1982-01-29

    N - Nw .VA COMPUTER PROGRAM USER’S MANUAL FOR . 0FIREFINDER DIGITAL TOPOGRAPHIC DATA VERIFICATION LIBRARY DUBBING SYSTEM VOLUME II DUBBING 29 JANUARY...Digital Topographic Data Verification Library Dubbing System, Volume II, Dubbing 6. PERFORMING ORG. REPORT NUMER 7. AUTHOR(q) S. CONTRACT OR GRANT...Software Library FIREFINDER Dubbing 20. ABSTRACT (Continue an revWee *Ide II necessary end identify by leek mauber) PThis manual describes the computer

  11. Optical/digital identification/verification system based on digital watermarking technology

    NASA Astrophysics Data System (ADS)

    Herrigel, Alexander; Voloshynovskiy, Sviatoslav V.; Hrytskiv, Zenon D.

    2000-06-01

    This paper presents a new approach for the secure integrity verification of driver licenses, passports or other analogue identification documents. The system embeds (detects) the reference number of the identification document with the DCT watermark technology in (from) the owner photo of the identification document holder. During verification the reference number is extracted and compared with the reference number printed in the identification document. The approach combines optical and digital image processing techniques. The detection system must be able to scan an analogue driver license or passport, convert the image of this document into a digital representation and then apply the watermark verification algorithm to check the payload of the embedded watermark. If the payload of the watermark is identical with the printed visual reference number of the issuer, the verification was successful and the passport or driver license has not been modified. This approach constitutes a new class of application for the watermark technology, which was originally targeted for the copyright protection of digital multimedia data. The presented approach substantially increases the security of the analogue identification documents applied in many European countries.

  12. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  13. Digital autopilots: Design considerations and simulator evaluations

    NASA Technical Reports Server (NTRS)

    Osder, S.; Neuman, F.; Foster, J.

    1971-01-01

    The development of a digital autopilot program for a transport aircraft and the evaluation of that system's performance on a transport aircraft simulator is discussed. The digital autopilot includes three axis attitude stabilization, automatic throttle control and flight path guidance functions with emphasis on the mode progression from descent into the terminal area through automatic landing. The study effort involved a sequence of tasks starting with the definition of detailed system block diagrams of control laws followed by a flow charting and programming phase and concluding with performance verification using the transport aircraft simulation. The autopilot control laws were programmed in FORTRAN 4 in order to isolate the design process from requirements peculiar to an individual computer.

  14. NASA/BLM APT, phase 2. Volume 2: Technology demonstration. [Arizona

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Techniques described include: (1) steps in the preprocessing of LANDSAT data; (2) the training of a classifier; (3) maximum likelihood classification and precision; (4) geometric correction; (5) class description; (6) digitizing; (7) digital terrain data; (8) an overview of sample design; (9) allocation and selection of primary sample units; (10) interpretation of secondary sample units; (11) data collection ground plots; (12) data reductions; (13) analysis for productivity estimation and map verification; (14) cost analysis; and (150) LANDSAT digital products. The evaluation of the pre-inventory planning for P.J. is included.

  15. Space shuttle main engine controller assembly, phase C-D. [with lagging system design and analysis

    NASA Technical Reports Server (NTRS)

    1973-01-01

    System design and system analysis and simulation are slightly behind schedule, while design verification testing has improved. Input/output circuit design has improved, but digital computer unit (DCU) and mechanical design continue to lag. Part procurement was impacted by delays in printed circuit board, assembly drawing releases. These are the result of problems in generating suitable printed circuit artwork for the very complex and high density multilayer boards.

  16. The Effects of Sentence Imitation and Picture Verification on the Recall of Subsequent Digits. Lektos: Interdisciplinary Working Papers in Language Sciences, Vol. 3, No. 1.

    ERIC Educational Resources Information Center

    Scholes, Robert J.; And Others

    The effects of sentence imitation and picture verification on the recall of subsequent digits were studied. Stimuli consisted of 20 sentences, each sentence followed by a string of five digit names, and five structural types of sentences were presented. Subjects were instructed to listen to the sentence and digit string and then either immediately…

  17. Digital system upset. The effects of simulated lightning-induced transients on a general-purpose microprocessor

    NASA Technical Reports Server (NTRS)

    Belcastro, C. M.

    1983-01-01

    Flight critical computer based control systems designed for advanced aircraft must exhibit ultrareliable performance in lightning charged environments. Digital system upset can occur as a result of lightning induced electrical transients, and a methodology was developed to test specific digital systems for upset susceptibility. Initial upset data indicates that there are several distinct upset modes and that the occurrence of upset is related to the relative synchronization of the transient input with the processing sate of the digital system. A large upset test data base will aid in the formulation and verification of analytical upset reliability modeling techniques which are being developed.

  18. Partially filled electrodes for digital microfluidic devices

    NASA Astrophysics Data System (ADS)

    Pyne, D. G.; Salman, W. M.; Abdelgawad, M.; Sun, Y.

    2013-07-01

    As digital microfluidics technology evolves, the need for integrating additional elements (e.g., sensing/detection and heating elements) on the electrode increases. Consequently, electrode area for droplet actuation is reduced to create space for accommodating these additional elements, which undesirably affects force generation. Electrodes cannot simply be scaled larger to compensate for this loss of force, as this would also increase droplet volume and thereby compromise the advantages thought in miniaturization. Here, we present a study evaluating, numerically with preliminary experimental verification, different partially filled electrode designs and suggesting designs that combine high actuation forces with a large reduction in electrode area.

  19. Verification Failures: What to Do When Things Go Wrong

    NASA Astrophysics Data System (ADS)

    Bertacco, Valeria

    Every integrated circuit is released with latent bugs. The damage and risk implied by an escaped bug ranges from almost imperceptible to potential tragedy; unfortunately it is impossible to discern within this range before a bug has been exposed and analyzed. While the past few decades have witnessed significant efforts to improve verification methodology for hardware systems, these efforts have been far outstripped by the massive complexity of modern digital designs, leading to product releases for which an always smaller fraction of system's states has been verified. The news of escaped bugs in large market designs and/or safety critical domains is alarming because of safety and cost implications (due to replacements, lawsuits, etc.).

  20. 37 CFR 382.7 - Verification of royalty payments.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... payments. 382.7 Section 382.7 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.7 Verification of royalty...

  1. 37 CFR 382.6 - Verification of royalty payments.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... payments. 382.6 Section 382.6 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.6 Verification of royalty...

  2. 37 CFR 382.5 - Verification of statements of account.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... account. 382.5 Section 382.5 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.5 Verification of statements...

  3. 37 CFR 382.7 - Verification of royalty payments.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... payments. 382.7 Section 382.7 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.7 Verification of royalty...

  4. 37 CFR 382.6 - Verification of statements of account.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... account. 382.6 Section 382.6 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.6 Verification of statements...

  5. 37 CFR 382.6 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... payments. 382.6 Section 382.6 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.6 Verification of royalty...

  6. 37 CFR 382.5 - Verification of statements of account.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... account. 382.5 Section 382.5 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.5 Verification of statements...

  7. 37 CFR 382.5 - Verification of statements of account.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... account. 382.5 Section 382.5 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.5 Verification of statements...

  8. 37 CFR 382.6 - Verification of statements of account.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... account. 382.6 Section 382.6 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.6 Verification of statements...

  9. 37 CFR 382.6 - Verification of royalty payments.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... payments. 382.6 Section 382.6 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Subscription Services § 382.6 Verification of royalty...

  10. Computer Program User’s Manual for FIREFINDER Digital Topographic Data Verification Library Dubbing System,

    DTIC Science & Technology

    1981-11-30

    COMPUTER PROGRAM USER’S MANUAL FOR FIREFINDER DIGITAL TOPOGRAPHIC DATA VERIFICATION LIBRARY DUBBING SYSTEM 30 NOVEMBER 1981 by: Marie Ceres Leslie R...Library .............................. 1-2 1.2.3 Dubbing .......................... 1-2 1.3 Library Process Overview ..................... 1-3 2 LIBRARY...RPOSE AND SCOPE This manual describes the computer programs for the FIREFINDER Digital Topographic Data Veri fication-Library- Dubbing System (FFDTDVLDS

  11. Formal specification and verification of a fault-masking and transient-recovery model for digital flight-control systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1991-01-01

    The formal specification and mechanically checked verification for a model of fault-masking and transient-recovery among the replicated computers of digital flight-control systems are presented. The verification establishes, subject to certain carefully stated assumptions, that faults among the component computers are masked so that commands sent to the actuators are the same as those that would be sent by a single computer that suffers no failures.

  12. Integrated Short Range, Low Bandwidth, Wearable Communications Networking Technologies

    DTIC Science & Technology

    2012-04-30

    Only (FOUO) Table of Contents Introduction 7 Research Discussions 7 1 Specifications 8 2 SAN Radio 9 2.1 R.F. Design Improvements 9 2.1.1 LNA...Characterization and Verification Testing 26 2.2 Digital Design Improvements 26 2.2.1 Improve Processor Access to Memory Resources 26 2.2.2...integrated and tested . A hybrid architecture of the automatic gain control (AGC) was designed to Page 7 of 116 For Official Use Only (FOUO

  13. An engineering methodology for implementing and testing VLSI (Very Large Scale Integrated) circuits

    NASA Astrophysics Data System (ADS)

    Corliss, Walter F., II

    1989-03-01

    The engineering methodology for producing a fully tested VLSI chip from a design layout is presented. A 16-bit correlator, NPS CORN88, that was previously designed, was used as a vehicle to demonstrate this methodology. The study of the design and simulation tools, MAGIC and MOSSIM II, was the focus of the design and validation process. The design was then implemented and the chip was fabricated by MOSIS. This fabricated chip was then used to develop a testing methodology for using the digital test facilities at NPS. NPS CORN88 was the first full custom VLSI chip, designed at NPS, to be tested with the NPS digital analysis system, Tektronix DAS 9100 series tester. The capabilities and limitations of these test facilities are examined. NPS CORN88 test results are included to demonstrate the capabilities of the digital test system. A translator, MOS2DAS, was developed to convert the MOSSIM II simulation program to the input files required by the DAS 9100 device verification software, 91DVS. Finally, a tutorial for using the digital test facilities, including the DAS 9100 and associated support equipments, is included as an appendix.

  14. A REVIEW OF HUMAN-SYSTEM INTERFACE DESIGN ISSUES OBSERVED DURING ANALOG-TO-DIGITAL AND DIGITAL-TO-DIGITAL MIGRATIONS IN U.S. NUCLEAR POWER PLANTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovesdi, C.; Joe, J.

    The United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is developing a scientific basis through targeted research and development (R&D) to support the U.S. nuclear power plant (NPP) fleet in extending their existing licensing period and ensuring their long-term reliability, productivity, safety, and security. Over the last several years, human factors engineering (HFE) professionals at the Idaho National Laboratory (INL) have supported the LWRS Advanced Instrumentation, Information, and Control (II&C) System Technologies pathway across several U.S. commercial NPPs in analog-to-digital migrations (i.e., turbine control systems) and digital-to-digital migrations (i.e., Safety Parameter Display System). These effortsmore » have included in-depth human factors evaluation of proposed human-system interface (HSI) design concepts against established U.S. Nuclear Regulatory Commission (NRC) design guidelines from NUREG-0700, Rev 2 to inform subsequent HSI design prior to transitioning into Verification and Validation. This paper discusses some of the overarching design issues observed from these past HFE evaluations. In addition, this work presents some observed challenges such as common tradeoffs utilities are likely to face when introducing new HSI technologies into NPP hybrid control rooms. The primary purpose of this work is to distill these observed design issues into general HSI design guidance that industry can use in early stages of HSI design.« less

  15. Digital data storage systems, computers, and data verification methods

    DOEpatents

    Groeneveld, Bennett J.; Austad, Wayne E.; Walsh, Stuart C.; Herring, Catherine A.

    2005-12-27

    Digital data storage systems, computers, and data verification methods are provided. According to a first aspect of the invention, a computer includes an interface adapted to couple with a dynamic database; and processing circuitry configured to provide a first hash from digital data stored within a portion of the dynamic database at an initial moment in time, to provide a second hash from digital data stored within the portion of the dynamic database at a subsequent moment in time, and to compare the first hash and the second hash.

  16. An integrated user-oriented laboratory for verification of digital flight control systems: Features and capabilities

    NASA Technical Reports Server (NTRS)

    Defeo, P.; Doane, D.; Saito, J.

    1982-01-01

    A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.

  17. 37 CFR 201.29 - Access to, and confidentiality of, Statements of Account, Verification Auditor's Reports, and...

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... filed in the Copyright Office for digital audio recording devices or media. 201.29 Section 201.29 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS COPYRIGHT OFFICE AND PROCEDURES...'s Reports, and other verification information filed in the Copyright Office for digital audio...

  18. 37 CFR 201.29 - Access to, and confidentiality of, Statements of Account, Verification Auditor's Reports, and...

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... filed in the Copyright Office for digital audio recording devices or media. 201.29 Section 201.29 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS COPYRIGHT OFFICE AND PROCEDURES...'s Reports, and other verification information filed in the Copyright Office for digital audio...

  19. 37 CFR 201.29 - Access to, and confidentiality of, Statements of Account, Verification Auditor's Reports, and...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... filed in the Copyright Office for digital audio recording devices or media. 201.29 Section 201.29 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS COPYRIGHT OFFICE AND PROCEDURES...'s Reports, and other verification information filed in the Copyright Office for digital audio...

  20. 37 CFR 201.29 - Access to, and confidentiality of, Statements of Account, Verification Auditor's Reports, and...

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... filed in the Copyright Office for digital audio recording devices or media. 201.29 Section 201.29 Patents, Trademarks, and Copyrights U.S. COPYRIGHT OFFICE, LIBRARY OF CONGRESS COPYRIGHT OFFICE AND... Auditor's Reports, and other verification information filed in the Copyright Office for digital audio...

  1. DRS: Derivational Reasoning System

    NASA Technical Reports Server (NTRS)

    Bose, Bhaskar

    1995-01-01

    The high reliability requirements for airborne systems requires fault-tolerant architectures to address failures in the presence of physical faults, and the elimination of design flaws during the specification and validation phase of the design cycle. Although much progress has been made in developing methods to address physical faults, design flaws remain a serious problem. Formal methods provides a mathematical basis for removing design flaws from digital systems. DRS (Derivational Reasoning System) is a formal design tool based on advanced research in mathematical modeling and formal synthesis. The system implements a basic design algebra for synthesizing digital circuit descriptions from high level functional specifications. DRS incorporates an executable specification language, a set of correctness preserving transformations, verification interface, and a logic synthesis interface, making it a powerful tool for realizing hardware from abstract specifications. DRS integrates recent advances in transformational reasoning, automated theorem proving and high-level CAD synthesis systems in order to provide enhanced reliability in designs with reduced time and cost.

  2. Design of lightning protection for a full-authority digital engine control

    NASA Technical Reports Server (NTRS)

    Dargi, M.; Rupke, E.; Wiles, K.

    1991-01-01

    The steps and procedures are described which are necessary to achieve a successful lightning-protection design for a state-of-the-art Full-Authority Digital Engine Control (FADEC) system. The engine and control systems used as examples are fictional, but the design and verification methods are real. Topics discussed include: applicable airworthiness regulation, selection of equipment transient design and control levels for the engine/airframe and intra-engine segments of the system, the use of cable shields, terminal-protection devices and filter circuits in hardware protection design, and software approaches to minimize upset potential. Shield terminations, grounding, and bonding are also discussed, as are the important elements of certification and test plans, and the role of tests and analyses. Also included are examples of multiple-stroke and multiple-burst testing. A review of design pitfalls and challenges, and status of applicable test standards such as RTCA DO-160, Section 22, are presented.

  3. Synesthesia affects verification of simple arithmetic equations.

    PubMed

    Ghirardelli, Thomas G; Mills, Carol Bergfeld; Zilioli, Monica K C; Bailey, Leah P; Kretschmar, Paige K

    2010-01-01

    To investigate the effects of color-digit synesthesia on numerical representation, we presented a synesthete, called SE, in the present study, and controls with mathematical equations for verification. In Experiment 1, SE verified addition equations made up of digits that either matched or mismatched her color-digit photisms or were in black. In Experiment 2A, the addends were presented in the different color conditions and the solution was presented in black, whereas in Experiment 2B the addends were presented in black and the solutions were presented in the different color conditions. In Experiment 3, multiplication and division equations were presented in the same color conditions as in Experiment 1. SE responded significantly faster to equations that matched her photisms than to those that did not; controls did not show this effect. These results suggest that photisms influence the processing of digits in arithmetic verification, replicating and extending previous findings.

  4. Formal methods and digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1993-01-01

    This report has been prepared to supplement a forthcoming chapter on formal methods in the FAA Digital Systems Validation Handbook. Its purpose is as follows: to outline the technical basis for formal methods in computer science; to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used on board aircraft; and to suggest factors for consideration when formal methods are offered in support of certification. These latter factors assume the context for software development and assurance described in RTCA document DO-178B, 'Software Considerations in Airborne Systems and Equipment Certification,' Dec. 1992.

  5. Formal verification of an avionics microprocessor

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam, K.; Miller, Steven P.

    1995-01-01

    Formal specification combined with mechanical verification is a promising approach for achieving the extremely high levels of assurance required of safety-critical digital systems. However, many questions remain regarding their use in practice: Can these techniques scale up to industrial systems, where are they likely to be useful, and how should industry go about incorporating them into practice? This report discusses a project undertaken to answer some of these questions, the formal verification of the AAMPS microprocessor. This project consisted of formally specifying in the PVS language a rockwell proprietary microprocessor at both the instruction-set and register-transfer levels and using the PVS theorem prover to show that the microcode correctly implemented the instruction-level specification for a representative subset of instructions. Notable aspects of this project include the use of a formal specification language by practicing hardware and software engineers, the integration of traditional inspections with formal specifications, and the use of a mechanical theorem prover to verify a portion of a commercial, pipelined microprocessor that was not explicitly designed for formal verification.

  6. Investigation of an advanced fault tolerant integrated avionics system

    NASA Technical Reports Server (NTRS)

    Dunn, W. R.; Cottrell, D.; Flanders, J.; Javornik, A.; Rusovick, M.

    1986-01-01

    Presented is an advanced, fault-tolerant multiprocessor avionics architecture as could be employed in an advanced rotorcraft such as LHX. The processor structure is designed to interface with existing digital avionics systems and concepts including the Army Digital Avionics System (ADAS) cockpit/display system, navaid and communications suites, integrated sensing suite, and the Advanced Digital Optical Control System (ADOCS). The report defines mission, maintenance and safety-of-flight reliability goals as might be expected for an operational LHX aircraft. Based on use of a modular, compact (16-bit) microprocessor card family, results of a preliminary study examining simplex, dual and standby-sparing architectures is presented. Given the stated constraints, it is shown that the dual architecture is best suited to meet reliability goals with minimum hardware and software overhead. The report presents hardware and software design considerations for realizing the architecture including redundancy management requirements and techniques as well as verification and validation needs and methods.

  7. Design of the front end electronics for the infrared camera of JEM-EUSO, and manufacturing and verification of the prototype model

    NASA Astrophysics Data System (ADS)

    Maroto, Oscar; Diez-Merino, Laura; Carbonell, Jordi; Tomàs, Albert; Reyes, Marcos; Joven-Alvarez, Enrique; Martín, Yolanda; Morales de los Ríos, J. A.; del Peral, Luis; Rodríguez-Frías, M. D.

    2014-07-01

    The Japanese Experiment Module (JEM) Extreme Universe Space Observatory (EUSO) will be launched and attached to the Japanese module of the International Space Station (ISS). Its aim is to observe UV photon tracks produced by ultra-high energy cosmic rays developing in the atmosphere and producing extensive air showers. The key element of the instrument is a very wide-field, very fast, large-lense telescope that can detect extreme energy particles with energy above 1019 eV. The Atmospheric Monitoring System (AMS), comprising, among others, the Infrared Camera (IRCAM), which is the Spanish contribution, plays a fundamental role in the understanding of the atmospheric conditions in the Field of View (FoV) of the telescope. It is used to detect the temperature of clouds and to obtain the cloud coverage and cloud top altitude during the observation period of the JEM-EUSO main instrument. SENER is responsible for the preliminary design of the Front End Electronics (FEE) of the Infrared Camera, based on an uncooled microbolometer, and the manufacturing and verification of the prototype model. This paper describes the flight design drivers and key factors to achieve the target features, namely, detector biasing with electrical noise better than 100μV from 1Hz to 10MHz, temperature control of the microbolometer, from 10°C to 40°C with stability better than 10mK over 4.8hours, low noise high bandwidth amplifier adaptation of the microbolometer output to differential input before analog to digital conversion, housekeeping generation, microbolometer control, and image accumulation for noise reduction. It also shows the modifications implemented in the FEE prototype design to perform a trade-off of different technologies, such as the convenience of using linear or switched regulation for the temperature control, the possibility to check the camera performances when both microbolometer and analog electronics are moved further away from the power and digital electronics, and the addition of switching regulators to demonstrate the design is immune to the electrical noise the switching converters introduce. Finally, the results obtained during the verification phase are presented: FEE limitations, verification results, including FEE noise for each channel and its equivalent NETD and microbolometer temperature stability achieved, technologies trade-off, lessons learnt, and design improvement to implement in future project phases.

  8. Use of maxillofacial laboratory materials to construct a tissue-equivalent head phantom with removable titanium implantable devices for use in verification of the dose of intensity-modulated radiotherapy.

    PubMed

    Morris, K

    2017-06-01

    The dose of radiotherapy is often verified by measuring the dose of radiation at specific points within a phantom. The presence of high-density implant materials such as titanium, however, may cause complications both during calculation and delivery of the dose. Numerous studies have reported photon/electron backscatter and alteration of the dose by high-density implants, but we know of no evidence of a dosimetry phantom that incorporates high density implants or fixtures. The aim of the study was to design and manufacture a tissue-equivalent head phantom for use in verification of the dose in radiotherapy using a combination of traditional laboratory materials and techniques and 3-dimensional technology that can incorporate titanium maxillofacial devices. Digital designs were used together with Mimics® 18.0 (Materialise NV) and FreeForm® software. DICOM data were downloaded and manipulated into the final pieces of the phantom mould. Three-dimensional digital objects were converted into STL files and exported for additional stereolithography. Phantoms were constructed in four stages: material testing and selection, design of a 3-dimensional mould, manufacture of implants, and final fabrication of the phantom using traditional laboratory techniques. Three tissue-equivalent materials were found and used to successfully manufacture a suitable phantom with interchangeable sections that contained three versions of titanium maxillofacial implants. Maxillofacial and other materials can be used to successfully construct a head phantom with interchangeable titanium implant sections for use in verification of doses of radiotherapy. Crown Copyright © 2017. Published by Elsevier Ltd. All rights reserved.

  9. Multi-Mission System Architecture Platform: Design and Verification of the Remote Engineering Unit

    NASA Technical Reports Server (NTRS)

    Sartori, John

    2005-01-01

    The Multi-Mission System Architecture Platform (MSAP) represents an effort to bolster efficiency in the spacecraft design process. By incorporating essential spacecraft functionality into a modular, expandable system, the MSAP provides a foundation on which future spacecraft missions can be developed. Once completed, the MSAP will provide support for missions with varying objectives, while maintaining a level of standardization that will minimize redesign of general system components. One subsystem of the MSAP, the Remote Engineering Unit (REU), functions by gathering engineering telemetry from strategic points on the spacecraft and providing these measurements to the spacecraft's Command and Data Handling (C&DH) subsystem. Before the MSAP Project reaches completion, all hardware, including the REU, must be verified. However, the speed and complexity of the REU circuitry rules out the possibility of physical prototyping. Instead, the MSAP hardware is designed and verified using the Verilog Hardware Definition Language (HDL). An increasingly popular means of digital design, HDL programming provides a level of abstraction, which allows the designer to focus on functionality while logic synthesis tools take care of gate-level design and optimization. As verification of the REU proceeds, errors are quickly remedied, preventing costly changes during hardware validation. After undergoing the careful, iterative processes of verification and validation, the REU and MSAP will prove their readiness for use in a multitude of spacecraft missions.

  10. Digital video system for on-line portal verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Konrad W.; Shalev, Shlomo; Cosby, N. Scott

    1990-07-01

    A digital system has been developed for on-line acquisition, processing and display of portal images during radiation therapy treatment. A metal/phosphor screen combination is the primary detector, where the conversion from high-energy photons to visible light takes place. A mirror angled at 45 degrees reflects the primary image to a low-light-level camera, which is removed from the direct radiation beam. The image registered by the camera is digitized, processed and displayed on a CRT monitor. Advanced digital techniques for processing of on-line images have been developed and implemented to enhance image contrast and suppress the noise. Some elements of automated radiotherapy treatment verification have been introduced.

  11. Clinical evaluation of a mobile digital specimen radiography system for intraoperative specimen verification.

    PubMed

    Wang, Yingbing; Ebuoma, Lilian; Saksena, Mansi; Liu, Bob; Specht, Michelle; Rafferty, Elizabeth

    2014-08-01

    Use of mobile digital specimen radiography systems expedites intraoperative verification of excised breast specimens. The purpose of this study was to evaluate the performance of a such a system for verifying targets. A retrospective review included 100 consecutive pairs of breast specimen radiographs. Specimens were imaged in the operating room with a mobile digital specimen radiography system and then with a conventional digital mammography system in the radiology department. Two expert reviewers independently scored each image for image quality on a 3-point scale and confidence in target visualization on a 5-point scale. A target was considered confidently verified only if both reviewers declared the target to be confidently detected. The 100 specimens contained a total of 174 targets, including 85 clips (49%), 53 calcifications (30%), 35 masses (20%), and one architectural distortion (1%). Although a significantly higher percentage of mobile digital specimen radiographs were considered poor quality by at least one reviewer (25%) compared with conventional digital mammograms (1%), 169 targets (97%), were confidently verified with mobile specimen radiography; 172 targets (98%) were verified with conventional digital mammography. Three faint masses were not confidently verified with mobile specimen radiography, and conventional digital mammography was needed for confirmation. One faint mass and one architectural distortion were not confidently verified with either method. Mobile digital specimen radiography allows high diagnostic confidence for verification of target excision in breast specimens across target types, despite lower image quality. Substituting this modality for conventional digital mammography can eliminate delays associated with specimen transport, potentially decreasing surgical duration and increasing operating room throughput.

  12. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... time of verification request; (6) The name of a contact person at the seller's company, including... digital image of the prescription), that was presented to the seller by the patient or prescriber. (2) For...

  13. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... time of verification request; (6) The name of a contact person at the seller's company, including... digital image of the prescription), that was presented to the seller by the patient or prescriber. (2) For...

  14. Two high accuracy digital integrators for Rogowski current transducers.

    PubMed

    Luo, Pan-dian; Li, Hong-bin; Li, Zhen-hua

    2014-01-01

    The Rogowski current transducers have been widely used in AC current measurement, but their accuracy is mainly subject to the analog integrators, which have typical problems such as poor long-term stability and being susceptible to environmental conditions. The digital integrators can be another choice, but they cannot obtain a stable and accurate output for the reason that the DC component in original signal can be accumulated, which will lead to output DC drift. Unknown initial conditions can also result in integral output DC offset. This paper proposes two improved digital integrators used in Rogowski current transducers instead of traditional analog integrators for high measuring accuracy. A proportional-integral-derivative (PID) feedback controller and an attenuation coefficient have been applied in improving the Al-Alaoui integrator to change its DC response and get an ideal frequency response. For the special design in the field of digital signal processing, the improved digital integrators have better performance than analog integrators. Simulation models are built for the purpose of verification and comparison. The experiments prove that the designed integrators can achieve higher accuracy than analog integrators in steady-state response, transient-state response, and temperature changing condition.

  15. Two high accuracy digital integrators for Rogowski current transducers

    NASA Astrophysics Data System (ADS)

    Luo, Pan-dian; Li, Hong-bin; Li, Zhen-hua

    2014-01-01

    The Rogowski current transducers have been widely used in AC current measurement, but their accuracy is mainly subject to the analog integrators, which have typical problems such as poor long-term stability and being susceptible to environmental conditions. The digital integrators can be another choice, but they cannot obtain a stable and accurate output for the reason that the DC component in original signal can be accumulated, which will lead to output DC drift. Unknown initial conditions can also result in integral output DC offset. This paper proposes two improved digital integrators used in Rogowski current transducers instead of traditional analog integrators for high measuring accuracy. A proportional-integral-derivative (PID) feedback controller and an attenuation coefficient have been applied in improving the Al-Alaoui integrator to change its DC response and get an ideal frequency response. For the special design in the field of digital signal processing, the improved digital integrators have better performance than analog integrators. Simulation models are built for the purpose of verification and comparison. The experiments prove that the designed integrators can achieve higher accuracy than analog integrators in steady-state response, transient-state response, and temperature changing condition.

  16. Signature Verification Using N-tuple Learning Machine.

    PubMed

    Maneechot, Thanin; Kitjaidure, Yuttana

    2005-01-01

    This research presents new algorithm for signature verification using N-tuple learning machine. The features are taken from handwritten signature on Digital Tablet (On-line). This research develops recognition algorithm using four features extraction, namely horizontal and vertical pen tip position(x-y position), pen tip pressure, and pen altitude angles. Verification uses N-tuple technique with Gaussian thresholding.

  17. The BOEING 777 - concurrent engineering and digital pre-assembly

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Abarbanel, B.

    The processes created on the 777 for checking designs were called {open_quotes}digital pre-assembly{close_quotes}. Using FlyThru(tm), a spin-off of a Boeing advanced computing research project, engineers were able to view up to 1500 models (15000 solids) in 3d traversing that data at high speed. FlyThru(tm) was rapidly deployed in 1991 to meet the needs of the 777 for large scale product visualization and verification. The digital pre-assembly process has bad fantastic results. The 777 has had far fewer assembly and systems problems compared to previous airplane programs. Today, FlyThru(tm) is installed on hundreds of workstations on almost every airplane program, andmore » is being used on Space Station, F22, AWACS, and other defense projects. It`s applications have gone far beyond just design review. In many ways, FlyThru is a Data Warehouse supported by advanced tools for analysis. It is today being integrated with Knowledge Based Engineering geometry generation tools.« less

  18. A real-time simulator of a turbofan engine

    NASA Technical Reports Server (NTRS)

    Litt, Jonathan S.; Delaat, John C.; Merrill, Walter C.

    1989-01-01

    A real-time digital simulator of a Pratt and Whitney F100 engine has been developed for real-time code verification and for actuator diagnosis during full-scale engine testing. This self-contained unit can operate in an open-loop stand-alone mode or as part of closed-loop control system. It can also be used for control system design and development. Tests conducted in conjunction with the NASA Advanced Detection, Isolation, and Accommodation program show that the simulator is a valuable tool for real-time code verification and as a real-time actuator simulator for actuator fault diagnosis. Although currently a small perturbation model, advances in microprocessor hardware should allow the simulator to evolve into a real-time, full-envelope, full engine simulation.

  19. 78 FR 47804 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ..., ``Configuration Management Plans for Digital Computer Software used in Safety Systems of Nuclear Power Plants... Digital Computer Software Used in Safety Systems of Nuclear Power Plants AGENCY: Nuclear Regulatory..., Reviews, and Audits for Digital Computer Software Used in Safety Systems of Nuclear Power Plants.'' This...

  20. Sampled-Data Techniques Applied to a Digital Controller for an Altitude Autopilot

    NASA Technical Reports Server (NTRS)

    Schmidt, Stanley F.; Harper, Eleanor V.

    1959-01-01

    Sampled-data theory, using the Z transformation, is applied to the design of a digital controller for an aircraft-altitude autopilot. Particular attention is focused on the sensitivity of the design to parameter variations and the abruptness of the response, that is, the normal acceleration required to carry out a transient maneuver. Consideration of these two characteristics of the system has shown that the finite settling time design method produces an unacceptable system, primarily because of the high sensitivity of the response to parameter variations, although abruptness can be controlled by increasing the sampling period. Also demonstrated is the importance of having well-damped poles or zeros if cancellation is attempted in the design methods. A different method of smoothing the response and obtaining a design which is not excessively sensitive is proposed, and examples are carried through to demonstrate the validity of the procedure. This method is based on design concepts of continuous systems, and it is shown that if no pole-zero cancellations are allowed in the design, one can obtain a response which is not too abrupt, is relatively insensitive to parameter variations, and is not sensitive to practical limits on control-surface rate. This particular design also has the simplest possible pulse transfer function for the digital controller. Simulation techniques and root loci are used for the verification of the design philosophy.

  1. A HUMAN FACTORS ENGINEERING PROCESS TO SUPPORT HUMAN-SYSTEM INTERFACE DESIGN IN CONTROL ROOM MODERNIZATION

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kovesdi, C.; Joe, J.; Boring, R.

    The primary objective of the United States (U.S.) Department of Energy (DOE) Light Water Reactor Sustainability (LWRS) program is to sustain operation of the existing commercial nuclear power plants (NPPs) through a multi-pathway approach in conducting research and development (R&D). The Advanced Instrumentation, Information, and Control (II&C) System Technologies pathway conducts targeted R&D to address aging and reliability concerns with legacy instrumentation and control (I&C) and other information systems in existing U.S. NPPs. Control room modernization is an important part following this pathway, and human factors experts at Idaho National Laboratory (INL) have been involved in conducting R&D to supportmore » migration of new digital main control room (MCR) technologies from legacy analog and legacy digital I&C. This paper describes a human factors engineering (HFE) process that supports human-system interface (HSI) design in MCR modernization activities, particularly with migration of old digital to new digital I&C. The process described in this work is an expansion from the LWRS Report INL/EXT-16-38576, and is a requirements-driven approach that aligns with NUREG-0711 requirements. The work described builds upon the existing literature by adding more detail around key tasks and decisions to make when transitioning from HSI Design into Verification and Validation (V&V). The overall objective of this process is to inform HSI design and elicit specific, measurable, and achievable human factors criteria for new digital technologies. Upon following this process, utilities should have greater confidence with transitioning from HSI design into V&V.« less

  2. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.16...

  3. 37 CFR 382.15 - Verification of royalty payments.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... payments. 382.15 Section 382.15 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.15...

  4. 37 CFR 382.15 - Verification of royalty payments.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... payments. 382.15 Section 382.15 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.15...

  5. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.16...

  6. 37 CFR 382.15 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... payments. 382.15 Section 382.15 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.15...

  7. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.16...

  8. 37 CFR 382.15 - Verification of royalty payments.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... payments. 382.15 Section 382.15 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.15...

  9. 37 CFR 382.15 - Verification of royalty payments.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... payments. 382.15 Section 382.15 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.15...

  10. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.16...

  11. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... distributions. 382.16 Section 382.16 Patents, Trademarks, and Copyrights COPYRIGHT ROYALTY BOARD, LIBRARY OF CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... SATELLITE DIGITAL AUDIO RADIO SERVICES Preexisting Satellite Digital Audio Radio Services § 382.16...

  12. Formal methods and their role in digital systems validation for airborne systems

    NASA Technical Reports Server (NTRS)

    Rushby, John

    1995-01-01

    This report is based on one prepared as a chapter for the FAA Digital Systems Validation Handbook (a guide to assist FAA certification specialists with advanced technology issues). Its purpose is to explain the use of formal methods in the specification and verification of software and hardware requirements, designs, and implementations; to identify the benefits, weaknesses, and difficulties in applying these methods to digital systems used in critical applications; and to suggest factors for consideration when formal methods are offered in support of certification. The presentation concentrates on the rationale for formal methods and on their contribution to assurance for critical applications within a context such as that provided by DO-178B (the guidelines for software used on board civil aircraft); it is intended as an introduction for those to whom these topics are new.

  13. Digital Video of Live-Scan Fingerprint Data

    National Institute of Standards and Technology Data Gateway

    NIST Digital Video of Live-Scan Fingerprint Data (PC database for purchase)   NIST Special Database 24 contains MPEG-2 (Moving Picture Experts Group) compressed digital video of live-scan fingerprint data. The database is being distributed for use in developing and testing of fingerprint verification systems.

  14. Dynamic analysis methods for detecting anomalies in asynchronously interacting systems

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kumar, Akshat; Solis, John Hector; Matschke, Benjamin

    2014-01-01

    Detecting modifications to digital system designs, whether malicious or benign, is problematic due to the complexity of the systems being analyzed. Moreover, static analysis techniques and tools can only be used during the initial design and implementation phases to verify safety and liveness properties. It is computationally intractable to guarantee that any previously verified properties still hold after a system, or even a single component, has been produced by a third-party manufacturer. In this paper we explore new approaches for creating a robust system design by investigating highly-structured computational models that simplify verification and analysis. Our approach avoids the needmore » to fully reconstruct the implemented system by incorporating a small verification component that dynamically detects for deviations from the design specification at run-time. The first approach encodes information extracted from the original system design algebraically into a verification component. During run-time this component randomly queries the implementation for trace information and verifies that no design-level properties have been violated. If any deviation is detected then a pre-specified fail-safe or notification behavior is triggered. Our second approach utilizes a partitioning methodology to view liveness and safety properties as a distributed decision task and the implementation as a proposed protocol that solves this task. Thus the problem of verifying safety and liveness properties is translated to that of verifying that the implementation solves the associated decision task. We develop upon results from distributed systems and algebraic topology to construct a learning mechanism for verifying safety and liveness properties from samples of run-time executions.« less

  15. Noise Source Visualization Using a Digital Voice Recorder and Low-Cost Sensors

    PubMed Central

    Cho, Yong Thung

    2018-01-01

    Accurate sound visualization of noise sources is required for optimal noise control. Typically, noise measurement systems require microphones, an analog-digital converter, cables, a data acquisition system, etc., which may not be affordable for potential users. Also, many such systems are not highly portable and may not be convenient for travel. Handheld personal electronic devices such as smartphones and digital voice recorders with relatively lower costs and higher performance have become widely available recently. Even though such devices are highly portable, directly implementing them for noise measurement may lead to erroneous results since such equipment was originally designed for voice recording. In this study, external microphones were connected to a digital voice recorder to conduct measurements and the input received was processed for noise visualization. In this way, a low cost, compact sound visualization system was designed and introduced to visualize two actual noise sources for verification with different characteristics: an enclosed loud speaker and a small air compressor. Reasonable accuracy of noise visualization for these two sources was shown over a relatively wide frequency range. This very affordable and compact sound visualization system can be used for many actual noise visualization applications in addition to educational purposes. PMID:29614038

  16. Optimization and Verification of Droplet Digital PCR Even-Specific Methods for the Quantification of GM Maize DAS1507 and NK603.

    PubMed

    Grelewska-Nowotko, Katarzyna; Żurawska-Zajfert, Magdalena; Żmijewska, Ewelina; Sowa, Sławomir

    2018-05-01

    In recent years, digital polymerase chain reaction (dPCR), a new molecular biology technique, has been gaining in popularity. Among many other applications, this technique can also be used for the detection and quantification of genetically modified organisms (GMOs) in food and feed. It might replace the currently widely used real-time PCR method (qPCR), by overcoming problems related to the PCR inhibition and the requirement of certified reference materials to be used as a calibrant. In theory, validated qPCR methods can be easily transferred to the dPCR platform. However, optimization of the PCR conditions might be necessary. In this study, we report the transfer of two validated qPCR methods for quantification of maize DAS1507 and NK603 events to the droplet dPCR (ddPCR) platform. After some optimization, both methods have been verified according to the guidance of the European Network of GMO Laboratories (ENGL) on analytical method verification (ENGL working group on "Method Verification." (2011) Verification of Analytical Methods for GMO Testing When Implementing Interlaboratory Validated Methods). Digital PCR methods performed equally or better than the qPCR methods. Optimized ddPCR methods confirm their suitability for GMO determination in food and feed.

  17. Single Event Effects mitigation with TMRG tool

    NASA Astrophysics Data System (ADS)

    Kulis, S.

    2017-01-01

    Single Event Effects (SEE) are a major concern for integrated circuits exposed to radiation. There have been several techniques proposed to protect circuits against radiation-induced upsets. Among the others, the Triple Modular Redundancy (TMR) technique is one of the most popular. The purpose of the Triple Modular Redundancy Generator (TMRG) tool is to automatize the process of triplicating digital circuits freeing the designer from introducing the TMR code manually at the implementation stage. It helps to ensure that triplicated logic is maintained through the design process. Finally, the tool streamlines the process of introducing SEE in gate level simulations for final verification.

  18. 37 CFR 260.6 - Verification of royalty payments.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false Verification of royalty payments. 260.6 Section 260.6 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS... SERVICES' DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND MAKING OF EPHEMERAL PHONORECORDS § 260.6...

  19. 37 CFR 260.6 - Verification of royalty payments.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Verification of royalty payments. 260.6 Section 260.6 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS... SERVICES' DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND MAKING OF EPHEMERAL PHONORECORDS § 260.6...

  20. 37 CFR 260.6 - Verification of royalty payments.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2012-07-01 2012-07-01 false Verification of royalty payments. 260.6 Section 260.6 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS... SERVICES' DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND MAKING OF EPHEMERAL PHONORECORDS § 260.6...

  1. 37 CFR 260.6 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty payments. 260.6 Section 260.6 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS... SERVICES' DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND MAKING OF EPHEMERAL PHONORECORDS § 260.6...

  2. Performance verification and system parameter identification of spacecraft tape recorder control servo

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, A. K.

    1979-01-01

    Design adequacy of the lead-lag compensator of the frequency loop, accuracy checking of the analytical expression for the electrical motor transfer function, and performance evaluation of the speed control servo of the digital tape recorder used on-board the 1976 Viking Mars Orbiters and Voyager 1977 Jupiter-Saturn flyby spacecraft are analyzed. The transfer functions of the most important parts of a simplified frequency loop used for test simulation are described and ten simulation cases are reported. The first four of these cases illustrate the method of selecting the most suitable transfer function for the hysteresis synchronous motor, while the rest verify and determine the servo performance parameters and alternative servo compensation schemes. It is concluded that the linear methods provide a starting point for the final verification/refinement of servo design by nonlinear time response simulation and that the variation of the parameters of the static/dynamic Coulomb friction is as expected in a long-life space mission environment.

  3. Software control and system configuration management - A process that works

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1983-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  4. Software control and system configuration management: A systems-wide approach

    NASA Technical Reports Server (NTRS)

    Petersen, K. L.; Flores, C., Jr.

    1984-01-01

    A comprehensive software control and system configuration management process for flight-crucial digital control systems of advanced aircraft has been developed and refined to insure efficient flight system development and safe flight operations. Because of the highly complex interactions among the hardware, software, and system elements of state-of-the-art digital flight control system designs, a systems-wide approach to configuration control and management has been used. Specific procedures are implemented to govern discrepancy reporting and reconciliation, software and hardware change control, systems verification and validation testing, and formal documentation requirements. An active and knowledgeable configuration control board reviews and approves all flight system configuration modifications and revalidation tests. This flexible process has proved effective during the development and flight testing of several research aircraft and remotely piloted research vehicles with digital flight control systems that ranged from relatively simple to highly complex, integrated mechanizations.

  5. 37 CFR 260.6 - Verification of royalty payments.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2014-07-01 2014-07-01 false Verification of royalty payments. 260.6 Section 260.6 Patents, Trademarks, and Copyrights U.S. COPYRIGHT OFFICE, LIBRARY OF... SUBSCRIPTION SERVICES' DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND MAKING OF EPHEMERAL PHONORECORDS § 260.6...

  6. 37 CFR 260.5 - Verification of statements of account.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2012-07-01 2012-07-01 false Verification of statements of account. 260.5 Section 260.5 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS... SERVICES' DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND MAKING OF EPHEMERAL PHONORECORDS § 260.5...

  7. 37 CFR 260.5 - Verification of statements of account.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Verification of statements of account. 260.5 Section 260.5 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS... SERVICES' DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND MAKING OF EPHEMERAL PHONORECORDS § 260.5...

  8. 37 CFR 260.5 - Verification of statements of account.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of statements of account. 260.5 Section 260.5 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS... SERVICES' DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND MAKING OF EPHEMERAL PHONORECORDS § 260.5...

  9. 37 CFR 260.5 - Verification of statements of account.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false Verification of statements of account. 260.5 Section 260.5 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS... SERVICES' DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND MAKING OF EPHEMERAL PHONORECORDS § 260.5...

  10. Thermal acoustic oscillations, volume 2. [cryogenic fluid storage

    NASA Technical Reports Server (NTRS)

    Spradley, L. W.; Sims, W. H.; Fan, C.

    1975-01-01

    A number of thermal acoustic oscillation phenomena and their effects on cryogenic systems were studied. The conditions which cause or suppress oscillations, the frequency, amplitude and intensity of oscillations when they exist, and the heat loss they induce are discussed. Methods of numerical analysis utilizing the digital computer were developed for use in cryogenic systems design. In addition, an experimental verification program was conducted to study oscillation wave characteristics and boiloff rate. The data were then reduced and compared with the analytical predictions.

  11. 37 CFR 260.5 - Verification of statements of account.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2014-07-01 2014-07-01 false Verification of statements of account. 260.5 Section 260.5 Patents, Trademarks, and Copyrights U.S. COPYRIGHT OFFICE, LIBRARY OF... SUBSCRIPTION SERVICES' DIGITAL TRANSMISSIONS OF SOUND RECORDINGS AND MAKING OF EPHEMERAL PHONORECORDS § 260.5...

  12. Certification of lightning protection for a full-authority digital engine control

    NASA Technical Reports Server (NTRS)

    Dargi, M.; Rupke, E.; Wiles, K.

    1991-01-01

    FADEC systems present many challenges to the lightning protection engineer. Verification of the protection-design adequacy for certification purposes presents additional challenges. The basic requirements of the certification plan of a FADEC is to demonstrate compliance with Federal Airworthiness Regulations (FAR) 25.1309 and 25.581. These FARs are intended for transport aircraft, but there are equivalent sections for general aviation aircraft, normal and transport rotorcraft. Military aircraft may have additional requirements. The criteria for demonstration of adequate lightning protection for a FADEC systems include the procedures outlined in FAA Advisory Circular (AC) 20-136, Protection of aircraft electrical/electronic systems against the indirect effects of lightning. As FADEC systems, including the interconnecting wiring, are generally not susceptible to direct attachment of lightning currents, the verification of protection against indirect effects is primarily described.

  13. An Integrated Unix-based CAD System for the Design and Testing of Custom VLSI Chips

    NASA Technical Reports Server (NTRS)

    Deutsch, L. J.

    1985-01-01

    A computer aided design (CAD) system that is being used at the Jet Propulsion Laboratory for the design of custom and semicustom very large scale integrated (VLSI) chips is described. The system consists of a Digital Equipment Corporation VAX computer with the UNIX operating system and a collection of software tools for the layout, simulation, and verification of microcircuits. Most of these tools were written by the academic community and are, therefore, available to JPL at little or no cost. Some small pieces of software have been written in-house in order to make all the tools interact with each other with a minimal amount of effort on the part of the designer.

  14. Watermarking 3D Objects for Verification

    DTIC Science & Technology

    1999-01-01

    signal (audio/ image /video) pro- cessing and steganography fields, and even newer to the computer graphics community. Inherently, digital watermarking of...quality images , and digital video. The field of digital watermarking is relatively new, and many of its terms have not been well defined. Among the dif...ferent media types, watermarking of 2D still images is comparatively better studied. Inherently, digital water- marking of 3D objects remains a

  15. A Fixed Point VHDL Component Library for a High Efficiency Reconfigurable Radio Design Methodology

    NASA Technical Reports Server (NTRS)

    Hoy, Scott D.; Figueiredo, Marco A.

    2006-01-01

    Advances in Field Programmable Gate Array (FPGA) technologies enable the implementation of reconfigurable radio systems for both ground and space applications. The development of such systems challenges the current design paradigms and requires more robust design techniques to meet the increased system complexity. Among these techniques is the development of component libraries to reduce design cycle time and to improve design verification, consequently increasing the overall efficiency of the project development process while increasing design success rates and reducing engineering costs. This paper describes the reconfigurable radio component library developed at the Software Defined Radio Applications Research Center (SARC) at Goddard Space Flight Center (GSFC) Microwave and Communications Branch (Code 567). The library is a set of fixed-point VHDL components that link the Digital Signal Processing (DSP) simulation environment with the FPGA design tools. This provides a direct synthesis path based on the latest developments of the VHDL tools as proposed by the BEE VBDL 2004 which allows for the simulation and synthesis of fixed-point math operations while maintaining bit and cycle accuracy. The VHDL Fixed Point Reconfigurable Radio Component library does not require the use of the FPGA vendor specific automatic component generators and provide a generic path from high level DSP simulations implemented in Mathworks Simulink to any FPGA device. The access to the component synthesizable, source code provides full design verification capability:

  16. 37 CFR 201.30 - Verification of Statements of Account.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2013-07-01 2013-07-01 false Verification of Statements of Account. 201.30 Section 201.30 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS... manufacturer or importer of digital devices or media who is required by 17 U.S.C. 1003 to file with the...

  17. 37 CFR 201.30 - Verification of Statements of Account.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2012-07-01 2012-07-01 false Verification of Statements of Account. 201.30 Section 201.30 Patents, Trademarks, and Copyrights COPYRIGHT OFFICE, LIBRARY OF CONGRESS... manufacturer or importer of digital devices or media who is required by 17 U.S.C. 1003 to file with the...

  18. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  19. Verification and Validation of Digitally Upgraded Control Rooms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Boring, Ronald; Lau, Nathan

    2015-09-01

    As nuclear power plants undertake main control room modernization, a challenge is the lack of a clearly defined human factors process to follow. Verification and validation (V&V) as applied in the nuclear power community has tended to involve efforts such as integrated system validation, which comes at the tail end of the design stage. To fill in guidance gaps and create a step-by-step process for control room modernization, we have developed the Guideline for Operational Nuclear Usability and Knowledge Elicitation (GONUKE). This approach builds on best practices in the software industry, which prescribe an iterative user-centered approach featuring multiple cyclesmore » of design and evaluation. Nuclear regulatory guidance for control room design emphasizes summative evaluation—which occurs after the design is complete. In the GONUKE approach, evaluation is also performed at the formative stage of design—early in the design cycle using mockups and prototypes for evaluation. The evaluation may involve expert review (e.g., software heuristic evaluation at the formative stage and design verification against human factors standards like NUREG-0700 at the summative stage). The evaluation may also involve user testing (e.g., usability testing at the formative stage and integrated system validation at the summative stage). An additional, often overlooked component of evaluation is knowledge elicitation, which captures operator insights into the system. In this report we outline these evaluation types across design phases that support the overall modernization process. The objective is to provide industry-suitable guidance for steps to be taken in support of the design and evaluation of a new human-machine interface (HMI) in the control room. We suggest the value of early-stage V&V and highlight how this early-stage V&V can help improve the design process for control room modernization. We argue that there is a need to overcome two shortcomings of V&V in current practice—the propensity for late-stage V&V and the use of increasingly complex psychological assessment measures for V&V.« less

  20. Man-rated flight software for the F-8 DFBW program

    NASA Technical Reports Server (NTRS)

    Bairnsfather, R. R.

    1975-01-01

    The design, implementation, and verification of the flight control software used in the F-8 DFBW program are discussed. Since the DFBW utilizes an Apollo computer and hardware, the procedures, controls, and basic management techniques employed are based on those developed for the Apollo software system. Program Assembly Control, simulator configuration control, erasable-memory load generation, change procedures and anomaly reporting are discussed. The primary verification tools--the all-digital simulator, the hybrid simulator, and the Iron Bird simulator--are described, as well as the program test plans and their implementation on the various simulators. Failure-effects analysis and the creation of special failure-generating software for testing purposes are described. The quality of the end product is evidenced by the F-8 DFBW flight test program in which 42 flights, totaling 58 hours of flight time, were successfully made without any DFCS inflight software, or hardware, failures.

  1. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 46 Shipping 2 2011-10-01 2011-10-01 false Design verification testing. 61.40-3 Section 61.40-3... INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  2. Efficient design of clinical trials and epidemiological research: is it possible?

    PubMed

    Lauer, Michael S; Gordon, David; Wei, Gina; Pearson, Gail

    2017-08-01

    Randomized clinical trials and large-scale, cohort studies continue to have a critical role in generating evidence in cardiovascular medicine; however, the increasing concern is that ballooning costs threaten the clinical trial enterprise. In this Perspectives article, we discuss the changing landscape of clinical research, and clinical trials in particular, focusing on reasons for the increasing costs and inefficiencies. These reasons include excessively complex design, overly restrictive inclusion and exclusion criteria, burdensome regulations, excessive source-data verification, and concerns about the effect of clinical research conduct on workflow. Thought leaders have called on the clinical research community to consider alternative, transformative business models, including those models that focus on simplicity and leveraging of digital resources. We present some examples of innovative approaches by which some investigators have successfully conducted large-scale, clinical trials at relatively low cost. These examples include randomized registry trials, cluster-randomized trials, adaptive trials, and trials that are fully embedded within digital clinical care or administrative platforms.

  3. A review of existing and emerging digital technologies to combat the global trade in fake medicines.

    PubMed

    Mackey, Tim K; Nayyar, Gaurvika

    2017-05-01

    The globalization of the pharmaceutical supply chain has introduced new challenges, chief among them, fighting the international criminal trade in fake medicines. As the manufacture, supply, and distribution of drugs becomes more complex, so does the need for innovative technology-based solutions to protect patients globally. Areas covered: We conducted a multidisciplinary review of the science/health, information technology, computer science, and general academic literature with the aim of identifying cutting-edge existing and emerging 'digital' solutions to combat fake medicines. Our review identified five distinct categories of technology including mobile, radio frequency identification, advanced computational methods, online verification, and blockchain technology. Expert opinion: Digital fake medicine solutions are unifying platforms that integrate different types of anti-counterfeiting technologies as complementary solutions, improve information sharing and data collection, and are designed to overcome existing barriers of adoption and implementation. Investment in this next generation technology is essential to ensure the future security and integrity of the global drug supply chain.

  4. Digital conversion of INEL archeological data using ARC/INFO and Oracle

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, R.D.; Brizzee, J.; White, L.

    1993-11-04

    This report documents the procedures used to convert archaeological data for the INEL to digital format, lists the equipment used, and explains the verification and validation steps taken to check data entry. It also details the production of an engineered interface between ARC/INFO and Oracle.

  5. Precision pointing and control of flexible spacecraft

    NASA Technical Reports Server (NTRS)

    Bantell, M. H., Jr.

    1987-01-01

    The problem and long term objectives for the precision pointing and control of flexible spacecraft are given. The four basic objectives are stated in terms of two principle tasks. Under Task 1, robust low order controllers, improved structural modeling methods for control applications and identification methods for structural dynamics are being developed. Under Task 2, a lab test experiment for verification of control laws and system identification algorithms is being developed. For Task 1, work has focused on robust low order controller design and some initial considerations for structural modeling in control applications. For Task 2, work has focused on experiment design and fabrication, along with sensor selection and initial digital controller implementation. Conclusions are given.

  6. Firefly: an optical lithographic system for the fabrication of holographic security labels

    NASA Astrophysics Data System (ADS)

    Calderón, Jorge; Rincón, Oscar; Amézquita, Ricardo; Pulido, Iván.; Amézquita, Sebastián.; Bernal, Andrés.; Romero, Luis; Agudelo, Viviana

    2016-03-01

    This paper introduces Firefly, an optical lithography origination system that has been developed to produce holographic masters of high quality. This mask-less lithography system has a resolution of 418 nm half-pitch, and generates holographic masters with the optical characteristics required for security applications of level 1 (visual verification), level 2 (pocket reader verification) and level 3 (forensic verification). The holographic master constitutes the main core of the manufacturing process of security holographic labels used for the authentication of products and documents worldwide. Additionally, the Firefly is equipped with a software tool that allows for the hologram design from graphic formats stored in bitmaps. The software is capable of generating and configuring basic optical effects such as animation and color, as well as effects of high complexity such as Fresnel lenses, engraves and encrypted images, among others. The Firefly technology gathers together optical lithography, digital image processing and the most advanced control systems, making possible a competitive equipment that challenges the best technologies in the industry of holographic generation around the world. In this paper, a general description of the origination system is provided as well as some examples of its capabilities.

  7. Digital model of a vacuum circuit breaker for the analysis of switching waveforms in electrical circuits

    NASA Astrophysics Data System (ADS)

    Budzisz, Joanna; Wróblewski, Zbigniew

    2016-03-01

    The article presents a method of modelling a vaccum circuit breaker in the ATP/EMTP package, the results of the verification of the correctness of the developed digital circuit breaker model operation and its practical usefulness for analysis of overvoltages and overcurrents occurring in commutated capacitive electrical circuits and also examples of digital simulations of overvoltages and overcurrents in selected electrical circuits.

  8. Energy- and time-resolved detection of prompt gamma-rays for proton range verification.

    PubMed

    Verburg, Joost M; Riley, Kent; Bortfeld, Thomas; Seco, Joao

    2013-10-21

    In this work, we present experimental results of a novel prompt gamma-ray detector for proton beam range verification. The detection system features an actively shielded cerium-doped lanthanum(III) bromide scintillator, coupled to a digital data acquisition system. The acquisition was synchronized to the cyclotron radio frequency to separate the prompt gamma-ray signals from the later-arriving neutron-induced background. We designed the detector to provide a high energy resolution and an effective reduction of background events, enabling discrete proton-induced prompt gamma lines to be resolved. Measuring discrete prompt gamma lines has several benefits for range verification. As the discrete energies correspond to specific nuclear transitions, the magnitudes of the different gamma lines have unique correlations with the proton energy and can be directly related to nuclear reaction cross sections. The quantification of discrete gamma lines also enables elemental analysis of tissue in the beam path, providing a better prediction of prompt gamma-ray yields. We present the results of experiments in which a water phantom was irradiated with proton pencil-beams in a clinical proton therapy gantry. A slit collimator was used to collimate the prompt gamma-rays, and measurements were performed at 27 positions along the path of proton beams with ranges of 9, 16 and 23 g cm(-2) in water. The magnitudes of discrete gamma lines at 4.44, 5.2 and 6.13 MeV were quantified. The prompt gamma lines were found to be clearly resolved in dimensions of energy and time, and had a reproducible correlation with the proton depth-dose curve. We conclude that the measurement of discrete prompt gamma-rays for in vivo range verification of clinical proton beams is feasible, and plan to further study methods and detector designs for clinical use.

  9. Circuit design tool. User's manual, revision 2

    NASA Technical Reports Server (NTRS)

    Miyake, Keith M.; Smith, Donald E.

    1992-01-01

    The CAM chip design was produced in a UNIX software environment using a design tool that supports definition of digital electronic modules, composition of these modules into higher level circuits, and event-driven simulation of these circuits. Our design tool provides an interface whose goals include straightforward but flexible primitive module definition and circuit composition, efficient simulation, and a debugging environment that facilitates design verification and alteration. The tool provides a set of primitive modules which can be composed into higher level circuits. Each module is a C-language subroutine that uses a set of interface protocols understood by the design tool. Primitives can be altered simply by recoding their C-code image; in addition new primitives can be added allowing higher level circuits to be described in C-code rather than as a composition of primitive modules--this feature can greatly enhance the speed of simulation.

  10. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  11. Verification, Validation and Accreditation

    DTIC Science & Technology

    2011-05-03

    5512 digital oscillatorABC_9230 Warning Module PWB component h component, c r2 socsr hhh  max. height (surface relative), hsr r1 pwbsra thh  max...Evacuation Codes Egress, Exodus, … 0.500 in.0.060 in. 20135-5512 digital oscillatorABC_9230 Warning Module PWB component component, c r2 hhh max. height

  12. 21 CFR 1311.25 - Requirements for obtaining a CSOS digital certificate.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... public keys, the corresponding private key must be used to sign the certificate request. Verification of the signature using the public key in the request will serve as proof of possession of the private key. ... certification of the public digital signature key. After the request is approved, the Certification Authority...

  13. 21 CFR 1311.25 - Requirements for obtaining a CSOS digital certificate.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... public keys, the corresponding private key must be used to sign the certificate request. Verification of the signature using the public key in the request will serve as proof of possession of the private key. ... certification of the public digital signature key. After the request is approved, the Certification Authority...

  14. 21 CFR 1311.25 - Requirements for obtaining a CSOS digital certificate.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... public keys, the corresponding private key must be used to sign the certificate request. Verification of the signature using the public key in the request will serve as proof of possession of the private key. ... certification of the public digital signature key. After the request is approved, the Certification Authority...

  15. Space shuttle engineering and operations support. Avionics system engineering

    NASA Technical Reports Server (NTRS)

    Broome, P. A.; Neubaur, R. J.; Welsh, R. T.

    1976-01-01

    The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.

  16. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  17. Two-dimensional optoelectronic interconnect-processor and its operational bit error rate

    NASA Astrophysics Data System (ADS)

    Liu, J. Jiang; Gollsneider, Brian; Chang, Wayne H.; Carhart, Gary W.; Vorontsov, Mikhail A.; Simonis, George J.; Shoop, Barry L.

    2004-10-01

    Two-dimensional (2-D) multi-channel 8x8 optical interconnect and processor system were designed and developed using complementary metal-oxide-semiconductor (CMOS) driven 850-nm vertical-cavity surface-emitting laser (VCSEL) arrays and the photodetector (PD) arrays with corresponding wavelengths. We performed operation and bit-error-rate (BER) analysis on this free-space integrated 8x8 VCSEL optical interconnects driven by silicon-on-sapphire (SOS) circuits. Pseudo-random bit stream (PRBS) data sequence was used in operation of the interconnects. Eye diagrams were measured from individual channels and analyzed using a digital oscilloscope at data rates from 155 Mb/s to 1.5 Gb/s. Using a statistical model of Gaussian distribution for the random noise in the transmission, we developed a method to compute the BER instantaneously with the digital eye-diagrams. Direct measurements on this interconnects were also taken on a standard BER tester for verification. We found that the results of two methods were in the same order and within 50% accuracy. The integrated interconnects were investigated in an optoelectronic processing architecture of digital halftoning image processor. Error diffusion networks implemented by the inherently parallel nature of photonics promise to provide high quality digital halftoned images.

  18. A high precision dual feedback discrete control system designed for satellite trajectory simulator

    NASA Astrophysics Data System (ADS)

    Liu, Ximin; Liu, Liren; Sun, Jianfeng; Xu, Nan

    2005-08-01

    Cooperating with the free-space laser communication terminals, the satellite trajectory simulator is used to test the acquisition, pointing, tracking and communicating performances of the terminals. So the satellite trajectory simulator plays an important role in terminal ground test and verification. Using the double-prism, Sun etc in our group designed a satellite trajectory simulator. In this paper, a high precision dual feedback discrete control system designed for the simulator is given and a digital fabrication of the simulator is made correspondingly. In the dual feedback discrete control system, Proportional- Integral controller is used in velocity feedback loop and Proportional- Integral- Derivative controller is used in position feedback loop. In the controller design, simplex method is introduced and an improvement to the method is made. According to the transfer function of the control system in Z domain, the digital fabrication of the simulator is given when it is exposed to mechanism error and moment disturbance. Typically, when the mechanism error is 100urad, the residual standard error of pitching angle, azimuth angle, x-coordinate position and y-coordinate position are 0.49urad, 6.12urad, 4.56urad, 4.09urad respectively. When the moment disturbance is 0.1rad, the residual standard error of pitching angle, azimuth angle, x-coordinate position and y-coordinate position are 0.26urad, 0.22urad, 0.16urad, 0.15urad respectively. The digital fabrication results demonstrate that the dual feedback discrete control system designed for the simulator can achieve the anticipated high precision performance.

  19. A design of optical modulation system with pixel-level modulation accuracy

    NASA Astrophysics Data System (ADS)

    Zheng, Shiwei; Qu, Xinghua; Feng, Wei; Liang, Baoqiu

    2018-01-01

    Vision measurement has been widely used in the field of dimensional measurement and surface metrology. However, traditional methods of vision measurement have many limits such as low dynamic range and poor reconfigurability. The optical modulation system before image formation has the advantage of high dynamic range, high accuracy and more flexibility, and the modulation accuracy is the key parameter which determines the accuracy and effectiveness of optical modulation system. In this paper, an optical modulation system with pixel level accuracy is designed and built based on multi-points reflective imaging theory and digital micromirror device (DMD). The system consisted of digital micromirror device, CCD camera and lens. Firstly we achieved accurate pixel-to-pixel correspondence between the DMD mirrors and the CCD pixels by moire fringe and an image processing of sampling and interpolation. Then we built three coordinate systems and calculated the mathematic relationship between the coordinate of digital micro-mirror and CCD pixels using a checkerboard pattern. A verification experiment proves that the correspondence error is less than 0.5 pixel. The results show that the modulation accuracy of system meets the requirements of modulation. Furthermore, the high reflecting edge of a metal circular piece can be detected using the system, which proves the effectiveness of the optical modulation system.

  20. SSME Alternate Turbopump Development Program: Design verification specification for high-pressure fuel turbopump

    NASA Technical Reports Server (NTRS)

    1989-01-01

    The design and verification requirements are defined which are appropriate to hardware at the detail, subassembly, component, and engine levels and to correlate these requirements to the development demonstrations which provides verification that design objectives are achieved. The high pressure fuel turbopump requirements verification matrix provides correlation between design requirements and the tests required to verify that the requirement have been met.

  1. 12 CFR 7.5005 - National bank acting as digital certification authority.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... customer as of the current or a previous date, such as account balance as of a particular date, lines of credit as of a particular date, past financial performance of the customer, and verification of customer relationship with the bank as of a particular date. (c) When a national bank issues a digital certificate...

  2. Design and prototyping of a wristband-type wireless photoplethysmographic device for heart rate variability signal analysis.

    PubMed

    Ghamari, M; Soltanpur, C; Cabrera, S; Romero, R; Martinek, R; Nazeran, H

    2016-08-01

    Heart Rate Variability (HRV) signal analysis provides a quantitative marker of the Autonomic Nervous System (ANS) function. A wristband-type wireless photoplethysmographic (PPG) device was custom-designed to collect and analyze the arterial pulse in the wrist. The proposed device is comprised of an optical sensor to monitor arterial pulse, a signal conditioning unit to filter and amplify the analog PPG signal, a microcontroller to digitize the analog PPG signal, and a Bluetooth module to transfer the data to a smart device. This paper proposes a novel model to represent the PPG signal as the summation of two Gaussian functions. The paper concludes with a verification procedure for HRV signal analysis during sedentary activities.

  3. SWARM: A 32 GHz Correlator and VLBI Beamformer for the Submillimeter Array

    NASA Astrophysics Data System (ADS)

    Primiani, Rurik A.; Young, Kenneth H.; Young, André; Patel, Nimesh; Wilson, Robert W.; Vertatschitsch, Laura; Chitwood, Billie B.; Srinivasan, Ranjani; MacMahon, David; Weintroub, Jonathan

    2016-03-01

    A 32GHz bandwidth VLBI capable correlator and phased array has been designed and deployeda at the Smithsonian Astrophysical Observatory’s Submillimeter Array (SMA). The SMA Wideband Astronomical ROACH2 Machine (SWARM) integrates two instruments: a correlator with 140kHz spectral resolution across its full 32GHz band, used for connected interferometric observations, and a phased array summer used when the SMA participates as a station in the Event Horizon Telescope (EHT) very long baseline interferometry (VLBI) array. For each SWARM quadrant, Reconfigurable Open Architecture Computing Hardware (ROACH2) units shared under open-source from the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER) are equipped with a pair of ultra-fast analog-to-digital converters (ADCs), a field programmable gate array (FPGA) processor, and eight 10 Gigabit Ethernet (GbE) ports. A VLBI data recorder interface designated the SWARM digital back end, or SDBE, is implemented with a ninth ROACH2 per quadrant, feeding four Mark6 VLBI recorders with an aggregate recording rate of 64 Gbps. This paper describes the design and implementation of SWARM, as well as its deployment at SMA with reference to verification and science data.

  4. Study of techniques for redundancy verification without disrupting systems, phases 1-3

    NASA Technical Reports Server (NTRS)

    1970-01-01

    The problem of verifying the operational integrity of redundant equipment and the impact of a requirement for verification on such equipment are considered. Redundant circuits are examined and the characteristics which determine adaptability to verification are identified. Mutually exclusive and exhaustive categories for verification approaches are established. The range of applicability of these techniques is defined in terms of signal characteristics and redundancy features. Verification approaches are discussed and a methodology for the design of redundancy verification is developed. A case study is presented which involves the design of a verification system for a hypothetical communications system. Design criteria for redundant equipment are presented. Recommendations for the development of technological areas pertinent to the goal of increased verification capabilities are given.

  5. Property-driven functional verification technique for high-speed vision system-on-chip processor

    NASA Astrophysics Data System (ADS)

    Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian

    2017-04-01

    The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.

  6. Transfer function verification and block diagram simplification of a very high-order distributed pole closed-loop servo by means of non-linear time-response simulation

    NASA Technical Reports Server (NTRS)

    Mukhopadhyay, A. K.

    1975-01-01

    Linear frequency domain methods are inadequate in analyzing the 1975 Viking Orbiter (VO75) digital tape recorder servo due to dominant nonlinear effects such as servo signal limiting, unidirectional servo control, and static/dynamic Coulomb friction. The frequency loop (speed control) servo of the VO75 tape recorder is used to illustrate the analytical tools and methodology of system redundancy elimination and high order transfer function verification. The paper compares time-domain performance parameters derived from a series of nonlinear time responses with the available experimental data in order to select the best possible analytical transfer function representation of the tape transport (mechanical segment of the tape recorder) from several possible candidates. The study also shows how an analytical time-response simulation taking into account most system nonlinearities can pinpoint system redundancy and overdesign stemming from a strictly empirical design approach. System order reduction is achieved through truncation of individual transfer functions and elimination of redundant blocks.

  7. Authentication of digital video evidence

    NASA Astrophysics Data System (ADS)

    Beser, Nicholas D.; Duerr, Thomas E.; Staisiunas, Gregory P.

    2003-11-01

    In response to a requirement from the United States Postal Inspection Service, the Technical Support Working Group tasked The Johns Hopkins University Applied Physics Laboratory (JHU/APL) to develop a technique tha will ensure the authenticity, or integrity, of digital video (DV). Verifiable integrity is needed if DV evidence is to withstand a challenge to its admissibility in court on the grounds that it can be easily edited. Specifically, the verification technique must detect additions, deletions, or modifications to DV and satisfy the two-part criteria pertaining to scientific evidence as articulated in Daubert et al. v. Merrell Dow Pharmaceuticals Inc., 43 F3d (9th Circuit, 1995). JHU/APL has developed a prototype digital video authenticator (DVA) that generates digital signatures based on public key cryptography at the frame level of the DV. Signature generation and recording is accomplished at the same time as DV is recorded by the camcorder. Throughput supports the consumer-grade camcorder data rate of 25 Mbps. The DVA software is implemented on a commercial laptop computer, which is connected to a commercial digital camcorder via the IEEE-1394 serial interface. A security token provides agent identification and the interface to the public key infrastructure (PKI) that is needed for management of the public keys central to DV integrity verification.

  8. 46 CFR 61.40-3 - Design verification testing.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... Shipping COAST GUARD, DEPARTMENT OF HOMELAND SECURITY (CONTINUED) MARINE ENGINEERING PERIODIC TESTS AND INSPECTIONS Design Verification and Periodic Testing of Vital System Automation § 61.40-3 Design verification testing. (a) Tests must verify that automated vital systems are designed, constructed, and operate in...

  9. The Multidimensional Integrated Intelligent Imaging project (MI-3)

    NASA Astrophysics Data System (ADS)

    Allinson, N.; Anaxagoras, T.; Aveyard, J.; Arvanitis, C.; Bates, R.; Blue, A.; Bohndiek, S.; Cabello, J.; Chen, L.; Chen, S.; Clark, A.; Clayton, C.; Cook, E.; Cossins, A.; Crooks, J.; El-Gomati, M.; Evans, P. M.; Faruqi, W.; French, M.; Gow, J.; Greenshaw, T.; Greig, T.; Guerrini, N.; Harris, E. J.; Henderson, R.; Holland, A.; Jeyasundra, G.; Karadaglic, D.; Konstantinidis, A.; Liang, H. X.; Maini, K. M. S.; McMullen, G.; Olivo, A.; O'Shea, V.; Osmond, J.; Ott, R. J.; Prydderch, M.; Qiang, L.; Riley, G.; Royle, G.; Segneri, G.; Speller, R.; Symonds-Tayler, J. R. N.; Triger, S.; Turchetta, R.; Venanzi, C.; Wells, K.; Zha, X.; Zin, H.

    2009-06-01

    MI-3 is a consortium of 11 universities and research laboratories whose mission is to develop complementary metal-oxide semiconductor (CMOS) active pixel sensors (APS) and to apply these sensors to a range of imaging challenges. A range of sensors has been developed: On-Pixel Intelligent CMOS (OPIC)—designed for in-pixel intelligence; FPN—designed to develop novel techniques for reducing fixed pattern noise; HDR—designed to develop novel techniques for increasing dynamic range; Vanilla/PEAPS—with digital and analogue modes and regions of interest, which has also been back-thinned; Large Area Sensor (LAS)—a novel, stitched LAS; and eLeNA—which develops a range of low noise pixels. Applications being developed include autoradiography, a gamma camera system, radiotherapy verification, tissue diffraction imaging, X-ray phase-contrast imaging, DNA sequencing and electron microscopy.

  10. A system for automatic evaluation of simulation software

    NASA Technical Reports Server (NTRS)

    Ryan, J. P.; Hodges, B. C.

    1976-01-01

    Within the field of computer software, simulation and verification are complementary processes. Simulation methods can be used to verify software by performing variable range analysis. More general verification procedures, such as those described in this paper, can be implicitly, viewed as attempts at modeling the end-product software. From software requirement methodology, each component of the verification system has some element of simulation to it. Conversely, general verification procedures can be used to analyze simulation software. A dynamic analyzer is described which can be used to obtain properly scaled variables for an analog simulation, which is first digitally simulated. In a similar way, it is thought that the other system components and indeed the whole system itself have the potential of being effectively used in a simulation environment.

  11. Verification of the Chesapeake Bay Model.

    DTIC Science & Technology

    1981-12-01

    points on the model. Each inflow control unit consists of a pressure regulator , a digital flow control valve, and a flowmeter (Fig- ure 8). A mechanical...spring-type pressure regulator ensures constant pressure to the digital flow control valve. Each digital valve contains eight solenoid valve actuators...FT) =0.798 EEOC 1DGS 2.78 EPOCH (DEGS) - 11. 84 3 DATA TAKEN: AC(0) = 0. 11 38 F T A (0)= 0. 1653 FT 28 MAR 1978 RANGE (FT) - 1.638 RANGE (FT

  12. 40 CFR 1065.308 - Continuous gas analyzer system-response and updating-recording verification-for gas analyzers not...

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... meets a minimum response time. You may use the results of this test to determine transformation time, t... you use any analog or real-time digital filters during emission testing, you must operate those... the rise time and fall time as needed. You may also configure analog or digital filters before...

  13. Fingerprint verification on medical image reporting system.

    PubMed

    Chen, Yen-Cheng; Chen, Liang-Kuang; Tsai, Ming-Dar; Chiu, Hou-Chang; Chiu, Jainn-Shiun; Chong, Chee-Fah

    2008-03-01

    The healthcare industry is recently going through extensive changes, through adoption of robust, interoperable healthcare information technology by means of electronic medical records (EMR). However, a major concern of EMR is adequate confidentiality of the individual records being managed electronically. Multiple access points over an open network like the Internet increases possible patient data interception. The obligation is on healthcare providers to procure information security solutions that do not hamper patient care while still providing the confidentiality of patient information. Medical images are also part of the EMR which need to be protected from unauthorized users. This study integrates the techniques of fingerprint verification, DICOM object, digital signature and digital envelope in order to ensure that access to the hospital Picture Archiving and Communication System (PACS) or radiology information system (RIS) is only by certified parties.

  14. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  15. Enhanced optical security by using information carrier digital screening

    NASA Astrophysics Data System (ADS)

    Koltai, Ferenc; Adam, Bence

    2004-06-01

    Jura has developed different security features based on Information Carrier Digital Screening. Substance of such features is that a non-visible secondary image is encoded in a visible primary image. The encoded image will be visible only by using a decoding device. One of such developments is JURA's Invisible Personal Information (IPI) is widely used in high security documents, where personal data of the document holder are encoded in the screen of the document holder's photography and they can be decoded by using an optical decoding device. In order to make document verification fully automated, enhance security and eliminate human factors, digital version of IPI, the D-IPI was developed. A special 2D-barcode structure was designed, which contains sufficient quantity of encoded digital information and can be embedded into the photo. Other part of Digital-IPI is the reading software, that is able to retrieve the encoded information with high reliability. The reading software developed with a specific 2D structure is providing the possibility of a forensic analysis. Such analysis will discover all kind of manipulations -- globally, if the photography was simply changed and selectively, if only part of the photography was manipulated. Digital IPI is a good example how benefits of digital technology can be exploited by using optical security and how technology for optical security can be converted into digital technology. The D-IPI process is compatible with all current personalization printers and materials (polycarbonate, PVC, security papers, Teslin-foils, etc.) and can provide any document with enhanced security and tamper-resistance.

  16. Development of Standard Station Interface for Comprehensive Nuclear Test Ban Treaty Organistation Monitoring Networks

    NASA Astrophysics Data System (ADS)

    Dricker, I. G.; Friberg, P.; Hellman, S.

    2001-12-01

    Under the contract with the CTBTO, Instrumental Software Technologies Inc., (ISTI) has designed and developed a Standard Station Interface (SSI) - a set of executable programs and application programming interface libraries for acquisition, authentication, archiving and telemetry of seismic and infrasound data for stations of the CTBTO nuclear monitoring network. SSI (written in C) is fully supported under both the Solaris and Linux operating systems and will be shipped with fully documented source code. SSI consists of several interconnected modules. The Digitizer Interface Module maintains a near-real-time data flow between multiple digitizers and the SSI. The Disk Buffer Module is responsible for local data archival. The Station Key Management Module is a low-level tool for data authentication and verification of incoming signatures. The Data Transmission Module supports packetized near-real-time data transmission from the primary CTBTO stations to the designated Data Center. The AutoDRM module allows transport of seismic and infrasound signed data via electronic mail (auxiliary station mode). The Command Interface Module is used to pass the remote commands to the digitizers and other modules of SSI. A station operator has access to the state-of-health information and waveforms via an the Operator Interface Module. Modular design of SSI will allow painless extension of the software system within and outside the boundaries of CTBTO station requirements. Currently an alpha version of SSI undergoes extensive tests in the lab and onsite.

  17. Design and Verification of a Digital Controller for a 2-Piece Hemispherical Resonator Gyroscope.

    PubMed

    Lee, Jungshin; Yun, Sung Wook; Rhim, Jaewook

    2016-04-20

    A Hemispherical Resonator Gyro (HRG) is the Coriolis Vibratory Gyro (CVG) that measures rotation angle or angular velocity using Coriolis force acting the vibrating mass. A HRG can be used as a rate gyro or integrating gyro without structural modification by simply changing the control scheme. In this paper, differential control algorithms are designed for a 2-piece HRG. To design a precision controller, the electromechanical modelling and signal processing must be pre-performed accurately. Therefore, the equations of motion for the HRG resonator with switched harmonic excitations are derived with the Duhamel Integral method. Electromechanical modeling of the resonator, electric module and charge amplifier is performed by considering the mode shape of a thin hemispherical shell. Further, signal processing and control algorithms are designed. The multi-flexing scheme of sensing, driving cycles and x, y-axis switching cycles is appropriate for high precision and low maneuverability systems. The differential control scheme is easily capable of rejecting the common mode errors of x, y-axis signals and changing the rate integrating mode on basis of these studies. In the rate gyro mode the controller is composed of Phase-Locked Loop (PLL), amplitude, quadrature and rate control loop. All controllers are designed on basis of a digital PI controller. The signal processing and control algorithms are verified through Matlab/Simulink simulations. Finally, a FPGA and DSP board with these algorithms is verified through experiments.

  18. Spacelab, Spacehab, and Space Station Freedom payload interface projects

    NASA Technical Reports Server (NTRS)

    Smith, Dean Lance

    1992-01-01

    Contributions were made to several projects. Howard Nguyen was assisted in developing the Space Station RPS (Rack Power Supply). The RPS is a computer controlled power supply that helps test equipment used for experiments before the equipment is installed on Space Station Freedom. Ron Bennett of General Electric Government Services was assisted in the design and analysis of the Standard Interface Rack Controller hardware and software. An analysis was made of the GPIB (General Purpose Interface Bus), looking for any potential problems while transmitting data across the bus, such as the interaction of the bus controller with a data talker and its listeners. An analysis was made of GPIB bus communications in general, including any negative impact the bus may have on transmitting data back to Earth. A study was made of transmitting digital data back to Earth over a video channel. A report was written about the study and a revised version of the report will be submitted for publication. Work was started on the design of a PC/AT compatible circuit board that will combine digital data with a video signal. Another PC/AT compatible circuit board is being designed to recover the digital data from the video signal. A proposal was submitted to support the continued development of the interface boards after the author returns to Memphis State University in the fall. A study was also made of storing circuit board design software and data on the hard disk server of a LAN (Local Area Network) that connects several IBM style PCs. A report was written that makes several recommendations. A preliminary design review was started of the AIVS (Automatic Interface Verification System). The summer was over before any significant contribution could be made to this project.

  19. Digital I and C system upgrade integration technique

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huang, H. W.; Shih, C.; Wang, J. R.

    2012-07-01

    This work developed an integration technique for digital I and C system upgrade, the utility can replace the I and C systems step by step systematically by this method. Inst. of Nuclear Energy Research (INER) developed a digital Instrumentation and Control (I and C) replacement integration technique on the basis of requirement of the three existing nuclear power plants (NPPs), which are Chin-Shan (CS) NPP, Kuo-Sheng (KS) NPP, and Maanshan (MS) NPP, in Taiwan, and also developed the related Critical Digital Review (CDR) Procedure. The digital I and C replacement integration technique includes: (I) Establishment of Nuclear Power Plant Digitalmore » Replacement Integration Guideline, (2) Preliminary Investigation on I and C System Digitalization, (3) Evaluation on I and C System Digitalization, and (4) Establishment of I and C System Digitalization Architectures. These works can be a reference for performing I and C system digital replacement integration of the three existing NPPs of Taiwan Power Company (TPC). A CDR is the review for a critical system digital I and C replacement. The major reference of this procedure is EPRI TR- 1011710 (2005) 'Handbook for Evaluating Critical Digital Equipment and Systems' which was published by the Electric Power Research Inst. (EPRI). With this document, INER developed a TPC-specific CDR procedure. Currently, CDR becomes one of the policies for digital I and C replacement in TPC. The contents of this CDR procedure include: Scope, Responsibility, Operation Procedure, Operation Flow Chart, CDR review items. The CDR review items include the comparison of the design change, Software Verification and Validation (SVandV), Failure Mode and Effects Analysis (FMEA), Evaluation of Diversity and Defense-in-depth (D3), Evaluation of Watchdog Timer, Evaluation of Electromagnetic Compatibility (EMC), Evaluation of Grounding for System/Component, Seismic Evaluation, Witness and Inspection, Lessons Learnt from the Digital I and C Failure Events. A solid review can assure the quality of the digital I and C system replacement. (authors)« less

  20. Multichannel Baseband Processor for Wideband CDMA

    NASA Astrophysics Data System (ADS)

    Jalloul, Louay M. A.; Lin, Jim

    2005-12-01

    The system architecture of the cellular base station modem engine (CBME) is described. The CBME is a single-chip multichannel transceiver capable of processing and demodulating signals from multiple users simultaneously. It is optimized to process different classes of code-division multiple-access (CDMA) signals. The paper will show that through key functional system partitioning, tightly coupled small digital signal processing cores, and time-sliced reuse architecture, CBME is able to achieve a high degree of algorithmic flexibility while maintaining efficiency. The paper will also highlight the implementation and verification aspects of the CBME chip design. In this paper, wideband CDMA is used as an example to demonstrate the architecture concept.

  1. Optical detection of random features for high security applications

    NASA Astrophysics Data System (ADS)

    Haist, T.; Tiziani, H. J.

    1998-02-01

    Optical detection of random features in combination with digital signatures based on public key codes in order to recognize counterfeit objects will be discussed. Without applying expensive production techniques objects are protected against counterfeiting. Verification is done off-line by optical means without a central authority. The method is applied for protecting banknotes. Experimental results for this application are presented. The method is also applicable for identity verification of a credit- or chip-card holder.

  2. New lumbar disc endoprosthesis applied to the patient's anatomic features.

    PubMed

    Mróz, Adrian; Skalski, Konstanty; Walczyk, Wojciech

    2015-01-01

    The paper describes the process of designing, manufacturing and design verification of the intervertebral of a new structure of lumbar disc endoprosthesis - INOP/LSP.1101. Modern and noninvasive medical imagining techniques, make it possible to record results of tests in a digital form, which creates opportunities for further processing. Mimics Innovation Suite software generates three-dimensional virtual models reflecting the real shape and measurements of components of L4-L5 spinal motion segment. With the use of 3D Print technique, physical models of bone structures of the mobile segment of the spine as well as the INOP/LSP.1101 endoprosthesis model were generated. A simplified FEA analysis of stresses in the endoprosthesis was performed to evaluate the designed geometries and materials of the new structure. The endoprosthesis prototype was made of Co28Cr6Mo alloy with the use of selective laser technology. The prototypes were subject to tribological verification with the use of the SBT-03.1 spine simulator. The structure of the endoprosthesis ensures a full reflection of its kinematics, full range of mobility of the motion segment in all anatomical planes as well as restoration of a normal height of the intervertebral space and curvature of the lordosis. The results of the tribological tests confirmed that SLM technology has the potential for production of the human bone and jointendoprostheses.

  3. Design of a 32-Channel EEG System for Brain Control Interface Applications

    PubMed Central

    Wang, Ching-Sung

    2012-01-01

    This study integrates the hardware circuit design and the development support of the software interface to achieve a 32-channel EEG system for BCI applications. Since the EEG signals of human bodies are generally very weak, in addition to preventing noise interference, it also requires avoiding the waveform distortion as well as waveform offset and so on; therefore, the design of a preamplifier with high common-mode rejection ratio and high signal-to-noise ratio is very important. Moreover, the friction between the electrode pads and the skin as well as the design of dual power supply will generate DC bias which affects the measurement signals. For this reason, this study specially designs an improved single-power AC-coupled circuit, which effectively reduces the DC bias and improves the error caused by the effects of part errors. At the same time, the digital way is applied to design the adjustable amplification and filter function, which can design for different EEG frequency bands. For the analog circuit, a frequency band will be taken out through the filtering circuit and then the digital filtering design will be used to adjust the extracted frequency band for the target frequency band, combining with MATLAB to design man-machine interface for displaying brain wave. Finally the measured signals are compared to the traditional 32-channel EEG signals. In addition to meeting the IFCN standards, the system design also conducted measurement verification in the standard EEG isolation room in order to demonstrate the accuracy and reliability of this system design. PMID:22778545

  4. Design of a 32-channel EEG system for brain control interface applications.

    PubMed

    Wang, Ching-Sung

    2012-01-01

    This study integrates the hardware circuit design and the development support of the software interface to achieve a 32-channel EEG system for BCI applications. Since the EEG signals of human bodies are generally very weak, in addition to preventing noise interference, it also requires avoiding the waveform distortion as well as waveform offset and so on; therefore, the design of a preamplifier with high common-mode rejection ratio and high signal-to-noise ratio is very important. Moreover, the friction between the electrode pads and the skin as well as the design of dual power supply will generate DC bias which affects the measurement signals. For this reason, this study specially designs an improved single-power AC-coupled circuit, which effectively reduces the DC bias and improves the error caused by the effects of part errors. At the same time, the digital way is applied to design the adjustable amplification and filter function, which can design for different EEG frequency bands. For the analog circuit, a frequency band will be taken out through the filtering circuit and then the digital filtering design will be used to adjust the extracted frequency band for the target frequency band, combining with MATLAB to design man-machine interface for displaying brain wave. Finally the measured signals are compared to the traditional 32-channel EEG signals. In addition to meeting the IFCN standards, the system design also conducted measurement verification in the standard EEG isolation room in order to demonstrate the accuracy and reliability of this system design.

  5. A Practical Approach to Identity on Digital Ecosystems Using Claim Verification and Trust

    NASA Astrophysics Data System (ADS)

    McLaughlin, Mark; Malone, Paul

    Central to the ethos of digital ecosystems (DEs) is that DEs should be distributed and have no central points of failure or control. This essentially mandates a decentralised system, which poses significant challenges for identity. Identity in decentralised environments must be treated very differently to identity in traditional environments, where centralised naming, authentication and authorisation can be assumed, and where identifiers can be considered global and absolute. In the absence of such guarantees we have expanded on the OPAALS identity model to produce a general implementation for the OPAALS DE that uses a combination of identity claim verification protocols and trust to give assurances in place of centralised servers. We outline how the components of this implementation function and give an illustrated workflow of how identity issues are solved on the OPAALS DE in practice.

  6. Spacecraft attitude calibration/verification baseline study

    NASA Technical Reports Server (NTRS)

    Chen, L. C.

    1981-01-01

    A baseline study for a generalized spacecraft attitude calibration/verification system is presented. It can be used to define software specifications for three major functions required by a mission: the pre-launch parameter observability and data collection strategy study; the in-flight sensor calibration; and the post-calibration attitude accuracy verification. Analytical considerations are given for both single-axis and three-axis spacecrafts. The three-axis attitudes considered include the inertial-pointing attitudes, the reference-pointing attitudes, and attitudes undergoing specific maneuvers. The attitude sensors and hardware considered include the Earth horizon sensors, the plane-field Sun sensors, the coarse and fine two-axis digital Sun sensors, the three-axis magnetometers, the fixed-head star trackers, and the inertial reference gyros.

  7. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication verification, or installation verification... 30 Mineral Resources 2 2010-07-01 2010-07-01 false When must I resubmit Platform Verification...

  8. Towards the formal verification of the requirements and design of a processor interface unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.

  9. U. S. GEOLOGICAL SURVEY LAND REMOTE SENSING ACTIVITIES.

    USGS Publications Warehouse

    Frederick, Doyle G.

    1983-01-01

    USGS uses all types of remotely sensed data, in combination with other sources of data, to support geologic analyses, hydrologic assessments, land cover mapping, image mapping, and applications research. Survey scientists use all types of remotely sensed data with ground verifications and digital topographic and cartographic data. A considerable amount of research is being done by Survey scientists on developing automated geographic information systems that can handle a wide variety of digital data. The Survey is also investigating the use of microprocessor computer systems for accessing, displaying, and analyzing digital data.

  10. Digitized forensics: retaining a link between physical and digital crime scene traces using QR-codes

    NASA Astrophysics Data System (ADS)

    Hildebrandt, Mario; Kiltz, Stefan; Dittmann, Jana

    2013-03-01

    The digitization of physical traces from crime scenes in forensic investigations in effect creates a digital chain-of-custody and entrains the challenge of creating a link between the two or more representations of the same trace. In order to be forensically sound, especially the two security aspects of integrity and authenticity need to be maintained at all times. Especially the adherence to the authenticity using technical means proves to be a challenge at the boundary between the physical object and its digital representations. In this article we propose a new method of linking physical objects with its digital counterparts using two-dimensional bar codes and additional meta-data accompanying the acquired data for integration in the conventional documentation of collection of items of evidence (bagging and tagging process). Using the exemplary chosen QR-code as particular implementation of a bar code and a model of the forensic process, we also supply a means to integrate our suggested approach into forensically sound proceedings as described by Holder et al.1 We use the example of the digital dactyloscopy as a forensic discipline, where currently progress is being made by digitizing some of the processing steps. We show an exemplary demonstrator of the suggested approach using a smartphone as a mobile device for the verification of the physical trace to extend the chain-of-custody from the physical to the digital domain. Our evaluation of the demonstrator is performed towards the readability and the verification of its contents. We can read the bar code despite its limited size of 42 x 42 mm and rather large amount of embedded data using various devices. Furthermore, the QR-code's error correction features help to recover contents of damaged codes. Subsequently, our appended digital signature allows for detecting malicious manipulations of the embedded data.

  11. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory.

    PubMed

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality.

  12. Automation of Presentation Record Production Based on Rich-Media Technology Using SNT Petri Nets Theory

    PubMed Central

    Martiník, Ivo

    2015-01-01

    Rich-media describes a broad range of digital interactive media that is increasingly used in the Internet and also in the support of education. Last year, a special pilot audiovisual lecture room was built as a part of the MERLINGO (MEdia-rich Repository of LearnING Objects) project solution. It contains all the elements of the modern lecture room determined for the implementation of presentation recordings based on the rich-media technologies and their publication online or on-demand featuring the access of all its elements in the automated mode including automatic editing. Property-preserving Petri net process algebras (PPPA) were designed for the specification and verification of the Petri net processes. PPPA does not need to verify the composition of the Petri net processes because all their algebraic operators preserve the specified set of the properties. These original PPPA are significantly generalized for the newly introduced class of the SNT Petri process and agent nets in this paper. The PLACE-SUBST and ASYNC-PROC algebraic operators are defined for this class of Petri nets and their chosen properties are proved. The SNT Petri process and agent nets theory were significantly applied at the design, verification, and implementation of the programming system ensuring the pilot audiovisual lecture room functionality. PMID:26258164

  13. A study of applications scribe frame data verifications using design rule check

    NASA Astrophysics Data System (ADS)

    Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki

    2013-06-01

    In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.

  14. CD volume design and verification

    NASA Technical Reports Server (NTRS)

    Li, Y. P.; Hughes, J. S.

    1993-01-01

    In this paper, we describe a prototype for CD-ROM volume design and verification. This prototype allows users to create their own model of CD volumes by modifying a prototypical model. Rule-based verification of the test volumes can then be performed later on against the volume definition. This working prototype has proven the concept of model-driven rule-based design and verification for large quantity of data. The model defined for the CD-ROM volumes becomes a data model as well as an executable specification.

  15. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  16. Design and Realization of Controllable Ultrasonic Fault Detector Automatic Verification System

    NASA Astrophysics Data System (ADS)

    Sun, Jing-Feng; Liu, Hui-Ying; Guo, Hui-Juan; Shu, Rong; Wei, Kai-Li

    The ultrasonic flaw detection equipment with remote control interface is researched and the automatic verification system is developed. According to use extensible markup language, the building of agreement instruction set and data analysis method database in the system software realizes the controllable designing and solves the diversification of unreleased device interfaces and agreements. By using the signal generator and a fixed attenuator cascading together, a dynamic error compensation method is proposed, completes what the fixed attenuator does in traditional verification and improves the accuracy of verification results. The automatic verification system operating results confirms that the feasibility of the system hardware and software architecture design and the correctness of the analysis method, while changes the status of traditional verification process cumbersome operations, and reduces labor intensity test personnel.

  17. SWARM: A Compact High Resolution Correlator and Wideband VLBI Phased Array Upgrade for SMA

    NASA Astrophysics Data System (ADS)

    Weintroub, Jonathan

    2014-06-01

    A new digital back end (DBE) is being commissioned on Mauna Kea. The “SMA Wideband Astronomical ROACH2 Machine”, or SWARM, processes a 4 GHz usable band in single polarization mode and is flexibly reconfigurable for 2 GHz full Stokes dual polarization. The hardware is based on the open source Reconfigurable Open Architecture Computing Hardware 2 (ROACH2) platform from the Collaboration for Astronomy Signal Processing and Electronics Research (CASPER). A 5 GSps quad-core analog-to-digital converter board uses a commercial chip from e2v installed on a CASPER-standard printed circuit board designed by Homin Jiang’s group at ASIAA. Two ADC channels are provided per ROACH2, each sampling a 2.3 GHz Nyquist band generated by a custom wideband block downconverter (BDC). The ROACH2 logic includes 16k-channel Polyphase Filterbank (F-engine) per input followed by a 10 GbE switch based corner-turn which feeds into correlator-accumulator logic (X-engines) co-located with the F-engines. This arrangement makes very effective use of a small amount of digital hardware (just 8 ROACH2s in 1U rack mount enclosures). The primary challenge now is to meet timing at full speed for a large and very complex FPGA bit code. Design of the VLBI phased sum and recorder interface logic is also in process. Our poster will describe the instrument design, with the focus on the particular challenges of ultra wideband signal processing. Early connected commissioning and science verification data will be presented.

  18. Cluster man/system design requirements and verification. [for Skylab program

    NASA Technical Reports Server (NTRS)

    Watters, H. H.

    1974-01-01

    Discussion of the procedures employed for determining the man/system requirements that guided Skylab design, and review of the techniques used for implementing the man/system design verification. The foremost lesson learned from the design need anticipation and design verification experience is the necessity to allow for human capabilities of in-flight maintenance and repair. It is now known that the entire program was salvaged by a series of unplanned maintenance and repair events which were implemented in spite of poor design provisions for maintenance.

  19. Parallel processing for digital picture comparison

    NASA Technical Reports Server (NTRS)

    Cheng, H. D.; Kou, L. T.

    1987-01-01

    In picture processing an important problem is to identify two digital pictures of the same scene taken under different lighting conditions. This kind of problem can be found in remote sensing, satellite signal processing and the related areas. The identification can be done by transforming the gray levels so that the gray level histograms of the two pictures are closely matched. The transformation problem can be solved by using the packing method. Researchers propose a VLSI architecture consisting of m x n processing elements with extensive parallel and pipelining computation capabilities to speed up the transformation with the time complexity 0(max(m,n)), where m and n are the numbers of the gray levels of the input picture and the reference picture respectively. If using uniprocessor and a dynamic programming algorithm, the time complexity will be 0(m(3)xn). The algorithm partition problem, as an important issue in VLSI design, is discussed. Verification of the proposed architecture is also given.

  20. Flight test experience and controlled impact of a large, four-engine, remotely piloted airplane

    NASA Technical Reports Server (NTRS)

    Kempel, R. W.; Horton, T. W.

    1985-01-01

    A controlled impact demonstration (CID) program using a large, four engine, remotely piloted transport airplane was conducted. Closed loop primary flight control was performed from a ground based cockpit and digital computer in conjunction with an up/down telemetry link. Uplink commands were received aboard the airplane and transferred through uplink interface systems to a highly modified Bendix PB-20D autopilot. Both proportional and discrete commands were generated by the ground pilot. Prior to flight tests, extensive simulation was conducted during the development of ground based digital control laws. The control laws included primary control, secondary control, and racetrack and final approach guidance. Extensive ground checks were performed on all remotely piloted systems. However, manned flight tests were the primary method of verification and validation of control law concepts developed from simulation. The design, development, and flight testing of control laws and the systems required to accomplish the remotely piloted mission are discussed.

  1. From Verified Models to Verifiable Code

    NASA Technical Reports Server (NTRS)

    Lensink, Leonard; Munoz, Cesar A.; Goodloe, Alwyn E.

    2009-01-01

    Declarative specifications of digital systems often contain parts that can be automatically translated into executable code. Automated code generation may reduce or eliminate the kinds of errors typically introduced through manual code writing. For this approach to be effective, the generated code should be reasonably efficient and, more importantly, verifiable. This paper presents a prototype code generator for the Prototype Verification System (PVS) that translates a subset of PVS functional specifications into an intermediate language and subsequently to multiple target programming languages. Several case studies are presented to illustrate the tool's functionality. The generated code can be analyzed by software verification tools such as verification condition generators, static analyzers, and software model-checkers to increase the confidence that the generated code is correct.

  2. Light Water Reactor Sustainability Program Operator Performance Metrics for Control Room Modernization: A Practical Guide for Early Design Evaluation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald Boring; Roger Lew; Thomas Ulrich

    2014-03-01

    As control rooms are modernized with new digital systems at nuclear power plants, it is necessary to evaluate the operator performance using these systems as part of a verification and validation process. There are no standard, predefined metrics available for assessing what is satisfactory operator interaction with new systems, especially during the early design stages of a new system. This report identifies the process and metrics for evaluating human system interfaces as part of control room modernization. The report includes background information on design and evaluation, a thorough discussion of human performance measures, and a practical example of how themore » process and metrics have been used as part of a turbine control system upgrade during the formative stages of design. The process and metrics are geared toward generalizability to other applications and serve as a template for utilities undertaking their own control room modernization activities.« less

  3. DFM flow by using combination between design based metrology system and model based verification at sub-50nm memory device

    NASA Astrophysics Data System (ADS)

    Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong

    2007-03-01

    As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.

  4. Design and Verification of a Digital Controller for a 2-Piece Hemispherical Resonator Gyroscope

    PubMed Central

    Lee, Jungshin; Yun, Sung Wook; Rhim, Jaewook

    2016-01-01

    A Hemispherical Resonator Gyro (HRG) is the Coriolis Vibratory Gyro (CVG) that measures rotation angle or angular velocity using Coriolis force acting the vibrating mass. A HRG can be used as a rate gyro or integrating gyro without structural modification by simply changing the control scheme. In this paper, differential control algorithms are designed for a 2-piece HRG. To design a precision controller, the electromechanical modelling and signal processing must be pre-performed accurately. Therefore, the equations of motion for the HRG resonator with switched harmonic excitations are derived with the Duhamel Integral method. Electromechanical modeling of the resonator, electric module and charge amplifier is performed by considering the mode shape of a thin hemispherical shell. Further, signal processing and control algorithms are designed. The multi-flexing scheme of sensing, driving cycles and x, y-axis switching cycles is appropriate for high precision and low maneuverability systems. The differential control scheme is easily capable of rejecting the common mode errors of x, y-axis signals and changing the rate integrating mode on basis of these studies. In the rate gyro mode the controller is composed of Phase-Locked Loop (PLL), amplitude, quadrature and rate control loop. All controllers are designed on basis of a digital PI controller. The signal processing and control algorithms are verified through Matlab/Simulink simulations. Finally, a FPGA and DSP board with these algorithms is verified through experiments. PMID:27104539

  5. Analysis of accuracy of digital elevation models created from captured data by digital photogrammetry method

    NASA Astrophysics Data System (ADS)

    Hudec, P.

    2011-12-01

    A digital elevation model (DEM) is an important part of many geoinformatic applications. For the creation of DEM, spatial data collected by geodetic measurements in the field, photogrammetric processing of aerial survey photographs, laser scanning and secondary sources (analogue maps) are used. It is very important from a user's point of view to know the vertical accuracy of a DEM. The article describes the verification of the vertical accuracy of a DEM for the region of Medzibodrožie, which was created using digital photogrammetry for the purposes of water resources management and modeling and resolving flood cases based on geodetic measurements in the field.

  6. 37 CFR 262.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a Performer may conduct a single audit of the Designated Agent upon reasonable notice and... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR CERTAIN ELIGIBLE...

  7. Programmable Pulser

    NASA Technical Reports Server (NTRS)

    Baumann, Eric; Merolla, Anthony

    1988-01-01

    User controls number of clock pulses to prevent burnout. New digital programmable pulser circuit in three formats; freely running, counted, and single pulse. Operates at frequencies up to 5 MHz, with no special consideration given to layout of components or to terminations. Pulser based on sequential circuit with four states and binary counter with appropriate decoding logic. Number of programmable pulses increased beyond 127 by addition of another counter and decoding logic. For very large pulse counts and/or very high frequencies, use synchronous counters to avoid errors caused by propagation delays. Invaluable tool for initial verification or diagnosis of digital or digitally controlled circuity.

  8. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 1: A case study in theorem prover-based verification

    NASA Technical Reports Server (NTRS)

    Srivas, Mandayam; Bickford, Mark

    1991-01-01

    The design and formal verification of a hardware system for a task that is an important component of a fault tolerant computer architecture for flight control systems is presented. The hardware system implements an algorithm for obtaining interactive consistancy (byzantine agreement) among four microprocessors as a special instruction on the processors. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, provided certain preconditions hold. An assumption is made that the processors execute synchronously. For verification, the authors used a computer aided design hardware design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  9. 37 CFR 201.30 - Verification of Statements of Account.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... manufacturer or importer of digital devices or media who is required by 17 U.S.C. 1003 to file with the... applicable generally to attest engagements (collectively, the “AICPA Code”); and (ii) He or she is...

  10. Simulation environment based on the Universal Verification Methodology

    NASA Astrophysics Data System (ADS)

    Fiergolski, A.

    2017-01-01

    Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.

  11. The Electronic View Box: a software tool for radiation therapy treatment verification.

    PubMed

    Bosch, W R; Low, D A; Gerber, R L; Michalski, J M; Graham, M V; Perez, C A; Harms, W B; Purdy, J A

    1995-01-01

    We have developed a software tool for interactively verifying treatment plan implementation. The Electronic View Box (EVB) tool copies the paradigm of current practice but does so electronically. A portal image (online portal image or digitized port film) is displayed side by side with a prescription image (digitized simulator film or digitally reconstructed radiograph). The user can measure distances between features in prescription and portal images and "write" on the display, either to approve the image or to indicate required corrective actions. The EVB tool also provides several features not available in conventional verification practice using a light box. The EVB tool has been written in ANSI C using the X window system. The tool makes use of the Virtual Machine Platform and Foundation Library specifications of the NCI-sponsored Radiation Therapy Planning Tools Collaborative Working Group for portability into an arbitrary treatment planning system that conforms to these specifications. The present EVB tool is based on an earlier Verification Image Review tool, but with a substantial redesign of the user interface. A graphical user interface prototyping system was used in iteratively refining the tool layout to allow rapid modifications of the interface in response to user comments. Features of the EVB tool include 1) hierarchical selection of digital portal images based on physician name, patient name, and field identifier; 2) side-by-side presentation of prescription and portal images at equal magnification and orientation, and with independent grayscale controls; 3) "trace" facility for outlining anatomical structures; 4) "ruler" facility for measuring distances; 5) zoomed display of corresponding regions in both images; 6) image contrast enhancement; and 7) communication of portal image evaluation results (approval, block modification, repeat image acquisition, etc.). The EVB tool facilitates the rapid comparison of prescription and portal images and permits electronic communication of corrections in port shape and positioning.

  12. On verifying a high-level design. [cost and error analysis

    NASA Technical Reports Server (NTRS)

    Mathew, Ben; Wehbeh, Jalal A.; Saab, Daniel G.

    1993-01-01

    An overview of design verification techniques is presented, and some of the current research in high-level design verification is described. Formal hardware description languages that are capable of adequately expressing the design specifications have been developed, but some time will be required before they can have the expressive power needed to be used in real applications. Simulation-based approaches are more useful in finding errors in designs than they are in proving the correctness of a certain design. Hybrid approaches that combine simulation with other formal design verification techniques are argued to be the most promising over the short term.

  13. 30 CFR 250.913 - When must I resubmit Platform Verification Program plans?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... CONTINENTAL SHELF Platforms and Structures Platform Verification Program § 250.913 When must I resubmit Platform Verification Program plans? (a) You must resubmit any design verification, fabrication... 30 Mineral Resources 2 2011-07-01 2011-07-01 false When must I resubmit Platform Verification...

  14. Nodal network generator for CAVE3

    NASA Technical Reports Server (NTRS)

    Palmieri, J. V.; Rathjen, K. A.

    1982-01-01

    A new extension of CAVE3 code was developed that automates the creation of a finite difference math model in digital form ready for input to the CAVE3 code. The new software, Nodal Network Generator, is broken into two segments. One segment generates the model geometry using a Tektronix Tablet Digitizer and the other generates the actual finite difference model and allows for graphic verification using Tektronix 4014 Graphic Scope. Use of the Nodal Network Generator is described.

  15. A Hybrid Digital-Signature and Zero-Watermarking Approach for Authentication and Protection of Sensitive Electronic Documents

    PubMed Central

    Kabir, Muhammad N.; Alginahi, Yasser M.

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  16. Debris control design achievements of the booster separation motors

    NASA Technical Reports Server (NTRS)

    Smith, G. W.; Chase, C. A.

    1985-01-01

    The stringent debris control requirements imposed on the design of the Space Shuttle booster separation motor are described along with the verification program implemented to ensure compliance with debris control objectives. The principal areas emphasized in the design and development of the Booster Separation Motor (BSM) relative to debris control were the propellant formulation and nozzle closures which protect the motors from aerodynamic heating and moisture. A description of the motor design requirements, the propellant formulation and verification program, and the nozzle closures design and verification are presented.

  17. 30 CFR 285.705 - When must I use a Certified Verification Agent (CVA)?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... INTERIOR OFFSHORE RENEWABLE ENERGY ALTERNATE USES OF EXISTING FACILITIES ON THE OUTER CONTINENTAL SHELF Facility Design, Fabrication, and Installation Certified Verification Agent § 285.705 When must I use a Certified Verification Agent (CVA)? You must use a CVA to review and certify the Facility Design Report, the...

  18. Design and Experimental Verification of a 0.19 V 53 μW 65 nm CMOS Integrated Supply-Sensing Sensor With a Supply-Insensitive Temperature Sensor and an Inductive-Coupling Transmitter for a Self-Powered Bio-sensing System Using a Biofuel Cell.

    PubMed

    Kobayashi, Atsuki; Ikeda, Kei; Ogawa, Yudai; Kai, Hiroyuki; Nishizawa, Matsuhiko; Nakazato, Kazuo; Niitsu, Kiichi

    2017-12-01

    In this paper, we present a self-powered bio-sensing system with the capability of proximity inductive-coupling communication for supply sensing and temperature monitoring. The proposed bio-sensing system includes a biofuel cell as a power source and a sensing frontend that is associated with the CMOS integrated supply-sensing sensor. The sensor consists of a digital-based gate leakage timer, a supply-insensitive time-domain temperature sensor, and a current-driven inductive-coupling transmitter and achieves low-voltage operation. The timer converts the output voltage from a biofuel cell to frequency. The temperature sensor provides a pulse width modulation (PWM) output that is not dependent on the supply voltage, and the associated inductive-coupling transmitter enables proximity communication. A test chip was fabricated in 65 nm CMOS technology and consumed 53 μW with a supply voltage of 190 mV. The low-voltage-friendly design satisfied the performance targets of each integrated sensor without any trimming. The chips allowed us to successfully demonstrate proximity communication with an asynchronous receiver, and the measurement results show the potential for self-powered operation using biofuel cells. The analysis and experimental verification of the system confirmed their robustness.

  19. 30 CFR 250.909 - What is the Platform Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 30 Mineral Resources 2 2010-07-01 2010-07-01 false What is the Platform Verification Program? 250... Verification Program § 250.909 What is the Platform Verification Program? The Platform Verification Program is the MMS approval process for ensuring that floating platforms; platforms of a new or unique design...

  20. Block 2 SRM conceptual design studies. Volume 1, Book 2: Preliminary development and verification plan

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Activities that will be conducted in support of the development and verification of the Block 2 Solid Rocket Motor (SRM) are described. Development includes design, fabrication, processing, and testing activities in which the results are fed back into the project. Verification includes analytical and test activities which demonstrate SRM component/subassembly/assembly capability to perform its intended function. The management organization responsible for formulating and implementing the verification program is introduced. It also identifies the controls which will monitor and track the verification program. Integral with the design and certification of the SRM are other pieces of equipment used in transportation, handling, and testing which influence the reliability and maintainability of the SRM configuration. The certification of this equipment is also discussed.

  1. Software safety - A user's practical perspective

    NASA Technical Reports Server (NTRS)

    Dunn, William R.; Corliss, Lloyd D.

    1990-01-01

    Software safety assurance philosophy and practices at the NASA Ames are discussed. It is shown that, to be safe, software must be error-free. Software developments on two digital flight control systems and two ground facility systems are examined, including the overall system and software organization and function, the software-safety issues, and their resolution. The effectiveness of safety assurance methods is discussed, including conventional life-cycle practices, verification and validation testing, software safety analysis, and formal design methods. It is concluded (1) that a practical software safety technology does not yet exist, (2) that it is unlikely that a set of general-purpose analytical techniques can be developed for proving that software is safe, and (3) that successful software safety-assurance practices will have to take into account the detailed design processes employed and show that the software will execute correctly under all possible conditions.

  2. Note: Ultrasonic gas flowmeter based on optimized time-of-flight algorithms

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wang, X. F.; Tang, Z. A.

    2011-04-15

    A new digital signal processor based single path ultrasonic gas flowmeter is designed, constructed, and experimentally tested. To achieve high accuracy measurements, an optimized ultrasound driven method of incorporation of the amplitude modulation and the phase modulation of the transmit-receive technique is used to stimulate the transmitter. Based on the regularities among the received envelope zero-crossings, different received signal's signal-to-noise ratio situations are discriminated and optional time-of-flight algorithms are applied to take flow rate calculations. Experimental results from the dry calibration indicate that the designed flowmeter prototype can meet the zero-flow verification test requirements of the American Gas Association Reportmore » No. 9. Furthermore, the results derived from the flow calibration prove that the proposed flowmeter prototype can measure flow rate accurately in the practical experiments, and the nominal accuracies after FWME adjustment are lower than 0.8% throughout the calibration range.« less

  3. Survey of Verification and Validation Techniques for Small Satellite Software Development

    NASA Technical Reports Server (NTRS)

    Jacklin, Stephen A.

    2015-01-01

    The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.

  4. NASA/BLM Applications Pilot Test (APT), phase 2. Volume 1: Executive summary. [vegetation mapping and production estimation in northwestern Arizona

    NASA Technical Reports Server (NTRS)

    1981-01-01

    Data from LANDSAT, low altitude color aerial photography, and ground visits were combined and used to produce vegetation cover maps and to estimate productivity of range, woodland, and forest resources in northwestern Arizona. A planning session, two workshops, and four status reviews were held to assist technology transfer from NASA. Computer aided digital classification of LANDSAT data was selected as a major source of input data. An overview is presented of the data processing, data collection, productivity estimation, and map verification techniques used. Cost analysis and digital LANDSAT digital products are also considered.

  5. Open data and digital morphology

    PubMed Central

    Davies, Thomas G.; Cunningham, John A.; Asher, Robert J.; Bates, Karl T.; Bengtson, Stefan; Benson, Roger B. J.; Boyer, Doug M.; Braga, José; Dong, Xi-Ping; Evans, Alistair R.; Friedman, Matt; Garwood, Russell J.; Goswami, Anjali; Hutchinson, John R.; Jeffery, Nathan S.; Lebrun, Renaud; Martínez-Pérez, Carlos; O'Higgins, Paul M.; Orliac, Maëva; Rowe, Timothy B.; Sánchez-Villagra, Marcelo R.; Shubin, Neil H.; Starck, J. Matthias; Stringer, Chris; Summers, Adam P.; Sutton, Mark D.; Walsh, Stig A.; Weisbecker, Vera; Witmer, Lawrence M.; Wroe, Stephen; Yin, Zongjun

    2017-01-01

    Over the past two decades, the development of methods for visualizing and analysing specimens digitally, in three and even four dimensions, has transformed the study of living and fossil organisms. However, the initial promise that the widespread application of such methods would facilitate access to the underlying digital data has not been fully achieved. The underlying datasets for many published studies are not readily or freely available, introducing a barrier to verification and reproducibility, and the reuse of data. There is no current agreement or policy on the amount and type of data that should be made available alongside studies that use, and in some cases are wholly reliant on, digital morphology. Here, we propose a set of recommendations for minimum standards and additional best practice for three-dimensional digital data publication, and review the issues around data storage, management and accessibility. PMID:28404779

  6. Open data and digital morphology.

    PubMed

    Davies, Thomas G; Rahman, Imran A; Lautenschlager, Stephan; Cunningham, John A; Asher, Robert J; Barrett, Paul M; Bates, Karl T; Bengtson, Stefan; Benson, Roger B J; Boyer, Doug M; Braga, José; Bright, Jen A; Claessens, Leon P A M; Cox, Philip G; Dong, Xi-Ping; Evans, Alistair R; Falkingham, Peter L; Friedman, Matt; Garwood, Russell J; Goswami, Anjali; Hutchinson, John R; Jeffery, Nathan S; Johanson, Zerina; Lebrun, Renaud; Martínez-Pérez, Carlos; Marugán-Lobón, Jesús; O'Higgins, Paul M; Metscher, Brian; Orliac, Maëva; Rowe, Timothy B; Rücklin, Martin; Sánchez-Villagra, Marcelo R; Shubin, Neil H; Smith, Selena Y; Starck, J Matthias; Stringer, Chris; Summers, Adam P; Sutton, Mark D; Walsh, Stig A; Weisbecker, Vera; Witmer, Lawrence M; Wroe, Stephen; Yin, Zongjun; Rayfield, Emily J; Donoghue, Philip C J

    2017-04-12

    Over the past two decades, the development of methods for visualizing and analysing specimens digitally, in three and even four dimensions, has transformed the study of living and fossil organisms. However, the initial promise that the widespread application of such methods would facilitate access to the underlying digital data has not been fully achieved. The underlying datasets for many published studies are not readily or freely available, introducing a barrier to verification and reproducibility, and the reuse of data. There is no current agreement or policy on the amount and type of data that should be made available alongside studies that use, and in some cases are wholly reliant on, digital morphology. Here, we propose a set of recommendations for minimum standards and additional best practice for three-dimensional digital data publication, and review the issues around data storage, management and accessibility. © 2017 The Authors.

  7. Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers

    NASA Technical Reports Server (NTRS)

    Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.

    1983-01-01

    A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.

  8. Towards the formal verification of the requirements and design of a processor interface unit: HOL listings

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    This technical report contains the Higher-Order Logic (HOL) listings of the partial verification of the requirements and design for a commercially developed processor interface unit (PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault tolerant computer system. This system, the Fault Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU verification as it currently exists. Section two of this report contains general-purpose HOL theories and definitions that support the PIU verification. These include arithmetic theories dealing with inequalities and associativity, and a collection of tactics used in the PIU proofs. Section three contains the HOL listings for the completed PIU design verification. Section 4 contains the HOL listings for the partial requirements verification of the P-Port.

  9. Rapid Prototyping of a Smart Device-based Wireless Reflectance Photoplethysmograph

    PubMed Central

    Ghamari, M.; Aguilar, C.; Soltanpur, C.; Nazeran, H.

    2017-01-01

    This paper presents the design, fabrication, and testing of a wireless heart rate (HR) monitoring device based on photoplethysmography (PPG) and smart devices. PPG sensors use infrared (IR) light to obtain vital information to assess cardiac health and other physiologic conditions. The PPG data that are transferred to a computer undergo further processing to derive the Heart Rate Variability (HRV) signal, which is analyzed to generate quantitative markers of the Autonomic Nervous System (ANS). The HRV signal has numerous monitoring and diagnostic applications. To this end, wireless connectivity plays an important role in such biomedical instruments. The photoplethysmograph consists of an optical sensor to detect the changes in the light intensity reflected from the illuminated tissue, a signal conditioning unit to prepare the reflected light for further signal conditioning through amplification and filtering, a low-power microcontroller to control and digitize the analog PPG signal, and a Bluetooth module to transmit the digital data to a Bluetooth-based smart device such as a tablet. An Android app is then used to enable the smart device to acquire and digitally display the received analog PPG signal in real-time on the smart device. This article is concluded with the prototyping of the wireless PPG followed by the verification procedures of the PPG and HRV signals acquired in a laboratory environment. PMID:28959119

  10. Rapid Prototyping of a Smart Device-based Wireless Reflectance Photoplethysmograph.

    PubMed

    Ghamari, M; Aguilar, C; Soltanpur, C; Nazeran, H

    2016-03-01

    This paper presents the design, fabrication, and testing of a wireless heart rate (HR) monitoring device based on photoplethysmography (PPG) and smart devices. PPG sensors use infrared (IR) light to obtain vital information to assess cardiac health and other physiologic conditions. The PPG data that are transferred to a computer undergo further processing to derive the Heart Rate Variability (HRV) signal, which is analyzed to generate quantitative markers of the Autonomic Nervous System (ANS). The HRV signal has numerous monitoring and diagnostic applications. To this end, wireless connectivity plays an important role in such biomedical instruments. The photoplethysmograph consists of an optical sensor to detect the changes in the light intensity reflected from the illuminated tissue, a signal conditioning unit to prepare the reflected light for further signal conditioning through amplification and filtering, a low-power microcontroller to control and digitize the analog PPG signal, and a Bluetooth module to transmit the digital data to a Bluetooth-based smart device such as a tablet. An Android app is then used to enable the smart device to acquire and digitally display the received analog PPG signal in real-time on the smart device. This article is concluded with the prototyping of the wireless PPG followed by the verification procedures of the PPG and HRV signals acquired in a laboratory environment.

  11. Satellite detection of oil on the marine surface

    NASA Technical Reports Server (NTRS)

    Wilson, M. J.; Oneill, P. E.; Estes, J. E.

    1981-01-01

    The ability of two widely dissimilar spaceborne imaging sensors to detect surface oil accumulations in the marine environment has been evaluated using broadly different techniques. Digital Landsat multispectral scanner (MSS) data consisting of two visible and two near infrared channels has been processed to enhance contrast between areas of known oil coverage and background clean surface water. These enhanced images have then been compared to surface verification data gathered by aerial reconnaissance during the October 15, 1975, Landsat overpass. A similar evaluation of oil slick imaging potential has been made for digitally enhanced Seasat-A synthetic aperture radar (SAR) data from July 18, 1979. Due to the premature failure of this satellite, however, no concurrent surface verification data were collected. As a substitute, oil slick configuration information has been generated for the comparison using meteorological and oceanographic data. The test site utilized in both studies was the extensive area of natural seepage located off Coal Oil Point, adjacent to the University of California, Santa Barbara.

  12. Analysis and simulation tools for solar array power systems

    NASA Astrophysics Data System (ADS)

    Pongratananukul, Nattorn

    This dissertation presents simulation tools developed specifically for the design of solar array power systems. Contributions are made in several aspects of the system design phases, including solar source modeling, system simulation, and controller verification. A tool to automate the study of solar array configurations using general purpose circuit simulators has been developed based on the modeling of individual solar cells. Hierarchical structure of solar cell elements, including semiconductor properties, allows simulation of electrical properties as well as the evaluation of the impact of environmental conditions. A second developed tool provides a co-simulation platform with the capability to verify the performance of an actual digital controller implemented in programmable hardware such as a DSP processor, while the entire solar array including the DC-DC power converter is modeled in software algorithms running on a computer. This "virtual plant" allows developing and debugging code for the digital controller, and also to improve the control algorithm. One important task in solar arrays is to track the maximum power point on the array in order to maximize the power that can be delivered. Digital controllers implemented with programmable processors are particularly attractive for this task because sophisticated tracking algorithms can be implemented and revised when needed to optimize their performance. The proposed co-simulation tools are thus very valuable in developing and optimizing the control algorithm, before the system is built. Examples that demonstrate the effectiveness of the proposed methodologies are presented. The proposed simulation tools are also valuable in the design of multi-channel arrays. In the specific system that we have designed and tested, the control algorithm is implemented on a single digital signal processor. In each of the channels the maximum power point is tracked individually. In the prototype we built, off-the-shelf commercial DC-DC converters were utilized. At the end, the overall performance of the entire system was evaluated using solar array simulators capable of simulating various I-V characteristics, and also by using an electronic load. Experimental results are presented.

  13. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  14. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2010-04-01 2010-04-01 false Import certificate/delivery verification...

  15. 22 CFR 123.14 - Import certificate/delivery verification procedure.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... REGULATIONS LICENSES FOR THE EXPORT OF DEFENSE ARTICLES § 123.14 Import certificate/delivery verification procedure. (a) The Import Certificate/Delivery Verification Procedure is designed to assure that a commodity... 22 Foreign Relations 1 2011-04-01 2011-04-01 false Import certificate/delivery verification...

  16. 36 CFR 1237.28 - What special concerns apply to digital photographs?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...

  17. 36 CFR § 1237.28 - What special concerns apply to digital photographs?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...

  18. 36 CFR 1237.28 - What special concerns apply to digital photographs?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...

  19. 36 CFR 1237.28 - What special concerns apply to digital photographs?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...

  20. 36 CFR 1237.28 - What special concerns apply to digital photographs?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... defects, evaluate the accuracy of finding aids, and verify file header information and file name integrity... sampling methods or more comprehensive verification systems (e.g., checksum programs), to evaluate image.... For permanent or unscheduled images descriptive elements must include: (1) An identification number...

  1. Inter-laboratory verification of European pharmacopoeia monograph on derivative spectrophotometry method and its application for chitosan hydrochloride.

    PubMed

    Marković, Bojan; Ignjatović, Janko; Vujadinović, Mirjana; Savić, Vedrana; Vladimirov, Sote; Karljiković-Rajić, Katarina

    2015-01-01

    Inter-laboratory verification of European pharmacopoeia (EP) monograph on derivative spectrophotometry (DS) method and its application for chitosan hydrochloride was carried out on two generation of instruments (earlier GBC Cintra 20 and current technology TS Evolution 300). Instruments operate with different versions of Savitzky-Golay algorithm and modes of generating digital derivative spectra. For resolution power parameter, defined as the amplitude ratio A/B in DS method EP monograph, comparable results were obtained only with algorithm's parameters smoothing points (SP) 7 and the 2nd degree polynomial and those provided corresponding data with other two modes on TS Evolution 300 Medium digital indirect and Medium digital direct. Using quoted algorithm's parameters, the differences in percentages between the amplitude ratio A/B averages, were within accepted criteria (±3%) for assay of drug product for method transfer. The deviation of 1.76% for the degree of deacetylation assessment of chitosan hydrochloride, determined on two instruments, (amplitude (1)D202; the 2nd degree polynomial and SP 9 in Savitzky-Golay algorithm), was acceptable, since it was within allowed criteria (±2%) for assay deviation of drug substance, for method transfer in pharmaceutical analyses. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. The 1991 3rd NASA Symposium on VLSI Design

    NASA Technical Reports Server (NTRS)

    Maki, Gary K.

    1991-01-01

    Papers from the symposium are presented from the following sessions: (1) featured presentations 1; (2) very large scale integration (VLSI) circuit design; (3) VLSI architecture 1; (4) featured presentations 2; (5) neural networks; (6) VLSI architectures 2; (7) featured presentations 3; (8) verification 1; (9) analog design; (10) verification 2; (11) design innovations 1; (12) asynchronous design; and (13) design innovations 2.

  3. Seismic design verification of LMFBR structures

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Not Available

    1977-07-01

    The report provides an assessment of the seismic design verification procedures currently used for nuclear power plant structures, a comparison of dynamic test methods available, and conclusions and recommendations for future LMFB structures.

  4. MARATHON Verification (MARV)

    DTIC Science & Technology

    2017-08-01

    comparable with MARATHON 1 in terms of output. Rather, the MARATHON 2 verification cases were designed to ensure correct implementation of the new algorithms...DISCLAIMER The findings of this report are not to be construed as an official Department of the Army position, policy, or decision unless so designated by...for employment against demands. This study is a comparative verification of the functionality of MARATHON 4 (our newest implementation of MARATHON

  5. Study of space shuttle orbiter system management computer function. Volume 2: Automated performance verification concepts

    NASA Technical Reports Server (NTRS)

    1975-01-01

    The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.

  6. Travel Vaccines Enter the Digital Age: Creating a Virtual Immunization Record

    PubMed Central

    Wilson, Kumanan; Atkinson, Katherine M.; Bell, Cameron P.

    2016-01-01

    At present, proof of immunization against diseases such as yellow fever is required at some international borders in concordance with the International Health Regulations. The current standard, the International Certificate of Vaccination or Prophylaxis (ICVP), has limitations as a paper record including the possibility of being illegible, misplaced, or damaged. We believe that a complementary, digital record would offer advantages to public health and travelers alike. These include enhanced availability and reliability, potential to include lot specific information, and integration with immunization information systems. Challenges exist in implementation, particularly pertaining to verification at border crossings. We describe a potential course for the development and implementation of a digital ICVP record. PMID:26711516

  7. Environmental Technology Verification: Pesticide Spray Drift Reduction Technologies for Row and Field Crops

    EPA Pesticide Factsheets

    The Environmental Technology Verification Program, established by the EPA, is designed to accelerate the development and commercialization of new or improved technologies through third-party verification and reporting of performance.

  8. A comparison of QuantStudio™ 3D Digital PCR and ARMS-PCR for measuring plasma EGFR T790M mutations of NSCLC patients.

    PubMed

    Feng, Qin; Gai, Fei; Sang, Yaxiong; Zhang, Jie; Wang, Ping; Wang, Yue; Liu, Bing; Lin, Dongmei; Yu, Yang; Fang, Jian

    2018-01-01

    The AURA3 clinical trial has shown that advanced non-small cell lung cancer (NSCLC) patients with EGFR T790M mutations in circulating tumor DNA (ctDNA) could benefit from osimertinib. The aim of this study was to assess the usefulness of QuantStudio™ 3D Digital PCR System platform for the detection of plasma EGFR T790M mutations in NSCLC patients, and compare the performances of 3D Digital PCR and ARMS-PCR. A total of 119 Chinese patients were enrolled in this study. Mutant allele frequency of plasma EGFR T790M was detected by 3D Digital PCR, then 25 selected samples were verified by ARMS-PCR and four of them were verified by next generation sequencing (NGS). In total, 52.94% (69/119) had EGFR T790M mutations detected by 3D Digital PCR. In 69 positive samples, the median mutant allele frequency (AF) was 1.09% and three cases presented low concentration (AF <0.1%). Limited by the amount of plasma DNA, 17 samples (AF <2.5%) and eight samples (T790M-) were selected for verification by ARMS-PCR. Four of those samples were verified by NGS as a third verification method. Among the selected 17 positive cases, ten samples presented mutant allele frequency <0.5%, and seven samples presented intermediate mutant allele frequency (0.5% AF 2.5%). However, only three samples (3/17) were identified as positive by ARMS-PCR, namely, P6 (AF =1.09%), P7 (AF =2.09%), and P8 (AF =2.21%). It is worth mentioning that sample P9 (AF =2.05%, analyzed by 3D Digital PCR) was identified as T790M- by ARMS-PCR. Four samples were identified as T790M+ by both NGS and 3D Digital PCR, and typically three samples (3/4) presented at a low ratio (AF <0.5%). Our study demonstrated that 3D Digital PCR is a novel method with high sensitivity and specificity to detect EGFR T790M mutation in plasma.

  9. A comparison of QuantStudio™ 3D Digital PCR and ARMS-PCR for measuring plasma EGFR T790M mutations of NSCLC patients

    PubMed Central

    Sang, Yaxiong; Zhang, Jie; Wang, Ping; Wang, Yue; Liu, Bing; Lin, Dongmei; Yu, Yang; Fang, Jian

    2018-01-01

    Background The AURA3 clinical trial has shown that advanced non-small cell lung cancer (NSCLC) patients with EGFR T790M mutations in circulating tumor DNA (ctDNA) could benefit from osimertinib. Purpose The aim of this study was to assess the usefulness of QuantStudio™ 3D Digital PCR System platform for the detection of plasma EGFR T790M mutations in NSCLC patients, and compare the performances of 3D Digital PCR and ARMS-PCR. Patients and methods A total of 119 Chinese patients were enrolled in this study. Mutant allele frequency of plasma EGFR T790M was detected by 3D Digital PCR, then 25 selected samples were verified by ARMS-PCR and four of them were verified by next generation sequencing (NGS). Results In total, 52.94% (69/119) had EGFR T790M mutations detected by 3D Digital PCR. In 69 positive samples, the median mutant allele frequency (AF) was 1.09% and three cases presented low concentration (AF <0.1%). Limited by the amount of plasma DNA, 17 samples (AF <2.5%) and eight samples (T790M-) were selected for verification by ARMS-PCR. Four of those samples were verified by NGS as a third verification method. Among the selected 17 positive cases, ten samples presented mutant allele frequency <0.5%, and seven samples presented intermediate mutant allele frequency (0.5% AF 2.5%). However, only three samples (3/17) were identified as positive by ARMS-PCR, namely, P6 (AF =1.09%), P7 (AF =2.09%), and P8 (AF =2.21%). It is worth mentioning that sample P9 (AF =2.05%, analyzed by 3D Digital PCR) was identified as T790M- by ARMS-PCR. Four samples were identified as T790M+ by both NGS and 3D Digital PCR, and typically three samples (3/4) presented at a low ratio (AF <0.5%). Conclusion Our study demonstrated that 3D Digital PCR is a novel method with high sensitivity and specificity to detect EGFR T790M mutation in plasma. PMID:29403309

  10. Nonlinear research of an image motion stabilization system embedded in a space land-survey telescope

    NASA Astrophysics Data System (ADS)

    Somov, Yevgeny; Butyrin, Sergey; Siguerdidjane, Houria

    2017-01-01

    We consider an image motion stabilization system embedded into a space telescope for a scanning optoelectronic observation of terrestrial targets. Developed model of this system is presented taking into account physical hysteresis of piezo-ceramic driver and a time delay at a forming of digital control. We have presented elaborated algorithms for discrete filtering and digital control, obtained results on analysis of the image motion velocity oscillations in the telescope focal plane, and also methods for terrestrial and in-flight verification of the system.

  11. Design and Verification of Critical Pressurised Windows for Manned Spaceflight

    NASA Astrophysics Data System (ADS)

    Lamoure, Richard; Busto, Lara; Novo, Francisco; Sinnema, Gerben; Leal, Mendes M.

    2014-06-01

    The Window Design for Manned Spaceflight (WDMS) project was tasked with establishing the state-of-art and explore possible improvements to the current structural integrity verification and fracture control methodologies for manned spacecraft windows.A critical review of the state-of-art in spacecraft window design, materials and verification practice was conducted. Shortcomings of the methodology in terms of analysis, inspection and testing were identified. Schemes for improving verification practices and reducing conservatism whilst maintaining the required safety levels were then proposed.An experimental materials characterisation programme was defined and carried out with the support of the 'Glass and Façade Technology Research Group', at the University of Cambridge. Results of the sample testing campaign were analysed, post-processed and subsequently applied to the design of a breadboard window demonstrator.Two Fused Silica glass window panes were procured and subjected to dedicated analyses, inspection and testing comprising both qualification and acceptance programmes specifically tailored to the objectives of the activity.Finally, main outcomes have been compiled into a Structural Verification Guide for Pressurised Windows in manned spacecraft, incorporating best practices and lessons learned throughout this project.

  12. Very fast road database verification using textured 3D city models obtained from airborne imagery

    NASA Astrophysics Data System (ADS)

    Bulatov, Dimitri; Ziems, Marcel; Rottensteiner, Franz; Pohl, Melanie

    2014-10-01

    Road databases are known to be an important part of any geodata infrastructure, e.g. as the basis for urban planning or emergency services. Updating road databases for crisis events must be performed quickly and with the highest possible degree of automation. We present a semi-automatic algorithm for road verification using textured 3D city models, starting from aerial or even UAV-images. This algorithm contains two processes, which exchange input and output, but basically run independently from each other. These processes are textured urban terrain reconstruction and road verification. The first process contains a dense photogrammetric reconstruction of 3D geometry of the scene using depth maps. The second process is our core procedure, since it contains various methods for road verification. Each method represents a unique road model and a specific strategy, and thus is able to deal with a specific type of roads. Each method is designed to provide two probability distributions, where the first describes the state of a road object (correct, incorrect), and the second describes the state of its underlying road model (applicable, not applicable). Based on the Dempster-Shafer Theory, both distributions are mapped to a single distribution that refers to three states: correct, incorrect, and unknown. With respect to the interaction of both processes, the normalized elevation map and the digital orthophoto generated during 3D reconstruction are the necessary input - together with initial road database entries - for the road verification process. If the entries of the database are too obsolete or not available at all, sensor data evaluation enables classification of the road pixels of the elevation map followed by road map extraction by means of vectorization and filtering of the geometrically and topologically inconsistent objects. Depending on the time issue and availability of a geo-database for buildings, the urban terrain reconstruction procedure has semantic models of buildings, trees, and ground as output. Building s and ground are textured by means of available images. This facilitates the orientation in the model and the interactive verification of the road objects that where initially classified as unknown. The three main modules of the texturing algorithm are: Pose estimation (if the videos are not geo-referenced), occlusion analysis, and texture synthesis.

  13. Verification, Validation and Credibility Assessment of a Computational Model of the Advanced Resistive Exercise Device (ARED)

    NASA Technical Reports Server (NTRS)

    Werner, C. R.; Humphreys, B. T.; Mulugeta, L.

    2014-01-01

    The Advanced Resistive Exercise Device (ARED) is the resistive exercise device used by astronauts on the International Space Station (ISS) to mitigate bone loss and muscle atrophy due to extended exposure to microgravity (micro g). The Digital Astronaut Project (DAP) has developed a multi-body dynamics model of biomechanics models for use in spaceflight exercise physiology research and operations. In an effort to advance model maturity and credibility of the ARED model, the DAP performed verification, validation and credibility (VV and C) assessment of the analyses of the model in accordance to NASA-STD-7009 'Standards for Models and Simulations'.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM FOR MONITORING AND CHARACTERIZATION

    EPA Science Inventory

    The Environmental Technology Verification Program is a service of the Environmental Protection Agency designed to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of performance. The goal of ETV i...

  15. Hardware acceleration and verification of systems designed with hardware description languages (HDL)

    NASA Astrophysics Data System (ADS)

    Wisniewski, Remigiusz; Wegrzyn, Marek

    2005-02-01

    Hardware description languages (HDLs) allow creating bigger and bigger designs nowadays. The size of prototyped systems very often exceeds million gates. Therefore verification process of the designs takes several hours or even days. The solution for this problem can be solved by hardware acceleration of simulation.

  16. A Design Rationale Capture Tool to Support Design Verification and Re-use

    NASA Technical Reports Server (NTRS)

    Hooey, Becky Lee; Da Silva, Jonny C.; Foyle, David C.

    2012-01-01

    A design rationale tool (DR tool) was developed to capture design knowledge to support design verification and design knowledge re-use. The design rationale tool captures design drivers and requirements, and documents the design solution including: intent (why it is included in the overall design); features (why it is designed the way it is); information about how the design components support design drivers and requirements; and, design alternatives considered but rejected. For design verification purposes, the tool identifies how specific design requirements were met and instantiated within the final design, and which requirements have not been met. To support design re-use, the tool identifies which design decisions are affected when design drivers and requirements are modified. To validate the design tool, the design knowledge from the Taxiway Navigation and Situation Awareness (T-NASA; Foyle et al., 1996) system was captured and the DR tool was exercised to demonstrate its utility for validation and re-use.

  17. An all-digital receiver for satellite audio broadcasting signals using trellis coded quasi-orthogonal code-division multiplexing

    NASA Astrophysics Data System (ADS)

    Braun, Walter; Eglin, Peter; Abello, Ricard

    1993-02-01

    Spread Spectrum Code Division Multiplex is an attractive scheme for the transmission of multiple signals over a satellite transponder. By using orthogonal or quasi-orthogonal spreading codes the interference between the users can be virtually eliminated. However, the acquisition and tracking of the spreading code phase can not take advantage of the code orthogonality since sequential acquisition and Delay-Locked loop tracking depend on correlation with code phases other than the optimal despreading phase. Hence, synchronization is a critical issue in such a system. A demonstration hardware for the verification of the orthogonal CDM synchronization and data transmission concept is being designed and implemented. The system concept, the synchronization scheme, and the implementation are described. The performance of the system is discussed based on computer simulations.

  18. The Cooking and Pneumonia Study (CAPS) in Malawi: Implementation of Remote Source Data Verification

    PubMed Central

    Weston, William; Smedley, James; Bennett, Andrew; Mortimer, Kevin

    2016-01-01

    Background Source data verification (SDV) is a data monitoring procedure which compares the original records with the Case Report Form (CRF). Traditionally, on-site SDV relies on monitors making multiples visits to study sites requiring extensive resources. The Cooking And Pneumonia Study (CAPS) is a 24- month village-level cluster randomized controlled trial assessing the effectiveness of an advanced cook-stove intervention in preventing pneumonia in children under five in rural Malawi (www.capstudy.org). CAPS used smartphones to capture digital images of the original records on an electronic CRF (eCRF). In the present study, descriptive statistics are used to report the experience of electronic data capture with remote SDV in a challenging research setting in rural Malawi. Methods At three monthly intervals, fieldworkers, who were employed by CAPS, captured pneumonia data from the original records onto the eCRF. Fieldworkers also captured digital images of the original records. Once Internet connectivity was available, the data captured on the eCRF and the digital images of the original records were uploaded to a web-based SDV application. This enabled SDV to be conducted remotely from the UK. We conducted SDV of the pneumonia data (occurrence, severity, and clinical indicators) recorded in the eCRF with the data in the digital images of the original records. Result 664 episodes of pneumonia were recorded after 6 months of follow-up. Of these 664 episodes, 611 (92%) had a finding of pneumonia in the original records. All digital images of the original records were clear and legible. Conclusion Electronic data capture using eCRFs on mobile technology is feasible in rural Malawi. Capturing digital images of the original records in the field allows remote SDV to be conducted efficiently and securely without requiring additional field visits. We recommend these approaches in similar settings, especially those with health endpoints. PMID:27355447

  19. The Cooking and Pneumonia Study (CAPS) in Malawi: Implementation of Remote Source Data Verification.

    PubMed

    Weston, William; Smedley, James; Bennett, Andrew; Mortimer, Kevin

    2016-01-01

    Source data verification (SDV) is a data monitoring procedure which compares the original records with the Case Report Form (CRF). Traditionally, on-site SDV relies on monitors making multiples visits to study sites requiring extensive resources. The Cooking And Pneumonia Study (CAPS) is a 24- month village-level cluster randomized controlled trial assessing the effectiveness of an advanced cook-stove intervention in preventing pneumonia in children under five in rural Malawi (www.capstudy.org). CAPS used smartphones to capture digital images of the original records on an electronic CRF (eCRF). In the present study, descriptive statistics are used to report the experience of electronic data capture with remote SDV in a challenging research setting in rural Malawi. At three monthly intervals, fieldworkers, who were employed by CAPS, captured pneumonia data from the original records onto the eCRF. Fieldworkers also captured digital images of the original records. Once Internet connectivity was available, the data captured on the eCRF and the digital images of the original records were uploaded to a web-based SDV application. This enabled SDV to be conducted remotely from the UK. We conducted SDV of the pneumonia data (occurrence, severity, and clinical indicators) recorded in the eCRF with the data in the digital images of the original records. 664 episodes of pneumonia were recorded after 6 months of follow-up. Of these 664 episodes, 611 (92%) had a finding of pneumonia in the original records. All digital images of the original records were clear and legible. Electronic data capture using eCRFs on mobile technology is feasible in rural Malawi. Capturing digital images of the original records in the field allows remote SDV to be conducted efficiently and securely without requiring additional field visits. We recommend these approaches in similar settings, especially those with health endpoints.

  20. 76 FR 44051 - Submission for Review: Verification of Who Is Getting Payments, RI 38-107 and RI 38-147

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-22

    .... SUPPLEMENTARY INFORMATION: RI 38-107, Verification of Who is Getting Payments, is designed for use by the... OFFICE OF PERSONNEL MANAGEMENT Submission for Review: Verification of Who Is Getting Payments, RI... currently approved information collection request (ICR) 3206-0197, Verification of Who is Getting Payments...

  1. Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) Independent Analysis

    NASA Technical Reports Server (NTRS)

    Davis, Mitchell L.; Aguilar, Michael L.; Mora, Victor D.; Regenie, Victoria A.; Ritz, William F.

    2009-01-01

    Two approaches were compared to the Crew Exploration Vehicle (CEV) Avionics Integration Laboratory (CAIL) approach: the Flat-Sat and Shuttle Avionics Integration Laboratory (SAIL). The Flat-Sat and CAIL/SAIL approaches are two different tools designed to mitigate different risks. Flat-Sat approach is designed to develop a mission concept into a flight avionics system and associated ground controller. The SAIL approach is designed to aid in the flight readiness verification of the flight avionics system. The approaches are complimentary in addressing both the system development risks and mission verification risks. The following NESC team findings were identified: The CAIL assumption is that the flight subsystems will be matured for the system level verification; The Flat-Sat and SAIL approaches are two different tools designed to mitigate different risks. The following NESC team recommendation was provided: Define, document, and manage a detailed interface between the design and development (EDL and other integration labs) to the verification laboratory (CAIL).

  2. Ada(R) Test and Verification System (ATVS)

    NASA Technical Reports Server (NTRS)

    Strelich, Tom

    1986-01-01

    The Ada Test and Verification System (ATVS) functional description and high level design are completed and summarized. The ATVS will provide a comprehensive set of test and verification capabilities specifically addressing the features of the Ada language, support for embedded system development, distributed environments, and advanced user interface capabilities. Its design emphasis was on effective software development environment integration and flexibility to ensure its long-term use in the Ada software development community.

  3. WRAP-RIB antenna technology development

    NASA Technical Reports Server (NTRS)

    Freeland, R. E.; Garcia, N. F.; Iwamoto, H.

    1985-01-01

    The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.

  4. Digital Health Services and Digital Identity in Alberta.

    PubMed

    McEachern, Aiden; Cholewa, David

    2017-01-01

    The Government of Alberta continues to improve delivery of healthcare by allowing Albertans to access their health information online. Alberta is the only province in Canada with provincial electronic health records for all its citizens. These records are currently made available to medical practitioners, but Alberta Health believes that providing Albertans access to their health records will transform the delivery of healthcare in Alberta. It is important to have a high level of assurance that the health records are provided to the correct Albertan. Alberta Health requires a way for Albertans to obtain a digital identity with a high level of identity assurance prior to releasing health records via the Personal Health Portal. Service Alberta developed the MyAlberta Digital ID program to provide a digital identity verification service. The Ministry of Health is leveraging MyAlberta Digital ID to enable Albertans to access their personal health records through the Personal Health Portal. The Government of Alberta is advancing its vision of patient-centred healthcare by enabling Albertans to access a trusted source for health information and their electronic health records using a secure digital identity.

  5. Integrating Model-Based Verification into Software Design Education

    ERIC Educational Resources Information Center

    Yilmaz, Levent; Wang, Shuo

    2005-01-01

    Proper design analysis is indispensable to assure quality and reduce emergent costs due to faulty software. Teaching proper design verification skills early during pedagogical development is crucial, as such analysis is the only tractable way of resolving software problems early when they are easy to fix. The premise of the presented strategy is…

  6. Assume-Guarantee Verification of Source Code with Design-Level Assumptions

    NASA Technical Reports Server (NTRS)

    Giannakopoulou, Dimitra; Pasareanu, Corina S.; Cobleigh, Jamieson M.

    2004-01-01

    Model checking is an automated technique that can be used to determine whether a system satisfies certain required properties. To address the 'state explosion' problem associated with this technique, we propose to integrate assume-guarantee verification at different phases of system development. During design, developers build abstract behavioral models of the system components and use them to establish key properties of the system. To increase the scalability of model checking at this level, we have developed techniques that automatically decompose the verification task by generating component assumptions for the properties to hold. The design-level artifacts are subsequently used to guide the implementation of the system, but also to enable more efficient reasoning at the source code-level. In particular we propose to use design-level assumptions to similarly decompose the verification of the actual system implementation. We demonstrate our approach on a significant NASA application, where design-level models were used to identify; and correct a safety property violation, and design-level assumptions allowed us to check successfully that the property was presented by the implementation.

  7. First Image Products from EcoSAR - Osa Peninsula, Costa Rica

    NASA Technical Reports Server (NTRS)

    Osmanoglu, Batuhan; Lee, SeungKuk; Rincon, Rafael; Fatuyinbo, Lola; Bollian, Tobias; Ranson, Jon

    2016-01-01

    Designed especially for forest ecosystem studies, EcoSAR employs state-of-the-art digital beamforming technology to generate wide-swath, high-resolution imagery. EcoSARs dual antenna single-pass imaging capability eliminates temporal decorrelation from polarimetric and interferometric analysis, increasing the signal strength and simplifying models used to invert forest structure parameters. Antennae are physically separated by 25 meters providing single pass interferometry. In this mode the radar is most sensitive to topography. With 32 active transmit and receive channels, EcoSARs digital beamforming is an order of magnitude more versatile than the digital beamforming employed on the upcoming NISAR mission. EcoSARs long wavelength (P-band, 435 MHz, 69 cm) measurements can be used to simulate data products for ESAs future BIOMASS mission, allowing scientists to develop algorithms before the launch of the satellite. EcoSAR can also be deployed to collect much needed data where BIOMASS satellite wont be allowed to collect data (North America, Europe and Arctic), filling in the gaps to keep a watchful eye on the global carbon cycle. EcoSAR can play a vital role in monitoring, reporting and verification schemes of internationals programs such as UN-REDD (United Nations Reducing Emissions from Deforestation and Degradation) benefiting global society. EcoSAR was developed and flown with support from NASA Earth Sciences Technology Offices Instrument Incubator Program.

  8. Decomposed Photo Response Non-Uniformity for Digital Forensic Analysis

    NASA Astrophysics Data System (ADS)

    Li, Yue; Li, Chang-Tsun

    The last few years have seen the applications of Photo Response Non-Uniformity noise (PRNU) - a unique stochastic fingerprint of image sensors, to various types of digital forensic investigations such as source device identification and integrity verification. In this work we proposed a new way of extracting PRNU noise pattern, called Decomposed PRNU (DPRNU), by exploiting the difference between the physical andartificial color components of the photos taken by digital cameras that use a Color Filter Array for interpolating artificial components from physical ones. Experimental results presented in this work have shown the superiority of the proposed DPRNU to the commonly used version. We also proposed a new performance metrics, Corrected Positive Rate (CPR) to evaluate the performance of the common PRNU and the proposed DPRNU.

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT--BAGHOUSE FILTRATION PRODUCTS, DONALDSON COMPANY, INC., 6282 FILTRATION MEDIA

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, established by the U.S. EPA, is designed to accelerate the developmentand commercialization of new or improved technologies through third-party verification and reporting of performance. The Air Pollution Control Technology...

  10. Sparse distributed memory prototype: Principles of operation

    NASA Technical Reports Server (NTRS)

    Flynn, Michael J.; Kanerva, Pentti; Ahanin, Bahram; Bhadkamkar, Neal; Flaherty, Paul; Hickey, Philip

    1988-01-01

    Sparse distributed memory is a generalized random access memory (RAM) for long binary words. Such words can be written into and read from the memory, and they can be used to address the memory. The main attribute of the memory is sensitivity to similarity, meaning that a word can be read back not only by giving the original right address but also by giving one close to it as measured by the Hamming distance between addresses. Large memories of this kind are expected to have wide use in speech and scene analysis, in signal detection and verification, and in adaptive control of automated equipment. The memory can be realized as a simple, massively parallel computer. Digital technology has reached a point where building large memories is becoming practical. The research is aimed at resolving major design issues that have to be faced in building the memories. The design of a prototype memory with 256-bit addresses and from 8K to 128K locations for 256-bit words is described. A key aspect of the design is extensive use of dynamic RAM and other standard components.

  11. [Design of magneto-acoustic-electrical detection system and verification of its linear sweep theory].

    PubMed

    Dai, Ming; Chen, Siping; Li, Fangfang; Chen, Mian; Lin, Haoming; Chen, Xin

    2018-02-01

    Clinical studies had demonstrated that early diagnosis of lesion could significantly reduce the risk of cancer. Magneto-acoustic-electrical tomography (MAET) is expected to become a new detection method due to its advantages of high resolution and high contrast. Based on thinking of modular design, a low-cost, digital magneto-acoustic conductivity detection system was designed and implemented in this study. The theory of MAET using chirp continuous wave excitation was introduced. The results of homogeneous phantom experiment with 0.5% NaCl clearly showed that the conductivity curve of homogeneous phantom was highly consistent with the actual physical size, which indicated that the chirp excitation theory in our proposed system was correct and feasible. Besides, the resolution obtained by 1 000 μs sweep time was better than that obtained by 500 μs and 1 500 μs, which means that sweep time is an important factor affecting the detection resolution of the conductivity. The same result was obtained in the experiments carried out on homogeneous phantoms with different concentrations of NaCl, which demonstrated the repeatability of our proposed MAET system.

  12. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, Michael A.

    1997-01-01

    A system and method for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded "D" character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the "D" interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available.

  13. System and method for simultaneously collecting serial number information from numerous identity tags

    DOEpatents

    Doty, M.A.

    1997-01-07

    A system and method are disclosed for simultaneously collecting serial number information reports from numerous colliding coded-radio-frequency identity tags. Each tag has a unique multi-digit serial number that is stored in non-volatile RAM. A reader transmits an ASCII coded ``D`` character on a carrier of about 900 MHz and a power illumination field having a frequency of about 1.6 Ghz. A one MHz tone is modulated on the 1.6 Ghz carrier as a timing clock for a microprocessor in each of the identity tags. Over a thousand such tags may be in the vicinity and each is powered-up and clocked by the 1.6 Ghz power illumination field. Each identity tag looks for the ``D`` interrogator modulated on the 900 MHz carrier, and each uses a digit of its serial number to time a response. Clear responses received by the reader are repeated for verification. If no verification or a wrong number is received by any identity tag, it uses a second digital together with the first to time out a more extended period for response. Ultimately, the entire serial number will be used in the worst case collision environments; and since the serial numbers are defined as being unique, the final possibility will be successful because a clear time-slot channel will be available. 5 figs.

  14. Tunable Multifunctional Thermal Metamaterials: Manipulation of Local Heat Flux via Assembly of Unit-Cell Thermal Shifters

    PubMed Central

    Park, Gwanwoo; Kang, Sunggu; Lee, Howon; Choi, Wonjoon

    2017-01-01

    Thermal metamaterials, designed by transformation thermodynamics are artificial structures that can actively control heat flux at a continuum scale. However, fabrication of them is very challenging because it requires a continuous change of thermal properties in materials, for one specific function. Herein, we introduce tunable thermal metamaterials that use the assembly of unit-cell thermal shifters for a remarkable enhancement in multifunctionality as well as manufacturability. Similar to the digitization of a two-dimensional image, designed thermal metamaterials by transformation thermodynamics are disassembled as unit-cells thermal shifters in tiny areas, representing discretized heat flux lines in local spots. The programmed-reassembly of thermal shifters inspired by LEGO enable the four significant functions of thermal metamaterials—shield, concentrator, diffuser, and rotator—in both simulation and experimental verification using finite element method and fabricated structures made from copper and PDMS. This work paves the way for overcoming the structural and functional limitations of thermal metamaterials. PMID:28106156

  15. Tunable Multifunctional Thermal Metamaterials: Manipulation of Local Heat Flux via Assembly of Unit-Cell Thermal Shifters

    NASA Astrophysics Data System (ADS)

    Park, Gwanwoo; Kang, Sunggu; Lee, Howon; Choi, Wonjoon

    2017-01-01

    Thermal metamaterials, designed by transformation thermodynamics are artificial structures that can actively control heat flux at a continuum scale. However, fabrication of them is very challenging because it requires a continuous change of thermal properties in materials, for one specific function. Herein, we introduce tunable thermal metamaterials that use the assembly of unit-cell thermal shifters for a remarkable enhancement in multifunctionality as well as manufacturability. Similar to the digitization of a two-dimensional image, designed thermal metamaterials by transformation thermodynamics are disassembled as unit-cells thermal shifters in tiny areas, representing discretized heat flux lines in local spots. The programmed-reassembly of thermal shifters inspired by LEGO enable the four significant functions of thermal metamaterials—shield, concentrator, diffuser, and rotator—in both simulation and experimental verification using finite element method and fabricated structures made from copper and PDMS. This work paves the way for overcoming the structural and functional limitations of thermal metamaterials.

  16. Mechanical Systems

    NASA Technical Reports Server (NTRS)

    Davis, Robert E.

    2002-01-01

    The presentation provides an overview of requirement and interpretation letters, mechanical systems safety interpretation letter, design and verification provisions, and mechanical systems verification plan.

  17. Design Authority in the Test Programme Definition: The Alenia Spazio Experience

    NASA Astrophysics Data System (ADS)

    Messidoro, P.; Sacchi, E.; Beruto, E.; Fleming, P.; Marucchi Chierro, P.-P.

    2004-08-01

    In addition, being the Verification and Test Programme a significant part of the spacecraft development life cycle in terms of cost and time, very often the subject of the mentioned discussion has the objective to optimize the verification campaign by possible deletion or limitation of some testing activities. The increased market pressure to reduce the project's schedule and cost is originating a dialecting process inside the project teams, involving program management and design authorities, in order to optimize the verification and testing programme. The paper introduces the Alenia Spazio experience in this context, coming from the real project life on different products and missions (science, TLC, EO, manned, transportation, military, commercial, recurrent and one-of-a-kind). Usually the applicable verification and testing standards (e.g. ECSS-E-10 part 2 "Verification" and ECSS-E-10 part 3 "Testing" [1]) are tailored to the specific project on the basis of its peculiar mission constraints. The Model Philosophy and the associated verification and test programme are defined following an iterative process which suitably combines several aspects (including for examples test requirements and facilities) as shown in Fig. 1 (from ECSS-E-10). The considered cases are mainly oriented to the thermal and mechanical verification, where the benefits of possible test programme optimizations are more significant. Considering the thermal qualification and acceptance testing (i.e. Thermal Balance and Thermal Vacuum) the lessons learned originated by the development of several satellites are presented together with the corresponding recommended approaches. In particular the cases are indicated in which a proper Thermal Balance Test is mandatory and others, in presence of more recurrent design, where a qualification by analysis could be envisaged. The importance of a proper Thermal Vacuum exposure for workmanship verification is also highlighted. Similar considerations are summarized for the mechanical testing with particular emphasis on the importance of Modal Survey, Static and Sine Vibration Tests in the qualification stage in combination with the effectiveness of Vibro-Acoustic Test in acceptance. The apparent relative importance of the Sine Vibration Test for workmanship verification in specific circumstances is also highlighted. Fig. 1. Model philosophy, Verification and Test Programme definition The verification of the project requirements is planned through a combination of suitable verification methods (in particular Analysis and Test) at the different verification levels (from System down to Equipment), in the proper verification stages (e.g. in Qualification and Acceptance).

  18. Palmprint and face score level fusion: hardware implementation of a contactless small sample biometric system

    NASA Astrophysics Data System (ADS)

    Poinsot, Audrey; Yang, Fan; Brost, Vincent

    2011-02-01

    Including multiple sources of information in personal identity recognition and verification gives the opportunity to greatly improve performance. We propose a contactless biometric system that combines two modalities: palmprint and face. Hardware implementations are proposed on the Texas Instrument Digital Signal Processor and Xilinx Field-Programmable Gate Array (FPGA) platforms. The algorithmic chain consists of a preprocessing (which includes palm extraction from hand images), Gabor feature extraction, comparison by Hamming distance, and score fusion. Fusion possibilities are discussed and tested first using a bimodal database of 130 subjects that we designed (uB database), and then two common public biometric databases (AR for face and PolyU for palmprint). High performance has been obtained for recognition and verification purpose: a recognition rate of 97.49% with AR-PolyU database and an equal error rate of 1.10% on the uB database using only two training samples per subject have been obtained. Hardware results demonstrate that preprocessing can easily be performed during the acquisition phase, and multimodal biometric recognition can be treated almost instantly (0.4 ms on FPGA). We show the feasibility of a robust and efficient multimodal hardware biometric system that offers several advantages, such as user-friendliness and flexibility.

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: BAGHOUSE FILTRATION PRODUCTS--DONALDSON COMPANY, INC., TETRATEC #6255 FILTRATION MEDIA

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program, established by the U.S. EPA, is designed to accelerate the development and commercialization of new or improved technologies through third-party verification and reporting of performance. The Air Pollution Control Technolog...

  20. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  1. Coherent inductive communications link for biomedical applications

    NASA Technical Reports Server (NTRS)

    Hogrefe, Arthur F. (Inventor); Radford, Wade E. (Inventor)

    1985-01-01

    A two-way coherent inductive communications link between an external transceiver and an internal transceiver located in a biologically implanted programmable medical device. Digitally formatted command data and programming data is transmitted to the implanted medical device by frequency shift keying the inductive communications link. Internal transceiver is powered by the inductive field between internal and external transceivers. Digitally formatted data is transmitted to external transceiver by internal transceiver amplitude modulating inductive field. Immediate verification of the establishment of a reliable communications link is provided by determining existence of frequency lock and bit phase lock between internal and external transceivers.

  2. A TREETOPS simulation of the Hubble Space Telescope-High Gain Antenna interaction

    NASA Technical Reports Server (NTRS)

    Sharkey, John P.

    1987-01-01

    Virtually any project dealing with the control of a Large Space Structure (LSS) will involve some level of verification by digital computer simulation. While the Hubble Space Telescope might not normally be included in a discussion of LSS, it is presented to highlight a recently developed simulation and analysis program named TREETOPS. TREETOPS provides digital simulation, linearization, and control system interaction of flexible, multibody spacecraft which admit to a point-connected tree topology. The HST application of TREETOPS is intended to familiarize the LSS community with TREETOPS by presenting a user perspective of its key features.

  3. Systematic Model-in-the-Loop Test of Embedded Control Systems

    NASA Astrophysics Data System (ADS)

    Krupp, Alexander; Müller, Wolfgang

    Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.

  4. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  5. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  6. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  7. 49 CFR 236.905 - Railroad Safety Program Plan (RSPP).

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... to be used in the verification and validation process, consistent with appendix C to this part. The...; and (iv) The identification of the safety assessment process. (2) Design for verification and validation. The RSPP must require the identification of verification and validation methods for the...

  8. What is the Final Verification of Engineering Requirements?

    NASA Technical Reports Server (NTRS)

    Poole, Eric

    2010-01-01

    This slide presentation reviews the process of development through the final verification of engineering requirements. The definition of the requirements is driven by basic needs, and should be reviewed by both the supplier and the customer. All involved need to agree upon a formal requirements including changes to the original requirements document. After the requirements have ben developed, the engineering team begins to design the system. The final design is reviewed by other organizations. The final operational system must satisfy the original requirements, though many verifications should be performed during the process. The verification methods that are used are test, inspection, analysis and demonstration. The plan for verification should be created once the system requirements are documented. The plan should include assurances that every requirement is formally verified, that the methods and the responsible organizations are specified, and that the plan is reviewed by all parties. The options of having the engineering team involved in all phases of the development as opposed to having some other organization continue the process once the design has been complete is discussed.

  9. Designation and verification of road markings detection and guidance method

    NASA Astrophysics Data System (ADS)

    Wang, Runze; Jian, Yabin; Li, Xiyuan; Shang, Yonghong; Wang, Jing; Zhang, JingChuan

    2018-01-01

    With the rapid development of China's space industry, digitization and intelligent is the tendency of the future. This report is present a foundation research about guidance system which based on the HSV color space. With the help of these research which will help to design the automatic navigation and parking system for the frock transport car and the infrared lamp homogeneity intelligent test equipment. The drive mode, steer mode as well as the navigation method was selected. In consideration of the practicability, it was determined to use the front-wheel-steering chassis. The steering mechanism was controlled by the stepping motors, and it is guided by Machine Vision. The optimization and calibration of the steering mechanism was made. A mathematical model was built and the objective functions was constructed for the steering mechanism. The extraction method of the steering line was studied and the motion controller was designed and optimized. The theory of HSV, RGB color space and analysis of the testing result will be discussed Using the function library OPENCV on the Linux system to fulfill the camera calibration. Based on the HSV color space to design the guidance algorithm.

  10. Improving Face Verification in Photo Albums by Combining Facial Recognition and Metadata With Cross-Matching

    DTIC Science & Technology

    2017-12-01

    satisfactory performance. We do not use statistical models, and we do not create patterns that require supervised learning. Our methodology is intended...statistical models, and we do not create patterns that require supervised learning. Our methodology is intended for use in personal digital image...THESIS MOTIVATION .........................................................................19 III. METHODOLOGY

  11. Verification, Validation and Accreditation using AADL

    DTIC Science & Technology

    2011-05-03

    component h component, c r2 socsr hhh  max. height (surface relative), hsr r1 pwbsra thh  max. height (absolute), ha pwb pwb t c0. Context-Specific...5512 digital oscillatorABC_9230 Warning Module PWB component component, c r2 hhh max. height (surface relative), hsr r1 pwbsra thh  max. height

  12. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems

    DTIC Science & Technology

    1994-07-29

    time systems and to evaluate the design. The evaluation of the design includes investigation of both the capability and potential usefulness of the toolkit environment and the feasibility of its implementation....The goals of Phase 1 are to design in detail a toolkit environment based on formal methods for the specification and verification of distributed real

  13. Verified compilation of Concurrent Managed Languages

    DTIC Science & Technology

    2017-11-01

    designs for compiler intermediate representations that facilitate mechanized proofs and verification; and (d) a realistic case study that combines these...ideas to prove the correctness of a state-of- the-art concurrent garbage collector. 15. SUBJECT TERMS Program verification, compiler design ...Even though concurrency is a pervasive part of modern software and hardware systems, it has often been ignored in safety-critical system designs . A

  14. Design and Verification of an Inexpensive Ultrasonic Water Depth Sensor Using Arduino

    NASA Astrophysics Data System (ADS)

    Mihevc, T. M.; Rajagopal, S.

    2012-12-01

    A system that combines the arduino micro-controller, a Parallax PING Ultrasonic distance sensor and a secure digital card to log the data is developed to help monitor water table depths in multiple settings. Traditional methods of monitoring water table depths involve the use of a pressure transducer and expensive data loggers that cost upward of 1000. The present system is built for less than 100, with the caveat that the accuracy of the measurements is 1cm. In this laboratory study, we first build the arduino based system to monitor water table depths in a piezometer and compare these measurements to those made by a pressure transducer. Initial results show that the depth measurements are accurate in comparison to actual tape measurements. Results from this benchmarking experiment will be presented at the meeting.

  15. Small Projects Rapid Integration and Test Environment (SPRITE): Application for Increasing Robustness

    NASA Technical Reports Server (NTRS)

    Rakoczy, John; Heater, Daniel; Lee, Ashley

    2013-01-01

    Marshall Space Flight Center's (MSFC) Small Projects Rapid Integration and Test Environment (SPRITE) is a Hardware-In-The-Loop (HWIL) facility that provides rapid development, integration, and testing capabilities for small projects (CubeSats, payloads, spacecraft, and launch vehicles). This facility environment focuses on efficient processes and modular design to support rapid prototyping, integration, testing and verification of small projects at an affordable cost, especially compared to larger type HWIL facilities. SPRITE (Figure 1) consists of a "core" capability or "plant" simulation platform utilizing a graphical programming environment capable of being rapidly re-configured for any potential test article's space environments, as well as a standard set of interfaces (i.e. Mil-Std 1553, Serial, Analog, Digital, etc.). SPRITE also allows this level of interface testing of components and subsystems very early in a program, thereby reducing program risk.

  16. Software Tools for Formal Specification and Verification of Distributed Real-Time Systems.

    DTIC Science & Technology

    1997-09-30

    set of software tools for specification and verification of distributed real time systems using formal methods. The task of this SBIR Phase II effort...to be used by designers of real - time systems for early detection of errors. The mathematical complexity of formal specification and verification has

  17. FORMED: Bringing Formal Methods to the Engineering Desktop

    DTIC Science & Technology

    2016-02-01

    integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification

  18. Joint ETV/NOWATECH test plan for the Sorbisense GSW40 passive sampler

    EPA Science Inventory

    The joint test plan is the implementation of a test design developed for verification of the performance of an environmental technology following the NOWATECH ETV method. The verification is a joint verification with the US EPA ETV scheme and the Advanced Monitoring Systems Cent...

  19. JPL control/structure interaction test bed real-time control computer architecture

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.

    1989-01-01

    The Control/Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts - such as active structure - and new tools - such as combined structure and control optimization algorithm - and their verification in ground and possibly flight test. A focus mission spacecraft was designed based upon a space interferometer and is the basis for design of the ground test article. The ground test bed objectives include verification of the spacecraft design concepts, the active structure elements and certain design tools such as the new combined structures and controls optimization tool. In anticipation of CSI technology flight experiments, the test bed control electronics must emulate the computation capacity and control architectures of space qualifiable systems as well as the command and control networks that will be used to connect investigators with the flight experiment hardware. The Test Bed facility electronics were functionally partitioned into three units: a laboratory data acquisition system for structural parameter identification and performance verification; an experiment supervisory computer to oversee the experiment, monitor the environmental parameters and perform data logging; and a multilevel real-time control computing system. The design of the Test Bed electronics is presented along with hardware and software component descriptions. The system should break new ground in experimental control electronics and is of interest to anyone working in the verification of control concepts for large structures.

  20. Accuracy of Digital vs Conventional Implant Impression Approach: A Three-Dimensional Comparative In Vitro Analysis.

    PubMed

    Basaki, Kinga; Alkumru, Hasan; De Souza, Grace; Finer, Yoav

    To assess the three-dimensional (3D) accuracy and clinical acceptability of implant definitive casts fabricated using a digital impression approach and to compare the results with those of a conventional impression method in a partially edentulous condition. A mandibular reference model was fabricated with implants in the first premolar and molar positions to simulate a patient with bilateral posterior edentulism. Ten implant-level impressions per method were made using either an intraoral scanner with scanning abutments for the digital approach or an open-tray technique and polyvinylsiloxane material for the conventional approach. 3D analysis and comparison of implant location on resultant definitive casts were performed using laser scanner and quality control software. The inter-implant distances and interimplant angulations for each implant pair were measured for the reference model and for each definitive cast (n = 20 per group); these measurements were compared to calculate the magnitude of error in 3D for each definitive cast. The influence of implant angulation on definitive cast accuracy was evaluated for both digital and conventional approaches. Statistical analysis was performed using t test (α = .05) for implant position and angulation. Clinical qualitative assessment of accuracy was done via the assessment of the passivity of a master verification stent for each implant pair, and significance was analyzed using chi-square test (α = .05). A 3D error of implant positioning was observed for the two impression techniques vs the reference model, with mean ± standard deviation (SD) error of 116 ± 94 μm and 56 ± 29 μm for the digital and conventional approaches, respectively (P = .01). In contrast, the inter-implant angulation errors were not significantly different between the two techniques (P = .83). Implant angulation did not have a significant influence on definitive cast accuracy within either technique (P = .64). The verification stent demonstrated acceptable passive fit for 11 out of 20 casts and 18 out of 20 casts for the digital and conventional methods, respectively (P = .01). Definitive casts fabricated using the digital impression approach were less accurate than those fabricated from the conventional impression approach for this simulated clinical scenario. A significant number of definitive casts generated by the digital technique did not meet clinically acceptable accuracy for the fabrication of a multiple implant-supported restoration.

  1. RTL validation methodology on high complexity wireless microcontroller using OVM technique for fast time to market

    NASA Astrophysics Data System (ADS)

    Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB

    2017-11-01

    Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.

  2. Digital video timing analyzer for the evaluation of PC-based real-time simulation systems

    NASA Astrophysics Data System (ADS)

    Jones, Shawn R.; Crosby, Jay L.; Terry, John E., Jr.

    2009-05-01

    Due to the rapid acceleration in technology and the drop in costs, the use of commercial off-the-shelf (COTS) PC-based hardware and software components for digital and hardware-in-the-loop (HWIL) simulations has increased. However, the increase in PC-based components creates new challenges for HWIL test facilities such as cost-effective hardware and software selection, system configuration and integration, performance testing, and simulation verification/validation. This paper will discuss how the Digital Video Timing Analyzer (DiViTA) installed in the Aviation and Missile Research, Development and Engineering Center (AMRDEC) provides quantitative characterization data for PC-based real-time scene generation systems. An overview of the DiViTA is provided followed by details on measurement techniques, applications, and real-world examples of system benefits.

  3. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  4. Experimental verification of a model of a two-link flexible, lightweight manipulator. M.S. Thesis

    NASA Technical Reports Server (NTRS)

    Huggins, James David

    1988-01-01

    Experimental verification is presented for an assumed modes model of a large, two link, flexible manipulator design and constructed in the School of Mechanical Engineering at Georgia Institute of Technology. The structure was designed to have typical characteristics of a lightweight manipulator.

  5. TEST DESIGN FOR ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) OF ADD-ON NOX CONTROL UTILIZING OZONE INJECTION

    EPA Science Inventory

    The paper discusses the test design for environmental technology verification (ETV) of add-0n nitrogen oxides (NOx) control utilizing ozone injection. (NOTE: ETV is an EPA-established program to enhance domestic and international market acceptance of new or improved commercially...

  6. Requirement Specifications for a Design and Verification Unit.

    ERIC Educational Resources Information Center

    Pelton, Warren G.; And Others

    A research and development activity to introduce new and improved education and training technology into Bureau of Medicine and Surgery training is recommended. The activity, called a design and verification unit, would be administered by the Education and Training Sciences Department. Initial research and development are centered on the…

  7. 78 FR 6849 - Agency Information Collection (Verification of VA Benefits) Activity Under OMB Review

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-01-31

    ... (Verification of VA Benefits) Activity Under OMB Review AGENCY: Veterans Benefits Administration, Department of... ``OMB Control No. 2900-0406.'' SUPPLEMENTARY INFORMATION: Title: Verification of VA Benefits, VA Form 26... eliminate unlimited versions of lender- designed forms. The form also informs the lender whether or not the...

  8. Design and Verification Guidelines for Vibroacoustic and Transient Environments

    NASA Technical Reports Server (NTRS)

    1986-01-01

    Design and verification guidelines for vibroacoustic and transient environments contain many basic methods that are common throughout the aerospace industry. However, there are some significant differences in methodology between NASA/MSFC and others - both government agencies and contractors. The purpose of this document is to provide the general guidelines used by the Component Analysis Branch, ED23, at MSFC, for the application of the vibroacoustic and transient technology to all launch vehicle and payload components and payload components and experiments managed by NASA/MSFC. This document is intended as a tool to be utilized by the MSFC program management and their contractors as a guide for the design and verification of flight hardware.

  9. Integrated testing and verification system for research flight software design document

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.; Merilatt, R. L.; Osterweil, L. J.

    1979-01-01

    The NASA Langley Research Center is developing the MUST (Multipurpose User-oriented Software Technology) program to cut the cost of producing research flight software through a system of software support tools. The HAL/S language is the primary subject of the design. Boeing Computer Services Company (BCS) has designed an integrated verification and testing capability as part of MUST. Documentation, verification and test options are provided with special attention on real time, multiprocessing issues. The needs of the entire software production cycle have been considered, with effective management and reduced lifecycle costs as foremost goals. Capabilities have been included in the design for static detection of data flow anomalies involving communicating concurrent processes. Some types of ill formed process synchronization and deadlock also are detected statically.

  10. Automated biowaste sampling system urine subsystem operating model, part 1

    NASA Technical Reports Server (NTRS)

    Fogal, G. L.; Mangialardi, J. K.; Rosen, F.

    1973-01-01

    The urine subsystem automatically provides for the collection, volume sensing, and sampling of urine from six subjects during space flight. Verification of the subsystem design was a primary objective of the current effort which was accomplished thru the detail design, fabrication, and verification testing of an operating model of the subsystem.

  11. Integrated Verification Experiment data collected as part of the Los Alamos National Laboratory`s Source Region Program. Appendix B: Surface ground motion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Weaver, T.A.; Baker, D.F.; Edwards, C.L.

    1993-10-01

    Surface ground motion was recorded for many of the Integrated Verification Experiments using standard 10-, 25- and 100-g accelerometers, force-balanced accelerometers and, for some events, using golf balls and 0.39-cm steel balls as surface inertial gauges (SIGs). This report contains the semi-processed acceleration, velocity, and displacement data for the accelerometers fielded and the individual observations for the SIG experiments. Most acceleration, velocity, and displacement records have had calibrations applied and have been deramped, offset corrected, and deglitched but are otherwise unfiltered or processed from their original records. Digital data for all of these records are stored at Los Alamos Nationalmore » Laboratory.« less

  12. Human Factors Analysis and Layout Guideline Development for the Canadian Surface Combatant (CSC) Project

    DTIC Science & Technology

    2013-04-01

    project was to provide the Royal Canadian Navy ( RCN ) with a set of guidelines on analysis, design, and verification processes for effective room...design, and verification processes that should be used in the development of effective room layouts for Royal Canadian Navy ( RCN ) ships. The primary...designed CSC; however, the guidelines could be applied to the design of any multiple-operator space in any RCN vessel. Results: The development of

  13. Making statistical inferences about software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1988-01-01

    Failure times of software undergoing random debugging can be modelled as order statistics of independent but nonidentically distributed exponential random variables. Using this model inferences can be made about current reliability and, if debugging continues, future reliability. This model also shows the difficulty inherent in statistical verification of very highly reliable software such as that used by digital avionics in commercial aircraft.

  14. Development of Product Relatedness and Distance Effects in Typical Achievers and in Children with Mathematics Learning Disabilities

    ERIC Educational Resources Information Center

    Rotem, Avital; Henik, Avishai

    2015-01-01

    The current study examined the development of two effects that have been found in single-digit multiplication errors: relatedness and distance. Typically achieving (TA) second, fourth, and sixth graders and adults, and sixth and eighth graders with a mathematics learning disability (MLD) performed a verification task. Relatedness was defined by a…

  15. Intelligent content fitting for digital publishing

    NASA Astrophysics Data System (ADS)

    Lin, Xiaofan

    2006-02-01

    One recurring problem in Variable Data Printing (VDP) is that the existing contents cannot satisfy the VDP task as-is. So there is a strong need for content fitting technologies to support high-value digital publishing applications, in which text and image are the two major types of contents. This paper presents meta-Autocrop framework for image fitting and TextFlex technology for text fitting. The meta-Autocrop framework supports multiple modes: fixed aspect-ratio mode, advice mode, and verification mode. The TextFlex technology supports non-rectangular text wrapping and paragraph-based line breaking. We also demonstrate how these content fitting technologies are utilized in the overall automated composition and layout system.

  16. On-Ground Processing of Yaogan-24 Remote Sensing Satellite Attitude Data and Verification Using Geometric Field Calibration

    PubMed Central

    Wang, Mi; Fan, Chengcheng; Yang, Bo; Jin, Shuying; Pan, Jun

    2016-01-01

    Satellite attitude accuracy is an important factor affecting the geometric processing accuracy of high-resolution optical satellite imagery. To address the problem whereby the accuracy of the Yaogan-24 remote sensing satellite’s on-board attitude data processing is not high enough and thus cannot meet its image geometry processing requirements, we developed an approach involving on-ground attitude data processing and digital orthophoto (DOM) and the digital elevation model (DEM) verification of a geometric calibration field. The approach focuses on three modules: on-ground processing based on bidirectional filter, overall weighted smoothing and fitting, and evaluation in the geometric calibration field. Our experimental results demonstrate that the proposed on-ground processing method is both robust and feasible, which ensures the reliability of the observation data quality, convergence and stability of the parameter estimation model. In addition, both the Euler angle and quaternion could be used to build a mathematical fitting model, while the orthogonal polynomial fitting model is more suitable for modeling the attitude parameter. Furthermore, compared to the image geometric processing results based on on-board attitude data, the image uncontrolled and relative geometric positioning result accuracy can be increased by about 50%. PMID:27483287

  17. Design, Implementation, and Verification of the Reliable Multicast Protocol. Thesis

    NASA Technical Reports Server (NTRS)

    Montgomery, Todd L.

    1995-01-01

    This document describes the Reliable Multicast Protocol (RMP) design, first implementation, and formal verification. RMP provides a totally ordered, reliable, atomic multicast service on top of an unreliable multicast datagram service. RMP is fully and symmetrically distributed so that no site bears an undue portion of the communications load. RMP provides a wide range of guarantees, from unreliable delivery to totally ordered delivery, to K-resilient, majority resilient, and totally resilient atomic delivery. These guarantees are selectable on a per message basis. RMP provides many communication options, including virtual synchrony, a publisher/subscriber model of message delivery, a client/server model of delivery, mutually exclusive handlers for messages, and mutually exclusive locks. It has been commonly believed that total ordering of messages can only be achieved at great performance expense. RMP discounts this. The first implementation of RMP has been shown to provide high throughput performance on Local Area Networks (LAN). For two or more destinations a single LAN, RMP provides higher throughput than any other protocol that does not use multicast or broadcast technology. The design, implementation, and verification activities of RMP have occurred concurrently. This has allowed the verification to maintain a high fidelity between design model, implementation model, and the verification model. The restrictions of implementation have influenced the design earlier than in normal sequential approaches. The protocol as a whole has matured smoother by the inclusion of several different perspectives into the product development.

  18. Fourth NASA Langley Formal Methods Workshop

    NASA Technical Reports Server (NTRS)

    Holloway, C. Michael (Compiler); Hayhurst, Kelly J. (Compiler)

    1997-01-01

    This publication consists of papers presented at NASA Langley Research Center's fourth workshop on the application of formal methods to the design and verification of life-critical systems. Topic considered include: Proving properties of accident; modeling and validating SAFER in VDM-SL; requirement analysis of real-time control systems using PVS; a tabular language for system design; automated deductive verification of parallel systems. Also included is a fundamental hardware design in PVS.

  19. Highly efficient simulation environment for HDTV video decoder in VLSI design

    NASA Astrophysics Data System (ADS)

    Mao, Xun; Wang, Wei; Gong, Huimin; He, Yan L.; Lou, Jian; Yu, Lu; Yao, Qingdong; Pirsch, Peter

    2002-01-01

    With the increase of the complex of VLSI such as the SoC (System on Chip) of MPEG-2 Video decoder with HDTV scalability especially, simulation and verification of the full design, even as high as the behavior level in HDL, often proves to be very slow, costly and it is difficult to perform full verification until late in the design process. Therefore, they become bottleneck of the procedure of HDTV video decoder design, and influence it's time-to-market mostly. In this paper, the architecture of Hardware/Software Interface of HDTV video decoder is studied, and a Hardware-Software Mixed Simulation (HSMS) platform is proposed to check and correct error in the early design stage, based on the algorithm of MPEG-2 video decoding. The application of HSMS to target system could be achieved by employing several introduced approaches. Those approaches speed up the simulation and verification task without decreasing performance.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION: JOINT (NSF-EPA) VERIFICATION STATEMENT AND REPORT HYDRO COMPLIANCE MANAGEMENT, INC. HYDRO-KLEEN FILTRATION SYSTEM, 03/07/WQPC-SWP, SEPTEMBER 2003

    EPA Science Inventory

    Verification testing of the Hydro-Kleen(TM) Filtration System, a catch-basin filter designed to reduce hydrocarbon, sediment, and metals contamination from surface water flows, was conducted at NSF International in Ann Arbor, Michigan. A Hydro-Kleen(TM) system was fitted into a ...

  1. A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems

    DTIC Science & Technology

    2016-03-01

    insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of

  2. Verification of an on line in vivo semiconductor dosimetry system for TBI with two TLD procedures.

    PubMed

    Sánchez-Doblado, F; Terrón, J A; Sánchez-Nieto, B; Arráns, R; Errazquin, L; Biggs, D; Lee, C; Núñez, L; Delgado, A; Muñiz, J L

    1995-01-01

    This work presents the verification of an on line in vivo dosimetry system based on semiconductors. Software and hardware has been designed to convert the diode signal into absorbed dose. Final verification was made in the form of an intercomparison with two independent thermoluminiscent (TLD) dosimetry systems, under TBI conditions.

  3. Continuous monitoring of large civil structures using a digital fiber optic motion sensor system

    NASA Astrophysics Data System (ADS)

    Hodge, Malcolm H.; Kausel, Theodore C., Jr.

    1998-03-01

    There is no single attribute which can always predict structural deterioration. Accordingly, we have developed a scheme for monitoring a wide range of incipient deterioration parameters, all based on a single motion sensor concept. In this presentation, we describe how an intrinsically low power- consumption fiber optic harness can be permanently deployed to poll an array of optical sensors. The function and design of these simple, durable, and naturally digital sensors is described, along with the manner in which they have been configured to collect information for changes in the most important structural aspects. The SIMS system is designed to interrogate each sensor up to five-thousand times per second for the life of the structure, and to report sensor data back to a remote computer base for current and long-term analysis, and is directed primarily towards bridges. By suitably modifying the actuation of this very precise motion sensor, SIMS is able to track bridge deck deflection and vibration, expansion joint travel, concrete and rebar corrosion, pothole development, pier scour and tilt. Other sensors will track bolt clamp load, cable tension, and metal fatigue. All of these data are received within microseconds, which means that appropriate computer algorithm manipulations can be carried out to correlate one sensor with other sensors in real time. This internal verification feature automatically enhances confidence in the system's predictive ability and alerts the user to any anomalous behavior.

  4. INDEPENDENT VERIFICATION SURVEY REPORT FOR ZONE 1 OF THE EAST TENNESSEE TECHNOLOGY PARK IN OAK RIDGE, TENNESSEE

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    King, David A.

    2012-08-16

    Oak Ridge Associated Universities (ORAU) conducted in-process inspections and independent verification (IV) surveys in support of DOE's remedial efforts in Zone 1 of East Tennessee Technology Park (ETTP) in Oak Ridge, Tennessee. Inspections concluded that the remediation contractor's soil removal and survey objectives were satisfied and the dynamic verification strategy (DVS) was implemented as designed. Independent verification (IV) activities included gamma walkover surveys and soil sample collection/analysis over multiple exposure units (EUs).

  5. Characterisation

    DTIC Science & Technology

    2007-03-01

    Characterisation. In Nanotechnology Aerospace Applications – 2006 (pp. 4-1 – 4-8). Educational Notes RTO-EN-AVT-129bis, Paper 4. Neuilly-sur-Seine, France: RTO...the Commercialisation Processes Concept IDEA Proof-of- Principle Trial Samples Engineering Verification Samples Design Verification Samples...SEIC Systems Engineering for commercialisation Design Houses, Engineering & R&D USERS & Integrators SE S U R Integrators Fabs & Wafer Processing Die

  6. Space Station Furnace Facility. Volume 2: Appendix 1: Contract End Item specification (CEI), part 1

    NASA Technical Reports Server (NTRS)

    Seabrook, Craig

    1992-01-01

    This specification establishes the performance, design, development, and verification requirements for the Space Station Furnace Facility (SSFF) Core. The definition of the SSFF Core and its interfaces, specifies requirements for the SSFF Core performance, specifies requirements for the SSFF Core design, and construction are presented, and the verification requirements are established.

  7. Integrated Formal Analysis of Timed-Triggered Ethernet

    NASA Technical Reports Server (NTRS)

    Dutertre, Bruno; Shankar, Nstarajan; Owre, Sam

    2012-01-01

    We present new results related to the verification of the Timed-Triggered Ethernet (TTE) clock synchronization protocol. This work extends previous verification of TTE based on model checking. We identify a suboptimal design choice in a compression function used in clock synchronization, and propose an improvement. We compare the original design and the improved definition using the SAL model checker.

  8. Verification of ICESat-2/ATLAS Science Receiver Algorithm Onboard Databases

    NASA Astrophysics Data System (ADS)

    Carabajal, C. C.; Saba, J. L.; Leigh, H. W.; Magruder, L. A.; Urban, T. J.; Mcgarry, J.; Schutz, B. E.

    2013-12-01

    NASA's ICESat-2 mission will fly the Advanced Topographic Laser Altimetry System (ATLAS) instrument on a 3-year mission scheduled to launch in 2016. ATLAS is a single-photon detection system transmitting at 532nm with a laser repetition rate of 10 kHz, and a 6 spot pattern on the Earth's surface. A set of onboard Receiver Algorithms will perform signal processing to reduce the data rate and data volume to acceptable levels. These Algorithms distinguish surface echoes from the background noise, limit the daily data volume, and allow the instrument to telemeter only a small vertical region about the signal. For this purpose, three onboard databases are used: a Surface Reference Map (SRM), a Digital Elevation Model (DEM), and a Digital Relief Maps (DRMs). The DEM provides minimum and maximum heights that limit the signal search region of the onboard algorithms, including a margin for errors in the source databases, and onboard geolocation. Since the surface echoes will be correlated while noise will be randomly distributed, the signal location is found by histogramming the received event times and identifying the histogram bins with statistically significant counts. Once the signal location has been established, the onboard Digital Relief Maps (DRMs) will be used to determine the vertical width of the telemetry band about the signal. University of Texas-Center for Space Research (UT-CSR) is developing the ICESat-2 onboard databases, which are currently being tested using preliminary versions and equivalent representations of elevation ranges and relief more recently developed at Goddard Space Flight Center (GSFC). Global and regional elevation models have been assessed in terms of their accuracy using ICESat geodetic control, and have been used to develop equivalent representations of the onboard databases for testing against the UT-CSR databases, with special emphasis on the ice sheet regions. A series of verification checks have been implemented, including comparisons against ICESat altimetry for selected regions with tall vegetation and high relief. The extensive verification effort by the Receiver Algorithm team at GSFC is aimed at assuring that the onboard databases are sufficiently accurate. We will present the results of those assessments and verification tests, along with measures taken to implement modifications to the databases to optimize their use by the receiver algorithms. Companion presentations by McGarry et al. and Leigh et al. describe the details on the ATLAS Onboard Receiver Algorithms and databases development, respectively.

  9. An investigation of the effects of relevant samples and a comparison of verification versus discovery based lab design

    NASA Astrophysics Data System (ADS)

    Rieben, James C., Jr.

    This study focuses on the effects of relevance and lab design on student learning within the chemistry laboratory environment. A general chemistry conductivity of solutions experiment and an upper level organic chemistry cellulose regeneration experiment were employed. In the conductivity experiment, the two main variables studied were the effect of relevant (or "real world") samples on student learning and a verification-based lab design versus a discovery-based lab design. With the cellulose regeneration experiment, the effect of a discovery-based lab design vs. a verification-based lab design was the sole focus. Evaluation surveys consisting of six questions were used at three different times to assess student knowledge of experimental concepts. In the general chemistry laboratory portion of this study, four experimental variants were employed to investigate the effect of relevance and lab design on student learning. These variants consisted of a traditional (or verification) lab design, a traditional lab design using "real world" samples, a new lab design employing real world samples/situations using unknown samples, and the new lab design using real world samples/situations that were known to the student. Data used in this analysis were collected during the Fall 08, Winter 09, and Fall 09 terms. For the second part of this study a cellulose regeneration experiment was employed to investigate the effects of lab design. A demonstration creating regenerated cellulose "rayon" was modified and converted to an efficient and low-waste experiment. In the first variant students tested their products and verified a list of physical properties. In the second variant, students filled in a blank physical property chart with their own experimental results for the physical properties. Results from the conductivity experiment show significant student learning of the effects of concentration on conductivity and how to use conductivity to differentiate solution types with the use of real world samples. In the organic chemistry experiment, results suggest that the discovery-based design improved student retention of the chain length differentiation by physical properties relative to the verification-based design.

  10. A software engineering approach to expert system design and verification

    NASA Technical Reports Server (NTRS)

    Bochsler, Daniel C.; Goodwin, Mary Ann

    1988-01-01

    Software engineering design and verification methods for developing expert systems are not yet well defined. Integration of expert system technology into software production environments will require effective software engineering methodologies to support the entire life cycle of expert systems. The software engineering methods used to design and verify an expert system, RENEX, is discussed. RENEX demonstrates autonomous rendezvous and proximity operations, including replanning trajectory events and subsystem fault detection, onboard a space vehicle during flight. The RENEX designers utilized a number of software engineering methodologies to deal with the complex problems inherent in this system. An overview is presented of the methods utilized. Details of the verification process receive special emphasis. The benefits and weaknesses of the methods for supporting the development life cycle of expert systems are evaluated, and recommendations are made based on the overall experiences with the methods.

  11. Verification of the FtCayuga fault-tolerant microprocessor system. Volume 2: Formal specification and correctness theorems

    NASA Technical Reports Server (NTRS)

    Bickford, Mark; Srivas, Mandayam

    1991-01-01

    Presented here is a formal specification and verification of a property of a quadruplicately redundant fault tolerant microprocessor system design. A complete listing of the formal specification of the system and the correctness theorems that are proved are given. The system performs the task of obtaining interactive consistency among the processors using a special instruction on the processors. The design is based on an algorithm proposed by Pease, Shostak, and Lamport. The property verified insures that an execution of the special instruction by the processors correctly accomplishes interactive consistency, providing certain preconditions hold, using a computer aided design verification tool, Spectool, and the theorem prover, Clio. A major contribution of the work is the demonstration of a significant fault tolerant hardware design that is mechanically verified by a theorem prover.

  12. Watermarking and copyright labeling of printed images

    NASA Astrophysics Data System (ADS)

    Hel-Or, Hagit Z.

    2001-07-01

    Digital watermarking is a labeling technique for digital images which embeds a code into the digital data so the data are marked. Watermarking techniques previously developed deal with on-line digital data. These techniques have been developed to withstand digital attacks such as image processing, image compression and geometric transformations. However, one must also consider the readily available attack of printing and scanning. The available watermarking techniques are not reliable under printing and scanning. In fact, one must consider the availability of watermarks for printed images as well as for digital images. An important issue is to intercept and prevent forgery in printed material such as currency notes, back checks, etc. and to track and validate sensitive and secrete printed material. Watermarking in such printed material can be used not only for verification of ownership but as an indicator of date and type of transaction or date and source of the printed data. In this work we propose a method of embedding watermarks in printed images by inherently taking advantage of the printing process. The method is visually unobtrusive to the printed image, the watermark is easily extracted and is robust under reconstruction errors. The decoding algorithm is automatic given the watermarked image.

  13. Research on key technology of the verification system of steel rule based on vision measurement

    NASA Astrophysics Data System (ADS)

    Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun

    2018-01-01

    The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.

  14. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  15. Regional agriculture surveys using ERTS-1 data

    NASA Technical Reports Server (NTRS)

    Draeger, W. C.; Nichols, J. D.; Benson, A. S.; Larrabee, D. G.; Jenkus, W. M.; Hay, C. M.

    1974-01-01

    The Center for Remote Sensing Research has conducted studies designed to evaluate the potential application of ERTS data in performing agricultural inventories, and to develop efficient methods of data handling and analysis useful in the operational context for performing large area surveys. This work has resulted in the development of an integrated system utilizing both human and computer analysis of ground, aerial, and space imagery, which has been shown to be very efficient for regional crop acreage inventories. The technique involves: (1) the delineation of ERTS images into relatively homogeneous strata by human interpreters, (2) the point-by-point classification of the area within each strata on the basis of crop type using a human/machine interactive digital image processing system; and (3) a multistage sampling procedure for the collection of supporting aerial and ground data used in the adjustment and verification of the classification results.

  16. Transonic flow visualization using holographic interferometry

    NASA Technical Reports Server (NTRS)

    Bryanston-Cross, Peter J.

    1987-01-01

    An account is made of some of the applications of holographic interferometry to the visualization of transonic flows. In the case of the compressor shock visualization, the method is used regularly and has moved from being a research department invention to a design test tool. With the implementation of automatic processing and simple digitization systems, holographic vibrational analysis has also moved into routine nondestructive testing. The code verification interferograms were instructive, but the main turbomachinery interest is now in 3 dimensional flows. A major data interpretation effort will be required to compute tomographically the 3 dimensional flow around the leading or the trailing edges of a rotating blade row. The bolt on approach shows the potential application to current unsteady flows of interest. In particular that of the rotor passing and vortex interaction effects is experienced by the new generation of unducted fans. The turbocharger tests presents a new area for the application of holography.

  17. Note: An improved calibration system with phase correction for electronic transformers with digital output.

    PubMed

    Cheng, Han-miao; Li, Hong-bin

    2015-08-01

    The existing electronic transformer calibration systems employing data acquisition cards cannot satisfy some practical applications, because the calibration systems have phase measurement errors when they work in the mode of receiving external synchronization signals. This paper proposes an improved calibration system scheme with phase correction to improve the phase measurement accuracy. We employ NI PCI-4474 to design a calibration system, and the system has the potential to receive external synchronization signals and reach extremely high accuracy classes. Accuracy verification has been carried out in the China Electric Power Research Institute, and results demonstrate that the system surpasses the accuracy class 0.05. Furthermore, this system has been used to test the harmonics measurement accuracy of all-fiber optical current transformers. In the same process, we have used an existing calibration system, and a comparison of the test results is presented. The system after improvement is suitable for the intended applications.

  18. Using Adaptive Turnaround Documents to Electronically Acquire Structured Data in Clinical Settings

    PubMed Central

    Biondich, Paul G.; Anand, Vibha; Downs, Stephen M.; McDonald, Clement J.

    2003-01-01

    We developed adaptive turnaround documents (ATDs) to address longstanding challenges inherent in acquiring structured data at the point of care. These computer-generated paper forms both request and receive patient tailored information specifically for electronic storage. In our pilot, we evaluated the usability, accuracy, and user acceptance of an ATD designed to enrich a pediatric preventative care decision support system. The system had an overall digit recognition rate of 98.6% (95% CI: 98.3 to 98.9) and a marksense accuracy of 99.2% (95% CI: 99.1 to 99.3). More importantly, the system reliably extracted all data from 56.6% (95% CI: 53.3 to 59.9) of our pilot forms without the need for a verification step. These results translate to a minimal workflow burden to end users. This suggests that ATDs can serve as an inexpensive, workflow-sensitive means of structured data acquisition in the clinical setting. PMID:14728139

  19. Miniaturized star tracker for micro spacecraft with high angular rate

    NASA Astrophysics Data System (ADS)

    Li, Jianhua; Li, Zhifeng; Niu, Zhenhong; Liu, Jiaqi

    2017-10-01

    There is a clear need for miniaturized, lightweight, accurate and inexpensive star tracker for spacecraft with large anglar rate. To face these new constraints, the Beijing Institute of Space Long March Vehicle has designed, built and flown a low cost miniaturized star tracker that provides autonomous ("Lost in Space") inertial attitude determination, 2 Hz 3-axis star tracking, and digital imaging with embedded compression. Detector with high sensitivity is adopted to meet the dynamic and miniature requirement. A Sun and Moon avoiding method based on the calculation of Sun and Moon's vector by astronomical theory is proposed. The produced prototype weight 0.84kg, and can be used for a spacecraft with 6°/s anglar rate. The average angle measure error is less than 43 arc second. The ground verification and application of the star tracker during the pick-up flight test showed that the capability of the product meet the requirement.

  20. A feasibility study of treatment verification using EPID cine images for hypofractionated lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoli; Lin, Tong; Jiang, Steve

    2009-09-01

    We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  1. Post-OPC verification using a full-chip pattern-based simulation verification method

    NASA Astrophysics Data System (ADS)

    Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary

    2005-11-01

    In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow down to unique patterns/cells.

  2. Loads and Structural Dynamics Requirements for Spaceflight Hardware

    NASA Technical Reports Server (NTRS)

    Schultz, Kenneth P.

    2011-01-01

    The purpose of this document is to establish requirements relating to the loads and structural dynamics technical discipline for NASA and commercial spaceflight launch vehicle and spacecraft hardware. Requirements are defined for the development of structural design loads and recommendations regarding methodologies and practices for the conduct of load analyses are provided. As such, this document represents an implementation of NASA STD-5002. Requirements are also defined for structural mathematical model development and verification to ensure sufficient accuracy of predicted responses. Finally, requirements for model/data delivery and exchange are specified to facilitate interactions between Launch Vehicle Providers (LVPs), Spacecraft Providers (SCPs), and the NASA Technical Authority (TA) providing insight/oversight and serving in the Independent Verification and Validation role. In addition to the analysis-related requirements described above, a set of requirements are established concerning coupling phenomena or other interaction between structural dynamics and aerodynamic environments or control or propulsion system elements. Such requirements may reasonably be considered structure or control system design criteria, since good engineering practice dictates consideration of and/or elimination of the identified conditions in the development of those subsystems. The requirements are included here, however, to ensure that such considerations are captured in the design space for launch vehicles (LV), spacecraft (SC) and the Launch Abort Vehicle (LAV). The requirements in this document are focused on analyses to be performed to develop data needed to support structural verification. As described in JSC 65828, Structural Design Requirements and Factors of Safety for Spaceflight Hardware, implementation of the structural verification requirements is expected to be described in a Structural Verification Plan (SVP), which should describe the verification of each structural item for the applicable requirements. The requirement for and expected contents of the SVP are defined in JSC 65828. The SVP may also document unique verifications that meet or exceed these requirements with Technical Authority approval.

  3. Sparse distributed memory: Principles and operation

    NASA Technical Reports Server (NTRS)

    Flynn, M. J.; Kanerva, P.; Bhadkamkar, N.

    1989-01-01

    Sparse distributed memory is a generalized random access memory (RAM) for long (1000 bit) binary words. Such words can be written into and read from the memory, and they can also be used to address the memory. The main attribute of the memory is sensitivity to similarity, meaning that a word can be read back not only by giving the original write address but also by giving one close to it as measured by the Hamming distance between addresses. Large memories of this kind are expected to have wide use in speech recognition and scene analysis, in signal detection and verification, and in adaptive control of automated equipment, in general, in dealing with real world information in real time. The memory can be realized as a simple, massively parallel computer. Digital technology has reached a point where building large memories is becoming practical. Major design issues were resolved which were faced in building the memories. The design is described of a prototype memory with 256 bit addresses and from 8 to 128 K locations for 256 bit words. A key aspect of the design is extensive use of dynamic RAM and other standard components.

  4. Design of verification platform for wireless vision sensor networks

    NASA Astrophysics Data System (ADS)

    Ye, Juanjuan; Shang, Fei; Yu, Chuang

    2017-08-01

    At present, the majority of research for wireless vision sensor networks (WVSNs) still remains in the software simulation stage, and the verification platforms of WVSNs that available for use are very few. This situation seriously restricts the transformation from theory research of WVSNs to practical application. Therefore, it is necessary to study the construction of verification platform of WVSNs. This paper combines wireless transceiver module, visual information acquisition module and power acquisition module, designs a high-performance wireless vision sensor node whose core is ARM11 microprocessor and selects AODV as the routing protocol to set up a verification platform called AdvanWorks for WVSNs. Experiments show that the AdvanWorks can successfully achieve functions of image acquisition, coding, wireless transmission, and obtain the effective distance parameters between nodes, which lays a good foundation for the follow-up application of WVSNs.

  5. Deploying Crowd-Sourced Formal Verification Systems in a DoD Network

    DTIC Science & Technology

    2013-09-01

    INTENTIONALLY LEFT BLANK 1 I. INTRODUCTION A. INTRODUCTION In 2014 cyber attacks on critical infrastructure are expected to increase...CSFV systems on the Internet‒‒possibly using cloud infrastructure (Dean, 2013). By using Amazon Compute Cloud (EC2) systems, DARPA will use ordinary...through standard access methods. Those clients could be mobile phones, laptops, netbooks, tablet computers or personal digital assistants (PDAs) (Smoot

  6. Military applications of automatic speech recognition and future requirements

    NASA Technical Reports Server (NTRS)

    Beek, Bruno; Cupples, Edward J.

    1977-01-01

    An updated summary of the state-of-the-art of automatic speech recognition and its relevance to military applications is provided. A number of potential systems for military applications are under development. These include: (1) digital narrowband communication systems; (2) automatic speech verification; (3) on-line cartographic processing unit; (4) word recognition for militarized tactical data system; and (5) voice recognition and synthesis for aircraft cockpit.

  7. Serious simulation game development for energy transition education using integrated framework game design

    NASA Astrophysics Data System (ADS)

    Destyanto, A. R.; Putri, O. A.; Hidayatno, A.

    2017-11-01

    Due to the advantages that serious simulation game offered, many areas of studies, including energy, have used serious simulation games as their instruments. However, serious simulation games in the field of energy transition still have few attentions. In this study, serious simulation game is developed and tested as the activity of public education about energy transition which is a conversion from oil to natural gas program. The aim of the game development is to create understanding and awareness about the importance of energy transition for society in accelerating the process of energy transition in Indonesia since 1987 the energy transition program has not achieved the conversion target yet due to the lack of education about energy transition for society. Developed as a digital serious simulation game following the framework of integrated game design, the Transergy game has been tested to 15 users and then analysed. The result of verification and validation of the game shows that Transergy gives significance to the users for understanding and triggering the needs of oil to natural gas conversion.

  8. An Embedded Sensor Node Microcontroller with Crypto-Processors.

    PubMed

    Panić, Goran; Stecklina, Oliver; Stamenković, Zoran

    2016-04-27

    Wireless sensor network applications range from industrial automation and control, agricultural and environmental protection, to surveillance and medicine. In most applications, data are highly sensitive and must be protected from any type of attack and abuse. Security challenges in wireless sensor networks are mainly defined by the power and computing resources of sensor devices, memory size, quality of radio channels and susceptibility to physical capture. In this article, an embedded sensor node microcontroller designed to support sensor network applications with severe security demands is presented. It features a low power 16-bitprocessor core supported by a number of hardware accelerators designed to perform complex operations required by advanced crypto algorithms. The microcontroller integrates an embedded Flash and an 8-channel 12-bit analog-to-digital converter making it a good solution for low-power sensor nodes. The article discusses the most important security topics in wireless sensor networks and presents the architecture of the proposed hardware solution. Furthermore, it gives details on the chip implementation, verification and hardware evaluation. Finally, the chip power dissipation and performance figures are estimated and analyzed.

  9. An Embedded Sensor Node Microcontroller with Crypto-Processors

    PubMed Central

    Panić, Goran; Stecklina, Oliver; Stamenković, Zoran

    2016-01-01

    Wireless sensor network applications range from industrial automation and control, agricultural and environmental protection, to surveillance and medicine. In most applications, data are highly sensitive and must be protected from any type of attack and abuse. Security challenges in wireless sensor networks are mainly defined by the power and computing resources of sensor devices, memory size, quality of radio channels and susceptibility to physical capture. In this article, an embedded sensor node microcontroller designed to support sensor network applications with severe security demands is presented. It features a low power 16-bitprocessor core supported by a number of hardware accelerators designed to perform complex operations required by advanced crypto algorithms. The microcontroller integrates an embedded Flash and an 8-channel 12-bit analog-to-digital converter making it a good solution for low-power sensor nodes. The article discusses the most important security topics in wireless sensor networks and presents the architecture of the proposed hardware solution. Furthermore, it gives details on the chip implementation, verification and hardware evaluation. Finally, the chip power dissipation and performance figures are estimated and analyzed. PMID:27128925

  10. The scheme machine: A case study in progress in design derivation at system levels

    NASA Technical Reports Server (NTRS)

    Johnson, Steven D.

    1995-01-01

    The Scheme Machine is one of several design projects of the Digital Design Derivation group at Indiana University. It differs from the other projects in its focus on issues of system design and its connection to surrounding research in programming language semantics, compiler construction, and programming methodology underway at Indiana and elsewhere. The genesis of the project dates to the early 1980's, when digital design derivation research branched from the surrounding research effort in programming languages. Both branches have continued to develop in parallel, with this particular project serving as a bridge. However, by 1990 there remained little real interaction between the branches and recently we have undertaken to reintegrate them. On the software side, researchers have refined a mathematically rigorous (but not mechanized) treatment starting with the fully abstract semantic definition of Scheme and resulting in an efficient implementation consisting of a compiler and virtual machine model, the latter typically realized with a general purpose microprocessor. The derivation includes a number of sophisticated factorizations and representations and is also deep example of the underlying engineering methodology. The hardware research has created a mechanized algebra supporting the tedious and massive transformations often seen at lower levels of design. This work has progressed to the point that large scale devices, such as processors, can be derived from first-order finite state machine specifications. This is roughly where the language oriented research stops; thus, together, the two efforts establish a thread from the highest levels of abstract specification to detailed digital implementation. The Scheme Machine project challenges hardware derivation research in several ways, although the individual components of the system are of a similar scale to those we have worked with before. The machine has a custom dual-ported memory to support garbage collection. It consists of four tightly coupled processes--processor, collector, allocator, memory--with a very non-trivial synchronization relationship. Finally, there are deep issues of representation for the run-time objects of a symbolic processing language. The research centers on verification through integrated formal reasoning systems, but is also involved with modeling and prototyping environments. Since the derivation algebra is basd on an executable modeling language, there is opportunity to incorporate design animation in the design process. We are looking for ways to move smoothly and incrementally from executable specifications into hardware realization. For example, we can run the garbage collector specification, a Scheme program, directly against the physical memory prototype, and similarly, the instruction processor model against the heap implementation.

  11. New Physical Optics Method for Curvilinear Refractive Surfaces and its Verification in the Design and Testing of W-band Dual-Aspheric Lenses

    DTIC Science & Technology

    2013-10-01

    its Verification in the Design and Testing of W-band Dual-Aspheric Lenses A. Altintas and V. Yurchenko EEE Department, Bilkent University Ankara...Theory and Techn., Vol. 55, 239, 2007 [5] ZEMAX Development Corporation, Zemax- EE , http://www.zemax.com/ [6] Pasqualini D. and Maci S., ”High-Frequency

  12. The Design and Evaluation of Class Exercises as Active Learning Tools in Software Verification and Validation

    ERIC Educational Resources Information Center

    Wu, Peter Y.; Manohar, Priyadarshan A.; Acharya, Sushil

    2016-01-01

    It is well known that interesting questions can stimulate thinking and invite participation. Class exercises are designed to make use of questions to engage students in active learning. In a project toward building a community skilled in software verification and validation (SV&V), we critically review and further develop course materials in…

  13. Land use/land cover mapping (1:25000) of Taiwan, Republic of China by automated multispectral interpretation of LANDSAT imagery

    NASA Technical Reports Server (NTRS)

    Sung, Q. C.; Miller, L. D.

    1977-01-01

    Three methods were tested for collection of the training sets needed to establish the spectral signatures of the land uses/land covers sought due to the difficulties of retrospective collection of representative ground control data. Computer preprocessing techniques applied to the digital images to improve the final classification results were geometric corrections, spectral band or image ratioing and statistical cleaning of the representative training sets. A minimal level of statistical verification was made based upon the comparisons between the airphoto estimates and the classification results. The verifications provided a further support to the selection of MSS band 5 and 7. It also indicated that the maximum likelihood ratioing technique can achieve more agreeable classification results with the airphoto estimates than the stepwise discriminant analysis.

  14. The Maximal Oxygen Uptake Verification Phase: a Light at the End of the Tunnel?

    PubMed

    Schaun, Gustavo Z

    2017-12-08

    Commonly performed during an incremental test to exhaustion, maximal oxygen uptake (V̇O 2max ) assessment has become a recurring practice in clinical and experimental settings. To validate the test, several criteria were proposed. In this context, the plateau in oxygen uptake (V̇O 2 ) is inconsistent in its frequency, reducing its usefulness as a robust method to determine "true" V̇O 2max . Moreover, secondary criteria previously suggested, such as expiratory exchange ratios or percentages of maximal heart rate, are highly dependent on protocol design and often are achieved at V̇O 2 percentages well below V̇O 2max . Thus, an alternative method termed verification phase was proposed. Currently, it is clear that the verification phase can be a practical and sensitive method to confirm V̇O 2max ; however, procedures to conduct it are not standardized across the literature and no previous research tried to summarize how it has been employed. Therefore, in this review the knowledge on the verification phase was updated, while suggestions on how it can be performed (e.g. intensity, duration, recovery) were provided according to population and protocol design. Future studies should focus to identify a verification protocol feasible for different populations and to compare square-wave and multistage verification phases. Additionally, studies assessing verification phases in different patient populations are still warranted.

  15. Multi-canister overpack project -- verification and validation, MCNP 4A

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Goldmann, L.H.

    This supporting document contains the software verification and validation (V and V) package used for Phase 2 design of the Spent Nuclear Fuel Multi-Canister Overpack. V and V packages for both ANSYS and MCNP are included. Description of Verification Run(s): This software requires that it be compiled specifically for the machine it is to be used on. Therefore to facilitate ease in the verification process the software automatically runs 25 sample problems to ensure proper installation and compilation. Once the runs are completed the software checks for verification by performing a file comparison on the new output file and themore » old output file. Any differences between any of the files will cause a verification error. Due to the manner in which the verification is completed a verification error does not necessarily indicate a problem. This indicates that a closer look at the output files is needed to determine the cause of the error.« less

  16. A DVE Time Management Simulation and Verification Platform Based on Causality Consistency Middleware

    NASA Astrophysics Data System (ADS)

    Zhou, Hangjun; Zhang, Wei; Peng, Yuxing; Li, Sikun

    During the course of designing a time management algorithm for DVEs, the researchers always become inefficiency for the distraction from the realization of the trivial and fundamental details of simulation and verification. Therefore, a platform having realized theses details is desirable. However, this has not been achieved in any published work to our knowledge. In this paper, we are the first to design and realize a DVE time management simulation and verification platform providing exactly the same interfaces as those defined by the HLA Interface Specification. Moreover, our platform is based on a new designed causality consistency middleware and might offer the comparison of three kinds of time management services: CO, RO and TSO. The experimental results show that the implementation of the platform only costs small overhead, and that the efficient performance of it is highly effective for the researchers to merely focus on the improvement of designing algorithms.

  17. Operational verification of a 40-MHz annular array transducer

    PubMed Central

    Ketterling, Jeffrey A.; Ramachandran, Sarayu; Aristizäbal, Orlando

    2006-01-01

    An experimental system to take advantage of the imaging capabilities of a 5-ring polyvinylidene fluoride (PVDF) based annular array is presented. The array has a 6 mm total aperture and a 12 mm geometric focus. The experimental system is designed to pulse a single element of the array and then digitize the received data of all array channels simultaneously. All transmit/receive pairs are digitized and then the data are post-processed with a synthetic focusing technique to achieve an enhanced depth of field (DOF). The performance of the array is experimentally tested with a wire phantom consisting of 25-μm diameter wires diagonally spaced at 1 mm by 1 mm intervals. The phantom permitted the efficacy of the synthetic focusing algorithm to be tested and was also used for two-way beam characterization. Experimental results are compared to a spatial impulse response method beam simulation. After synthetic focusing, the two-way echo amplitude was enhanced over the range of 8 to 19 mm and the 6-dB DOF spanned from 9 to 15 mm. For a wire at a fixed axial depth, the relative time delays between transmit/receive ring pairs agreed with theoretical predictions to within ± 2 ns. To further test the system, B-mode images of an excised bovine eye are rendered. PMID:16555771

  18. Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images

    PubMed Central

    Lin, Chih-Lung; Wang, Shih-Hung; Cheng, Hsu-Yung; Fan, Kuo-Chin; Hsu, Wei-Lieh; Lai, Chin-Rong

    2015-01-01

    In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1) automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2) applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3) extracting the line-like features (LLFs) from the fused image; (4) obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5) using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR) camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods. PMID:26703596

  19. Bimodal Biometric Verification Using the Fusion of Palmprint and Infrared Palm-Dorsum Vein Images.

    PubMed

    Lin, Chih-Lung; Wang, Shih-Hung; Cheng, Hsu-Yung; Fan, Kuo-Chin; Hsu, Wei-Lieh; Lai, Chin-Rong

    2015-12-12

    In this paper, we present a reliable and robust biometric verification method based on bimodal physiological characteristics of palms, including the palmprint and palm-dorsum vein patterns. The proposed method consists of five steps: (1) automatically aligning and cropping the same region of interest from different palm or palm-dorsum images; (2) applying the digital wavelet transform and inverse wavelet transform to fuse palmprint and vein pattern images; (3) extracting the line-like features (LLFs) from the fused image; (4) obtaining multiresolution representations of the LLFs by using a multiresolution filter; and (5) using a support vector machine to verify the multiresolution representations of the LLFs. The proposed method possesses four advantages: first, both modal images are captured in peg-free scenarios to improve the user-friendliness of the verification device. Second, palmprint and vein pattern images are captured using a low-resolution digital scanner and infrared (IR) camera. The use of low-resolution images results in a smaller database. In addition, the vein pattern images are captured through the invisible IR spectrum, which improves antispoofing. Third, since the physiological characteristics of palmprint and vein pattern images are different, a hybrid fusing rule can be introduced to fuse the decomposition coefficients of different bands. The proposed method fuses decomposition coefficients at different decomposed levels, with different image sizes, captured from different sensor devices. Finally, the proposed method operates automatically and hence no parameters need to be set manually. Three thousand palmprint images and 3000 vein pattern images were collected from 100 volunteers to verify the validity of the proposed method. The results show a false rejection rate of 1.20% and a false acceptance rate of 1.56%. It demonstrates the validity and excellent performance of our proposed method comparing to other methods.

  20. Optimized Temporal Monitors for SystemC

    NASA Technical Reports Server (NTRS)

    Tabakov, Deian; Rozier, Kristin Y.; Vardi, Moshe Y.

    2012-01-01

    SystemC is a modeling language built as an extension of C++. Its growing popularity and the increasing complexity of designs have motivated research efforts aimed at the verification of SystemC models using assertion-based verification (ABV), where the designer asserts properties that capture the design intent in a formal language such as PSL or SVA. The model then can be verified against the properties using runtime or formal verification techniques. In this paper we focus on automated generation of runtime monitors from temporal properties. Our focus is on minimizing runtime overhead, rather than monitor size or monitor-generation time. We identify four issues in monitor generation: state minimization, alphabet representation, alphabet minimization, and monitor encoding. We conduct extensive experimentation and identify a combination of settings that offers the best performance in terms of runtime overhead.

  1. Hand Grasping Synergies As Biometrics.

    PubMed

    Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana

    2017-01-01

    Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies-postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security.

  2. Efficiency and Flexibility of Fingerprint Scheme Using Partial Encryption and Discrete Wavelet Transform to Verify User in Cloud Computing.

    PubMed

    Yassin, Ali A

    2014-01-01

    Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification.

  3. Efficiency and Flexibility of Fingerprint Scheme Using Partial Encryption and Discrete Wavelet Transform to Verify User in Cloud Computing

    PubMed Central

    Yassin, Ali A.

    2014-01-01

    Now, the security of digital images is considered more and more essential and fingerprint plays the main role in the world of image. Furthermore, fingerprint recognition is a scheme of biometric verification that applies pattern recognition techniques depending on image of fingerprint individually. In the cloud environment, an adversary has the ability to intercept information and must be secured from eavesdroppers. Unluckily, encryption and decryption functions are slow and they are often hard. Fingerprint techniques required extra hardware and software; it is masqueraded by artificial gummy fingers (spoof attacks). Additionally, when a large number of users are being verified at the same time, the mechanism will become slow. In this paper, we employed each of the partial encryptions of user's fingerprint and discrete wavelet transform to obtain a new scheme of fingerprint verification. Moreover, our proposed scheme can overcome those problems; it does not require cost, reduces the computational supplies for huge volumes of fingerprint images, and resists well-known attacks. In addition, experimental results illustrate that our proposed scheme has a good performance of user's fingerprint verification. PMID:27355051

  4. Leveraging pattern matching to solve SRAM verification challenges at advanced nodes

    NASA Astrophysics Data System (ADS)

    Kan, Huan; Huang, Lucas; Yang, Legender; Zou, Elaine; Wan, Qijian; Du, Chunshan; Hu, Xinyi; Liu, Zhengfang; Zhu, Yu; Zhang, Recoo; Huang, Elven; Muirhead, Jonathan

    2018-03-01

    Memory is a critical component in today's system-on-chip (SoC) designs. Static random-access memory (SRAM) blocks are assembled by combining intellectual property (IP) blocks that come from SRAM libraries developed and certified by the foundries for both functionality and a specific process node. Customers place these SRAM IP in their designs, adjusting as necessary to achieve DRC-clean results. However, any changes a customer makes to these SRAM IP during implementation, whether intentionally or in error, can impact yield and functionality. Physical verification of SRAM has always been a challenge, because these blocks usually contain smaller feature sizes and spacing constraints compared to traditional logic or other layout structures. At advanced nodes, critical dimension becomes smaller and smaller, until there is almost no opportunity to use optical proximity correction (OPC) and lithography to adjust the manufacturing process to mitigate the effects of any changes. The smaller process geometries, reduced supply voltages, increasing process variation, and manufacturing uncertainty mean accurate SRAM physical verification results are not only reaching new levels of difficulty, but also new levels of criticality for design success. In this paper, we explore the use of pattern matching to create an SRAM verification flow that provides both accurate, comprehensive coverage of the required checks and visual output to enable faster, more accurate error debugging. Our results indicate that pattern matching can enable foundries to improve SRAM manufacturing yield, while allowing designers to benefit from SRAM verification kits that can shorten the time to market.

  5. Measurement Sets and Sites Commonly Used for High Spatial Resolution Image Product Characterization

    NASA Technical Reports Server (NTRS)

    Pagnutti, Mary

    2006-01-01

    Scientists within NASA's Applied Sciences Directorate have developed a well-characterized remote sensing Verification & Validation (V&V) site at the John C. Stennis Space Center (SSC). This site has enabled the in-flight characterization of satellite high spatial resolution remote sensing system products form Space Imaging IKONOS, Digital Globe QuickBird, and ORBIMAGE OrbView, as well as advanced multispectral airborne digital camera products. SSC utilizes engineered geodetic targets, edge targets, radiometric tarps, atmospheric monitoring equipment and their Instrument Validation Laboratory to characterize high spatial resolution remote sensing data products. This presentation describes the SSC characterization capabilities and techniques in the visible through near infrared spectrum and examples of calibration results.

  6. An integrated service digital network (ISDN)-based international telecommunication between Samsung Medical Center and Hokkaido University using telecommunication helped radiotherapy planning and information system (THERAPIS).

    PubMed

    Huh, S J; Shirato, H; Hashimoto, S; Shimizu, S; Kim, D Y; Ahn, Y C; Choi, D; Miyasaka, K; Mizuno, J

    2000-07-01

    This study introduces the integrated service digital network (ISDN)-based international teleradiotherapy system (THERAPIS) in radiation oncology between hospitals in Seoul, South Korea and in Sapporo, Japan. THERAPIS has the following functions: (1) exchange of patient's image data, (2) real-time teleconference, and (3) communication of the treatment planning, dose calculation and distribution, and of portal verification images between the remote hospitals. Our preliminary results of applications on eight patients demonstrated that the international telecommunication using THERAPIS was clinically useful and satisfactory with sufficient bandwidth for the transfer of patient data for clinical use in radiation oncology.

  7. LH2 on-orbit storage tank support trunnion design and verification

    NASA Technical Reports Server (NTRS)

    Bailey, W. J.; Fester, D. A.; Toth, J. M., Jr.

    1985-01-01

    A detailed fatigue analysis was conducted to provide verification of the trunnion design in the reusable Cryogenic Fluid Management Facility for Shuttle flights and to assess the performance capability of the trunnion E-glass/S-glass epoxy composite material. Basic material property data at ambient and liquid hydrogen temperatures support the adequacy of the epoxy composite for seven-mission requirement. Testing of trunnions fabricated to the flight design has verified adequate strength and fatigue properties of the design to meet the requirements of seven Shuttle flights.

  8. Public-key quantum digital signature scheme with one-time pad private-key

    NASA Astrophysics Data System (ADS)

    Chen, Feng-Lin; Liu, Wan-Fang; Chen, Su-Gen; Wang, Zhi-Hua

    2018-01-01

    A quantum digital signature scheme is firstly proposed based on public-key quantum cryptosystem. In the scheme, the verification public-key is derived from the signer's identity information (such as e-mail) on the foundation of identity-based encryption, and the signature private-key is generated by one-time pad (OTP) protocol. The public-key and private-key pair belongs to classical bits, but the signature cipher belongs to quantum qubits. After the signer announces the public-key and generates the final quantum signature, each verifier can verify publicly whether the signature is valid or not with the public-key and quantum digital digest. Analysis results show that the proposed scheme satisfies non-repudiation and unforgeability. Information-theoretic security of the scheme is ensured by quantum indistinguishability mechanics and OTP protocol. Based on the public-key cryptosystem, the proposed scheme is easier to be realized compared with other quantum signature schemes under current technical conditions.

  9. Detecting agricultural to urban land use change from multi-temporal MSS digital data. [Salt Lake County, Utah

    NASA Technical Reports Server (NTRS)

    Ridd, M. K.; Merola, J. A.; Jaynes, R. A.

    1983-01-01

    Conversion of agricultural land to a variety of urban uses is a major problem along the Wasatch Front, Utah. Although LANDSAT MSS data is a relatively coarse tool for discriminating categories of change in urban-size plots, its availability prompts a thorough test of its power to detect change. The procedures being applied to a test area in Salt Lake County, Utah, where the land conversion problem is acute are presented. The identity of land uses before and after conversion was determined and digital procedures for doing so were compared. Several algorithms were compared, utilizing both raw data and preprocessed data. Verification of results involved high quality color infrared photography and field observation. Two data sets were digitally registered, specific change categories internally identified in the software, results tabulated by computer, and change maps printed at 1:24,000 scale.

  10. Digital Pharmacovigilance and Disease Surveillance: Combining Traditional and Big-Data Systems for Better Public Health

    PubMed Central

    Salathé, Marcel

    2016-01-01

    The digital revolution has contributed to very large data sets (ie, big data) relevant for public health. The two major data sources are electronic health records from traditional health systems and patient-generated data. As the two data sources have complementary strengths—high veracity in the data from traditional sources and high velocity and variety in patient-generated data—they can be combined to build more-robust public health systems. However, they also have unique challenges. Patient-generated data in particular are often completely unstructured and highly context dependent, posing essentially a machine-learning challenge. Some recent examples from infectious disease surveillance and adverse drug event monitoring demonstrate that the technical challenges can be solved. Despite these advances, the problem of verification remains, and unless traditional and digital epidemiologic approaches are combined, these data sources will be constrained by their intrinsic limits. PMID:28830106

  11. Algorithms and methodology used in constructing high-resolution terrain databases

    NASA Astrophysics Data System (ADS)

    Williams, Bryan L.; Wilkosz, Aaron

    1998-07-01

    This paper presents a top-level description of methods used to generate high-resolution 3D IR digital terrain databases using soft photogrammetry. The 3D IR database is derived from aerial photography and is made up of digital ground plane elevation map, vegetation height elevation map, material classification map, object data (tanks, buildings, etc.), and temperature radiance map. Steps required to generate some of these elements are outlined. The use of metric photogrammetry is discussed in the context of elevation map development; and methods employed to generate the material classification maps are given. The developed databases are used by the US Army Aviation and Missile Command to evaluate the performance of various missile systems. A discussion is also presented on database certification which consists of validation, verification, and accreditation procedures followed to certify that the developed databases give a true representation of the area of interest, and are fully compatible with the targeted digital simulators.

  12. An Analytic Creativity Assessment Scale for Digital Game Story Design: Construct Validity, Internal Consistency and Interrater Reliability

    ERIC Educational Resources Information Center

    Chuang, Tsung-Yen; Huang, Yun-Hsuan

    2015-01-01

    Mobile technology has rapidly made digital games a popular entertainment to this digital generation, and thus digital game design received considerable attention in both the game industry and design education. Digital game design involves diverse dimensions in which digital game story design (DGSD) particularly attracts our interest, as the…

  13. The Role of Integrated Modeling in the Design and Verification of the James Webb Space Telescope

    NASA Technical Reports Server (NTRS)

    Mosier, Gary E.; Howard, Joseph M.; Johnston, John D.; Parrish, Keith A.; Hyde, T. Tupper; McGinnis, Mark A.; Bluth, Marcel; Kim, Kevin; Ha, Kong Q.

    2004-01-01

    The James Web Space Telescope (JWST) is a large, infrared-optimized space telescope scheduled for launch in 2011. System-level verification of critical optical performance requirements will rely on integrated modeling to a considerable degree. In turn, requirements for accuracy of the models are significant. The size of the lightweight observatory structure, coupled with the need to test at cryogenic temperatures, effectively precludes validation of the models and verification of optical performance with a single test in 1-g. Rather, a complex series of steps are planned by which the components of the end-to-end models are validated at various levels of subassembly, and the ultimate verification of optical performance is by analysis using the assembled models. This paper describes the critical optical performance requirements driving the integrated modeling activity, shows how the error budget is used to allocate and track contributions to total performance, and presents examples of integrated modeling methods and results that support the preliminary observatory design. Finally, the concepts for model validation and the role of integrated modeling in the ultimate verification of observatory are described.

  14. Structural Design Requirements and Factors of Safety for Spaceflight Hardware: For Human Spaceflight. Revision A

    NASA Technical Reports Server (NTRS)

    Bernstein, Karen S.; Kujala, Rod; Fogt, Vince; Romine, Paul

    2011-01-01

    This document establishes the structural requirements for human-rated spaceflight hardware including launch vehicles, spacecraft and payloads. These requirements are applicable to Government Furnished Equipment activities as well as all related contractor, subcontractor and commercial efforts. These requirements are not imposed on systems other than human-rated spacecraft, such as ground test articles, but may be tailored for use in specific cases where it is prudent to do so such as for personnel safety or when assets are at risk. The requirements in this document are focused on design rather than verification. Implementation of the requirements is expected to be described in a Structural Verification Plan (SVP), which should describe the verification of each structural item for the applicable requirements. The SVP may also document unique verifications that meet or exceed these requirements with NASA Technical Authority approval.

  15. Engineering of the LISA Pathfinder mission—making the experiment a practical reality

    NASA Astrophysics Data System (ADS)

    Warren, Carl; Dunbar, Neil; Backler, Mike

    2009-05-01

    LISA Pathfinder represents a unique challenge in the development of scientific spacecraft—not only is the LISA Test Package (LTP) payload a complex integrated development, placing stringent requirements on its developers and the spacecraft, but the payload also acts as the core sensor and actuator for the spacecraft, making the tasks of control design, software development and system verification unusually difficult. The micro-propulsion system which provides the remaining actuation also presents substantial development and verification challenges. As the mission approaches the system critical design review, flight hardware is completing verification and the process of verification using software and hardware simulators and test benches is underway. Preparation for operations has started, but critical milestones for LTP and field effect electric propulsion (FEEP) lie ahead. This paper summarizes the status of the present development and outlines the key challenges that must be overcome on the way to launch.

  16. Large - scale Rectangular Ruler Automated Verification Device

    NASA Astrophysics Data System (ADS)

    Chen, Hao; Chang, Luping; Xing, Minjian; Xie, Xie

    2018-03-01

    This paper introduces a large-scale rectangular ruler automated verification device, which consists of photoelectric autocollimator and self-designed mechanical drive car and data automatic acquisition system. The design of mechanical structure part of the device refer to optical axis design, drive part, fixture device and wheel design. The design of control system of the device refer to hardware design and software design, and the hardware mainly uses singlechip system, and the software design is the process of the photoelectric autocollimator and the automatic data acquisition process. This devices can automated achieve vertical measurement data. The reliability of the device is verified by experimental comparison. The conclusion meets the requirement of the right angle test procedure.

  17. Space station prototype Sabatier reactor design verification testing

    NASA Technical Reports Server (NTRS)

    Cusick, R. J.

    1974-01-01

    A six-man, flight prototype carbon dioxide reduction subsystem for the SSP ETC/LSS (Space Station Prototype Environmental/Thermal Control and Life Support System) was developed and fabricated for the NASA-Johnson Space Center between February 1971 and October 1973. Component design verification testing was conducted on the Sabatier reactor covering design and off-design conditions as part of this development program. The reactor was designed to convert a minimum of 98 per cent hydrogen to water and methane for both six-man and two-man reactant flow conditions. Important design features of the reactor and test conditions are described. Reactor test results are presented that show design goals were achieved and off-design performance was stable.

  18. A Design Support Framework through Dynamic Deployment of Hypothesis and Verification in the Design Process

    NASA Astrophysics Data System (ADS)

    Nomaguch, Yutaka; Fujita, Kikuo

    This paper proposes a design support framework, named DRIFT (Design Rationale Integration Framework of Three layers), which dynamically captures and manages hypothesis and verification in the design process. A core of DRIFT is a three-layered design process model of action, model operation and argumentation. This model integrates various design support tools and captures design operations performed on them. Action level captures the sequence of design operations. Model operation level captures the transition of design states, which records a design snapshot over design tools. Argumentation level captures the process of setting problems and alternatives. The linkage of three levels enables to automatically and efficiently capture and manage iterative hypothesis and verification processes through design operations over design tools. In DRIFT, such a linkage is extracted through the templates of design operations, which are extracted from the patterns embeded in design tools such as Design-For-X (DFX) approaches, and design tools are integrated through ontology-based representation of design concepts. An argumentation model, gIBIS (graphical Issue-Based Information System), is used for representing dependencies among problems and alternatives. A mechanism of TMS (Truth Maintenance System) is used for managing multiple hypothetical design stages. This paper also demonstrates a prototype implementation of DRIFT and its application to a simple design problem. Further, it is concluded with discussion of some future issues.

  19. TeleOperator/telePresence System (TOPS) Concept Verification Model (CVM) development

    NASA Technical Reports Server (NTRS)

    Shimamoto, Mike S.

    1993-01-01

    The development of an anthropomorphic, undersea manipulator system, the TeleOperator/telePresence System (TOPS) Concept Verification Model (CVM) is described. The TOPS system's design philosophy, which results from NRaD's experience in undersea vehicles and manipulator systems development and operations, is presented. The TOPS design approach, task teams, manipulator, and vision system development and results, conclusions, and recommendations are presented.

  20. An automated system for pulmonary function testing

    NASA Technical Reports Server (NTRS)

    Mauldin, D. G.

    1974-01-01

    An experiment to quantitate pulmonary function was accepted for the space shuttle concept verification test. The single breath maneuver and the nitrogen washout are combined to reduce the test time. Parameters are defined from the forced vital capacity maneuvers. A spirometer measures the breath volume and a magnetic section mass spectrometer provides definition of gas composition. Mass spectrometer and spirometer data are analyzed by a PDP-81 digital computer.

  1. A pathologist-designed imaging system for anatomic pathology signout, teaching, and research.

    PubMed

    Schubert, E; Gross, W; Siderits, R H; Deckenbaugh, L; He, F; Becich, M J

    1994-11-01

    Pathology images are derived from gross surgical specimens, light microscopy, immunofluorescence, electron microscopy, molecular diagnostic gels, flow cytometry, image analysis data, and clinical laboratory data in graphic form. We have implemented a network of desktop personal computers (PCs) that allow us to easily capture, store, and retrieve gross and microscopic, anatomic, and research pathology images. System architecture involves multiple image acquisition and retrieval sites and a central file server for storage. The digitized images are conveyed via a local area network to and from image capture or display stations. Acquisition sites consist of a high-resolution camera connected to a frame grabber card in a 486-type personal computer, equipped with 16 MB (Table 1) RAM, a 1.05-gigabyte hard drive, and a 32-bit ethernet card for access to our anatomic pathology reporting system. We have designed a push-button workstation for acquiring and indexing images that does not significantly interfere with surgical pathology sign-out. Advantages of the system include the following: (1) Improving patient care: the availability of gross images at time of microscopic sign-out, verification of recurrence of malignancy from archived images, monitoring of bone marrow engraftment and immunosuppressive intervention after bone marrow/solid organ transplantation on repeat biopsies, and ability to seek instantaneous consultation with any pathologist on the network; (2) enhancing the teaching environment: building a digital surgical pathology atlas, improving the availability of images for conference support, and sharing cases across the network; (3) enhancing research: case study compilation, metastudy analysis, and availability of digitized images for quantitative analysis and permanent/reusable image records for archival study; and (4) other practical and economic considerations: storing case requisition images and hand-drawn diagrams deters the spread of gross room contaminants and results in considerable cost savings in photographic media for conferences, improved quality assurance by porting control stains across the network, and a multiplicity of other advantages that enhance image and information management in pathology.

  2. The International Remote Monitoring Project: Results of the Swedish Nuclear Power Facility field trial

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Johnson, C.S.; af Ekenstam, G.; Sallstrom, M.

    1995-07-01

    The Swedish Nuclear Power Inspectorate (SKI) and the US Department of Energy (DOE) sponsored work on a Remote Monitoring System (RMS) that was installed in August 1994 at the Barseback Works north of Malmo, Sweden. The RMS was designed to test the front end detection concept that would be used for unattended remote monitoring activities. Front end detection reduces the number of video images recorded and provides additional sensor verification of facility operations. The function of any safeguards Containment and Surveillance (C/S) system is to collect information which primarily is images that verify the operations at a nuclear facility. Barsebackmore » is ideal to test the concept of front end detection since most activities of safeguards interest is movement of spent fuel which occurs once a year. The RMS at Barseback uses a network of nodes to collect data from microwave motion detectors placed to detect the entrance and exit of spent fuel casks through a hatch. A video system using digital compression collects digital images and stores them on a hard drive and a digital optical disk. Data and images from the storage area are remotely monitored via telephone from Stockholm, Sweden and Albuquerque, NM, USA. These remote monitoring stations operated by SKI and SNL respectively, can retrieve data and images from the RMS computer at the Barseback Facility. The data and images are encrypted before transmission. This paper presents details of the RMS and test results of this approach to front end detection of safeguard activities.« less

  3. Control/structure interaction design methodology

    NASA Technical Reports Server (NTRS)

    Briggs, Hugh C.; Layman, William E.

    1989-01-01

    The Control Structure Interaction Program is a technology development program for spacecraft that exhibit interactions between the control system and structural dynamics. The program objectives include development and verification of new design concepts (such as active structure) and new tools (such as a combined structure and control optimization algorithm) and their verification in ground and possibly flight test. The new CSI design methodology is centered around interdisciplinary engineers using new tools that closely integrate structures and controls. Verification is an important CSI theme and analysts will be closely integrated to the CSI Test Bed laboratory. Components, concepts, tools and algorithms will be developed and tested in the lab and in future Shuttle-based flight experiments. The design methodology is summarized in block diagrams depicting the evolution of a spacecraft design and descriptions of analytical capabilities used in the process. The multiyear JPL CSI implementation plan is described along with the essentials of several new tools. A distributed network of computation servers and workstations was designed that will provide a state-of-the-art development base for the CSI technologies.

  4. Design for Verification: Using Design Patterns to Build Reliable Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Koga, Dennis (Technical Monitor)

    2003-01-01

    Components so far have been mainly used in commercial software development to reduce time to market. While some effort has been spent on formal aspects of components, most of this was done in the context of programming language or operating system framework integration. As a consequence, increased reliability of composed systems is mainly regarded as a side effect of a more rigid testing of pre-fabricated components. In contrast to this, Design for Verification (D4V) puts the focus on component specific property guarantees, which are used to design systems with high reliability requirements. D4V components are domain specific design pattern instances with well-defined property guarantees and usage rules, which are suitable for automatic verification. The guaranteed properties are explicitly used to select components according to key system requirements. The D4V hypothesis is that the same general architecture and design principles leading to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the limitations of conventional reliability assurance measures, such as too large a state space or too many execution paths.

  5. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  6. Verification bias an underrecognized source of error in assessing the efficacy of medical imaging.

    PubMed

    Petscavage, Jonelle M; Richardson, Michael L; Carr, Robert B

    2011-03-01

    Diagnostic tests are validated by comparison against a "gold standard" reference test. When the reference test is invasive or expensive, it may not be applied to all patients. This can result in biased estimates of the sensitivity and specificity of the diagnostic test. This type of bias is called "verification bias," and is a common problem in imaging research. The purpose of our study is to estimate the prevalence of verification bias in the recent radiology literature. All issues of the American Journal of Roentgenology (AJR), Academic Radiology, Radiology, and European Journal of Radiology (EJR) between November 2006 and October 2009 were reviewed for original research articles mentioning sensitivity or specificity as endpoints. Articles were read to determine whether verification bias was present and searched for author recognition of verification bias in the design. During 3 years, these journals published 2969 original research articles. A total of 776 articles used sensitivity or specificity as an outcome. Of these, 211 articles demonstrated potential verification bias. The fraction of articles with potential bias was respectively 36.4%, 23.4%, 29.5%, and 13.4% for AJR, Academic Radiology, Radiology, and EJR. The total fraction of papers with potential bias in which the authors acknowledged this bias was 17.1%. Verification bias is a common and frequently unacknowledged source of error in efficacy studies of diagnostic imaging. Bias can often be eliminated by proper study design. When it cannot be eliminated, it should be estimated and acknowledged. Published by Elsevier Inc.

  7. Multipurpose image watermarking algorithm based on multistage vector quantization.

    PubMed

    Lu, Zhe-Ming; Xu, Dian-Guo; Sun, Sheng-He

    2005-06-01

    The rapid growth of digital multimedia and Internet technologies has made copyright protection, copy protection, and integrity verification three important issues in the digital world. To solve these problems, the digital watermarking technique has been presented and widely researched. Traditional watermarking algorithms are mostly based on discrete transform domains, such as the discrete cosine transform, discrete Fourier transform (DFT), and discrete wavelet transform (DWT). Most of these algorithms are good for only one purpose. Recently, some multipurpose digital watermarking methods have been presented, which can achieve the goal of content authentication and copyright protection simultaneously. However, they are based on DWT or DFT. Lately, several robust watermarking schemes based on vector quantization (VQ) have been presented, but they can only be used for copyright protection. In this paper, we present a novel multipurpose digital image watermarking method based on the multistage vector quantizer structure, which can be applied to image authentication and copyright protection. In the proposed method, the semi-fragile watermark and the robust watermark are embedded in different VQ stages using different techniques, and both of them can be extracted without the original image. Simulation results demonstrate the effectiveness of our algorithm in terms of robustness and fragility.

  8. A demonstration of ERTS-1 analog and digital techniques applied to strip mining in Maryland and West Virginia

    NASA Technical Reports Server (NTRS)

    Anderson, A. T.; Schubert, J.

    1974-01-01

    The largest contour strip mining operations in western Maryland and West Virginia are located within the Georges Creek and the Upper Potomac Basins. These two coal basins lie within the Georges Creek (Wellersburg) syncline. The disturbed strip mine areas were delineated with the surrounding geological and vegetation features using ERTS-1 data in both analog (imagery) and digital form. The two digital systems used were: (1) the ERTS-Analysis system, a point-by-point digital analysis of spectral signatures based on known spectral values, and (2) the LARS Automatic Data Processing System. The digital techniques being developed will later be incorporated into a data base for land use planning. These two systems aided in efforts to determine the extent and state of strip mining in this region. Aircraft data, ground verification information, and geological field studies also aided in the application of ERTS-1 imagery to perform an integrated analysis that assessed the adverse effects of strip mining. The results indicated that ERTS can both monitor and map the extent of strip mining to determine immediately the acreage affected and indicate where future reclamation and revegetation may be necessary.

  9. Suite of Benchmark Tests to Conduct Mesh-Convergence Analysis of Nonlinear and Non-constant Coefficient Transport Codes

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2014-12-01

    Verification of geophysics codes is imperative to avoid serious academic as well as practical consequences. In case that access to any given source code is not possible, the Method of Manufactured Solution (MMS) cannot be employed in code verification. In contrast, employing the Method of Exact Solution (MES) has several practical advantages. In this research, we first provide four new one-dimensional analytical solutions designed for code verification; these solutions are able to uncover the particular imperfections of the Advection-diffusion-reaction equation, such as nonlinear advection, diffusion or source terms, as well as non-constant coefficient equations. After that, we provide a solution of Burgers' equation in a novel setup. Proposed solutions satisfy the continuity of mass for the ambient flow, which is a crucial factor for coupled hydrodynamics-transport solvers. Then, we use the derived analytical solutions for code verification. To clarify gray-literature issues in the verification of transport codes, we designed a comprehensive test suite to uncover any imperfection in transport solvers via a hierarchical increase in the level of tests' complexity. The test suite includes hundreds of unit tests and system tests to check vis-a-vis the portions of the code. Examples for checking the suite start by testing a simple case of unidirectional advection; then, bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh-convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available, we utilize symmetry. Auxiliary subroutines for automation of the test suite and report generation are designed. All in all, the test package is not only a robust tool for code verification but it also provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport. We also convey our experiences in finding several errors which were not detectable with routine verification techniques.

  10. Illumination-tolerant face verification of low-bit-rate JPEG2000 wavelet images with advanced correlation filters for handheld devices

    NASA Astrophysics Data System (ADS)

    Wijaya, Surya Li; Savvides, Marios; Vijaya Kumar, B. V. K.

    2005-02-01

    Face recognition on mobile devices, such as personal digital assistants and cell phones, is a big challenge owing to the limited computational resources available to run verifications on the devices themselves. One approach is to transmit the captured face images by use of the cell-phone connection and to run the verification on a remote station. However, owing to limitations in communication bandwidth, it may be necessary to transmit a compressed version of the image. We propose using the image compression standard JPEG2000, which is a wavelet-based compression engine used to compress the face images to low bit rates suitable for transmission over low-bandwidth communication channels. At the receiver end, the face images are reconstructed with a JPEG2000 decoder and are fed into the verification engine. We explore how advanced correlation filters, such as the minimum average correlation energy filter [Appl. Opt. 26, 3633 (1987)] and its variants, perform by using face images captured under different illumination conditions and encoded with different bit rates under the JPEG2000 wavelet-encoding standard. We evaluate the performance of these filters by using illumination variations from the Carnegie Mellon University's Pose, Illumination, and Expression (PIE) face database. We also demonstrate the tolerance of these filters to noisy versions of images with illumination variations.

  11. Projected Impact of Compositional Verification on Current and Future Aviation Safety Risk

    NASA Technical Reports Server (NTRS)

    Reveley, Mary S.; Withrow, Colleen A.; Leone, Karen M.; Jones, Sharon M.

    2014-01-01

    The projected impact of compositional verification research conducted by the National Aeronautic and Space Administration System-Wide Safety and Assurance Technologies on aviation safety risk was assessed. Software and compositional verification was described. Traditional verification techniques have two major problems: testing at the prototype stage where error discovery can be quite costly and the inability to test for all potential interactions leaving some errors undetected until used by the end user. Increasingly complex and nondeterministic aviation systems are becoming too large for these tools to check and verify. Compositional verification is a "divide and conquer" solution to addressing increasingly larger and more complex systems. A review of compositional verification research being conducted by academia, industry, and Government agencies is provided. Forty-four aviation safety risks in the Biennial NextGen Safety Issues Survey were identified that could be impacted by compositional verification and grouped into five categories: automation design; system complexity; software, flight control, or equipment failure or malfunction; new technology or operations; and verification and validation. One capability, 1 research action, 5 operational improvements, and 13 enablers within the Federal Aviation Administration Joint Planning and Development Office Integrated Work Plan that could be addressed by compositional verification were identified.

  12. D Digitization of AN Heritage Masterpiece - a Critical Analysis on Quality Assessment

    NASA Astrophysics Data System (ADS)

    Menna, F.; Nocerino, E.; Remondino, F.; Dellepiane, M.; Callieri, M.; Scopigno, R.

    2016-06-01

    Despite being perceived as interchangeable when properly applied, close-range photogrammetry and range imaging have both their pros and limitations that can be overcome using suitable procedures. Even if the two techniques have been frequently cross-compared, critical analysis discussing all sub-phases of a complex digitization project are quite rare. Comparisons taking into account the digitization of a cultural masterpiece, such as the Etruscan Sarcophagus of the Spouses (Figure 1) discussed in this paper, are even less common. The final 3D model of the Sarcophagus shows impressive spatial and texture resolution, in the order of tenths of millimetre for both digitization techniques, making it a large 3D digital model even though the physical size of the artwork is quite limited. The paper presents the survey of the Sarcophagus, a late 6th century BC Etruscan anthropoid Sarcophagus. Photogrammetry and laser scanning were used for its 3D digitization in two different times only few days apart from each other. The very short time available for the digitization was a crucial constraint for the surveying operations (due to constraints imposed us by the museum curators). Despite very high-resolution and detailed 3D models have been produced, a metric comparison between the two models shows intrinsic limitations of each technique that should be overcome through suitable onsite metric verification procedures as well as a proper processing workflow.

  13. Integrated testing and verification system for research flight software

    NASA Technical Reports Server (NTRS)

    Taylor, R. N.

    1979-01-01

    The MUST (Multipurpose User-oriented Software Technology) program is being developed to cut the cost of producing research flight software through a system of software support tools. An integrated verification and testing capability was designed as part of MUST. Documentation, verification and test options are provided with special attention on real-time, multiprocessing issues. The needs of the entire software production cycle were considered, with effective management and reduced lifecycle costs as foremost goals.

  14. Crewed Space Vehicle Battery Safety Requirements

    NASA Technical Reports Server (NTRS)

    Jeevarajan, Judith A.; Darcy, Eric C.

    2014-01-01

    This requirements document is applicable to all batteries on crewed spacecraft, including vehicle, payload, and crew equipment batteries. It defines the specific provisions required to design a battery that is safe for ground personnel and crew members to handle and/or operate during all applicable phases of crewed missions, safe for use in the enclosed environment of a crewed space vehicle, and safe for use in launch vehicles, as well as in unpressurized spaces adjacent to the habitable portion of a space vehicle. The required provisions encompass hazard controls, design evaluation, and verification. The extent of the hazard controls and verification required depends on the applicability and credibility of the hazard to the specific battery design and applicable missions under review. Evaluation of the design and verification program results shall be completed prior to certification for flight and ground operations. This requirements document is geared toward the designers of battery systems to be used in crewed vehicles, crew equipment, crew suits, or batteries to be used in crewed vehicle systems and payloads (or experiments). This requirements document also applies to ground handling and testing of flight batteries. Specific design and verification requirements for a battery are dependent upon the battery chemistry, capacity, complexity, charging, environment, and application. The variety of battery chemistries available, combined with the variety of battery-powered applications, results in each battery application having specific, unique requirements pertinent to the specific battery application. However, there are basic requirements for all battery designs and applications, which are listed in section 4. Section 5 includes a description of hazards and controls and also includes requirements.

  15. The redMaPPer Galaxy Cluster Catalog From DES Science Verification Data

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rykoff, E. S.

    We describe updates to the redMaPPer algorithm, a photometric red-sequence cluster finder specifically designed for large photometric surveys. The updated algorithm is applied tomore » $$150\\,\\mathrm{deg}^2$$ of Science Verification (SV) data from the Dark Energy Survey (DES), and to the Sloan Digital Sky Survey (SDSS) DR8 photometric data set. The DES SV catalog is locally volume limited, and contains 786 clusters with richness $$\\lambda>20$$ (roughly equivalent to $$M_{\\mathrm{500c}}\\gtrsim10^{14}\\,h_{70}^{-1}\\,M_{\\odot}$$) and 0.2 < $z$ <0.9. The DR8 catalog consists of 26311 clusters with 0.08 < $z$ < 0.6, with a sharply increasing richness threshold as a function of redshift for $$z\\gtrsim 0.35$$. The photometric redshift performance of both catalogs is shown to be excellent, with photometric redshift uncertainties controlled at the $$\\sigma_z/(1+z)\\sim 0.01$$ level for $$z\\lesssim0.7$$, rising to $$\\sim0.02$$ at $$z\\sim0.9$$ in DES SV. We make use of $Chandra$ and $XMM$ X-ray and South Pole Telescope Sunyaev-Zeldovich data to show that the centering performance and mass--richness scatter are consistent with expectations based on prior runs of redMaPPer on SDSS data. We also show how the redMaPPer photo-$z$ and richness estimates are relatively insensitive to imperfect star/galaxy separation and small-scale star masks.« less

  16. 37 CFR 261.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... may conduct a single audit of a Designated Agent upon reasonable notice and during reasonable business... COPYRIGHT ARBITRATION ROYALTY PANEL RULES AND PROCEDURES RATES AND TERMS FOR ELIGIBLE NONSUBSCRIPTION.... This section prescribes general rules pertaining to the verification by any Copyright Owner or...

  17. 20 CFR 632.77 - Participant eligibility determination.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... NATIVE AMERICAN EMPLOYMENT AND TRAINING PROGRAMS Program Design and Management § 632.77 Participant... maintaining a system which reasonably ensures an accurate determination and subsequent verification of... information is subject to verification and that falsification of the application shall be grounds for the...

  18. 20 CFR 632.77 - Participant eligibility determination.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... NATIVE AMERICAN EMPLOYMENT AND TRAINING PROGRAMS Program Design and Management § 632.77 Participant... maintaining a system which reasonably ensures an accurate determination and subsequent verification of... information is subject to verification and that falsification of the application shall be grounds for the...

  19. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Verburg, J; Bortfeld, T

    Purpose: We present a new system to perform prompt gamma-ray spectroscopy during proton pencil-beam scanning treatments, which enables in vivo verification of the proton range. This system will be used for the first clinical studies of this technology. Methods: After successful pre-clinical testing of prompt gamma-ray spectroscopy, a full scale system for clinical studies is now being assembled. Prompt gamma-rays will be detected during patient treatment using an array of 8 detector modules arranged behind a tungsten collimator. Each detector module consists of a lanthanum(III) bromide scintillator, a photomultiplier tube, and custom electronics for stable high voltage supply and signalmore » amplification. A new real-time data acquisition and control system samples the signals from the detectors with analog-to-digital converters, analyses events of interest, and communicates with the beam delivery systems. The timing of the detected events was synchronized to the cyclotron radiofrequency and the pencil-beam delivery. Range verification is performed by matching measured energy- and timeresolved gamma-ray spectra to nuclear reaction models based on the clinical treatment plan. Experiments in phantoms were performed using clinical beams in order to assess the performance of the systems. Results: The experiments showed reliable real-time analysis of more than 10 million detector events per second. The individual detector modules acquired accurate energy- and time-resolved gamma-ray measurements at a rate of 1 million events per second, which is typical for beams delivered with a clinical dose rate. The data acquisition system successfully tracked the delivery of the scanned pencil-beams to determine the location of range deviations within the treatment field. Conclusion: A clinical system for proton range verification using prompt gamma-ray spectroscopy has been designed and is being prepared for use during patient treatments. We anticipate to start a first clinical study in the near future. This work was supported by the Federal Share of program income earned by Massachusetts; General Hospital on C06-CA059267, Proton Therapy Research and Treatment Center.« less

  20. Model-based engineering for medical-device software.

    PubMed

    Ray, Arnab; Jetley, Raoul; Jones, Paul L; Zhang, Yi

    2010-01-01

    This paper demonstrates the benefits of adopting model-based design techniques for engineering medical device software. By using a patient-controlled analgesic (PCA) infusion pump as a candidate medical device, the authors show how using models to capture design information allows for i) fast and efficient construction of executable device prototypes ii) creation of a standard, reusable baseline software architecture for a particular device family, iii) formal verification of the design against safety requirements, and iv) creation of a safety framework that reduces verification costs for future versions of the device software. 1.

  1. Navy DDG-51 and DDG-1000 Destroyer Programs: Background and Issues for Congress

    DTIC Science & Technology

    2013-10-22

    two technologies previously identified as the most challenging — digital-beam-forming and transmit-receive modules—have been demonstrated in a...job of coming up with an affordable solution to a leap-ahead capability for the fleet.”31 In his presentation, Vandroff showed a slide comparing the...foreign ballistic missile data in support of international treaty verification. CJR represents an integrated mission solution : ship, radar suite, and

  2. Employment of adaptive learning techniques for the discrimination of acoustic emissions

    NASA Astrophysics Data System (ADS)

    Erkes, J. W.; McDonald, J. F.; Scarton, H. A.; Tam, K. C.; Kraft, R. P.

    1983-11-01

    The following aspects of this study on the discrimination of acoustic emissions (AE) were examined: (1) The analytical development and assessment of digital signal processing techniques for AE signal dereverberation, noise reduction, and source characterization; (2) The modeling and verification of some aspects of key selected techniques through a computer-based simulation; and (3) The study of signal propagation physics and their effect on received signal characteristics for relevant physical situations.

  3. Geographic Information System Data Analysis

    NASA Technical Reports Server (NTRS)

    Billings, Chad; Casad, Christopher; Floriano, Luis G.; Hill, Tracie; Johnson, Rashida K.; Locklear, J. Mark; Penn, Stephen; Rhoulac, Tori; Shay, Adam H.; Taylor, Antone; hide

    1995-01-01

    Data was collected in order to further NASA Langley Research Center's Geographic Information System(GIS). Information on LaRC's communication, electrical, and facility configurations was collected. Existing data was corrected through verification, resulting in more accurate databases. In addition, Global Positioning System(GPS) points were used in order to accurately impose buildings on digitized images. Overall, this project will help the Imaging and CADD Technology Team (ICTT) prove GIS to be a valuable resource for LaRC.

  4. Design Language for Digital Systems

    NASA Technical Reports Server (NTRS)

    Shiva, S. G.

    1985-01-01

    Digital Systems Design Language (DDL) is convenient hardware description language for developing and testing digital designs and for inputting design details into design automation system. Describes digital systems at gate, register transfer, and combinational block levels. DDL-based programs written in FORTRAN IV for batch execution.

  5. Design and verification of distributed logic controllers with application of Petri nets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał

    2015-12-31

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  6. Design verification test matrix development for the STME thrust chamber assembly

    NASA Technical Reports Server (NTRS)

    Dexter, Carol E.; Elam, Sandra K.; Sparks, David L.

    1993-01-01

    This report presents the results of the test matrix development for design verification at the component level for the National Launch System (NLS) space transportation main engine (STME) thrust chamber assembly (TCA) components including the following: injector, combustion chamber, and nozzle. A systematic approach was used in the development of the minimum recommended TCA matrix resulting in a minimum number of hardware units and a minimum number of hot fire tests.

  7. Design and verification of distributed logic controllers with application of Petri nets

    NASA Astrophysics Data System (ADS)

    Wiśniewski, Remigiusz; Grobelna, Iwona; Grobelny, Michał; Wiśniewska, Monika

    2015-12-01

    The paper deals with the designing and verification of distributed logic controllers. The control system is initially modelled with Petri nets and formally verified against structural and behavioral properties with the application of the temporal logic and model checking technique. After that it is decomposed into separate sequential automata that are working concurrently. Each of them is re-verified and if the validation is successful, the system can be finally implemented.

  8. Formal Verification of Complex Systems based on SysML Functional Requirements

    DTIC Science & Technology

    2014-12-23

    Formal Verification of Complex Systems based on SysML Functional Requirements Hoda Mehrpouyan1, Irem Y. Tumer2, Chris Hoyle2, Dimitra Giannakopoulou3...requirements for design of complex engineered systems. The proposed ap- proach combines a SysML modeling approach to document and structure safety requirements...methods and tools to support the integration of safety into the design solution. 2.1. SysML for Complex Engineered Systems Traditional methods and tools

  9. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  10. Formal System Verification for Trustworthy Embedded Systems

    DTIC Science & Technology

    2011-04-19

    microkernel basis. We had previously achieved code- level formal verification of the seL4 microkernel [3]. In the present project, over 12 months with 0.6 FTE...project, we designed and implemented a secure network access device (SAC) on top of the verified seL4 microkernel. The device allows a trusted front...Engelhardt, Rafal Kolan- ski, Michael Norrish, Thomas Sewell, Harvey Tuch, and Simon Winwood. seL4 : Formal verification of an OS kernel. CACM, 53(6):107

  11. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting.

    PubMed

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-11-01

    Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. Published by the BMJ Publishing Group Limited. For permission to use (where not already granted under a licence) please go to http://group.bmj.com/group/rights-licensing/permissions.

  12. Mitigating errors caused by interruptions during medication verification and administration: interventions in a simulated ambulatory chemotherapy setting

    PubMed Central

    Prakash, Varuna; Koczmara, Christine; Savage, Pamela; Trip, Katherine; Stewart, Janice; McCurdie, Tara; Cafazzo, Joseph A; Trbovich, Patricia

    2014-01-01

    Background Nurses are frequently interrupted during medication verification and administration; however, few interventions exist to mitigate resulting errors, and the impact of these interventions on medication safety is poorly understood. Objective The study objectives were to (A) assess the effects of interruptions on medication verification and administration errors, and (B) design and test the effectiveness of targeted interventions at reducing these errors. Methods The study focused on medication verification and administration in an ambulatory chemotherapy setting. A simulation laboratory experiment was conducted to determine interruption-related error rates during specific medication verification and administration tasks. Interventions to reduce these errors were developed through a participatory design process, and their error reduction effectiveness was assessed through a postintervention experiment. Results Significantly more nurses committed medication errors when interrupted than when uninterrupted. With use of interventions when interrupted, significantly fewer nurses made errors in verifying medication volumes contained in syringes (16/18; 89% preintervention error rate vs 11/19; 58% postintervention error rate; p=0.038; Fisher's exact test) and programmed in ambulatory pumps (17/18; 94% preintervention vs 11/19; 58% postintervention; p=0.012). The rate of error commission significantly decreased with use of interventions when interrupted during intravenous push (16/18; 89% preintervention vs 6/19; 32% postintervention; p=0.017) and pump programming (7/18; 39% preintervention vs 1/19; 5% postintervention; p=0.017). No statistically significant differences were observed for other medication verification tasks. Conclusions Interruptions can lead to medication verification and administration errors. Interventions were highly effective at reducing unanticipated errors of commission in medication administration tasks, but showed mixed effectiveness at reducing predictable errors of detection in medication verification tasks. These findings can be generalised and adapted to mitigate interruption-related errors in other settings where medication verification and administration are required. PMID:24906806

  13. Definition of ground test for Large Space Structure (LSS) control verification

    NASA Technical Reports Server (NTRS)

    Waites, H. B.; Doane, G. B., III; Tollison, D. K.

    1984-01-01

    An overview for the definition of a ground test for the verification of Large Space Structure (LSS) control is given. The definition contains information on the description of the LSS ground verification experiment, the project management scheme, the design, development, fabrication and checkout of the subsystems, the systems engineering and integration, the hardware subsystems, the software, and a summary which includes future LSS ground test plans. Upon completion of these items, NASA/Marshall Space Flight Center will have an LSS ground test facility which will provide sufficient data on dynamics and control verification of LSS so that LSS flight system operations can be reasonably ensured.

  14. The Multiple Doppler Radar Workshop, November 1979.

    NASA Astrophysics Data System (ADS)

    Carbone, R. E.; Harris, F. I.; Hildebrand, P. H.; Kropfli, R. A.; Miller, L. J.; Moninger, W.; Strauch, R. G.; Doviak, R. J.; Johnson, K. W.; Nelson, S. P.; Ray, P. S.; Gilet, M.

    1980-10-01

    The findings of the Multiple Doppler Radar Workshop are summarized by a series of six papers. Part I of this series briefly reviews the history of multiple Doppler experimentation, fundamental concepts of Doppler signal theory, and organization and objectives of the Workshop. Invited presentations by dynamicists and cloud physicists are also summarized.Experimental design and procedures (Part II) are shown to be of critical importance. Well-defined and limited experimental objectives are necessary in view of technological limitations. Specified radar scanning procedures that balance temporal and spatial resolution considerations are discussed in detail. Improved siting for suppression of ground clutter as well as scanning procedures to minimize errors at echo boundaries are discussed. The need for accelerated research using numerically simulated proxy data sets is emphasized.New technology to eliminate various sampling limitations is cited as an eventual solution to many current problems in Part III. Ground clutter contamination may be curtailed by means of full spectral processing, digital filters in real time, and/or variable pulse repetition frequency. Range and velocity ambiguities also may be minimized by various pulsing options as well as random phase transmission. Sidelobe contamination can be reduced through improvements in radomes, illumination patterns, and antenna feed types. Radar volume-scan time can be sharply reduced by means of wideband transmission, phased array antennas, multiple beam antennas, and frequency agility.Part IV deals with synthesis of data from several radars in the context of scientific requirements in cumulus clouds, widespread precipitation, and severe convective storms. The important temporal and spatial scales are examined together with the accuracy required for vertical air motion in each phenomenon. Factors that introduce errors in the vertical velocity field are identified and synthesis techniques are discussed separately for the dual Doppler and multiple Doppler cases. Various filters and techniques, including statistical and variational approaches, are mentioned. Emphasis is placed on the importance of experiment design and procedures, technological improvements, incorporation of all information from supporting sensors, and analysis priority for physically simple cases. Integrated reliability is proposed as an objective tool for radar siting.Verification of multiple Doppler-derived vertical velocity is discussed in Part V. Three categories of verification are defined as direct, deductive, and theoretical/numerical. Direct verification consists of zenith-pointing radar measurements (from either airborne or ground-based systems), air motion sensing aircraft, instrumented towers, and tracking of radar chaff. Deductive sources include mesonetworks, aircraft (thermodynamic and microphysical) measurements, satellite observations, radar reflectivity, multiple Doppler consistency, and atmospheric soundings. Theoretical/numerical sources of verification include proxy data simulation, momentum checking, and numerical cloud models. New technology, principally in the form of wide bandwidth radars, is seen as a development that may reduce the need for extensive verification of multiple Doppler-derived vertical air motions. Airborne Doppler radar is perceived as the single most important source of verification within the bounds of existing technology.Nine stages of data processing and display are identified in Part VI. The stages are identified as field checks, archival, selection, editing, coordinate transformation, synthesis of Cartesian fields, filtering, display, and physical analysis. Display of data is considered to be a problem critical to assimilation of data at all stages. Interactive computing systems and software are concluded to be very important, particularly for the editing stage. Three- and 4-dimensional displays are considered essential for data assimilation, particularly at the physical analysis stage. The concept of common data tape formats is approved both for data in radar spherical space as well as for synthesized Cartesian output.1169

  15. The design and implementation of a windowing interface pinch force measurement system

    NASA Astrophysics Data System (ADS)

    Ho, Tze-Yee; Chen, Yuanu-Joan; Chung, Chin-Teng; Hsiao, Ming-Heng

    2010-02-01

    This paper presents a novel windowing interface pinch force measurement system that is basically based on an USB (Universal Series Bus) microcontroller which mainly processes the sensing data from the force sensing resistance sensors mounted on five digits. It possesses several friendly functions, such as the value and curve trace of the applied force by a hand injured patient displayed in real time on a monitoring screen, consequently, not only the physician can easily evaluate the effect of hand injury rehabilitation, but also the patients get more progressive during the hand physical therapy by interacting with the screen of pinch force measurement. In order to facilitate the pinch force measurement system and make it friendly, the detail hardware design and software programming flowchart are described in this paper. Through a series of carefully and detailed experimental tests, first of all, the relationship between the applying force and the FSR sensors are measured and verified. Later, the different type of pinch force measurements are verified by the oscilloscope and compared with the corresponding values and waveform traces in the window interface display panel to obtain the consistency. Finally, a windowing interface pinch force measurement system based on the USB microcontroller is implemented and demonstrated. The experimental results show the verification and feasibility of the designed system.

  16. CMOS-TDI detector technology for reconnaissance application

    NASA Astrophysics Data System (ADS)

    Eckardt, Andreas; Reulke, Ralf; Jung, Melanie; Sengebusch, Karsten

    2014-10-01

    The Institute of Optical Sensor Systems (OS) at the Robotics and Mechatronics Center of the German Aerospace Center (DLR) has more than 30 years of experience with high-resolution imaging technology. This paper shows the institute's scientific results of the leading-edge detector design CMOS in a TDI (Time Delay and Integration) architecture. This project includes the technological design of future high or multi-spectral resolution spaceborne instruments and the possibility of higher integration. DLR OS and the Fraunhofer Institute for Microelectronic Circuits and Systems (IMS) in Duisburg were driving the technology of new detectors and the FPA design for future projects, new manufacturing accuracy and on-chip processing capability in order to keep pace with the ambitious scientific and user requirements. In combination with the engineering research, the current generation of space borne sensor systems is focusing on VIS/NIR high spectral resolution to meet the requirements on earth and planetary observation systems. The combination of large-swath and high-spectral resolution with intelligent synchronization control, fast-readout ADC (analog digital converter) chains and new focal-plane concepts opens the door to new remote-sensing and smart deep-space instruments. The paper gives an overview of the detector development status and verification program at DLR, as well as of new control possibilities for CMOS-TDI detectors in synchronization control mode.

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, GROUNDWATER SAMPLING TECHNOLOGIES, GEOPROBE INC., PNEUMATIC BLADDER PUMP GW 1400 SERIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, GROUNDWATER SAMPLING TECHNOLOGIES, GEOPROBE INC, MECHANICAL BLADDER PUMP MODEL MP470

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...

  19. Validation (not just verification) of Deep Space Missions

    NASA Technical Reports Server (NTRS)

    Duren, Riley M.

    2006-01-01

    ion & Validation (V&V) is a widely recognized and critical systems engineering function. However, the often used definition 'Verification proves the design is right; validation proves it is the right design' is rather vague. And while Verification is a reasonably well standardized systems engineering process, Validation is a far more abstract concept and the rigor and scope applied to it varies widely between organizations and individuals. This is reflected in the findings in recent Mishap Reports for several NASA missions, in which shortfalls in Validation (not just Verification) were cited as root- or contributing-factors in catastrophic mission loss. Furthermore, although there is strong agreement in the community that Test is the preferred method for V&V, many people equate 'V&V' with 'Test', such that Analysis and Modeling aren't given comparable attention. Another strong motivator is a realization that the rapid growth in complexity of deep-space missions (particularly Planetary Landers and Space Observatories given their inherent unknowns) is placing greater demands on systems engineers to 'get it right' with Validation.

  20. Is it Code Imperfection or 'garbage in Garbage Out'? Outline of Experiences from a Comprehensive Adr Code Verification

    NASA Astrophysics Data System (ADS)

    Zamani, K.; Bombardelli, F. A.

    2013-12-01

    ADR equation describes many physical phenomena of interest in the field of water quality in natural streams and groundwater. In many cases such as: density driven flow, multiphase reactive transport, and sediment transport, either one or a number of terms in the ADR equation may become nonlinear. For that reason, numerical tools are the only practical choice to solve these PDEs. All numerical solvers developed for transport equation need to undergo code verification procedure before they are put in to practice. Code verification is a mathematical activity to uncover failures and check for rigorous discretization of PDEs and implementation of initial/boundary conditions. In the context computational PDE verification is not a well-defined procedure on a clear path. Thus, verification tests should be designed and implemented with in-depth knowledge of numerical algorithms and physics of the phenomena as well as mathematical behavior of the solution. Even test results need to be mathematically analyzed to distinguish between an inherent limitation of algorithm and a coding error. Therefore, it is well known that code verification is a state of the art, in which innovative methods and case-based tricks are very common. This study presents full verification of a general transport code. To that end, a complete test suite is designed to probe the ADR solver comprehensively and discover all possible imperfections. In this study we convey our experiences in finding several errors which were not detectable with routine verification techniques. We developed a test suit including hundreds of unit tests and system tests. The test package has gradual increment in complexity such that tests start from simple and increase to the most sophisticated level. Appropriate verification metrics are defined for the required capabilities of the solver as follows: mass conservation, convergence order, capabilities in handling stiff problems, nonnegative concentration, shape preservation, and spurious wiggles. Thereby, we provide objective, quantitative values as opposed to subjective qualitative descriptions as 'weak' or 'satisfactory' agreement with those metrics. We start testing from a simple case of unidirectional advection, then bidirectional advection and tidal flow and build up to nonlinear cases. We design tests to check nonlinearity in velocity, dispersivity and reactions. For all of the mentioned cases we conduct mesh convergence tests. These tests compare the results' order of accuracy versus the formal order of accuracy of discretization. The concealing effect of scales (Peclet and Damkohler numbers) on the mesh convergence study and appropriate remedies are also discussed. For the cases in which the appropriate benchmarks for mesh convergence study are not available we utilize Symmetry, Complete Richardson Extrapolation and Method of False Injection to uncover bugs. Detailed discussions of capabilities of the mentioned code verification techniques are given. Auxiliary subroutines for automation of the test suit and report generation are designed. All in all, the test package is not only a robust tool for code verification but also it provides comprehensive insight on the ADR solvers capabilities. Such information is essential for any rigorous computational modeling of ADR equation for surface/subsurface pollution transport.

  1. A reuse-based framework for the design of analog and mixed-signal ICs

    NASA Astrophysics Data System (ADS)

    Castro-Lopez, Rafael; Fernandez, Francisco V.; Rodriguez Vazquez, Angel

    2005-06-01

    Despite the spectacular breakthroughs of the semiconductor industry, the ability to design integrated circuits (ICs) under stringent time-to-market (TTM) requirements is lagging behind integration capacity, so far keeping pace with still valid Moore"s Law. The resulting gap is threatening with slowing down such a phenomenal growth. The design community believes that it is only by means of powerful CAD tools and design methodologies - and, possibly, a design paradigm shift - that this design gap can be bridged. In this sense, reuse-based design is seen as a promising solution, and concepts such as IP Block, Virtual Component, and Design Reuse have become commonplace thanks to the significant advances in the digital arena. Unfortunately, the very nature of analog and mixed-signal (AMS) design has hindered a similar level of consensus and development. This paper presents a framework for the reuse-based design of AMS circuits. The framework is founded on three key elements: (1) a CAD-supported hierarchical design flow that facilitates the incorporation of AMS reusable blocks, reduces the overall design time, and expedites the management of increasing AMS design complexity; (2) a complete, clear definition of the AMS reusable block, structured into three separate facets or views: the behavioral, structural, and layout facets, the two first for top-down electrical synthesis and bottom-up verification, the latter used during bottom-up physical synthesis; (3) the design for reusability set of tools, methods, and guidelines that, relying on intensive parameterization as well as on design knowledge capture and encapsulation, allows to produce fully reusable AMS blocks. A case study and a functional silicon prototype demonstrate the validity of the paper"s proposals.

  2. Hand Grasping Synergies As Biometrics

    PubMed Central

    Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K.; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana

    2017-01-01

    Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies—postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security. PMID:28512630

  3. Validation of a SysML based design for wireless sensor networks

    NASA Astrophysics Data System (ADS)

    Berrachedi, Amel; Rahim, Messaoud; Ioualalen, Malika; Hammad, Ahmed

    2017-07-01

    When developing complex systems, the requirement for the verification of the systems' design is one of the main challenges. Wireless Sensor Networks (WSNs) are examples of such systems. We address the problem of how WSNs must be designed to fulfil the system requirements. Using the SysML Language, we propose a Model Based System Engineering (MBSE) specification and verification methodology for designing WSNs. This methodology uses SysML to describe the WSNs requirements, structure and behaviour. Then, it translates the SysML elements to an analytic model, specifically, to a Deterministic Stochastic Petri Net. The proposed approach allows to design WSNs and study their behaviors and their energy performances.

  4. Design of the software development and verification system (SWDVS) for shuttle NASA study task 35

    NASA Technical Reports Server (NTRS)

    Drane, L. W.; Mccoy, B. J.; Silver, L. W.

    1973-01-01

    An overview of the Software Development and Verification System (SWDVS) for the space shuttle is presented. The design considerations, goals, assumptions, and major features of the design are examined. A scenario that shows three persons involved in flight software development using the SWDVS in response to a program change request is developed. The SWDVS is described from the standpoint of different groups of people with different responsibilities in the shuttle program to show the functional requirements that influenced the SWDVS design. The software elements of the SWDVS that satisfy the requirements of the different groups are identified.

  5. Design Principles for the Information Architecture of a SMET Education Digital Library.

    ERIC Educational Resources Information Center

    Dong, Andy; Agogino, Alice M.

    This implementation paper introduces principles for the information architecture of an educational digital library, principles that address the distinction between designing digital libraries for education and designing digital libraries for information retrieval in general. Design is a key element of any successful product. Good designers and…

  6. Feasibility of biochemical verification in a web-based smoking cessation study.

    PubMed

    Cha, Sarah; Ganz, Ollie; Cohn, Amy M; Ehlke, Sarah J; Graham, Amanda L

    2017-10-01

    Cogent arguments have been made against the need for biochemical verification in population-based studies with low-demand characteristics. Despite this fact, studies involving digital interventions (low-demand) are often required in peer review to report biochemically verified abstinence. To address this discrepancy, we examined the feasibility and costs of biochemical verification in a web-based study conducted with a national sample. Participants were 600U.S. adult current smokers who registered on a web-based smoking cessation program and completed surveys at baseline and 3months. Saliva sampling kits were sent to participants who reported 7-day abstinence at 3months, and analyzed for cotinine. The response rate at 3-months was 41.2% (n=247): 93 participants reported 7-day abstinence (38%) and were mailed a saliva kit (71% returned). The discordance rate was 36.4%. Participants with discordant responses were more likely to report 3-month use of nicotine replacement therapy or e-cigarettes than those with concordant responses (79.2% vs. 45.2%, p=0.007). The total cost of saliva sampling was $8280 ($125/sample). Biochemical verification was both time- and cost-intensive, and yielded a relatively small number of samples due to low response rates and use of other nicotine products during the follow-up period. There was a high rate of discordance of self-reported abstinence and saliva testing. Costs for data collection may be prohibitive for studies with large sample sizes or limited budgets. Our findings echo previous statements that biochemical verification is not necessary in population-based studies, and add evidence specific to technology-based studies. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. FPGA wavelet processor design using language for instruction-set architectures (LISA)

    NASA Astrophysics Data System (ADS)

    Meyer-Bäse, Uwe; Vera, Alonzo; Rao, Suhasini; Lenk, Karl; Pattichis, Marios

    2007-04-01

    The design of an microprocessor is a long, tedious, and error-prone task consisting of typically three design phases: architecture exploration, software design (assembler, linker, loader, profiler), architecture implementation (RTL generation for FPGA or cell-based ASIC) and verification. The Language for instruction-set architectures (LISA) allows to model a microprocessor not only from instruction-set but also from architecture description including pipelining behavior that allows a design and development tool consistency over all levels of the design. To explore the capability of the LISA processor design platform a.k.a. CoWare Processor Designer we present in this paper three microprocessor designs that implement a 8/8 wavelet transform processor that is typically used in today's FBI fingerprint compression scheme. We have designed a 3 stage pipelined 16 bit RISC processor (NanoBlaze). Although RISC μPs are usually considered "fast" processors due to design concept like constant instruction word size, deep pipelines and many general purpose registers, it turns out that DSP operations consume essential processing time in a RISC processor. In a second step we have used design principles from programmable digital signal processor (PDSP) to improve the throughput of the DWT processor. A multiply-accumulate operation along with indirect addressing operation were the key to achieve higher throughput. A further improvement is possible with today's FPGA technology. Today's FPGAs offer a large number of embedded array multipliers and it is now feasible to design a "true" vector processor (TVP). A multiplication of two vectors can be done in just one clock cycle with our TVP, a complete scalar product in two clock cycles. Code profiling and Xilinx FPGA ISE synthesis results are provided that demonstrate the essential improvement that a TVP has compared with traditional RISC or PDSP designs.

  8. Precision segmented reflector, figure verification sensor

    NASA Technical Reports Server (NTRS)

    Manhart, Paul K.; Macenka, Steve A.

    1989-01-01

    The Precision Segmented Reflector (PSR) program currently under way at the Jet Propulsion Laboratory is a test bed and technology demonstration program designed to develop and study the structural and material technologies required for lightweight, precision segmented reflectors. A Figure Verification Sensor (FVS) which is designed to monitor the active control system of the segments is described, a best fit surface is defined, and an image or wavefront quality of the assembled array of reflecting panels is assessed

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: IN-DRAIN TREATMENT DEVICE. HYDRO INTERNATIONAL UP-FLO™ FILTER

    EPA Science Inventory

    Verification testing of the Hydro International Up-Flo™ Filter with one filter module and CPZ Mix™ filter media was conducted at the Penn State Harrisburg Environmental Engineering Laboratory in Middletown, Pennsylvania. The Up-Flo™ Filter is designed as a passive, modular filtr...

  10. Statistical Design for Biospecimen Cohort Size in Proteomics-based Biomarker Discovery and Verification Studies

    PubMed Central

    Skates, Steven J.; Gillette, Michael A.; LaBaer, Joshua; Carr, Steven A.; Anderson, N. Leigh; Liebler, Daniel C.; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L.; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R.; Rodriguez, Henry; Boja, Emily S.

    2014-01-01

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC), with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance, and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step towards building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research. PMID:24063748

  11. Statistical design for biospecimen cohort size in proteomics-based biomarker discovery and verification studies.

    PubMed

    Skates, Steven J; Gillette, Michael A; LaBaer, Joshua; Carr, Steven A; Anderson, Leigh; Liebler, Daniel C; Ransohoff, David; Rifai, Nader; Kondratovich, Marina; Težak, Živana; Mansfield, Elizabeth; Oberg, Ann L; Wright, Ian; Barnes, Grady; Gail, Mitchell; Mesri, Mehdi; Kinsinger, Christopher R; Rodriguez, Henry; Boja, Emily S

    2013-12-06

    Protein biomarkers are needed to deepen our understanding of cancer biology and to improve our ability to diagnose, monitor, and treat cancers. Important analytical and clinical hurdles must be overcome to allow the most promising protein biomarker candidates to advance into clinical validation studies. Although contemporary proteomics technologies support the measurement of large numbers of proteins in individual clinical specimens, sample throughput remains comparatively low. This problem is amplified in typical clinical proteomics research studies, which routinely suffer from a lack of proper experimental design, resulting in analysis of too few biospecimens to achieve adequate statistical power at each stage of a biomarker pipeline. To address this critical shortcoming, a joint workshop was held by the National Cancer Institute (NCI), National Heart, Lung, and Blood Institute (NHLBI), and American Association for Clinical Chemistry (AACC) with participation from the U.S. Food and Drug Administration (FDA). An important output from the workshop was a statistical framework for the design of biomarker discovery and verification studies. Herein, we describe the use of quantitative clinical judgments to set statistical criteria for clinical relevance and the development of an approach to calculate biospecimen sample size for proteomic studies in discovery and verification stages prior to clinical validation stage. This represents a first step toward building a consensus on quantitative criteria for statistical design of proteomics biomarker discovery and verification research.

  12. Practicing universal design to actual hand tool design process.

    PubMed

    Lin, Kai-Chieh; Wu, Chih-Fu

    2015-09-01

    UD evaluation principles are difficult to implement in product design. This study proposes a methodology for implementing UD in the design process through user participation. The original UD principles and user experience are used to develop the evaluation items. Difference of product types was considered. Factor analysis and Quantification theory type I were used to eliminate considered inappropriate evaluation items and to examine the relationship between evaluation items and product design factors. Product design specifications were established for verification. The results showed that converting user evaluation into crucial design verification factors by the generalized evaluation scale based on product attributes as well as the design factors applications in product design can improve users' UD evaluation. The design process of this study is expected to contribute to user-centered UD application. Copyright © 2015 Elsevier Ltd and The Ergonomics Society. All rights reserved.

  13. Shuttle payload interface verification equipment study. Volume 2: Technical document, part 1

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The technical analysis is reported that was performed during the shuttle payload interface verification equipment study. It describes: (1) the background and intent of the study; (2) study approach and philosophy covering all facets of shuttle payload/cargo integration; (3)shuttle payload integration requirements; (4) preliminary design of the horizontal IVE; (5) vertical IVE concept; and (6) IVE program development plans, schedule and cost. Also included is a payload integration analysis task to identify potential uses in addition to payload interface verification.

  14. Formal Methods for Life-Critical Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Johnson, Sally C.

    1993-01-01

    The use of computer software in life-critical applications, such as for civil air transports, demands the use of rigorous formal mathematical verification procedures. This paper demonstrates how to apply formal methods to the development and verification of software by leading the reader step-by-step through requirements analysis, design, implementation, and verification of an electronic phone book application. The current maturity and limitations of formal methods tools and techniques are then discussed, and a number of examples of the successful use of formal methods by industry are cited.

  15. Applying Monte Carlo Simulation to Launch Vehicle Design and Requirements Verification

    NASA Technical Reports Server (NTRS)

    Hanson, John M.; Beard, Bernard B.

    2010-01-01

    This paper is focused on applying Monte Carlo simulation to probabilistic launch vehicle design and requirements verification. The approaches developed in this paper can be applied to other complex design efforts as well. Typically the verification must show that requirement "x" is met for at least "y" % of cases, with, say, 10% consumer risk or 90% confidence. Two particular aspects of making these runs for requirements verification will be explored in this paper. First, there are several types of uncertainties that should be handled in different ways, depending on when they become known (or not). The paper describes how to handle different types of uncertainties and how to develop vehicle models that can be used to examine their characteristics. This includes items that are not known exactly during the design phase but that will be known for each assembled vehicle (can be used to determine the payload capability and overall behavior of that vehicle), other items that become known before or on flight day (can be used for flight day trajectory design and go/no go decision), and items that remain unknown on flight day. Second, this paper explains a method (order statistics) for determining whether certain probabilistic requirements are met or not and enables the user to determine how many Monte Carlo samples are required. Order statistics is not new, but may not be known in general to the GN&C community. The methods also apply to determining the design values of parameters of interest in driving the vehicle design. The paper briefly discusses when it is desirable to fit a distribution to the experimental Monte Carlo results rather than using order statistics.

  16. Object-oriented approach to the automatic segmentation of bones from pediatric hand radiographs

    NASA Astrophysics Data System (ADS)

    Shim, Hyeonjoon; Liu, Brent J.; Taira, Ricky K.; Hall, Theodore R.

    1997-04-01

    The purpose of this paper is to develop a robust and accurate method that automatically segments phalangeal and epiphyseal bones from digital pediatric hand radiographs exhibiting various stages of growth. The development of this system draws principles from object-oriented design, model- guided analysis, and feedback control. A system architecture called 'the object segmentation machine' was implemented incorporating these design philosophies. The system is aided by a knowledge base where all model contours and other information such as age, race, and sex, are stored. These models include object structure models, shape models, 1-D wrist profiles, and gray level histogram models. Shape analysis is performed first by using an arc-length orientation transform to break down a given contour into elementary segments and curves. Then an interpretation tree is used as an inference engine to map known model contour segments to data contour segments obtained from the transform. Spatial and anatomical relationships among contour segments work as constraints from shape model. These constraints aid in generating a list of candidate matches. The candidate match with the highest confidence is chosen to be the current intermediate result. Verification of intermediate results are perform by a feedback control loop.

  17. First-Order SPICE Modeling of Extreme-Temperature 4H-SiC JFET Integrated Circuits

    NASA Technical Reports Server (NTRS)

    Neudeck, Philip G.; Spry, David J.; Chen, Liang-Yu

    2016-01-01

    A separate submission to this conference reports that 4H-SiC Junction Field Effect Transistor (JFET) digital and analog Integrated Circuits (ICs) with two levels of metal interconnect have reproducibly demonstrated electrical operation at 500 C in excess of 1000 hours. While this progress expands the complexity and durability envelope of high temperature ICs, one important area for further technology maturation is the development of reasonably accurate and accessible computer-aided modeling and simulation tools for circuit design of these ICs. Towards this end, we report on development and verification of 25 C to 500 C SPICE simulation models of first order accuracy for this extreme-temperature durable 4H-SiC JFET IC technology. For maximum availability, the JFET IC modeling is implemented using the baseline-version SPICE NMOS LEVEL 1 model that is common to other variations of SPICE software and importantly includes the body-bias effect. The first-order accuracy of these device models is verified by direct comparison with measured experimental device characteristics.

  18. Note: An improved calibration system with phase correction for electronic transformers with digital output

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cheng, Han-miao, E-mail: chenghanmiao@hust.edu.cn; Li, Hong-bin, E-mail: lihongbin@hust.edu.cn; State Key Laboratory of Advanced Electromagnetic Engineering and Technology, Wuhan 430074

    The existing electronic transformer calibration systems employing data acquisition cards cannot satisfy some practical applications, because the calibration systems have phase measurement errors when they work in the mode of receiving external synchronization signals. This paper proposes an improved calibration system scheme with phase correction to improve the phase measurement accuracy. We employ NI PCI-4474 to design a calibration system, and the system has the potential to receive external synchronization signals and reach extremely high accuracy classes. Accuracy verification has been carried out in the China Electric Power Research Institute, and results demonstrate that the system surpasses the accuracy classmore » 0.05. Furthermore, this system has been used to test the harmonics measurement accuracy of all-fiber optical current transformers. In the same process, we have used an existing calibration system, and a comparison of the test results is presented. The system after improvement is suitable for the intended applications.« less

  19. Missing data reconstruction using Gaussian mixture models for fingerprint images

    NASA Astrophysics Data System (ADS)

    Agaian, Sos S.; Yeole, Rushikesh D.; Rao, Shishir P.; Mulawka, Marzena; Troy, Mike; Reinecke, Gary

    2016-05-01

    Publisher's Note: This paper, originally published on 25 May 2016, was replaced with a revised version on 16 June 2016. If you downloaded the original PDF, but are unable to access the revision, please contact SPIE Digital Library Customer Service for assistance. One of the most important areas in biometrics is matching partial fingerprints in fingerprint databases. Recently, significant progress has been made in designing fingerprint identification systems for missing fingerprint information. However, a dependable reconstruction of fingerprint images still remains challenging due to the complexity and the ill-posed nature of the problem. In this article, both binary and gray-level images are reconstructed. This paper also presents a new similarity score to evaluate the performance of the reconstructed binary image. The offered fingerprint image identification system can be automated and extended to numerous other security applications such as postmortem fingerprints, forensic science, investigations, artificial intelligence, robotics, all-access control, and financial security, as well as for the verification of firearm purchasers, driver license applicants, etc.

  20. Development of an Ion Thruster and Power Processor for New Millennium's Deep Space 1 Mission

    NASA Technical Reports Server (NTRS)

    Sovey, James S.; Hamley, John A.; Haag, Thomas W.; Patterson, Michael J.; Pencil, Eric J.; Peterson, Todd T.; Pinero, Luis R.; Power, John L.; Rawlin, Vincent K.; Sarmiento, Charles J.; hide

    1997-01-01

    The NASA Solar Electric Propulsion Technology Applications Readiness Program (NSTAR) will provide a single-string primary propulsion system to NASA's New Millennium Deep Space 1 Mission which will perform comet and asteroid flybys in the years 1999 and 2000. The propulsion system includes a 30-cm diameter ion thruster, a xenon feed system, a power processing unit, and a digital control and interface unit. A total of four engineering model ion thrusters, three breadboard power processors, and a controller have been built, integrated, and tested. An extensive set of development tests has been completed along with thruster design verification tests of 2000 h and 1000 h. An 8000 h Life Demonstration Test is ongoing and has successfully demonstrated more than 6000 h of operation. In situ measurements of accelerator grid wear are consistent with grid lifetimes well in excess of the 12,000 h qualification test requirement. Flight hardware is now being assembled in preparation for integration, functional, and acceptance tests.

  1. Transient analysis of an HTS DC power cable with an HVDC system

    NASA Astrophysics Data System (ADS)

    Dinh, Minh-Chau; Ju, Chang-Hyeon; Kim, Jin-Geun; Park, Minwon; Yu, In-Keun; Yang, Byeongmo

    2013-11-01

    The operational characteristics of a superconducting DC power cable connected to a highvoltage direct current (HVDC) system are mainly concerned with the HVDC control and protection system. To confirm how the cable operates with the HVDC system, verifications using simulation tools are needed. This paper presents a transient analysis of a high temperature superconducting (HTS) DC power cable in connection with an HVDC system. The study was conducted via the simulation of the HVDC system and a developed model of the HTS DC power cable using a real time digital simulator (RTDS). The simulation was performed with some cases of short circuits that may have caused system damage. The simulation results show that during the faults, the quench did not happen with the HTS DC power cable because the HVDC controller reduced some degree of the fault current. These results could provide useful data for the protection design of a practical HVDC and HTS DC power cable system.

  2. Formal Verification of Digital Logic

    DTIC Science & Technology

    1991-12-01

    INVERT circuit was based upon VHDL code provided in the Zycad Reference Manual [32:Ch 10,73]. The other circuits were based upon VHtDL code written...HALFADD.PL /* This file implements a simple half-adder that * /* is built from inverters and 2 input nand gates. * /* It is based upon a Zycad VHDL file...It is based upon a Zycad VHDL file written by * /* Capt Dave Banton, which is attached below the * /* Prolog code . *load..in(primitive). %h get nor2

  3. Wetland delineation with IKONOS high-resolution satellite imagery, Fort Custer Training Center, Battle Creek, Michigan, 2005

    USGS Publications Warehouse

    Fuller, L.M.; Morgan, T.R.; Aichele, Stephen S.

    2006-01-01

    The Michigan Army National Guard’s Fort Custer Training Center (FCTC) in Battle Creek, Mich., has the responsibility to protect wetland resources on the training grounds while providing training opportunities, and for future development planning at the facility. The National Wetlands Inventory (NWI) data have been the primary wetland-boundary resource, but a check on scale and accuracy of the wetland boundary information for the Fort Custer Training Center was needed. In cooperation with the FCTC, the U.S. Geological Survey (USGS) used an early spring IKONOS pan-sharpened satellite image to delineate the wetlands and create a more accurate wetland map for the FCTC. The USGS tested automated approaches (supervised and unsupervised classifications) to identify the wetland areas from the IKONOS satellite image, but the automated approaches alone did not yield accurate results. To ensure accurate wetland boundaries, the final wetland map was manually digitized on the basis of the automated supervised and unsupervised classifications, in combination with NWI data, field verifications, and visual interpretation of the IKONOS satellite image. The final wetland areas digitized from the IKONOS satellite imagery were similar to those in NWI; however, the wetland boundaries differed in some areas, a few wetlands mapped on the NWI were determined not to be wetlands from the IKONOS image and field verification, and additional previously unmapped wetlands not recognized by the NWI were identified from the IKONOS image.

  4. ERTS-1 data applied to strip mining

    NASA Technical Reports Server (NTRS)

    Anderson, A. T.; Schubert, J.

    1976-01-01

    Two coal basins within the western region of the Potomac River Basin contain the largest strip-mining operations in western Maryland and West Virginia. The disturbed strip-mine areas were delineated along with the surrounding geological and vegetation features by using ERTS-1 data in both analog and digital form. The two digital systems employed were (1) the ERTS analysis system, a point-by-point digital analysis of spectral signatures based on known spectral values and (2) the LARS automatic data processing system. These two systems aided in efforts to determine the extent and state of strip mining in this region. Aircraft data, ground-verification information, and geological field studies also aided in the application of ERTS-1 imagery to perform an integrated analysis that assessed the adverse effects of strip mining. The results indicated that ERTS can both monitor and map the extent of strip mining to determine immediately the acreage affected and to indicate where future reclamation and revegetation may be necessary.

  5. Detection of aspen-conifer forest mixes from LANDSAT digital data. [Utah-Idaho Bear River Range

    NASA Technical Reports Server (NTRS)

    Jaynes, R. A.; Merola, J. A.

    1982-01-01

    Aspen, conifer and mixed aspen/conifer forests were mapped for a 15-quadrangle study area in the Utah-Idaho Bear River Range using LANDSAT multispectral scanner data. Digital classification and statistical analysis of LANDSAT data allowed the identification of six groups of signatures which reflect different types of aspen/conifer forest mixing. Photo interpretations of the print symbols suggest that such classes are indicative of mid to late seral aspen forests. Digital print map overlays and acreage calculations were prepared for the study area quadrangles. Further field verification is needed to acquire additional information about the nature of the forests. Single date LANDSAT analysis should be a cost effective means to index aspen forests which are at least in the mid seral phase of conifer invasion. Since aspen canopies tend to obscure understory conifers for early seral forests, a second date analysis, using data taken when aspens are leafless, could provide information about early seral aspen forests.

  6. Breast Mass Detection in Digital Mammogram Based on Gestalt Psychology

    PubMed Central

    Bu, Qirong; Liu, Feihong; Zhang, Min; Ren, Yu; Lv, Yi

    2018-01-01

    Inspired by gestalt psychology, we combine human cognitive characteristics with knowledge of radiologists in medical image analysis. In this paper, a novel framework is proposed to detect breast masses in digitized mammograms. It can be divided into three modules: sensation integration, semantic integration, and verification. After analyzing the progress of radiologist's mammography screening, a series of visual rules based on the morphological characteristics of breast masses are presented and quantified by mathematical methods. The framework can be seen as an effective trade-off between bottom-up sensation and top-down recognition methods. This is a new exploratory method for the automatic detection of lesions. The experiments are performed on Mammographic Image Analysis Society (MIAS) and Digital Database for Screening Mammography (DDSM) data sets. The sensitivity reached to 92% at 1.94 false positive per image (FPI) on MIAS and 93.84% at 2.21 FPI on DDSM. Our framework has achieved a better performance compared with other algorithms. PMID:29854359

  7. Digital Pharmacovigilance and Disease Surveillance: Combining Traditional and Big-Data Systems for Better Public Health.

    PubMed

    Salathé, Marcel

    2016-12-01

    The digital revolution has contributed to very large data sets (ie, big data) relevant for public health. The two major data sources are electronic health records from traditional health systems and patient-generated data. As the two data sources have complementary strengths-high veracity in the data from traditional sources and high velocity and variety in patient-generated data-they can be combined to build more-robust public health systems. However, they also have unique challenges. Patient-generated data in particular are often completely unstructured and highly context dependent, posing essentially a machine-learning challenge. Some recent examples from infectious disease surveillance and adverse drug event monitoring demonstrate that the technical challenges can be solved. Despite these advances, the problem of verification remains, and unless traditional and digital epidemiologic approaches are combined, these data sources will be constrained by their intrinsic limits. © The Author 2016. Published by Oxford University Press for the Infectious Diseases Society of America.

  8. Predicted and tested performance of durable TPS

    NASA Technical Reports Server (NTRS)

    Shideler, John L.

    1992-01-01

    The development of thermal protection systems (TPS) for aerospace vehicles involves combining material selection, concept design, and verification tests to evaluate the effectiveness of the system. The present paper reviews verification tests of two metallic and one carbon-carbon thermal protection system. The test conditions are, in general, representative of Space Shuttle design flight conditions which may be more or less severe than conditions required for future space transportation systems. The results of this study are intended to help establish a preliminary data base from which the designers of future entry vehicles can evaluate the applicability of future concepts to their vehicles.

  9. User-friendly design approach for analog layout design

    NASA Astrophysics Data System (ADS)

    Li, Yongfu; Lee, Zhao Chuan; Tripathi, Vikas; Perez, Valerio; Ong, Yoong Seang; Hui, Chiu Wing

    2017-03-01

    Analog circuits are sensitives to the changes in the layout environment conditions, manufacturing processes, and variations. This paper presents analog verification flow with five types of analogfocused layout constraint checks to assist engineers in identifying any potential device mismatch and layout drawing mistakes. Compared to several solutions, our approach only requires layout design, which is sufficient to recognize all the matched devices. Our approach simplifies the data preparation and allows seamless integration into the layout environment with minimum disruption to the custom layout flow. Our user-friendly analog verification flow provides the engineer with more confident with their layouts quality.

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, SWINE WASTE ELECTRIC POWER AND HEAT PRODUCTION--MARTIN MACHINERY INTERNAL COMBUSTION ENGINE

    EPA Science Inventory

    Under EPA’s Environmental Technology Verification program, which provides objective and scientific third party analysis of new technology that can benefit the environment, a combined heat and power system designed by Martin Machinery was evaluated. This paper provides test result...

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, HVLP COATING EQUIPMENT, SHARPE MANUFACTURING COMPANY PLATINUM 2012 HVLP SPRAY GUN

    EPA Science Inventory

    This report presents the results of the verification test of the Sharpe Platinum 2013 high-volume, low-pressure gravity-feed spray gun, hereafter referred to as the Sharpe Platinum, which is designed for use in automotive refinishing. The test coating chosen by Sharpe Manufacturi...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE - VORTECHNICS INC., VORTECHS® SYSTEM, MODEL 1000

    EPA Science Inventory

    Verification testing of the Vortechnics, Inc. Vortechs® System, Model 1000 was conducted on a 0.25 acre portion of an elevated highway near downtown Milwaukee, Wisconsin. The Vortechs is designed to remove settable and floatable pollutants from stormwater runoff. The Vortechs® ...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, C. LEE COOK DIVISION, DOVER CORPORATION, STATIC PAC (TM) SYSTEM, PHASE II REPORT

    EPA Science Inventory

    The Environmental Technology Verification report discusses the technology and performance of the Static Pac System, Phase II, natural gas reciprocating compressor rod packing manufactured by the C. Lee Cook Division, Dover Corporation. The Static Pac System is designed to seal th...

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, MIRATECH CORPORATIONM GECO 3001 AIR/FUEL RATIO CONTROLLER

    EPA Science Inventory

    Details on the verification test design, measurement test procedures, and Quality assurance/Quality Control (QA/QC) procedures can be found in the test plan titled Testing and Quality Assurance Plan, MIRATECH Corporation GECO 3100 Air/Fuel Ratio Controller (SRI 2001). It can be d...

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES: MITSUI ENGINEERING & SHIPBUILDING DIESEL PARTICULATE FILTER

    EPA Science Inventory

    EPA‘s Environmental Technology Verification program is designed to further environmental protection by accelerating the acceptance and use of improved and cost effective technologies. This is done by providing high-quality, peer reviewed data on technology performance to those in...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: STORMWATER SOURCE AREA TREATMENT DEVICE: HYDRO INTERNATIONAL DOWNSTREAM DEFENDER®

    EPA Science Inventory

    Verification testing of the Hydro International Downstream Defender® was conducted at the Madison Water Utility in Madison, Wisconsin. The system was designed for a drainage basin estimated at 1.9 acres in size, but during intense storm events, the system received water from an a...

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT, LEAD IN DUST WIPE MEASUREMENT TECHNOLOGY, NITON LLC, X-RAY FLUORESCENCE SPECTRUM ANALYZER, XLT-700

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) design efficient processes for conducting has created the Environmental Technology perfofl1lance tests of innovative technologies. Verification Program (E TV) to facilitate the deployment of innovative or improved environmental techn...

  18. Simulation-based MDP verification for leading-edge masks

    NASA Astrophysics Data System (ADS)

    Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki

    2017-07-01

    For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.

  19. All-digital radar architecture

    NASA Astrophysics Data System (ADS)

    Molchanov, Pavlo A.

    2014-10-01

    All digital radar architecture requires exclude mechanical scan system. The phase antenna array is necessarily large because the array elements must be co-located with very precise dimensions and will need high accuracy phase processing system for aggregate and distribute T/R modules data to/from antenna elements. Even phase array cannot provide wide field of view. New nature inspired all digital radar architecture proposed. The fly's eye consists of multiple angularly spaced sensors giving the fly simultaneously thee wide-area visual coverage it needs to detect and avoid the threats around him. Fly eye radar antenna array consist multiple directional antennas loose distributed along perimeter of ground vehicle or aircraft and coupled with receiving/transmitting front end modules connected by digital interface to central processor. Non-steering antenna array allows creating all-digital radar with extreme flexible architecture. Fly eye radar architecture provides wide possibility of digital modulation and different waveform generation. Simultaneous correlation and integration of thousands signals per second from each point of surveillance area allows not only detecting of low level signals ((low profile targets), but help to recognize and classify signals (targets) by using diversity signals, polarization modulation and intelligent processing. Proposed all digital radar architecture with distributed directional antenna array can provide a 3D space vector to the jammer by verification direction of arrival for signals sources and as result jam/spoof protection not only for radar systems, but for communication systems and any navigation constellation system, for both encrypted or unencrypted signals, for not limited number or close positioned jammers.

  20. Digital flight control research

    NASA Technical Reports Server (NTRS)

    Potter, J. E.; Stern, R. G.; Smith, T. B.; Sinha, P.

    1974-01-01

    The results of studies which were undertaken to contribute to the design of digital flight control systems, particularly for transport aircraft are presented. In addition to the overall design considerations for a digital flight control system, the following topics are discussed in detail: (1) aircraft attitude reference system design, (2) the digital computer configuration, (3) the design of a typical digital autopilot for transport aircraft, and (4) a hybrid flight simulator.

  1. Test load verification through strain data analysis

    NASA Technical Reports Server (NTRS)

    Verderaime, V.; Harrington, F.

    1995-01-01

    A traditional binding acceptance criterion on polycrystalline structures is the experimental verification of the ultimate factor of safety. At fracture, the induced strain is inelastic and about an order-of-magnitude greater than designed for maximum expected operational limit. At this extreme strained condition, the structure may rotate and displace at the applied verification load such as to unknowingly distort the load transfer into the static test article. Test may result in erroneously accepting a submarginal design or rejecting a reliable one. A technique was developed to identify, monitor, and assess the load transmission error through two back-to-back surface-measured strain data. The technique is programmed for expediency and convenience. Though the method was developed to support affordable aerostructures, the method is also applicable for most high-performance air and surface transportation structural systems.

  2. Performance verification and system integration tests of the pulse shape processor for the soft x-ray spectrometer onboard ASTRO-H

    NASA Astrophysics Data System (ADS)

    Takeda, Sawako; Tashiro, Makoto S.; Ishisaki, Yoshitaka; Tsujimoto, Masahiro; Seta, Hiromi; Shimoda, Yuya; Yamaguchi, Sunao; Uehara, Sho; Terada, Yukikatsu; Fujimoto, Ryuichi; Mitsuda, Kazuhisa

    2014-07-01

    The soft X-ray spectrometer (SXS) aboard ASTRO-H is equipped with dedicated digital signal processing units called pulse shape processors (PSPs). The X-ray microcalorimeter system SXS has 36 sensor pixels, which are operated at 50 mK to measure heat input of X-ray photons and realize an energy resolution of 7 eV FWHM in the range 0.3-12.0 keV. Front-end signal processing electronics are used to filter and amplify the electrical pulse output from the sensor and for analog-to-digital conversion. The digitized pulses from the 36 pixels are multiplexed and are sent to the PSP over low-voltage differential signaling lines. Each of two identical PSP units consists of an FPGA board, which assists the hardware logic, and two CPU boards, which assist the onboard software. The FPGA board triggers at every pixel event and stores the triggering information as a pulse waveform in the installed memory. The CPU boards read the event data to evaluate pulse heights by an optimal filtering algorithm. The evaluated X-ray photon data (including the pixel ID, energy, and arrival time information) are transferred to the satellite data recorder along with event quality information. The PSP units have been developed and tested with the engineering model (EM) and the flight model. Utilizing the EM PSP, we successfully verified the entire hardware system and the basic software design of the PSPs, including their communication capability and signal processing performance. In this paper, we show the key metrics of the EM test, such as accuracy and synchronicity of sampling clocks, event grading capability, and resultant energy resolution.

  3. Guidelines for mission integration, a summary report

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Guidelines are presented for instrument/experiment developers concerning hardware design, flight verification, and operations and mission implementation requirements. Interface requirements between the STS and instruments/experiments are defined. Interface constraints and design guidelines are presented along with integrated payload requirements for Spacelab Missions 1, 2, and 3. Interim data are suggested for use during hardware development until more detailed information is developed when a complete mission and an integrated payload system are defined. Safety requirements, flight verification requirements, and operations procedures are defined.

  4. From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems

    DTIC Science & Technology

    2015-03-13

    A. Lee. “A Programming Model for Time - Synchronized Distributed Real- Time Systems”. In: Proceedings of Real Time and Em- bedded Technology and Applications Symposium. 2007, pp. 259–268. ...From MetroII to Metronomy, Designing Contract-based Function-Architecture Co-simulation Framework for Timing Verification of Cyber-Physical Systems...the collection of information is estimated to average 1 hour per response, including the time for reviewing instructions, searching existing data

  5. Buddy Tag CONOPS and Requirements.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brotz, Jay Kristoffer; Deland, Sharon M.

    2015-12-01

    This document defines the concept of operations (CONOPS) and the requirements for the Buddy Tag, which is conceived and designed in collaboration between Sandia National Laboratories and Princeton University under the Department of State Key VerificationAssets Fund. The CONOPS describe how the tags are used to support verification of treaty limitations and is only defined to the extent necessary to support a tag design. The requirements define the necessary functions and desired non-functional features of the Buddy Tag at a high level

  6. Release Fixed Heel Point (FHP) Accommodation Model Verification and Validation (V and V) Plan - Rev A

    DTIC Science & Technology

    2017-01-23

    5e. TASK NUMBER N/A 5f. WORK UNIT NUMBER N/A 7. PERFORMING ORGANIZATION NAME(S) AND ADDRESS(ES) AND ADDRESS(ES) RDECOM-TARDEC-ACT Attn...occupant work space, central 90% of the Soldier population, encumbrance, posture and position, verification and validation, computer aided design...factors engineers could benefit by working with vehicle designers to perform virtual assessments in CAD when there is not enough time and/or funding to

  7. The 25 kW power module evolution study. Part 3: Conceptual design for power module evolution. Volume 6: WBS and dictionary

    NASA Technical Reports Server (NTRS)

    1979-01-01

    Program elements of the power module (PM) system, are identified, structured, and defined according to the planned work breakdown structure. Efforts required to design, develop, manufacture, test, checkout, launch and operate a protoflight assembled 25 kW, 50 kW and 100 kW PM include the preparation and delivery of related software, government furnished equipment, space support equipment, ground support equipment, launch site verification software, orbital verification software, and all related data items.

  8. Commissioning and quality assurance of an integrated system for patient positioning and setup verification in particle therapy.

    PubMed

    Pella, A; Riboldi, M; Tagaste, B; Bianculli, D; Desplanques, M; Fontana, G; Cerveri, P; Seregni, M; Fattori, G; Orecchia, R; Baroni, G

    2014-08-01

    In an increasing number of clinical indications, radiotherapy with accelerated particles shows relevant advantages when compared with high energy X-ray irradiation. However, due to the finite range of ions, particle therapy can be severely compromised by setup errors and geometric uncertainties. The purpose of this work is to describe the commissioning and the design of the quality assurance procedures for patient positioning and setup verification systems at the Italian National Center for Oncological Hadrontherapy (CNAO). The accuracy of systems installed in CNAO and devoted to patient positioning and setup verification have been assessed using a laser tracking device. The accuracy in calibration and image based setup verification relying on in room X-ray imaging system was also quantified. Quality assurance tests to check the integration among all patient setup systems were designed, and records of daily QA tests since the start of clinical operation (2011) are presented. The overall accuracy of the patient positioning system and the patient verification system motion was proved to be below 0.5 mm under all the examined conditions, with median values below the 0.3 mm threshold. Image based registration in phantom studies exhibited sub-millimetric accuracy in setup verification at both cranial and extra-cranial sites. The calibration residuals of the OTS were found consistent with the expectations, with peak values below 0.3 mm. Quality assurance tests, daily performed before clinical operation, confirm adequate integration and sub-millimetric setup accuracy. Robotic patient positioning was successfully integrated with optical tracking and stereoscopic X-ray verification for patient setup in particle therapy. Sub-millimetric setup accuracy was achieved and consistently verified in daily clinical operation.

  9. Evaluation of geotechnical monitoring data from the ESF North Ramp Starter Tunnel, April 1994 to June 1995. Revision 1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    NONE

    1995-11-01

    This report presents the results of instrumentation measurements and observations made during construction of the North Ramp Starter Tunnel (NRST) of the Exploratory Studies Facility (ESF). The information in this report was developed as part of the Design Verification Study, Section 8.3.1.15.1.8 of the Yucca Mountain Site Characterization Plan (DOE 1988). The ESF is being constructed by the US Department of Energy (DOE) to evaluate the feasibility of locating a potential high-level nuclear waste repository on lands within and adjacent to the Nevada Test Site (NTS), Nye County, Nevada. The Design Verification Studies are performed to collect information during constructionmore » of the ESF that will be useful for design and construction of the potential repository. Four experiments make up the Design Verification Study: Evaluation of Mining Methods, Monitoring Drift Stability, Monitoring of Ground Support Systems, and The Air Quality and Ventilation Experiment. This report describes Sandia National Laboratories` (SNL) efforts in the first three of these experiments in the NRST.« less

  10. SLS Navigation Model-Based Design Approach

    NASA Technical Reports Server (NTRS)

    Oliver, T. Emerson; Anzalone, Evan; Geohagan, Kevin; Bernard, Bill; Park, Thomas

    2018-01-01

    The SLS Program chose to implement a Model-based Design and Model-based Requirements approach for managing component design information and system requirements. This approach differs from previous large-scale design efforts at Marshall Space Flight Center where design documentation alone conveyed information required for vehicle design and analysis and where extensive requirements sets were used to scope and constrain the design. The SLS Navigation Team has been responsible for the Program-controlled Design Math Models (DMMs) which describe and represent the performance of the Inertial Navigation System (INS) and the Rate Gyro Assemblies (RGAs) used by Guidance, Navigation, and Controls (GN&C). The SLS Navigation Team is also responsible for the navigation algorithms. The navigation algorithms are delivered for implementation on the flight hardware as a DMM. For the SLS Block 1-B design, the additional GPS Receiver hardware is managed as a DMM at the vehicle design level. This paper provides a discussion of the processes and methods used to engineer, design, and coordinate engineering trades and performance assessments using SLS practices as applied to the GN&C system, with a particular focus on the Navigation components. These include composing system requirements, requirements verification, model development, model verification and validation, and modeling and analysis approaches. The Model-based Design and Requirements approach does not reduce the effort associated with the design process versus previous processes used at Marshall Space Flight Center. Instead, the approach takes advantage of overlap between the requirements development and management process, and the design and analysis process by efficiently combining the control (i.e. the requirement) and the design mechanisms. The design mechanism is the representation of the component behavior and performance in design and analysis tools. The focus in the early design process shifts from the development and management of design requirements to the development of usable models, model requirements, and model verification and validation efforts. The models themselves are represented in C/C++ code and accompanying data files. Under the idealized process, potential ambiguity in specification is reduced because the model must be implementable versus a requirement which is not necessarily subject to this constraint. Further, the models are shown to emulate the hardware during validation. For models developed by the Navigation Team, a common interface/standalone environment was developed. The common environment allows for easy implementation in design and analysis tools. Mechanisms such as unit test cases ensure implementation as the developer intended. The model verification and validation process provides a very high level of component design insight. The origin and implementation of the SLS variant of Model-based Design is described from the perspective of the SLS Navigation Team. The format of the models and the requirements are described. The Model-based Design approach has many benefits but is not without potential complications. Key lessons learned associated with the implementation of the Model Based Design approach and process from infancy to verification and certification are discussed

  11. Towards the formal specification of the requirements and design of a processor interface unit

    NASA Technical Reports Server (NTRS)

    Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.

    1993-01-01

    Work to formally specify the requirements and design of a Processor Interface Unit (PIU), a single-chip subsystem providing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault-tolerant computer system, is described. This system, the Fault-Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance free operation, or both. The approaches that were developed for modeling the PIU requirements and for composition of the PIU subcomponents at high levels of abstraction are described. These approaches were used to specify and verify a nontrivial subset of the PIU behavior. The PIU specification in Higher Order Logic (HOL) is documented in a companion NASA contractor report entitled 'Towards the Formal Specification of the Requirements and Design of a Processor Interfacs Unit - HOL Listings.' The subsequent verification approach and HOL listings are documented in NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit' and NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings.'

  12. A Verification Method for MASOES.

    PubMed

    Perozo, N; Aguilar Perozo, J; Terán, O; Molina, H

    2013-02-01

    MASOES is a 3agent architecture for designing and modeling self-organizing and emergent systems. This architecture describes the elements, relationships, and mechanisms, both at the individual and the collective levels, that favor the analysis of the self-organizing and emergent phenomenon without mathematically modeling the system. In this paper, a method is proposed for verifying MASOES from the point of view of design in order to study the self-organizing and emergent behaviors of the modeled systems. The verification criteria are set according to what is proposed in MASOES for modeling self-organizing and emerging systems and the principles of the wisdom of crowd paradigm and the fuzzy cognitive map (FCM) theory. The verification method for MASOES has been implemented in a tool called FCM Designer and has been tested to model a community of free software developers that works under the bazaar style as well as a Wikipedia community in order to study their behavior and determine their self-organizing and emergent capacities.

  13. Examining the Characteristics of Digital Learning Games Designed by In-Service Teachers

    ERIC Educational Resources Information Center

    An, Yun-Jo; Cao, Li

    2017-01-01

    In order to better understand teachers' perspectives on the design and development of digital game-based learning environments, this study examined the characteristics of digital learning games designed by teachers. In addition, this study explored how game design and peer critique activities influenced their perceptions of digital game-based…

  14. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION: TEST REPORT OF MOBILE SOURCE EMISSION CONTROL DEVICES--PUREM NORTH AMERICA LLC, PMF GREENTEC 1004205.00.0 DIESEL PARTICULATE FILTER

    EPA Science Inventory

    The U.S. EPA has created the Environmental Technology Verification (ETV) program to provide high quality, peer reviewed data on technology performance to those involved in the design, distribution, financing, permitting, purchase, and use of environmental technologies. The Air Po...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: DEVILBISS JGHV-531-46FF HVLP SPRAY GUN

    EPA Science Inventory

    This report presents the results of the verification test of the DeVilbiss JGHV-531-46FF high-volume, low-pressure pressure-feed spray gun, hereafter referred to as the DeVilbiss JGHV, which is designed for use in industrial finishing. The test coating chosen by ITW Industrial Fi...

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION: Separation of Manure Solids from Flushed Swine Waste. Hoffland Environmental Inc. Drag Screen and Clarifier

    EPA Science Inventory

    Verification testing of the Hoffland Drag Screen and Clarifier was conducted at the North Carolina State University's Lake Wheeler Road Field Laboratory, in Raleigh, North Carolina. The farm is designed to operate as a research and teaching facility with the capacity for 250 so...

  18. 30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...

  19. 30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and turret-and-hull... Platform Verification Program: (i) Drilling, production, and pipeline risers, and riser tensioning systems...

  20. 30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...

  1. 30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...

  2. 30 CFR 250.910 - Which of my facilities are subject to the Platform Verification Program?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ..., production, and pipeline risers, and riser tensioning systems (each platform must be designed to accommodate all the loads imposed by all risers and riser does not have tensioning systems);(ii) Turrets and... are subject to the Platform Verification Program: (i) Drilling, production, and pipeline risers, and...

  3. 40 CFR 63.924 - Standards-Container Level 3 controls.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... selected by the owner or operator: (1) The enclosure shall be designed and operated in accordance with the criteria for a permanent total enclosure as specified in “Procedure T—Criteria for and Verification of a... enclosure. The owner or operator shall perform the verification procedure for the enclosure as specified in...

  4. 40 CFR 63.924 - Standards-Container Level 3 controls.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... selected by the owner or operator: (1) The enclosure shall be designed and operated in accordance with the criteria for a permanent total enclosure as specified in “Procedure T—Criteria for and Verification of a... enclosure. The owner or operator shall perform the verification procedure for the enclosure as specified in...

  5. Peer Review of a Formal Verification/Design Proof Methodology

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.

  6. Considerations in STS payload environmental verification

    NASA Technical Reports Server (NTRS)

    Keegan, W. B.

    1978-01-01

    The current philosophy of the GSFS regarding environmental verification of Shuttle payloads is reviewed. In the structures area, increased emphasis will be placed on the use of analysis for design verification, with selective testing performed as necessary. Furthermore, as a result of recent cost optimization analysis, the multitier test program will presumably give way to a comprehensive test program at the major payload subassembly level after adequate workmanship at the component level has been verified. In the thermal vacuum area, thought is being given to modifying the approaches used for conventional spacecraft.

  7. The opto-mechanical design process: from vision to reality

    NASA Astrophysics Data System (ADS)

    Kvamme, E. Todd; Stubbs, David M.; Jacoby, Michael S.

    2017-08-01

    The design process for an opto-mechanical sub-system is discussed from requirements development through test. The process begins with a proper mission understanding and the development of requirements for the system. Preliminary design activities are then discussed with iterative analysis and design work being shared between the design, thermal, and structural engineering personnel. Readiness for preliminary review and the path to a final design review are considered. The value of prototyping and risk mitigation testing is examined with a focus on when it makes sense to execute a prototype test program. System level margin is discussed in general terms, and the practice of trading margin in one area of performance to meet another area is reviewed. Requirements verification and validation is briefly considered. Testing and its relationship to requirements verification concludes the design process.

  8. Distributed digital signal processors for multi-body structures

    NASA Technical Reports Server (NTRS)

    Lee, Gordon K.

    1990-01-01

    Several digital filter designs were investigated which may be used to process sensor data from large space structures and to design digital hardware to implement the distributed signal processing architecture. Several experimental tests articles are available at NASA Langley Research Center to evaluate these designs. A summary of some of the digital filter designs is presented, an evaluation of their characteristics relative to control design is discussed, and candidate hardware microcontroller/microcomputer components are given. Future activities include software evaluation of the digital filter designs and actual hardware inplementation of some of the signal processor algorithms on an experimental testbed at NASA Langley.

  9. An integral equation formulation for predicting radiation patterns of a space shuttle annular slot antenna

    NASA Technical Reports Server (NTRS)

    Jones, J. E.; Richmond, J. H.

    1974-01-01

    An integral equation formulation is applied to predict pitch- and roll-plane radiation patterns of a thin VHF/UHF (very high frequency/ultra high frequency) annular slot communications antenna operating at several locations in the nose region of the space shuttle orbiter. Digital computer programs used to compute radiation patterns are given and the use of the programs is illustrated. Experimental verification of computed patterns is given from measurements made on 1/35-scale models of the orbiter.

  10. Programs for Testing an SSME-Monitoring System

    NASA Technical Reports Server (NTRS)

    Lang, Andre; Cecil, Jimmie; Heusinger, Ralph; Freestone, Kathleen; Blue, Lisa; Wilkerson, DeLisa; McMahon, Leigh Anne; Hall, Richard B.; Varnavas, Kosta; Smith, Keary; hide

    2007-01-01

    A suite of computer programs has been developed for special test equipment (STE) that is used in verification testing of the Health Management Computer Integrated Rack Assembly (HMCIRA), a ground-based system of analog and digital electronic hardware and software for "flight-like" testing for development of components of an advanced health-management system for the space shuttle main engine (SSME). The STE software enables the STE to simulate the analog input and the data flow of an SSME test firing from start to finish.

  11. SSC Geopositional Assessment of the Advanced Wide Field Sensor

    NASA Technical Reports Server (NTRS)

    Ross, Kenton

    2007-01-01

    The objective is to provide independent verification of IRS geopositional accuracy claims and of the internal geopositional characterization provided by Lutes (2005). Six sub-scenes (quads) were assessed; Three from each AWiFS camera. Check points were manually matched to digital orthophoto quarter quadrangle (DOQQ) reference (assumed accuracy approx. 5 m, RMSE) Check points were selected to meet or exceed Federal Geographic Data Committee's guidelines. Used ESRI ArcGIS for data collection and SSC-written MATLAB scripts for data analysis.

  12. Apollo experience report: Guidance and control systems. Engineering simulation program

    NASA Technical Reports Server (NTRS)

    Gilbert, D. W.

    1973-01-01

    The Apollo Program experience from early 1962 to July 1969 with respect to the engineering-simulation support and the problems encountered is summarized in this report. Engineering simulation in support of the Apollo guidance and control system is discussed in terms of design analysis and verification, certification of hardware in closed-loop operation, verification of hardware/software compatibility, and verification of both software and procedures for each mission. The magnitude, time, and cost of the engineering simulations are described with respect to hardware availability, NASA and contractor facilities (for verification of the command module, the lunar module, and the primary guidance, navigation, and control system), and scheduling and planning considerations. Recommendations are made regarding implementation of similar, large-scale simulations for future programs.

  13. Independent Verification Final Summary Report for the David Witherspoon, Inc. 1630 Site Knoxville, Tennessee

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    P.C. Weaver

    2009-04-29

    The primary objective of the independent verification was to determine if BJC performed the appropriate actions to meet the specified “hot spot” cleanup criteria of 500 picocuries per gram (pCi/g) uranium-238 (U-238) in surface soil. Specific tasks performed by the independent verification team (IVT) to satisfy this objective included: 1) performing radiological walkover surveys, and 2) collecting soil samples for independent analyses. The independent verification (IV) efforts were designed to evaluate radioactive contaminants (specifically U-238) in the exposed surfaces below one foot of the original site grade, given that the top one foot layer of soil on the site wasmore » removed in its entirety.« less

  14. Software verification plan for GCS. [guidance and control software

    NASA Technical Reports Server (NTRS)

    Dent, Leslie A.; Shagnea, Anita M.; Hayhurst, Kelly J.

    1990-01-01

    This verification plan is written as part of an experiment designed to study the fundamental characteristics of the software failure process. The experiment will be conducted using several implementations of software that were produced according to industry-standard guidelines, namely the Radio Technical Commission for Aeronautics RTCA/DO-178A guidelines, Software Consideration in Airborne Systems and Equipment Certification, for the development of flight software. This plan fulfills the DO-178A requirements for providing instructions on the testing of each implementation of software. The plan details the verification activities to be performed at each phase in the development process, contains a step by step description of the testing procedures, and discusses all of the tools used throughout the verification process.

  15. Equations for estimating Clark Unit-hydrograph parameters for small rural watersheds in Illinois

    USGS Publications Warehouse

    Straub, Timothy D.; Melching, Charles S.; Kocher, Kyle E.

    2000-01-01

    Simulation of the measured discharge hydrographs for the verification storms utilizing TC and R obtained from the estimation equations yielded good results. The error in peak discharge for 21 of the 29 verification storms was less than 25 percent, and the error in time-to-peak discharge for 18 of the 29 verification storms also was less than 25 percent. Therefore, applying the estimation equations to determine TC and R for design-storm simulation may result in reliable design hydrographs, as long as the physical characteristics of the watersheds under consideration are within the range of those characteristics for the watersheds in this study [area: 0.02-2.3 mi2, main-channel length: 0.17-3.4 miles, main-channel slope: 10.5-229 feet per mile, and insignificant percentage of impervious cover].

  16. Verification of the Icarus Material Response Tool

    NASA Technical Reports Server (NTRS)

    Schroeder, Olivia; Palmer, Grant; Stern, Eric; Schulz, Joseph; Muppidi, Suman; Martin, Alexandre

    2017-01-01

    Due to the complex physics encountered during reentry, material response solvers are used for two main purposes: improve the understanding of the physical phenomena; and design and size thermal protection systems (TPS). Icarus, is a three dimensional, unstructured material response tool that is intended to be used for design while maintaining the flexibility to easily implement physical models as needed. Because TPS selection and sizing is critical, it is of the utmost importance that the design tools be extensively verified and validated before their use. Verification tests aim at insuring that the numerical schemes and equations are implemented correctly by comparison to analytical solutions and grid convergence tests.

  17. Data in support of qPCR primer design and verification in a Pink1 -/- rat model of Parkinson disease.

    PubMed

    Kelm-Nelson, Cynthia A; Stevenson, Sharon A; Ciucci, Michelle R

    2016-09-01

    Datasets provided in this article represent the Rattus norvegicus primer design and verification used in Pink1 -/- and wildtype Long Evans brain tissue. Accessible tables include relevant information, accession numbers, sequences, temperatures and product length, describing primer design specific to the transcript amplification use. Additionally, results of Sanger sequencing of qPCR reaction products (FASTA aligned sequences) are presented for genes of interest. Results and further interpretation and discussion can be found in the original research article "Atp13a2 expression in the periaqueductal gray is decreased in the Pink1 -/- rat model of Parkinson disease" [1].

  18. Ambient and Cryogenic Alignment Verification and Performance of the Infrared Multi-Object Spectrometer

    NASA Technical Reports Server (NTRS)

    Connelly, Joseph A.; Ohl, Raymond G.; Mink, Ronald G.; Mentzell, J. Eric; Saha, Timo T.; Tveekrem, June L.; Hylan, Jason E.; Sparr, Leroy M.; Chambers, V. John; Hagopian, John G.

    2003-01-01

    The Infrared Multi-Object Spectrometer (IRMOS) is a facility instrument for the Kitt Peak National Observatory 4 and 2.1 meter telescopes. IRMOS is a near-IR (0.8 - 2.5 micron) spectrometer with low- to mid-resolving power (R = 300 - 3000). IRMOS produces simultaneous spectra of approximately 100 objects in its 2.8 x 2.0 arc-min field of view using a commercial Micro Electro-Mechanical Systems (MEMS) Digital Micro-mirror Device (DMD) from Texas Instruments. The IRMOS optical design consists of two imaging subsystems. The focal reducer images the focal plane of the telescope onto the DMD field stop, and the spectrograph images the DMD onto the detector. We describe ambient breadboard subsystem alignment and imaging performance of each stage independently, and the ambient and cryogenic imaging performance of the fully assembled instrument. Interferometric measurements of subsystem wavefront error serve to venfy alignment, and are accomplished using a commercial, modified Twyman-Green laser unequal path interferometer. Image testing provides further verification of the optomechanical alignment method and a measurement of near-angle scattered light due to mirror small-scale surface error. Image testing is performed at multiple field points. A mercury-argon pencil lamp provides spectral lines at 546.1 nm and 1550 nm, and a CCD camera and IR camera are used as detectors. We use commercial optical modeling software to predict the point-spread function and its effect on instrument slit transmission and resolution. Our breadboard test results validate this prediction. We conclude with an instrument performance prediction for first light.

  19. THE REDMAPPER GALAXY CLUSTER CATALOG FROM DES SCIENCE VERIFICATION DATA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Rykoff, E. S.; Rozo, E.; Hollowood, D.

    We describe updates to the redMaPPer algorithm, a photometric red-sequence cluster finder specifically designed for large photometric surveys. The updated algorithm is applied to 150 deg(2) of Science Verification (SV) data from the Dark Energy Survey (DES), and to the Sloan Digital Sky Survey (SDSS) DR8 photometric data set. The DES SV catalog is locally volume limited and contains 786 clusters with richness lambda > 20 (roughly equivalent to M500c greater than or similar to 10(14) h(70)(-1)M(circle dot)) and 0.2 < z < 0.9. The DR8 catalog consists of 26,311 clusters with 0.08 < z < 0.6, with a sharplymore » increasing richness threshold as a function of redshift for z greater than or similar to 0.35. The photometric redshift performance of both catalogs is shown to be excellent, with photometric redshift uncertainties controlled at the sigma(z)/(1+ z) similar to 0.01 level for z greater than or similar to 0.7, rising to similar to 0.02 at z similar to 0.9 in DES SV. We make use of Chandra and XMM X-ray and South Pole Telescope Sunyaev-Zeldovich data to show that the centering performance and mass-richness scatter are consistent with expectations based on prior runs of redMaPPer on SDSS data. We also show how the redMaPPer photo-z and richness estimates are relatively insensitive to imperfect star/galaxy separation and small-scale star masks.« less

  20. Artwork Interactive Design System (AIDS) program description

    NASA Technical Reports Server (NTRS)

    Johnson, B. T.; Taylor, J. F.

    1976-01-01

    An artwork interactive design system is described which provides the microelectronic circuit designer/engineer a tool to perform circuit design, automatic layout modification, standard cell design, and artwork verification at a graphics computer terminal using a graphics tablet at the designer/computer interface.

  1. EVA Design, Verification, and On-Orbit Operations Support Using Worksite Analysis

    NASA Technical Reports Server (NTRS)

    Hagale, Thomas J.; Price, Larry R.

    2000-01-01

    The International Space Station (ISS) design is a very large and complex orbiting structure with thousands of Extravehicular Activity (EVA) worksites. These worksites are used to assemble and maintain the ISS. The challenge facing EVA designers was how to design, verify, and operationally support such a large number of worksites within cost and schedule. This has been solved through the practical use of computer aided design (CAD) graphical techniques that have been developed and used with a high degree of success over the past decade. The EVA design process allows analysts to work concurrently with hardware designers so that EVA equipment can be incorporated and structures configured to allow for EVA access and manipulation. Compliance with EVA requirements is strictly enforced during the design process. These techniques and procedures, coupled with neutral buoyancy underwater testing, have proven most valuable in the development, verification, and on-orbit support of planned or contingency EVA worksites.

  2. Advanced composite structures. [metal matrix composites - structural design criteria for spacecraft construction materials

    NASA Technical Reports Server (NTRS)

    1974-01-01

    A monograph is presented which establishes structural design criteria and recommends practices to ensure the design of sound composite structures, including composite-reinforced metal structures. (It does not discuss design criteria for fiber-glass composites and such advanced composite materials as beryllium wire or sapphire whiskers in a matrix material.) Although the criteria were developed for aircraft applications, they are general enough to be applicable to space vehicles and missiles as well. The monograph covers four broad areas: (1) materials, (2) design, (3) fracture control, and (4) design verification. The materials portion deals with such subjects as material system design, material design levels, and material characterization. The design portion includes panel, shell, and joint design, applied loads, internal loads, design factors, reliability, and maintainability. Fracture control includes such items as stress concentrations, service-life philosophy, and the management plan for control of fracture-related aspects of structural design using composite materials. Design verification discusses ways to prove flightworthiness.

  3. Design and verification of the miniature optical system for small object surface profile fast scanning

    NASA Astrophysics Data System (ADS)

    Chi, Sheng; Lee, Shu-Sheng; Huang, Jen, Jen-Yu; Lai, Ti-Yu; Jan, Chia-Ming; Hu, Po-Chi

    2016-04-01

    As the progress of optical technologies, different commercial 3D surface contour scanners are on the market nowadays. Most of them are used for reconstructing the surface profile of mold or mechanical objects which are larger than 50 mm×50 mm× 50 mm, and the scanning system size is about 300 mm×300 mm×100 mm. There are seldom optical systems commercialized for surface profile fast scanning for small object size less than 10 mm×10 mm×10 mm. Therefore, a miniature optical system has been designed and developed in this research work for this purpose. Since the most used scanning method of such system is line scan technology, we have developed pseudo-phase shifting digital projection technology by adopting projecting fringes and phase reconstruction method. A projector was used to project a digital fringe patterns on the object, and the fringes intensity images of the reference plane and of the sample object were recorded by a CMOS camera. The phase difference between the plane and object can be calculated from the fringes images, and the surface profile of the object was reconstructed by using the phase differences. The traditional phase shifting method was accomplished by using PZT actuator or precisely controlled motor to adjust the light source or grating and this is one of the limitations for high speed scanning. Compared with the traditional optical setup, we utilized a micro projector to project the digital fringe patterns on the sample. This diminished the phase shifting processing time and the controlled phase differences between the shifted phases become more precise. Besides, the optical path design based on a portable device scanning system was used to minimize the size and reduce the number of the system components. A screwdriver section about 7mm×5mm×5mm has been scanned and its surface profile was successfully restored. The experimental results showed that the measurement area of our system can be smaller than 10mm×10mm, the precision reached to +/-10μm, and the scanning time for each surface of an object was less than 15 seconds. This has proved that our system own the potential to be a fast scanning scanner for small object surface profile scanning.

  4. Design of embedded endoscopic ultrasonic imaging system

    NASA Astrophysics Data System (ADS)

    Li, Ming; Zhou, Hao; Wen, Shijie; Chen, Xiodong; Yu, Daoyin

    2008-12-01

    Endoscopic ultrasonic imaging system is an important component in the endoscopic ultrasonography system (EUS). Through the ultrasonic probe, the characteristics of the fault histology features of digestive organs is detected by EUS, and then received by the reception circuit which making up of amplifying, gain compensation, filtering and A/D converter circuit, in the form of ultrasonic echo. Endoscopic ultrasonic imaging system is the back-end processing system of the EUS, with the function of receiving digital ultrasonic echo modulated by the digestive tract wall from the reception circuit, acquiring and showing the fault histology features in the form of image and characteristic data after digital signal processing, such as demodulation, etc. Traditional endoscopic ultrasonic imaging systems are mainly based on image acquisition and processing chips, which connecting to personal computer with USB2.0 circuit, with the faults of expensive, complicated structure, poor portability, and difficult to popularize. To against the shortcomings above, this paper presents the methods of digital signal acquisition and processing specially based on embedded technology with the core hardware structure of ARM and FPGA for substituting the traditional design with USB2.0 and personal computer. With built-in FIFO and dual-buffer, FPGA implement the ping-pong operation of data storage, simultaneously transferring the image data into ARM through the EBI bus by DMA function, which is controlled by ARM to carry out the purpose of high-speed transmission. The ARM system is being chosen to implement the responsibility of image display every time DMA transmission over and actualizing system control with the drivers and applications running on the embedded operating system Windows CE, which could provide a stable, safe and reliable running platform for the embedded device software. Profiting from the excellent graphical user interface (GUI) and good performance of Windows CE, we can not only clearly show 511×511 pixels ultrasonic echo images through application program, but also provide a simple and friendly operating interface with mouse and touch screen which is more convenient than the traditional endoscopic ultrasonic imaging system. Including core and peripheral circuits of FPGA and ARM, power network circuit and LCD display circuit, we designed the whole embedded system, achieving the desired purpose by implementing ultrasonic image display properly after the experimental verification, solving the problem of hugeness and complexity of the traditional endoscopic ultrasonic imaging system.

  5. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  6. Application of computer vision to automatic prescription verification in pharmaceutical mail order

    NASA Astrophysics Data System (ADS)

    Alouani, Ali T.

    2005-05-01

    In large volume pharmaceutical mail order, before shipping out prescriptions, licensed pharmacists ensure that the drug in the bottle matches the information provided in the patient prescription. Typically, the pharmacist has about 2 sec to complete the prescription verification process of one prescription. Performing about 1800 prescription verification per hour is tedious and can generate human errors as a result of visual and brain fatigue. Available automatic drug verification systems are limited to a single pill at a time. This is not suitable for large volume pharmaceutical mail order, where a prescription can have as many as 60 pills and where thousands of prescriptions are filled every day. In an attempt to reduce human fatigue, cost, and limit human error, the automatic prescription verification system (APVS) was invented to meet the need of large scale pharmaceutical mail order. This paper deals with the design and implementation of the first prototype online automatic prescription verification machine to perform the same task currently done by a pharmacist. The emphasis here is on the visual aspects of the machine. The system has been successfully tested on 43,000 prescriptions.

  7. Adjusting for partial verification or workup bias in meta-analyses of diagnostic accuracy studies.

    PubMed

    de Groot, Joris A H; Dendukuri, Nandini; Janssen, Kristel J M; Reitsma, Johannes B; Brophy, James; Joseph, Lawrence; Bossuyt, Patrick M M; Moons, Karel G M

    2012-04-15

    A key requirement in the design of diagnostic accuracy studies is that all study participants receive both the test under evaluation and the reference standard test. For a variety of practical and ethical reasons, sometimes only a proportion of patients receive the reference standard, which can bias the accuracy estimates. Numerous methods have been described for correcting this partial verification bias or workup bias in individual studies. In this article, the authors describe a Bayesian method for obtaining adjusted results from a diagnostic meta-analysis when partial verification or workup bias is present in a subset of the primary studies. The method corrects for verification bias without having to exclude primary studies with verification bias, thus preserving the main advantages of a meta-analysis: increased precision and better generalizability. The results of this method are compared with the existing methods for dealing with verification bias in diagnostic meta-analyses. For illustration, the authors use empirical data from a systematic review of studies of the accuracy of the immunohistochemistry test for diagnosis of human epidermal growth factor receptor 2 status in breast cancer patients.

  8. Development and verification of a cementless novel tapered wedge stem for total hip arthroplasty.

    PubMed

    Faizan, Ahmad; Wuestemann, Thies; Nevelos, Jim; Bastian, Adam C; Collopy, Dermot

    2015-02-01

    Most current tapered wedge hip stems were designed based upon the original Mueller straight stem design introduced in 1977. These stems were designed to have a single medial curvature and grew laterally to accommodate different sizes. In this preclinical study, the design and verification of a tapered wedge stem using computed tomography scans of 556 patients are presented. The computer simulation demonstrated that the novel stem, designed for proximal engagement, allowed for reduced distal fixation, particularly in the 40-60 year male population. Moreover, the physical micromotion testing and finite element analysis demonstrated that the novel stem allowed for reduced micromotion. In summary, preclinical data suggest that the computed tomography based stem design described here may offer enhanced implant fit and reduced micromotion. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. Design and verification of a turbofan swirl augmentor

    NASA Technical Reports Server (NTRS)

    Egan, W. J., Jr.; Shadowen, J. H.

    1978-01-01

    The paper discusses the details of the design and verification testing of a full-scale turbofan 'swirl' augmentor at sea level and altitude. No flameholders are required in the swirl augmentor since the radial motion of the hot pilot gases and subsequent combustion products provides a continuous ignition front across the stream. Results of rig testing of this full-scale swirl augmentor on an F100 engine, which are very encouraging, and future development plans are presented. The results validate the application of the centrifugal-force swirling flow concept to a turbofan augmentor.

  10. "Expert" Verification of Classroom-Based Indicators of Teaching and Learning Effectiveness for Professional Renewable Certification.

    ERIC Educational Resources Information Center

    Naik, Nitin S.; And Others

    The results are provided of a statewide content verification survey of "expert" educators designed to verify indicators in the 1989-90 System for Teaching and Learning Assessment and Review (STAR) as reasonable expectations for beginning and/or experienced teachers (BETs) in Louisiana and as providing professional endorsement at the…

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: HVLP COATING EQUIPMENT, ITW AUTOMOTIVE REFINISHING, DEVILBISS GTI-600G, HVLP SPRAY GUN

    EPA Science Inventory

    This report presents the results of the verification test of the DeVilbiss GTi-600G high-volume, low-pressure gravity-feed spray gun, hereafter referred to as the DeVilbiss GTi, which is designed for use in automotive refinishing. The test coating chosen by ITW Automotive Refinis...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT: HVLP COATING EQUIPMENT, ITW AUTOMOTIVE REFINISHING, DEVILBISS FLG-631-318 HVLP SPRAY GUN

    EPA Science Inventory

    This report presents the results of the verification test of the DeVilbiss FLG-631-318 high-volume, low-pressure gravity-feed spray gun, hereafter referred to as the DeVilbiss FLG, which is designed for use in automotive refinishing. The test coating chosen by ITW Automotive Refi...

  13. A Self-Instructional Course in Student Financial Aid Administration. Module 13: Verification. Second Edition.

    ERIC Educational Resources Information Center

    Washington Consulting Group, Inc., Washington, DC.

    Module 13 of the 17-module self-instructional course on student financial aid administration (designed for novice financial aid administrators and other institutional personnel) focuses on the verification procedure for checking the accuracy of applicant data used in making financial aid awards. The full course provides an introduction to the…

  14. RELAP-7 Software Verification and Validation Plan: Requirements Traceability Matrix (RTM) Part 1 – Physics and numerical methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Choi, Yong Joon; Yoo, Jun Soo; Smith, Curtis Lee

    2015-09-01

    This INL plan comprehensively describes the Requirements Traceability Matrix (RTM) on main physics and numerical method of the RELAP-7. The plan also describes the testing-based software verification and validation (SV&V) process—a set of specially designed software models used to test RELAP-7.

  15. Fabrication and verification testing of ETM 30 cm diameter ion thrusters

    NASA Technical Reports Server (NTRS)

    Collett, C.

    1977-01-01

    Engineering model designs and acceptance tests are described for the 800 and 900 series 30 cm electron bombardment thrustors. Modifications to the test console for a 1000 hr verification test were made. The 10,000 hr endurance test of the S/N 701 thruster is described, and post test analysis results are included.

  16. AN ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PERFORMANCE TESTING OF THE INDUSTRIAL TEST SYSTEM, INC. CYANIDE REAGENTSTRIP™ TEST KIT

    EPA Science Inventory

    Cyanide can be present in various forms in water. The cyanide test kit evaluated in this verification study (Industrial Test System, Inc. Cyanide Regent Strip ™ Test Kit) was designed to detect free cyanide in water. This is done by converting cyanide in water to cyanogen...

  17. A New "Moodle" Module Supporting Automatic Verification of VHDL-Based Assignments

    ERIC Educational Resources Information Center

    Gutierrez, Eladio; Trenas, Maria A.; Ramos, Julian; Corbera, Francisco; Romero, Sergio

    2010-01-01

    This work describes a new "Moodle" module developed to give support to the practical content of a basic computer organization course. This module goes beyond the mere hosting of resources and assignments. It makes use of an automatic checking and verification engine that works on the VHDL designs submitted by the students. The module automatically…

  18. Standardized Radiation Shield Design Methods: 2005 HZETRN

    NASA Technical Reports Server (NTRS)

    Wilson, John W.; Tripathi, Ram K.; Badavi, Francis F.; Cucinotta, Francis A.

    2006-01-01

    Research committed by the Langley Research Center through 1995 resulting in the HZETRN code provides the current basis for shield design methods according to NASA STD-3000 (2005). With this new prominence, the database, basic numerical procedures, and algorithms are being re-examined with new methods of verification and validation being implemented to capture a well defined algorithm for engineering design processes to be used in this early development phase of the Bush initiative. This process provides the methodology to transform the 1995 HZETRN research code into the 2005 HZETRN engineering code to be available for these early design processes. In this paper, we will review the basic derivations including new corrections to the codes to insure improved numerical stability and provide benchmarks for code verification.

  19. Design of an occulter testbed at flight Fresnel numbers

    NASA Astrophysics Data System (ADS)

    Sirbu, Dan; Kasdin, N. Jeremy; Kim, Yunjong; Vanderbei, Robert J.

    2015-01-01

    An external occulter is a spacecraft flown along the line-of-sight of a space telescope to suppress starlight and enable high-contrast direct imaging of exoplanets. Laboratory verification of occulter designs is necessary to validate the optical models used to design and predict occulter performance. At Princeton, we are designing and building a testbed that allows verification of scaled occulter designs whose suppressed shadow is mathematically identical to that of space occulters. Here, we present a sample design operating at a flight Fresnel number and is thus representative of a realistic space mission. We present calculations of experimental limits arising from the finite size and propagation distance available in the testbed, limitations due to manufacturing feature size, and non-ideal input beam. We demonstrate how the testbed is designed to be feature-size limited, and provide an estimation of the expected performance.

  20. SMAP Verification and Validation Project - Final Report

    NASA Technical Reports Server (NTRS)

    Murry, Michael

    2012-01-01

    In 2007, the National Research Council (NRC) released the Decadal Survey of Earth science. In the future decade, the survey identified 15 new space missions of significant scientific and application value for the National Aeronautics and Space Administration (NASA) to undertake. One of these missions was the Soil Moisture Active Passive (SMAP) mission that NASA assigned to the Jet Propulsion Laboratory (JPL) in 2008. The goal of SMAP1 is to provide global, high resolution mapping of soil moisture and its freeze/thaw states. The SMAP project recently passed its Critical Design Review and is proceeding with its fabrication and testing phase.Verification and Validation (V&V) is widely recognized as a critical component in system engineering and is vital to the success of any space mission. V&V is a process that is used to check that a system meets its design requirements and specifications in order to fulfill its intended purpose. Verification often refers to the question "Have we built the system right?" whereas Validation asks "Have we built the right system?" Currently the SMAP V&V team is verifying design requirements through inspection, demonstration, analysis, or testing. An example of the SMAP V&V process is the verification of the antenna pointing accuracy with mathematical models since it is not possible to provide the appropriate micro-gravity environment for testing the antenna on Earth before launch.

Top