NASA Astrophysics Data System (ADS)
Graham, Michelle; Gray, David
As wireless networks become increasingly ubiquitous, the demand for a method of locating a device has increased dramatically. Location Based Services are now commonplace but there are few methods of verifying or guaranteeing a location provided by a user without some specialised hardware, especially in larger scale networks. We propose a system for the verification of location claims, using proof gathered from neighbouring devices. In this paper we introduce a protocol to protect this proof gathering process, protecting the privacy of all involved parties and securing it from intruders and malicious claiming devices. We present the protocol in stages, extending the security of this protocol to allow for flexibility within its application. The Secure Location Verification Proof Gathering Protocol (SLVPGP) has been designed to function within the area of Vehicular Networks, although its application could be extended to any device with wireless & cryptographic capabilities.
Optimum structural design based on reliability and proof-load testing
NASA Technical Reports Server (NTRS)
Shinozuka, M.; Yang, J. N.
1969-01-01
Proof-load test eliminates structures with strength less than the proof load and improves the reliability value in analysis. It truncates the distribution function of strength at the proof load, thereby alleviating verification of a fitted distribution function at the lower tail portion where data are usually nonexistent.
Mechanical verification of a schematic Byzantine clock synchronization algorithm
NASA Technical Reports Server (NTRS)
Shankar, Natarajan
1991-01-01
Schneider generalizes a number of protocols for Byzantine fault tolerant clock synchronization and presents a uniform proof for their correctness. The authors present a machine checked proof of this schematic protocol that revises some of the details in Schneider's original analysis. The verification was carried out with the EHDM system developed at the SRI Computer Science Laboratory. The mechanically checked proofs include the verification that the egocentric mean function used in Lamport and Melliar-Smith's Interactive Convergence Algorithm satisfies the requirements of Schneider's protocol.
Thomson's Theorem of Electrostatics: Its Applications and Mathematical Verification
ERIC Educational Resources Information Center
Bakhoum, Ezzat G.
2008-01-01
A 100 years-old formula that was given by J. J. Thomson recently found numerous applications in computational electrostatics and electromagnetics. Thomson himself never gave a proof for the formula; but a proof based on Differential Geometry was suggested by Jackson and later published by Pappas. Unfortunately, Differential Geometry, being a…
NASA Technical Reports Server (NTRS)
Miner, Paul S.
1993-01-01
A critical function in a fault-tolerant computer architecture is the synchronization of the redundant computing elements. The synchronization algorithm must include safeguards to ensure that failed components do not corrupt the behavior of good clocks. Reasoning about fault-tolerant clock synchronization is difficult because of the possibility of subtle interactions involving failed components. Therefore, mechanical proof systems are used to ensure that the verification of the synchronization system is correct. In 1987, Schneider presented a general proof of correctness for several fault-tolerant clock synchronization algorithms. Subsequently, Shankar verified Schneider's proof by using the mechanical proof system EHDM. This proof ensures that any system satisfying its underlying assumptions will provide Byzantine fault-tolerant clock synchronization. The utility of Shankar's mechanization of Schneider's theory for the verification of clock synchronization systems is explored. Some limitations of Shankar's mechanically verified theory were encountered. With minor modifications to the theory, a mechanically checked proof is provided that removes these limitations. The revised theory also allows for proven recovery from transient faults. Use of the revised theory is illustrated with the verification of an abstract design of a clock synchronization system.
Crowd Sourced Formal Verification-Augmentation (CSFV-A)
2016-06-01
Formal Verification (CSFV) program built games that recast FV problems into puzzles to make these problems more accessible, increasing the manpower to...construct FV proofs. This effort supported the CSFV program by hosting the games on a public website, and analyzed the gameplay for efficiency to...provide FV proofs. 15. SUBJECT TERMS Crowd Source, Software, Formal Verification, Games 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF ABSTRACT
Formalization of the Integral Calculus in the PVS Theorem Prover
NASA Technical Reports Server (NTRS)
Butler, Ricky W.
2004-01-01
The PVS Theorem prover is a widely used formal verification tool used for the analysis of safety-critical systems. The PVS prover, though fully equipped to support deduction in a very general logic framework, namely higher-order logic, it must nevertheless, be augmented with the definitions and associated theorems for every branch of mathematics and Computer Science that is used in a verification. This is a formidable task, ultimately requiring the contributions of researchers and developers all over the world. This paper reports on the formalization of the integral calculus in the PVS theorem prover. All of the basic definitions and theorems covered in a first course on integral calculus have been completed.The theory and proofs were based on Rosenlicht's classic text on real analysis and follow the traditional epsilon-delta method. The goal of this work was to provide a practical set of PVS theories that could be used for verification of hybrid systems that arise in air traffic management systems and other aerospace applications. All of the basic linearity, integrability, boundedness, and continuity properties of the integral calculus were proved. The work culminated in the proof of the Fundamental Theorem Of Calculus. There is a brief discussion about why mechanically checked proofs are so much longer than standard mathematics textbook proofs.
Peer Review of a Formal Verification/Design Proof Methodology
NASA Technical Reports Server (NTRS)
1983-01-01
The role of formal verification techniques in system validation was examined. The value and the state of the art of performance proving for fault-tolerant compuers were assessed. The investigation, development, and evaluation of performance proving tools were reviewed. The technical issues related to proof methodologies are examined. The technical issues discussed are summarized.
Experimental preparation and verification of quantum money
NASA Astrophysics Data System (ADS)
Guan, Jian-Yu; Arrazola, Juan Miguel; Amiri, Ryan; Zhang, Weijun; Li, Hao; You, Lixing; Wang, Zhen; Zhang, Qiang; Pan, Jian-Wei
2018-03-01
A quantum money scheme enables a trusted bank to provide untrusted users with verifiable quantum banknotes that cannot be forged. In this work, we report a proof-of-principle experimental demonstration of the preparation and verification of unforgeable quantum banknotes. We employ a security analysis that takes experimental imperfections fully into account. We measure a total of 3.6 ×106 states in one verification round, limiting the forging probability to 10-7 based on the security analysis. Our results demonstrate the feasibility of preparing and verifying quantum banknotes using currently available experimental techniques.
An Empirical Evaluation of Automated Theorem Provers in Software Certification
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd; Schumann, Johann
2004-01-01
We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). We discuss the unique requirements this application places on the ATPs, focusing on automation, proof checking, and usability. For full automation, however, the obligations must be aggressively preprocessed and simplified, and we demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATPs to solve the proof tasks. Our results are based on 13 certification experiments that lead to more than 25,000 proof tasks which have each been attempted by Vampire, Spass, e-setheo, and Otter. The proofs found by Otter have been proof-checked by IVY.
Analytic proof of the existence of the Lorenz attractor in the extended Lorenz model
NASA Astrophysics Data System (ADS)
Ovsyannikov, I. I.; Turaev, D. V.
2017-01-01
We give an analytic (free of computer assistance) proof of the existence of a classical Lorenz attractor for an open set of parameter values of the Lorenz model in the form of Yudovich-Morioka-Shimizu. The proof is based on detection of a homoclinic butterfly with a zero saddle value and rigorous verification of one of the Shilnikov criteria for the birth of the Lorenz attractor; we also supply a proof for this criterion. The results are applied in order to give an analytic proof for the existence of a robust, pseudohyperbolic strange attractor (the so-called discrete Lorenz attractor) for an open set of parameter values in a 4-parameter family of 3D Henon-like diffeomorphisms.
Formal verification of an oral messages algorithm for interactive consistency
NASA Technical Reports Server (NTRS)
Rushby, John
1992-01-01
The formal specification and verification of an algorithm for Interactive Consistency based on the Oral Messages algorithm for Byzantine Agreement is described. We compare our treatment with that of Bevier and Young, who presented a formal specification and verification for a very similar algorithm. Unlike Bevier and Young, who observed that 'the invariant maintained in the recursive subcases of the algorithm is significantly more complicated than is suggested by the published proof' and who found its formal verification 'a fairly difficult exercise in mechanical theorem proving,' our treatment is very close to the previously published analysis of the algorithm, and our formal specification and verification are straightforward. This example illustrates how delicate choices in the formulation of the problem can have significant impact on the readability of its formal specification and on the tractability of its formal verification.
Klein, Gerwin; Andronick, June; Keller, Gabriele; Matichuk, Daniel; Murray, Toby; O'Connor, Liam
2017-10-13
We present recent work on building and scaling trustworthy systems with formal, machine-checkable proof from the ground up, including the operating system kernel, at the level of binary machine code. We first give a brief overview of the seL4 microkernel verification and how it can be used to build verified systems. We then show two complementary techniques for scaling these methods to larger systems: proof engineering, to estimate verification effort; and code/proof co-generation, for scalable development of provably trustworthy applications.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Author(s).
Formal Verification of Large Software Systems
NASA Technical Reports Server (NTRS)
Yin, Xiang; Knight, John
2010-01-01
We introduce a scalable proof structure to facilitate formal verification of large software systems. In our approach, we mechanically synthesize an abstract specification from the software implementation, match its static operational structure to that of the original specification, and organize the proof as the conjunction of a series of lemmas about the specification structure. By setting up a different lemma for each distinct element and proving each lemma independently, we obtain the important benefit that the proof scales easily for large systems. We present details of the approach and an illustration of its application on a challenge problem from the security domain
Proceedings of the Second NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Munoz, Cesar (Editor)
2010-01-01
This publication contains the proceedings of the Second NASA Formal Methods Symposium sponsored by the National Aeronautics and Space Administration and held in Washington D.C. April 13-15, 2010. Topics covered include: Decision Engines for Software Analysis using Satisfiability Modulo Theories Solvers; Verification and Validation of Flight-Critical Systems; Formal Methods at Intel -- An Overview; Automatic Review of Abstract State Machines by Meta Property Verification; Hardware-independent Proofs of Numerical Programs; Slice-based Formal Specification Measures -- Mapping Coupling and Cohesion Measures to Formal Z; How Formal Methods Impels Discovery: A Short History of an Air Traffic Management Project; A Machine-Checked Proof of A State-Space Construction Algorithm; Automated Assume-Guarantee Reasoning for Omega-Regular Systems and Specifications; Modeling Regular Replacement for String Constraint Solving; Using Integer Clocks to Verify the Timing-Sync Sensor Network Protocol; Can Regulatory Bodies Expect Efficient Help from Formal Methods?; Synthesis of Greedy Algorithms Using Dominance Relations; A New Method for Incremental Testing of Finite State Machines; Verification of Faulty Message Passing Systems with Continuous State Space in PVS; Phase Two Feasibility Study for Software Safety Requirements Analysis Using Model Checking; A Prototype Embedding of Bluespec System Verilog in the PVS Theorem Prover; SimCheck: An Expressive Type System for Simulink; Coverage Metrics for Requirements-Based Testing: Evaluation of Effectiveness; Software Model Checking of ARINC-653 Flight Code with MCP; Evaluation of a Guideline by Formal Modelling of Cruise Control System in Event-B; Formal Verification of Large Software Systems; Symbolic Computation of Strongly Connected Components Using Saturation; Towards the Formal Verification of a Distributed Real-Time Automotive System; Slicing AADL Specifications for Model Checking; Model Checking with Edge-valued Decision Diagrams; and Data-flow based Model Analysis.
Explaining Verification Conditions
NASA Technical Reports Server (NTRS)
Deney, Ewen; Fischer, Bernd
2006-01-01
The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.
Hardware proofs using EHDM and the RSRE verification methodology
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Sjogren, Jon A.
1988-01-01
Examined is a methodology for hardware verification developed by Royal Signals and Radar Establishment (RSRE) in the context of the SRI International's Enhanced Hierarchical Design Methodology (EHDM) specification/verification system. The methodology utilizes a four-level specification hierarchy with the following levels: functional level, finite automata model, block model, and circuit level. The properties of a level are proved as theorems in the level below it. This methodology is applied to a 6-bit counter problem and is critically examined. The specifications are written in EHDM's specification language, Extended Special, and the proofs are improving both the RSRE methodology and the EHDM system.
A mechanized process algebra for verification of device synchronization protocols
NASA Technical Reports Server (NTRS)
Schubert, E. Thomas
1992-01-01
We describe the formalization of a process algebra based on CCS within the Higher Order Logic (HOL) theorem-proving system. The representation of four types of device interactions and a correctness proof of the communication between a microprocessor and MMU is presented.
Temporal Specification and Verification of Real-Time Systems.
1991-08-30
of concrete real - time systems can be modeled adequately. Specification: We present two conservative extensions of temporal logic that allow for the...logic. We present both model-checking algorithms for the automatic verification of finite-state real - time systems and proof methods for the deductive verification of real - time systems .
A physical zero-knowledge object-comparison system for nuclear warhead verification
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; d'Errico, Francesco
2016-01-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications. PMID:27649477
A physical zero-knowledge object-comparison system for nuclear warhead verification.
Philippe, Sébastien; Goldston, Robert J; Glaser, Alexander; d'Errico, Francesco
2016-09-20
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.
A physical zero-knowledge object-comparison system for nuclear warhead verification
NASA Astrophysics Data System (ADS)
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; D'Errico, Francesco
2016-09-01
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information. More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.
A physical zero-knowledge object-comparison system for nuclear warhead verification
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander; ...
2016-09-20
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less
A physical zero-knowledge object-comparison system for nuclear warhead verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Philippe, Sébastien; Goldston, Robert J.; Glaser, Alexander
Zero-knowledge proofs are mathematical cryptographic methods to demonstrate the validity of a claim while providing no further information beyond the claim itself. The possibility of using such proofs to process classified and other sensitive physical data has attracted attention, especially in the field of nuclear arms control. Here we demonstrate a non-electronic fast neutron differential radiography technique using superheated emulsion detectors that can confirm that two objects are identical without revealing their geometry or composition. Such a technique could form the basis of a verification system that could confirm the authenticity of nuclear weapons without sharing any secret design information.more » More broadly, by demonstrating a physical zero-knowledge proof that can compare physical properties of objects, this experiment opens the door to developing other such secure proof-systems for other applications.« less
Towards composition of verified hardware devices
NASA Technical Reports Server (NTRS)
Schubert, E. Thomas; Levitt, K.; Cohen, G. C.
1991-01-01
Computers are being used where no affordable level of testing is adequate. Safety and life critical systems must find a replacement for exhaustive testing to guarantee their correctness. Through a mathematical proof, hardware verification research has focused on device verification and has largely ignored system composition verification. To address these deficiencies, we examine how the current hardware verification methodology can be extended to verify complete systems.
Families of Functions and Functions of Proof
ERIC Educational Resources Information Center
Landman, Greisy Winicki
2002-01-01
This article describes an activity for secondary school students that may constitute an appropriate opportunity to discuss with them the idea of proof, particularly in an algebraic context. During the activity the students may experience and understand some of the roles played by proof in mathematics in addition to verification of truth:…
Using Dynamic Geometry to Expand Mathematics Teachers' Understanding of Proof
ERIC Educational Resources Information Center
de Villiers, Michael
2004-01-01
This paper gives a broad descriptive account of some activities that the author has designed using Sketchpad to develop teachers' understanding of other functions of proof than just the traditional function of 'verification'. These other functions of proof illustrated here are those of explanation, discovery and systematization (in the context of…
Verification and Validation of KBS with Neural Network Components
NASA Technical Reports Server (NTRS)
Wen, Wu; Callahan, John
1996-01-01
Artificial Neural Network (ANN) play an important role in developing robust Knowledge Based Systems (KBS). The ANN based components used in these systems learn to give appropriate predictions through training with correct input-output data patterns. Unlike traditional KBS that depends on a rule database and a production engine, the ANN based system mimics the decisions of an expert without specifically formulating the if-than type of rules. In fact, the ANNs demonstrate their superiority when such if-then type of rules are hard to generate by human expert. Verification of traditional knowledge based system is based on the proof of consistency and completeness of the rule knowledge base and correctness of the production engine.These techniques, however, can not be directly applied to ANN based components.In this position paper, we propose a verification and validation procedure for KBS with ANN based components. The essence of the procedure is to obtain an accurate system specification through incremental modification of the specifications using an ANN rule extraction algorithm.
20 CFR 30.106 - Can OWCP request employment verification from other sources?
Code of Federal Regulations, 2010 CFR
2010-04-01
... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... for other entities to provide OWCP with the information necessary to verify an employment history...
20 CFR 30.106 - Can OWCP request employment verification from other sources?
Code of Federal Regulations, 2011 CFR
2011-04-01
... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... for other entities to provide OWCP with the information necessary to verify an employment history...
20 CFR 30.106 - Can OWCP request employment verification from other sources?
Code of Federal Regulations, 2012 CFR
2012-04-01
... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... for other entities to provide OWCP with the information necessary to verify an employment history...
20 CFR 30.106 - Can OWCP request employment verification from other sources?
Code of Federal Regulations, 2014 CFR
2014-04-01
... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... for other entities to provide OWCP with the information necessary to verify an employment history...
Formal verification of a microcoded VIPER microprocessor using HOL
NASA Technical Reports Server (NTRS)
Levitt, Karl; Arora, Tejkumar; Leung, Tony; Kalvala, Sara; Schubert, E. Thomas; Windley, Philip; Heckman, Mark; Cohen, Gerald C.
1993-01-01
The Royal Signals and Radar Establishment (RSRE) and members of the Hardware Verification Group at Cambridge University conducted a joint effort to prove the correspondence between the electronic block model and the top level specification of Viper. Unfortunately, the proof became too complex and unmanageable within the given time and funding constraints, and is thus incomplete as of the date of this report. This report describes an independent attempt to use the HOL (Cambridge Higher Order Logic) mechanical verifier to verify Viper. Deriving from recent results in hardware verification research at UC Davis, the approach has been to redesign the electronic block model to make it microcoded and to structure the proof in a series of decreasingly abstract interpreter levels, the lowest being the electronic block level. The highest level is the RSRE Viper instruction set. Owing to the new approach and some results on the proof of generic interpreters as applied to simple microprocessors, this attempt required an effort approximately an order of magnitude less than the previous one.
Automated Verification of Specifications with Typestates and Access Permissions
NASA Technical Reports Server (NTRS)
Siminiceanu, Radu I.; Catano, Nestor
2011-01-01
We propose an approach to formally verify Plural specifications based on access permissions and typestates, by model-checking automatically generated abstract state-machines. Our exhaustive approach captures all the possible behaviors of abstract concurrent programs implementing the specification. We describe the formal methodology employed by our technique and provide an example as proof of concept for the state-machine construction rules. The implementation of a fully automated algorithm to generate and verify models, currently underway, provides model checking support for the Plural tool, which currently supports only program verification via data flow analysis (DFA).
Key Ideas: What Are They and How Can They Help Us Understand How People View Proof?
ERIC Educational Resources Information Center
Raman, Manya
2003-01-01
Examines the views of proof held by university-level mathematics students and teachers. Develops a framework for characterizing people's views of proof based on a distinction between public and private aspects of proof and the key ideas that link these two domains. (Author/KHR)
DOE Office of Scientific and Technical Information (OSTI.GOV)
McCune, W.; Shumsky, O.
2000-02-04
IVY is a verified theorem prover for first-order logic with equality. It is coded in ACL2, and it makes calls to the theorem prover Otter to search for proofs and to the program MACE to search for countermodels. Verifications of Otter and MACE are not practical because they are coded in C. Instead, Otter and MACE give detailed proofs and models that are checked by verified ACL2 programs. In addition, the initial conversion to clause form is done by verified ACL2 code. The verification is done with respect to finite interpretations.
Security Verification Techniques Applied to PatchLink COTS Software
NASA Technical Reports Server (NTRS)
Gilliam, David P.; Powell, John D.; Bishop, Matt; Andrew, Chris; Jog, Sameer
2006-01-01
Verification of the security of software artifacts is a challenging task. An integrated approach that combines verification techniques can increase the confidence in the security of software artifacts. Such an approach has been developed by the Jet Propulsion Laboratory (JPL) and the University of California at Davis (UC Davis). Two security verification instruments were developed and then piloted on PatchLink's UNIX Agent, a Commercial-Off-The-Shelf (COTS) software product, to assess the value of the instruments and the approach. The two instruments are the Flexible Modeling Framework (FMF) -- a model-based verification instrument (JPL), and a Property-Based Tester (UC Davis). Security properties were formally specified for the COTS artifact and then verified using these instruments. The results were then reviewed to determine the effectiveness of the approach and the security of the COTS product.
What are the ultimate limits to computational techniques: verifier theory and unverifiability
NASA Astrophysics Data System (ADS)
Yampolskiy, Roman V.
2017-09-01
Despite significant developments in proof theory, surprisingly little attention has been devoted to the concept of proof verifiers. In particular, the mathematical community may be interested in studying different types of proof verifiers (people, programs, oracles, communities, superintelligences) as mathematical objects. Such an effort could reveal their properties, their powers and limitations (particularly in human mathematicians), minimum and maximum complexity, as well as self-verification and self-reference issues. We propose an initial classification system for verifiers and provide some rudimentary analysis of solved and open problems in this important domain. Our main contribution is a formal introduction of the notion of unverifiability, for which the paper could serve as a general citation in domains of theorem proving, as well as software and AI verification.
Proof Rules for Automated Compositional Verification through Learning
NASA Technical Reports Server (NTRS)
Barringer, Howard; Giannakopoulou, Dimitra; Pasareanu, Corina S.
2003-01-01
Compositional proof systems not only enable the stepwise development of concurrent processes but also provide a basis to alleviate the state explosion problem associated with model checking. An assume-guarantee style of specification and reasoning has long been advocated to achieve compositionality. However, this style of reasoning is often non-trivial, typically requiring human input to determine appropriate assumptions. In this paper, we present novel assume- guarantee rules in the setting of finite labelled transition systems with blocking communication. We show how these rules can be applied in an iterative and fully automated fashion within a framework based on learning.
Damage Detection and Verification System (DDVS) for In-Situ Health Monitoring
NASA Technical Reports Server (NTRS)
Williams, Martha K.; Lewis, Mark; Szafran, J.; Shelton, C.; Ludwig, L.; Gibson, T.; Lane, J.; Trautwein, T.
2015-01-01
Project presentation for Game Changing Program Smart Book Release. Detection and Verification System (DDVS) expands the Flat Surface Damage Detection System (FSDDS) sensory panels damage detection capabilities and includes an autonomous inspection capability utilizing cameras and dynamic computer vision algorithms to verify system health. Objectives of this formulation task are to establish the concept of operations, formulate the system requirements for a potential ISS flight experiment, and develop a preliminary design of an autonomous inspection capability system that will be demonstrated as a proof-of-concept ground based damage detection and inspection system.
WRAP-RIB antenna technology development
NASA Technical Reports Server (NTRS)
Freeland, R. E.; Garcia, N. F.; Iwamoto, H.
1985-01-01
The wrap-rib deployable antenna concept development is based on a combination of hardware development and testing along with extensive supporting analysis. The proof-of-concept hardware models are large in size so they will address the same basic problems associated with the design fabrication, assembly and test as the full-scale systems which were selected to be 100 meters at the beginning of the program. The hardware evaluation program consists of functional performance tests, design verification tests and analytical model verification tests. Functional testing consists of kinematic deployment, mesh management and verification of mechanical packaging efficiencies. Design verification consists of rib contour precision measurement, rib cross-section variation evaluation, rib materials characterizations and manufacturing imperfections assessment. Analytical model verification and refinement include mesh stiffness measurement, rib static and dynamic testing, mass measurement, and rib cross-section characterization. This concept was considered for a number of potential applications that include mobile communications, VLBI, and aircraft surveillance. In fact, baseline system configurations were developed by JPL, using the appropriate wrap-rib antenna, for all three classes of applications.
Design and Verification of a Distributed Communication Protocol
NASA Technical Reports Server (NTRS)
Munoz, Cesar A.; Goodloe, Alwyn E.
2009-01-01
The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.
Formally verifying Ada programs which use real number types
NASA Technical Reports Server (NTRS)
Sutherland, David
1986-01-01
Formal verification is applied to programs which use real number arithmetic operations (mathematical programs). Formal verification of a program P consists of creating a mathematical model of F, stating the desired properties of P in a formal logical language, and proving that the mathematical model has the desired properties using a formal proof calculus. The development and verification of the mathematical model are discussed.
Development of Sample Verification System for Sample Return Missions
NASA Technical Reports Server (NTRS)
Toda, Risaku; McKinney, Colin; Jackson, Shannon P.; Mojarradi, Mohammad; Trebi-Ollennu, Ashitey; Manohara, Harish
2011-01-01
This paper describes the development of a proof of-concept sample verification system (SVS) for in-situ mass measurement of planetary rock and soil sample in future robotic sample return missions. Our proof-of-concept SVS device contains a 10 cm diameter pressure sensitive elastic membrane placed at the bottom of a sample canister. The membrane deforms under the weight of accumulating planetary sample. The membrane is positioned in proximity to an opposing substrate with a narrow gap. The deformation of the membrane makes the gap to be narrower, resulting in increased capacitance between the two nearly parallel plates. Capacitance readout circuitry on a nearby printed circuit board (PCB) transmits data via a low-voltage differential signaling (LVDS) interface. The fabricated SVS proof-of-concept device has successfully demonstrated approximately 1pF/gram capacitance change
2016-02-01
proof in mathematics. For example, consider the proof of the Pythagorean Theorem illustrated at: http://www.cut-the-knot.org/ pythagoras / where 112...methods and tools have made significant progress in their ability to model software designs and prove correctness theorems about the systems modeled...assumption criticality” or “ theorem root set size” SITAPS detects potentially brittle verification cases. SITAPS provides tools and techniques that
Soft Drinks, Mind Reading, and Number Theory
ERIC Educational Resources Information Center
Schultz, Kyle T.
2009-01-01
Proof is a central component of mathematicians' work, used for verification, explanation, discovery, and communication. Unfortunately, high school students' experiences with proof are often limited to verifying mathematical statements or relationships that are already known to be true. As a result, students often fail to grasp the true nature of…
28 CFR 79.16 - Proof of medical condition.
Code of Federal Regulations, 2014 CFR
2014-07-01
... Program to contact the appropriate state cancer or tumor registry. The Program will accept as proof of medical condition verification from the state cancer or tumor registry that it possesses medical records... Cancer Institute can make a diagnosis of leukemia to a reasonable degree of medical certainty: (i) Bone...
28 CFR 79.16 - Proof of medical condition.
Code of Federal Regulations, 2013 CFR
2013-07-01
... Program to contact the appropriate state cancer or tumor registry. The Program will accept as proof of medical condition verification from the state cancer or tumor registry that it possesses medical records... Cancer Institute can make a diagnosis of leukemia to a reasonable degree of medical certainty: (i) Bone...
28 CFR 79.16 - Proof of medical condition.
Code of Federal Regulations, 2012 CFR
2012-07-01
... Program to contact the appropriate state cancer or tumor registry. The Program will accept as proof of medical condition verification from the state cancer or tumor registry that it possesses medical records... Cancer Institute can make a diagnosis of leukemia to a reasonable degree of medical certainty: (i) Bone...
32 CFR Appendix B to Part 324 - System of Records Notice
Code of Federal Regulations, 2010 CFR
2010-07-01
..., punctuation, and spaces. 2. Security classification. Self explanatory. (DoD does not publish this caption... birth, etc.); and any description of proof of identity for verification purposes required for personal... verification. If appropriate, the individual may be referred to the system manager or another DFAS official who...
Preparation of Drug-loaded Chitosan Microspheres and Its Application in Paper-based PVC Wallpaper
NASA Astrophysics Data System (ADS)
Lin, Hui; Chen, Lihui; Yan, Guiyang; Chen, Feng; Huang, Liulian
2018-03-01
By screening through test, it was found that the drug-loaded chitosan microspheres with the average particle size of 615 nm may be prepared with NaF as the mold-proof drug, chitosan as the drug carrier and sodium tripolyphosphate as the cross-linking agent; and they can improve the aspergillus niger-proof effect if loaded onto the base paper surface of the paper-based PVC wallpaper. The results show that NaF and chitosan have mold-proof synergistic effects; the mold-proof effect of the wallpaper may be improved by increasing the dose of chitosan; when the mass ratio of NaF, sodium tripolyphosphate and chitosan was 2:7:28, the paper-based PVC wallpaper with good mold-proof property can be prepared.
Formal specification and verification of Ada software
NASA Technical Reports Server (NTRS)
Hird, Geoffrey R.
1991-01-01
The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.
2007-03-01
Characterisation. In Nanotechnology Aerospace Applications – 2006 (pp. 4-1 – 4-8). Educational Notes RTO-EN-AVT-129bis, Paper 4. Neuilly-sur-Seine, France: RTO...the Commercialisation Processes Concept IDEA Proof-of- Principle Trial Samples Engineering Verification Samples Design Verification Samples...SEIC Systems Engineering for commercialisation Design Houses, Engineering & R&D USERS & Integrators SE S U R Integrators Fabs & Wafer Processing Die
Post-OPC verification using a full-chip pattern-based simulation verification method
NASA Astrophysics Data System (ADS)
Hung, Chi-Yuan; Wang, Ching-Heng; Ma, Cliff; Zhang, Gary
2005-11-01
In this paper, we evaluated and investigated techniques for performing fast full-chip post-OPC verification using a commercial product platform. A number of databases from several technology nodes, i.e. 0.13um, 0.11um and 90nm are used in the investigation. Although it has proven that for most cases, our OPC technology is robust in general, due to the variety of tape-outs with complicated design styles and technologies, it is difficult to develop a "complete or bullet-proof" OPC algorithm that would cover every possible layout patterns. In the evaluation, among dozens of databases, some OPC databases were found errors by Model-based post-OPC checking, which could cost significantly in manufacturing - reticle, wafer process, and more importantly the production delay. From such a full-chip OPC database verification, we have learned that optimizing OPC models and recipes on a limited set of test chip designs may not provide sufficient coverage across the range of designs to be produced in the process. And, fatal errors (such as pinch or bridge) or poor CD distribution and process-sensitive patterns may still occur. As a result, more than one reticle tape-out cycle is not uncommon to prove models and recipes that approach the center of process for a range of designs. So, we will describe a full-chip pattern-based simulation verification flow serves both OPC model and recipe development as well as post OPC verification after production release of the OPC. Lastly, we will discuss the differentiation of the new pattern-based and conventional edge-based verification tools and summarize the advantages of our new tool and methodology: 1). Accuracy: Superior inspection algorithms, down to 1nm accuracy with the new "pattern based" approach 2). High speed performance: Pattern-centric algorithms to give best full-chip inspection efficiency 3). Powerful analysis capability: Flexible error distribution, grouping, interactive viewing and hierarchical pattern extraction to narrow down to unique patterns/cells.
Towards the formal verification of the requirements and design of a processor interface unit
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
The formal verification of the design and partial requirements for a Processor Interface Unit (PIU) using the Higher Order Logic (HOL) theorem-proving system is described. The processor interface unit is a single-chip subsystem within a fault-tolerant embedded system under development within the Boeing Defense and Space Group. It provides the opportunity to investigate the specification and verification of a real-world subsystem within a commercially-developed fault-tolerant computer. An overview of the PIU verification effort is given. The actual HOL listing from the verification effort are documented in a companion NASA contractor report entitled 'Towards the Formal Verification of the Requirements and Design of a Processor Interface Unit - HOL Listings' including the general-purpose HOL theories and definitions that support the PIU verification as well as tactics used in the proofs.
NASA Technical Reports Server (NTRS)
Whalen, Michael; Schumann, Johann; Fischer, Bernd
2002-01-01
Code certification is a lightweight approach to demonstrate software quality on a formal level. Its basic idea is to require producers to provide formal proofs that their code satisfies certain quality properties. These proofs serve as certificates which can be checked independently. Since code certification uses the same underlying technology as program verification, it also requires many detailed annotations (e.g., loop invariants) to make the proofs possible. However, manually adding theses annotations to the code is time-consuming and error-prone. We address this problem by combining code certification with automatic program synthesis. We propose an approach to generate simultaneously, from a high-level specification, code and all annotations required to certify generated code. Here, we describe a certification extension of AUTOBAYES, a synthesis tool which automatically generates complex data analysis programs from compact specifications. AUTOBAYES contains sufficient high-level domain knowledge to generate detailed annotations. This allows us to use a general-purpose verification condition generator to produce a set of proof obligations in first-order logic. The obligations are then discharged using the automated theorem E-SETHEO. We demonstrate our approach by certifying operator safety for a generated iterative data classification program without manual annotation of the code.
Verified compilation of Concurrent Managed Languages
2017-11-01
designs for compiler intermediate representations that facilitate mechanized proofs and verification; and (d) a realistic case study that combines these...ideas to prove the correctness of a state-of- the-art concurrent garbage collector. 15. SUBJECT TERMS Program verification, compiler design ...Even though concurrency is a pervasive part of modern software and hardware systems, it has often been ignored in safety-critical system designs . A
Age verification cards fail to fully prevent minors from accessing tobacco products.
Kanda, Hideyuki; Osaki, Yoneatsu; Ohida, Takashi; Kaneita, Yoshitaka; Munezawa, Takeshi
2011-03-01
Proper age verification can prevent minors from accessing tobacco products. For this reason, electronic locking devices based on a proof-of age system utilising cards were installed in almost every tobacco vending machine across Japan and Germany to restrict sales to minors. We aimed to clarify the associations between amount smoked by high school students and the usage of age verification cards by conducting a nationwide cross-sectional survey of students in Japan. This survey was conducted in 2008. We asked high school students, aged 13-18 years, in Japan about their smoking behaviour, where they purchase cigarettes, if or if not they have used age verification cards, and if yes, how they obtained this card. As the amount smoked increased, the prevalence of purchasing cigarettes from vending machines also rose for both males and females. The percentage of those with experience of using an age verification card was also higher among those who smoked more. Somebody outside of family was the top source of obtaining cards. Surprisingly, around 5% of males and females belonging to the group with highest smoking levels applied for cards themselves. Age verification cards cannot fully prevent minors from accessing tobacco products. These findings suggest that a total ban of tobacco vending machines, not an age verification system, is needed to prevent sales to minors.
DOT National Transportation Integrated Search
2016-11-21
Work presented herein is an addendum to the final report for NCDOT Project 2011-05 entitled : Field Verification of Undercut Criteria and Alternatives for Subgrade Stabilization in the : Piedmont Area. The objective of the addendum work is to p...
Hand Grasping Synergies As Biometrics.
Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana
2017-01-01
Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies-postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security.
Space shuttle orbit maneuvering engine reusable thrust chamber program
NASA Technical Reports Server (NTRS)
Senneff, J. M.
1975-01-01
The feasibility of potential reusable thrust chamber concepts is studied. Propellant condidates were examined and analytically combined with potential cooling schemes. A data base of engine data which would assist in a configuration selection was produced. The data base verification was performed by the demonstration of a thrust chamber of a selected coolant scheme design. A full scale insulated columbium thrust chamber was used for propellant coolant configurations. Combustion stability of the injectors and a reduced size thrust chamber were experimentally verified as proof of concept demonstrations of the design and study results.
A Machine-Checked Proof of A State-Space Construction Algorithm
NASA Technical Reports Server (NTRS)
Catano, Nestor; Siminiceanu, Radu I.
2010-01-01
This paper presents the correctness proof of Saturation, an algorithm for generating state spaces of concurrent systems, implemented in the SMART tool. Unlike the Breadth First Search exploration algorithm, which is easy to understand and formalise, Saturation is a complex algorithm, employing a mutually-recursive pair of procedures that compute a series of non-trivial, nested local fixed points, corresponding to a chaotic fixed point strategy. A pencil-and-paper proof of Saturation exists, but a machine checked proof had never been attempted. The key element of the proof is the characterisation theorem of saturated nodes in decision diagrams, stating that a saturated node represents a set of states encoding a local fixed-point with respect to firing all events affecting only the node s level and levels below. For our purpose, we have employed the Prototype Verification System (PVS) for formalising the Saturation algorithm, its data structures, and for conducting the proofs.
Kabir, Muhammad N.; Alginahi, Yasser M.
2014-01-01
This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247
Report on the formal specification and partial verification of the VIPER microprocessor
NASA Technical Reports Server (NTRS)
Brock, Bishop; Hunt, Warren A., Jr.
1991-01-01
The formal specification and partial verification of the VIPER microprocessor is reviewed. The VIPER microprocessor was designed by RSRE, Malvern, England, for safety critical computing applications (e.g., aircraft, reactor control, medical instruments, armaments). The VIPER was carefully specified and partially verified in an attempt to provide a microprocessor with completely predictable operating characteristics. The specification of VIPER is divided into several levels of abstraction, from a gate-level description up to an instruction execution model. Although the consistency between certain levels was demonstrated with mechanically-assisted mathematical proof, the formal verification of VIPER was never completed.
Code of Federal Regulations, 2011 CFR
2011-01-01
... published on a schedule designed to provide the public with information about their Government on a timely... Federal Register for transmittal of statements and charts and for the verification of proofs. Failure to...
NASA Astrophysics Data System (ADS)
Benfenati, Francesco; Beretta, Gian Paolo
2018-04-01
We show that to prove the Onsager relations using the microscopic time reversibility one necessarily has to make an ergodic hypothesis, or a hypothesis closely linked to that. This is true in all the proofs of the Onsager relations in the literature: from the original proof by Onsager, to more advanced proofs in the context of linear response theory and the theory of Markov processes, to the proof in the context of the kinetic theory of gases. The only three proofs that do not require any kind of ergodic hypothesis are based on additional hypotheses on the macroscopic evolution: Ziegler's maximum entropy production principle (MEPP), the principle of time reversal invariance of the entropy production, or the steepest entropy ascent principle (SEAP).
High Pressure Regenerative Turbine Engine: 21st Century Propulsion
NASA Technical Reports Server (NTRS)
Lear, W. E.; Laganelli, A. L.; Senick, Paul (Technical Monitor)
2001-01-01
A novel semi-closed cycle gas turbine engine was demonstrated and was found to meet the program goals. The proof-of-principle test of the High Pressure Regenerative Turbine Engine produced data that agreed well with models, enabling more confidence in designing future prototypes based on this concept. Emission levels were significantly reduced as predicted as a natural attribute of this power cycle. Engine testing over a portion of the operating range allowed verification of predicted power increases compared to the baseline.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helton, Jon C.; Brooks, Dusty Marie; Sallaberry, Cedric Jean-Marie.
Representations are developed and illustrated for the distribution of link property values at the time of link failure in the presence of aleatory uncertainty in link properties. The following topics are considered: (i) defining properties for weak links and strong links, (ii) cumulative distribution functions (CDFs) for link failure time, (iii) integral-based derivation of CDFs for link property at time of link failure, (iv) sampling-based approximation of CDFs for link property at time of link failure, (v) verification of integral-based and sampling-based determinations of CDFs for link property at time of link failure, (vi) distributions of link properties conditional onmore » time of link failure, and (vii) equivalence of two different integral-based derivations of CDFs for link property at time of link failure.« less
Hand Grasping Synergies As Biometrics
Patel, Vrajeshri; Thukral, Poojita; Burns, Martin K.; Florescu, Ionut; Chandramouli, Rajarathnam; Vinjamuri, Ramana
2017-01-01
Recently, the need for more secure identity verification systems has driven researchers to explore other sources of biometrics. This includes iris patterns, palm print, hand geometry, facial recognition, and movement patterns (hand motion, gait, and eye movements). Identity verification systems may benefit from the complexity of human movement that integrates multiple levels of control (neural, muscular, and kinematic). Using principal component analysis, we extracted spatiotemporal hand synergies (movement synergies) from an object grasping dataset to explore their use as a potential biometric. These movement synergies are in the form of joint angular velocity profiles of 10 joints. We explored the effect of joint type, digit, number of objects, and grasp type. In its best configuration, movement synergies achieved an equal error rate of 8.19%. While movement synergies can be integrated into an identity verification system with motion capture ability, we also explored a camera-ready version of hand synergies—postural synergies. In this proof of concept system, postural synergies performed well, but only when specific postures were chosen. Based on these results, hand synergies show promise as a potential biometric that can be combined with other hand-based biometrics for improved security. PMID:28512630
Device independence for two-party cryptography and position verification with memoryless devices
NASA Astrophysics Data System (ADS)
Ribeiro, Jérémy; Thinh, Le Phuc; Kaniewski, Jedrzej; Helsen, Jonas; Wehner, Stephanie
2018-06-01
Quantum communication has demonstrated its usefulness for quantum cryptography far beyond quantum key distribution. One domain is two-party cryptography, whose goal is to allow two parties who may not trust each other to solve joint tasks. Another interesting application is position-based cryptography whose goal is to use the geographical location of an entity as its only identifying credential. Unfortunately, security of these protocols is not possible against an all powerful adversary. However, if we impose some realistic physical constraints on the adversary, there exist protocols for which security can be proven, but these so far relied on the knowledge of the quantum operations performed during the protocols. In this work we improve the device-independent security proofs of Kaniewski and Wehner [New J. Phys. 18, 055004 (2016), 10.1088/1367-2630/18/5/055004] for two-party cryptography (with memoryless devices) and we add a security proof for device-independent position verification (also memoryless devices) under different physical constraints on the adversary. We assess the quality of the devices by observing a Bell violation, and, as for Kaniewski and Wehner [New J. Phys. 18, 055004 (2016), 10.1088/1367-2630/18/5/055004], security can be attained for any violation of the Clauser-Holt-Shimony-Horne inequality.
Machine Learning-based Intelligent Formal Reasoning and Proving System
NASA Astrophysics Data System (ADS)
Chen, Shengqing; Huang, Xiaojian; Fang, Jiaze; Liang, Jia
2018-03-01
The reasoning system can be used in many fields. How to improve reasoning efficiency is the core of the design of system. Through the formal description of formal proof and the regular matching algorithm, after introducing the machine learning algorithm, the system of intelligent formal reasoning and verification has high efficiency. The experimental results show that the system can verify the correctness of propositional logic reasoning and reuse the propositional logical reasoning results, so as to obtain the implicit knowledge in the knowledge base and provide the basic reasoning model for the construction of intelligent system.
Online 3D EPID-based dose verification: Proof of concept.
Spreeuw, Hanno; Rozendaal, Roel; Olaciregui-Ruiz, Igor; González, Patrick; Mans, Anton; Mijnheer, Ben; van Herk, Marcel
2016-07-01
Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of this study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5-10 s irradiation time. A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.
ORION - Crew Module Side Hatch: Proof Pressure Test Anomaly Investigation
NASA Technical Reports Server (NTRS)
Evernden, Brent A.; Guzman, Oscar J.
2018-01-01
The Orion Multi-Purpose Crew Vehicle program was performing a proof pressure test on an engineering development unit (EDU) of the Orion Crew Module Side Hatch (CMSH) assembly. The purpose of the proof test was to demonstrate structural capability, with margin, at 1.5 times the maximum design pressure, before integrating the CMSH to the Orion Crew Module structural test article for subsequent pressure testing. The pressure test was performed at lower pressures of 3 psig, 10 psig and 15.75 psig with no apparent abnormal behavior or leaking. During pressurization to proof pressure of 23.32 psig, a loud 'pop' was heard at 21.3 psig. Upon review into the test cell, it was noted that the hatch had prematurely separated from the proof test fixture, thus immediately ending the test. The proof pressure test was expected be a simple verification but has since evolved into a significant joint failure investigation from both Lockheed Martin and NASA.
NASA Technical Reports Server (NTRS)
Fura, David A.; Windley, Phillip J.; Cohen, Gerald C.
1993-01-01
This technical report contains the Higher-Order Logic (HOL) listings of the partial verification of the requirements and design for a commercially developed processor interface unit (PIU). The PIU is an interface chip performing memory interface, bus interface, and additional support services for a commercial microprocessor within a fault tolerant computer system. This system, the Fault Tolerant Embedded Processor (FTEP), is targeted towards applications in avionics and space requiring extremely high levels of mission reliability, extended maintenance-free operation, or both. This report contains the actual HOL listings of the PIU verification as it currently exists. Section two of this report contains general-purpose HOL theories and definitions that support the PIU verification. These include arithmetic theories dealing with inequalities and associativity, and a collection of tactics used in the PIU proofs. Section three contains the HOL listings for the completed PIU design verification. Section 4 contains the HOL listings for the partial requirements verification of the P-Port.
This verification test was conducted according to procedures specifiedin the Test/QA Planfor Verification of Enzyme-Linked Immunosorbent Assay (ELISA) Test Kis for the Quantitative Determination of Endocrine Disrupting Compounds (EDCs) in Aqueous Phase Samples. Deviations to the...
Student-Teacher Linkage Verification: Model Process and Recommendations
ERIC Educational Resources Information Center
Watson, Jeffery; Graham, Matthew; Thorn, Christopher A.
2012-01-01
As momentum grows for tracking the role of individual educators in student performance, school districts across the country are implementing projects that involve linking teachers to their students. Programs that link teachers to student outcomes require a verification process for student-teacher linkages. Linkage verification improves accuracy by…
Data Mashups: Linking Human Health and Wellbeing with Weather, Climate and the Environment
NASA Astrophysics Data System (ADS)
Fleming, L. E.; Sarran, C.; Golding, B.; Haines, A.; Kessel, A.; Djennad, M.; Hajat, S.; Nichols, G.; Gordon Brown, H.; Depledge, M.
2016-12-01
A large part of the global disease burden can be linked to environmental factors, underpinned by unhealthy behaviours. Research into these linkages suffers from lack of common tools and databases for investigations across many different scientific disciplines to explore these complex associations. The MEDMI (Medical and Environmental Data-a Mash-up Infrastructure) Partnership brings together leading organisations and researchers in climate, weather, environment, and human health. We have created a proof-of-concept central data and analysis system with the UK Met Office and Public Health England data as the internet-based MEDMI Platform (www.data-mashup.org.uk) to serve as a common resource for researchers to link and analyse complex meteorological, environmental and epidemiological data in the UK. The Platform is hosted on its own dedicated server, with secure internet and in-person access with appropriate safeguards for ethical, copyright, security, preservation, and data sharing issues. Via the Platform, there is a demonstration Browser Application with access to user-selected subsets of the data for: a) analyses using time series (e.g. mortality/environmental variables), and b) data visualizations (e.g. infectious diseases/environmental variables). One demonstration project is linking climate change, harmful algal blooms and oceanographic modelling building on the hydrodynamic-biogeochemical coupled models; in situ and satellite observations as well as UK HAB data and hospital episode statistics data are being used for model verification and future forecasting. The MEDMI Project provides a demonstration of the potential, barriers and challenges, of these "data mashups" of environment and health data. Although there remain many challenges to creating and sustaining such a shared resource, these activities and resources are essential to truly explore the complex interactions between climate and other environmental change and health at the local and global scale.
Cymatics for the cloaking of flexural vibrations in a structured plate
Misseroni, D.; Colquitt, D. J.; Movchan, A. B.; Movchan, N. V.; Jones, I. S.
2016-01-01
Based on rigorous theoretical findings, we present a proof-of-concept design for a structured square cloak enclosing a void in an elastic lattice. We implement high-precision fabrication and experimental testing of an elastic invisibility cloak for flexural waves in a mechanical lattice. This is accompanied by verifications and numerical modelling performed through finite element simulations. The primary advantage of our square lattice cloak, over other designs, is the straightforward implementation and the ease of construction. The elastic lattice cloak, implemented experimentally, shows high efficiency. PMID:27068339
Probabilistic Sizing and Verification of Space Ceramic Structures
NASA Astrophysics Data System (ADS)
Denaux, David; Ballhause, Dirk; Logut, Daniel; Lucarelli, Stefano; Coe, Graham; Laine, Benoit
2012-07-01
Sizing of ceramic parts is best optimised using a probabilistic approach which takes into account the preexisting flaw distribution in the ceramic part to compute a probability of failure of the part depending on the applied load, instead of a maximum allowable load as for a metallic part. This requires extensive knowledge of the material itself but also an accurate control of the manufacturing process. In the end, risk reduction approaches such as proof testing may be used to lower the final probability of failure of the part. Sizing and verification of ceramic space structures have been performed by Astrium for more than 15 years, both with Zerodur and SiC: Silex telescope structure, Seviri primary mirror, Herschel telescope, Formosat-2 instrument, and other ceramic structures flying today. Throughout this period of time, Astrium has investigated and developed experimental ceramic analysis tools based on the Weibull probabilistic approach. In the scope of the ESA/ESTEC study: “Mechanical Design and Verification Methodologies for Ceramic Structures”, which is to be concluded in the beginning of 2012, existing theories, technical state-of-the-art from international experts, and Astrium experience with probabilistic analysis tools have been synthesized into a comprehensive sizing and verification method for ceramics. Both classical deterministic and more optimised probabilistic methods are available, depending on the criticality of the item and on optimisation needs. The methodology, based on proven theory, has been successfully applied to demonstration cases and has shown its practical feasibility.
Verification of Ceramic Structures
NASA Astrophysics Data System (ADS)
Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit
2012-07-01
In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).
Mosakhani, N; Sarhadi, V; Panula, P; Partinen, M; Knuutila, S
2017-11-01
Narcolepsy is a neurological sleep disorder characterized by excessive daytime sleepiness and nighttime sleep disturbance. Among children and adolescents vaccinated with Pandemrix vaccine in Finland and Sweden, the number of narcolepsy cases increased. Our aim was to identify miRNAs involved in narcolepsy and their association with Pandemrix vaccination. We performed global miRNA proofing by miRNA microarrays followed by RT-PCR verification on 20 narcolepsy patients (Pandemrix-associated and Pandemrix-non-associated) and 17 controls (vaccinated and non-vaccinated). Between all narcolepsy patients and controls, 11 miRNAs were differentially expressed; 17 miRNAs showed significantly differential expression between Pandemrix-non-associated narcolepsy patients and non-vaccinated healthy controls. MiR-188-5p and miR-4499 were over-expressed in narcolepsy patients vs healthy controls. Two miRNAs, miR-1470 and miR-4455, were under-expressed in Pandemrix-associated narcolepsy patients vs Pandemrix-non-associated narcolepsy patients. We identified miRNA expression patterns in narcolepsy patients that linked them to mRNA targets known to be involved in brain-related pathways or brain disorders. © 2017 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Advanced turboprop testbed systems study
NASA Technical Reports Server (NTRS)
Goldsmith, I. M.
1982-01-01
The proof of concept, feasibility, and verification of the advanced prop fan and of the integrated advanced prop fan aircraft are established. The use of existing hardware is compatible with having a successfully expedited testbed ready for flight. A prop fan testbed aircraft is definitely feasible and necessary for verification of prop fan/prop fan aircraft integrity. The Allison T701 is most suitable as a propulsor and modification of existing engine and propeller controls are adequate for the testbed. The airframer is considered the logical overall systems integrator of the testbed program.
Online 3D EPID-based dose verification: Proof of concept
DOE Office of Scientific and Technical Information (OSTI.GOV)
Spreeuw, Hanno; Rozendaal, Roel, E-mail: r.rozenda
Purpose: Delivery errors during radiotherapy may lead to medical harm and reduced life expectancy for patients. Such serious incidents can be avoided by performing dose verification online, i.e., while the patient is being irradiated, creating the possibility of halting the linac in case of a large overdosage or underdosage. The offline EPID-based 3D in vivo dosimetry system clinically employed at our institute is in principle suited for online treatment verification, provided the system is able to complete 3D dose reconstruction and verification within 420 ms, the present acquisition time of a single EPID frame. It is the aim of thismore » study to show that our EPID-based dosimetry system can be made fast enough to achieve online 3D in vivo dose verification. Methods: The current dose verification system was sped up in two ways. First, a new software package was developed to perform all computations that are not dependent on portal image acquisition separately, thus removing the need for doing these calculations in real time. Second, the 3D dose reconstruction algorithm was sped up via a new, multithreaded implementation. Dose verification was implemented by comparing planned with reconstructed 3D dose distributions delivered to two regions in a patient: the target volume and the nontarget volume receiving at least 10 cGy. In both volumes, the mean dose is compared, while in the nontarget volume, the near-maximum dose (D2) is compared as well. The real-time dosimetry system was tested by irradiating an anthropomorphic phantom with three VMAT plans: a 6 MV head-and-neck treatment plan, a 10 MV rectum treatment plan, and a 10 MV prostate treatment plan. In all plans, two types of serious delivery errors were introduced. The functionality of automatically halting the linac was also implemented and tested. Results: The precomputation time per treatment was ∼180 s/treatment arc, depending on gantry angle resolution. The complete processing of a single portal frame, including dose verification, took 266 ± 11 ms on a dual octocore Intel Xeon E5-2630 CPU running at 2.40 GHz. The introduced delivery errors were detected after 5–10 s irradiation time. Conclusions: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for two different kinds of gross delivery errors. Thus, online 3D dose verification has been technologically achieved.« less
NASA Technical Reports Server (NTRS)
Owre, Sam; Shankar, Natarajan
1997-01-01
PVS (Prototype Verification System) is a general-purpose environment for developing specifications and proofs. This document deals primarily with the abstract datatype mechanism in PVS which generates theories containing axioms and definitions for a class of recursive datatypes. The concepts underlying the abstract datatype mechanism are illustrated using ordered binary trees as an example. Binary trees are described by a PVS abstract datatype that is parametric in its value type. The type of ordered binary trees is then presented as a subtype of binary trees where the ordering relation is also taken as a parameter. We define the operations of inserting an element into, and searching for an element in an ordered binary tree; the bulk of the report is devoted to PVS proofs of some useful properties of these operations. These proofs illustrate various approaches to proving properties of abstract datatype operations. They also describe the built-in capabilities of the PVS proof checker for simplifying abstract datatype expressions.
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Divito, Ben L.; Holloway, C. Michael
1994-01-01
In this paper the design and formal verification of the lower levels of the Reliable Computing Platform (RCP), a fault-tolerant computing system for digital flight control applications, are presented. The RCP uses NMR-style redundancy to mask faults and internal majority voting to flush the effects of transient faults. Two new layers of the RCP hierarchy are introduced: the Minimal Voting refinement (DA_minv) of the Distributed Asynchronous (DA) model and the Local Executive (LE) Model. Both the DA_minv model and the LE model are specified formally and have been verified using the Ehdm verification system. All specifications and proofs are available electronically via the Internet using anonymous FTP or World Wide Web (WWW) access.
Compositional Verification of a Communication Protocol for a Remotely Operated Vehicle
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn E.; Munoz, Cesar A.
2009-01-01
This paper presents the specification and verification in the Prototype Verification System (PVS) of a protocol intended to facilitate communication in an experimental remotely operated vehicle used by NASA researchers. The protocol is defined as a stack-layered com- position of simpler protocols. It can be seen as the vertical composition of protocol layers, where each layer performs input and output message processing, and the horizontal composition of different processes concurrently inhabiting the same layer, where each process satisfies a distinct requirement. It is formally proven that the protocol components satisfy certain delivery guarantees. Compositional techniques are used to prove these guarantees also hold in the composed system. Although the protocol itself is not novel, the methodology employed in its verification extends existing techniques by automating the tedious and usually cumbersome part of the proof, thereby making the iterative design process of protocols feasible.
Middle School Children's Mathematical Reasoning and Proving Schemes
ERIC Educational Resources Information Center
Liu, Yating; Manouchehri, Azita
2013-01-01
In this work we explored proof schemes used by 41 middle school students when confronted with four mathematical propositions that demanded verification of accuracy of statements. The students' perception of mathematically complete vs. convincing arguments in different mathematics branches was also elicited. Lastly, we considered whether the…
Proving the correctness of the flight director program EADIFD, volume 1
NASA Technical Reports Server (NTRS)
Lee, F. J.; Maurer, W. D.
1977-01-01
EADIFD is written in symbolic assembly language for execution on the C4000 airborne computer. It is a subprogram of an aircraft navigation and guidance program and is used to generate pitch and roll command signals for use in terminal airspace. The proof of EADIFD was carried out by an inductive assertion method consisting of two parts, a verification condition generator and a source language independent proof checker. With the specifications provided by NASA, EADIFD was proved correct. The termination of the program is guaranteed and the program contains no instructions that can modify it under any conditions.
Using Automated Theorem Provers to Certify Auto-Generated Aerospace Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd; Schumann, Johann
2004-01-01
We describe a system for the automated certification of safety properties of NASA software. The system uses Hoare-style program verification technology to generate proof obligations which are then processed by an automated first-order theorem prover (ATP). For full automation, however, the obligations must be aggressively preprocessed and simplified We describe the unique requirements this places on the ATP and demonstrate how the individual simplification stages, which are implemented by rewriting, influence the ability of the ATP to solve the proof tasks. Experiments on more than 25,000 tasks were carried out using Vampire, Spass, and e-setheo.
Testing First-Order Logic Axioms in AutoCert
NASA Technical Reports Server (NTRS)
Ahn, Ki Yung; Denney, Ewen
2009-01-01
AutoCert [2] is a formal verification tool for machine generated code in safety critical domains, such as aerospace control code generated from MathWorks Real-Time Workshop. AutoCert uses Automated Theorem Provers (ATPs) [5] based on First-Order Logic (FOL) to formally verify safety and functional correctness properties of the code. These ATPs try to build proofs based on user provided domain-specific axioms, which can be arbitrary First-Order Formulas (FOFs). These axioms are the most crucial part of the trusted base, since proofs can be submitted to a proof checker removing the need to trust the prover and AutoCert itself plays the part of checking the code generator. However, formulating axioms correctly (i.e. precisely as the user had really intended) is non-trivial in practice. The challenge of axiomatization arise from several dimensions. First, the domain knowledge has its own complexity. AutoCert has been used to verify mathematical requirements on navigation software that carries out various geometric coordinate transformations involving matrices and quaternions. Axiomatic theories for such constructs are complex enough that mistakes are not uncommon. Second, adjusting axioms for ATPs can add even more complexity. The axioms frequently need to be modified in order to have them in a form suitable for use with ATPs. Such modifications tend to obscure the axioms further. Thirdly, speculating validity of the axioms from the output of existing ATPs is very hard since theorem provers typically do not give any examples or counterexamples.
Filippidis, Filippos T; Agaku, Israel T; Connolly, Gregory N; Vardavas, Constantine I
2014-04-01
This study assessed trends in age verification prior to cigarette sales to U.S. middle and high school students, and refusal to sell cigarettes to students aged <18 years during 2000-2009. Data were obtained from the 2000-2009 National Youth Tobacco Survey. Trends during 2000-2009 were assessed using binary logistic regression (p<0.05). The proportion of all students, who reported being asked to show proof of age prior to a cigarette purchase in the past 30 days did not change significantly between 2000 (46.9%) and 2009 (44.9%) (p=0.529 for linear trend). No significant trend in the proportion of students aged < 18 years who were refused a sale when attempting to buy cigarettes was observed between 2000 (39.8%) and 2009 (36.7%) (p=0.283 for linear trend). Refusal of a cigarette sale was significantly higher among under-aged boys compared to girls (adjusted odds ratio=1.48; 95% confidence interval: 1.28-1.70). About half of U.S. middle and high school students who reported making a cigarette purchase were not asked for proof of age, and about three of five under-aged buyers successfully made a cigarette purchase in 2009. Intensified implementation and enforcement of policies requiring age verification among youths is warranted to reduce access and use of tobacco products. Copyright © 2014 Elsevier Inc. All rights reserved.
Numerical proof of stability of roll waves in the small-amplitude limit for inclined thin film flow
NASA Astrophysics Data System (ADS)
Barker, Blake
2014-10-01
We present a rigorous numerical proof based on interval arithmetic computations categorizing the linearized and nonlinear stability of periodic viscous roll waves of the KdV-KS equation modeling weakly unstable flow of a thin fluid film on an incline in the small-amplitude KdV limit. The argument proceeds by verification of a stability condition derived by Bar-Nepomnyashchy and Johnson-Noble-Rodrigues-Zumbrun involving inner products of various elliptic functions arising through the KdV equation. One key point in the analysis is a bootstrap argument balancing the extremely poor sup norm bounds for these functions against the extremely good convergence properties for analytic interpolation in order to obtain a feasible computation time. Another is the way of handling analytic interpolation in several variables by a two-step process carving up the parameter space into manageable pieces for rigorous evaluation. These and other general aspects of the analysis should serve as blueprints for more general analyses of spectral stability.
21 CFR 1311.25 - Requirements for obtaining a CSOS digital certificate.
Code of Federal Regulations, 2013 CFR
2013-04-01
... public keys, the corresponding private key must be used to sign the certificate request. Verification of the signature using the public key in the request will serve as proof of possession of the private key. ... certification of the public digital signature key. After the request is approved, the Certification Authority...
21 CFR 1311.25 - Requirements for obtaining a CSOS digital certificate.
Code of Federal Regulations, 2012 CFR
2012-04-01
... public keys, the corresponding private key must be used to sign the certificate request. Verification of the signature using the public key in the request will serve as proof of possession of the private key. ... certification of the public digital signature key. After the request is approved, the Certification Authority...
21 CFR 1311.25 - Requirements for obtaining a CSOS digital certificate.
Code of Federal Regulations, 2014 CFR
2014-04-01
... public keys, the corresponding private key must be used to sign the certificate request. Verification of the signature using the public key in the request will serve as proof of possession of the private key. ... certification of the public digital signature key. After the request is approved, the Certification Authority...
Dynamic characterization and microprocessor control of the NASA/UVA proof mass actuator
NASA Technical Reports Server (NTRS)
Zimmerman, D. C.; Inman, D. J.; Horner, G. C.
1984-01-01
The self-contained electromagnetic-reaction-type force-actuator system developed by NASA/UVA for the verification of spacecraft-structure vibration-control laws is characterized and demonstrated. The device is controlled by a dedicated microprocessor and has dynamic characteristics determined by Fourier analysis. Test data on a cantilevered beam are shown.
Experimental verification of a model of a two-link flexible, lightweight manipulator. M.S. Thesis
NASA Technical Reports Server (NTRS)
Huggins, James David
1988-01-01
Experimental verification is presented for an assumed modes model of a large, two link, flexible manipulator design and constructed in the School of Mechanical Engineering at Georgia Institute of Technology. The structure was designed to have typical characteristics of a lightweight manipulator.
Test/QA Plan for Verification of Microcystin Test Kits
Microcystin test kits are used to quantitatively measure total microcystin in recreational waters. These test kits are based on enzyme-linked immunosorbent assays (ELISA) with antibodies that bind specifically to microcystins or phosphate activity inhibition where the phosphatas...
Automatic Estimation of Verified Floating-Point Round-Off Errors via Static Analysis
NASA Technical Reports Server (NTRS)
Moscato, Mariano; Titolo, Laura; Dutle, Aaron; Munoz, Cesar A.
2017-01-01
This paper introduces a static analysis technique for computing formally verified round-off error bounds of floating-point functional expressions. The technique is based on a denotational semantics that computes a symbolic estimation of floating-point round-o errors along with a proof certificate that ensures its correctness. The symbolic estimation can be evaluated on concrete inputs using rigorous enclosure methods to produce formally verified numerical error bounds. The proposed technique is implemented in the prototype research tool PRECiSA (Program Round-o Error Certifier via Static Analysis) and used in the verification of floating-point programs of interest to NASA.
Synthesizing Safety Conditions for Code Certification Using Meta-Level Programming
NASA Technical Reports Server (NTRS)
Eusterbrock, Jutta
2004-01-01
In code certification the code consumer publishes a safety policy and the code producer generates a proof that the produced code is in compliance with the published safety policy. In this paper, a novel viewpoint approach towards an implementational re-use oriented framework for code certification is taken. It adopts ingredients from Necula's approach for proof-carrying code, but in this work safety properties can be analyzed on a higher code level than assembly language instructions. It consists of three parts: (1) The specification language is extended to include generic pre-conditions that shall ensure safety at all states that can be reached during program execution. Actual safety requirements can be expressed by providing domain-specific definitions for the generic predicates which act as interface to the environment. (2) The Floyd-Hoare inductive assertion method is refined to obtain proof rules that allow the derivation of the proof obligations in terms of the generic safety predicates. (3) A meta-interpreter is designed and experimentally implemented that enables automatic synthesis of proof obligations for submitted programs by applying the modified Floyd-Hoare rules. The proof obligations have two separate conjuncts, one for functional correctness and another for the generic safety obligations. Proof of the generic obligations, having provided the actual safety definitions as context, ensures domain-specific safety of program execution in a particular environment and is simpler than full program verification.
Towards Formal Verification of a Separation Microkernel
NASA Astrophysics Data System (ADS)
Butterfield, Andrew; Sanan, David; Hinchey, Mike
2013-08-01
The best approach to verifying an IMA separation kernel is to use a (fixed) time-space partitioning kernel with a multiple independent levels of separation (MILS) architecture. We describe an activity that explores the cost and feasibility of doing a formal verification of such a kernel to the Common Criteria (CC) levels mandated by the Separation Kernel Protection Profile (SKPP). We are developing a Reference Specification of such a kernel, and are using higher-order logic (HOL) to construct formal models of this specification and key separation properties. We then plan to do a dry run of part of a formal proof of those properties using the Isabelle/HOL theorem prover.
NASA Astrophysics Data System (ADS)
Zint, M.; Stock, K.; Graser, R.; Ertl, T.; Brauer, E.; Heyninck, J.; Vanbiervliet, J.; Dhondt, S.; De Ceuninck, P.; Hibst, R.
2015-03-01
The presented work describes the development and verification of a novel optical, powder-free intra-oral scanner based on chromatic confocal technology combined with a multifocal approach. The proof of concept for a chromatic confocal area scanner for intra-oral scanning is given. Several prototype scanners passed a verification process showing an average accuracy (distance deviation on flat surfaces) of less than 31μm +/- 21μm and a reproducibility of less than 4μm +/- 3μm. Compared to a tactile measurement on a full jaw model fitted with 4mm ceramic spheres the measured average distance deviation between the spheres was 49μm +/- 12μm for scans of up to 8 teeth (3- unit bridge, single Quadrant) and 104μm +/- 82μm for larger scans and full jaws. The average deviation of the measured sphere diameter compared to the tactile measurement was 27μm +/- 14μm. Compared to μCT scans of plaster models equipped with human teeth the average standard deviation on up to 3 units was less than 55μm +/- 49μm whereas the reproducibility of the scans was better than 22μm +/- 10μm.
2010-07-12
Germany, 1999. [8] L. Babai, L. Fortnow, L. A. Levin, and M. Szegedy. Checking Computations in Polylogarithmic Time. In STOC, 1991. [9] A. Ben- David ...their work. J. ACM, 42(1):269–291, 1995. [12] D. Chaum , C. Crépeau, and I. Damgard. Multiparty unconditionally secure protocols. In STOC, 1988. [13
20 CFR 30.105 - What must DOE do after an employee or survivor files a claim?
Code of Federal Regulations, 2010 CFR
2010-04-01
... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... employment history provided by the claimant. Upon receipt of such a request, DOE will complete Form EE-5 as... concurs with the employment history provided by the claimant, that it disagrees with such history, or that...
20 CFR 30.105 - What must DOE do after an employee or survivor files a claim?
Code of Federal Regulations, 2011 CFR
2011-04-01
... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... employment history provided by the claimant. Upon receipt of such a request, DOE will complete Form EE-5 as... concurs with the employment history provided by the claimant, that it disagrees with such history, or that...
20 CFR 30.105 - What must DOE do after an employee or survivor files a claim?
Code of Federal Regulations, 2014 CFR
2014-04-01
... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... employment history provided by the claimant. Upon receipt of such a request, DOE will complete Form EE-5 as... concurs with the employment history provided by the claimant, that it disagrees with such history, or that...
20 CFR 30.105 - What must DOE do after an employee or survivor files a claim?
Code of Federal Regulations, 2012 CFR
2012-04-01
... Filing Claims; Evidence and Burden of Proof; Special Procedures for Certain Cancer Claims Verification of... employment history provided by the claimant. Upon receipt of such a request, DOE will complete Form EE-5 as... concurs with the employment history provided by the claimant, that it disagrees with such history, or that...
Planning with Open Eyes and Open Hearts: An Alternative to Excessive Positivism
ERIC Educational Resources Information Center
O'Brien, John
2007-01-01
In this article, the author offers his critique on Holburn and Cea's notion on "excessive positivism" that person-centered planners are overconcerned with scientific verification and logical proof. The author believes that Holburn and Cea's notion blurs the important messages they have for person-centered planners by leading toward a debate about…
General Dynamic (GD) Launch Waveform On-Orbit Performance Report
NASA Technical Reports Server (NTRS)
Briones, Janette C.; Shalkhauser, Mary Jo
2014-01-01
The purpose of this report is to present the results from the GD SDR on-orbit performance testing using the launch waveform over TDRSS. The tests include the evaluation of well-tested waveform modes, the operation of RF links that are expected to have high margins, the verification of forward return link operation (including full duplex), the verification of non-coherent operational models, and the verification of radio at-launch operational frequencies. This report also outlines the launch waveform tests conducted and comparisons to the results obtained from ground testing.
Report on the formal specification and partial verification of the VIPER microprocessor
NASA Technical Reports Server (NTRS)
Brock, Bishop; Hunt, Warren A., Jr.
1991-01-01
The VIPER microprocessor chip is partitioned into four levels of abstractions. At the highest level, VIPER is described with decreasingly abstract sets of functions in LCF-LSM. At the lowest level are the gate-level models in proprietary CAD languages. The block-level and gate-level specifications are also given in the ELLA simulation language. Among VIPER's deficiencies are the fact that there is no notion of external events in the top-level specification, and it is impossible to use the top-level specifications to prove abstract properties of programs running on VIPER computers. There is no complete proof that the gate-level specifications implement the top-level specifications. Cohn's proof that the major-state machine correctly implements the top-level specifications has no formal connection with any of the other proof attempts. None of the latter address resetting the machine, memory timeout, forced error, or single step modes.
NASA Technical Reports Server (NTRS)
Denney, Ewen W.; Fischer, Bernd
2009-01-01
Model-based development and automated code generation are increasingly used for production code in safety-critical applications, but since code generators are typically not qualified, the generated code must still be fully tested, reviewed, and certified. This is particularly arduous for mathematical and control engineering software which requires reviewers to trace subtle details of textbook formulas and algorithms to the code, and to match requirements (e.g., physical units or coordinate frames) not represented explicitly in models or code. Both tasks are complicated by the often opaque nature of auto-generated code. We address these problems by developing a verification-driven approach to traceability and documentation. We apply the AUTOCERT verification system to identify and then verify mathematical concepts in the code, based on a mathematical domain theory, and then use these verified traceability links between concepts, code, and verification conditions to construct a natural language report that provides a high-level structured argument explaining why and how the code uses the assumptions and complies with the requirements. We have applied our approach to generate review documents for several sub-systems of NASA s Project Constellation.
American Burn Association Consensus Statements
2013-08-01
stream infections, catheter-associated urinary tract infections, and ventilator-associated pneumonias in spite of many fewer catheters per 1000...multiple attempts the burn community has never been able to link ABA verification with enhanced reimbursement. It is clear that to accomplish this...goal, we must be able to link verification with improved outcomes, decreased mortality, shorter hospital stay, reduced hospital- acquired
NASA Technical Reports Server (NTRS)
Moser, Louise; Melliar-Smith, Michael; Schwartz, Richard
1987-01-01
A SIFT reliable aircraft control computer system, designed to meet the ultrahigh reliability required for safety critical flight control applications by use of processor replications and voting, was constructed for SRI, and delivered to NASA Langley for evaluation in the AIRLAB. To increase confidence in the reliability projections for SIFT, produced by a Markov reliability model, SRI constructed a formal specification, defining the meaning of reliability in the context of flight control. A further series of specifications defined, in increasing detail, the design of SIFT down to pre- and post-conditions on Pascal code procedures. Mechanically checked mathematical proofs were constructed to demonstrate that the more detailed design specifications for SIFT do indeed imply the formal reliability requirement. An additional specification defined some of the assumptions made about SIFT by the Markov model, and further proofs were constructed to show that these assumptions, as expressed by that specification, did indeed follow from the more detailed design specifications for SIFT. This report provides an outline of the methodology used for this hierarchical specification and proof, and describes the various specifications and proofs performed.
Control and Non-Payload Communications (CNPC) Prototype Radio Verification Test Report
NASA Technical Reports Server (NTRS)
Bishop, William D.; Frantz, Brian D.; Thadhani, Suresh K.; Young, Daniel P.
2017-01-01
This report provides an overview and results from the verification of the specifications that defines the operational capabilities of the airborne and ground, L Band and C Band, Command and Non-Payload Communications radio link system. An overview of system verification is provided along with an overview of the operation of the radio. Measurement results are presented for verification of the radios operation.
Bohata, J; Zvanovec, S; Pesek, P; Korinek, T; Mansour Abadi, M; Ghassemlooy, Z
2016-03-10
This paper describes the experimental verification of the utilization of long-term evolution radio over fiber (RoF) and radio over free space optics (RoFSO) systems using dual-polarization signals for cloud radio access network applications determining the specific utilization limits. A number of free space optics configurations are proposed and investigated under different atmospheric turbulence regimes in order to recommend the best setup configuration. We show that the performance of the proposed link, based on the combination of RoF and RoFSO for 64 QAM at 2.6 GHz, is more affected by the turbulence based on the measured difference error vector magnitude value of 5.5%. It is further demonstrated the proposed systems can offer higher noise immunity under particular scenarios with the signal-to-noise ratio reliability limit of 5 dB in the radio frequency domain for RoF and 19.3 dB in the optical domain for a combination of RoF and RoFSO links.
Blount, G.; Gorensek, M.; Hamm, L.; ...
2014-12-31
Partnering in Innovation, Inc. (Pi-Innovation) introduces an aqueous post-combustion carbon dioxide (CO₂) capture system (Pi-CO₂) that offers high market value by directly addressing the primary constraints limiting beneficial re-use markets (lowering parasitic energy costs, reducing delivered cost of capture, eliminating the need for special solvents, etc.). A highly experienced team has completed initial design, modeling, manufacturing verification, and financial analysis for commercial market entry. Coupled thermodynamic and thermal-hydraulic mass transfer modeling results fully support proof of concept. Pi-CO₂ has the potential to lower total cost and risk to levels sufficient to stimulate global demand for CO₂ from local industrial sources.
Verification or Proof: Justification of Pythagoras' Theorem in Chinese Mathematics Classrooms
ERIC Educational Resources Information Center
Huang, Rongjin
2005-01-01
This paper presents key findings of my research on the approaches to justification by investigating how a sample of teachers in Hong Kong and Shanghai taught the topic Pythagoras theorem. In this study, 8 Hong Kong videos taken from TIMSS 1999 Video Study and 11 Shanghai videos videotaped by the researcher comprised the database. It was found that…
2012-03-01
to sell fake antivirus software ; Gammima, which was used to steal gaming login information; and Zeus, which was used to steal banking information...13 3. Viruses ......................................14 C. PROOF OF CONCEPT OF SOFTWARE TRAINING USING MALWARE MIMICS...33 2. Software .....................................34 3. COMPOSE CG-71 Virtual Machines ...............37 a. Integrated Shipboard Network System
State-Based Implicit Coordination and Applications
NASA Technical Reports Server (NTRS)
Narkawicz, Anthony J.; Munoz, Cesar A.
2011-01-01
In air traffic management, pairwise coordination is the ability to achieve separation requirements when conflicting aircraft simultaneously maneuver to solve a conflict. Resolution algorithms are implicitly coordinated if they provide coordinated resolution maneuvers to conflicting aircraft when only surveillance data, e.g., position and velocity vectors, is periodically broadcast by the aircraft. This paper proposes an abstract framework for reasoning about state-based implicit coordination. The framework consists of a formalized mathematical development that enables and simplifies the design and verification of implicitly coordinated state-based resolution algorithms. The use of the framework is illustrated with several examples of algorithms and formal proofs of their coordination properties. The work presented here supports the safety case for a distributed self-separation air traffic management concept where different aircraft may use different conflict resolution algorithms and be assured that separation will be maintained.
[Relevance of medical rehabilitation in disease management programmes].
Lüngen, M; Lauterbach, K W
2003-10-01
Disease management programmes will increasingly be introduced in Germany due to the new risk adjustment scheme. The first disease management programmes started in 2003 for breast cancer and diabetes mellitus type II. German rehabilitation will have to face several challenges. Disease management programmes are strongly based on the notion of Evidence so that proof of the efficacy of a care giving task should be present. Verification of the evidence of the specifically German rehabilitation treatments must therefore be given. However, integration of rehabilitation in disease management programmes could lead to changes in the alignment of German rehabilitation. The essence of German rehabilitation, notably its holistic approach, could get lost with integration in disease management programmes.
NASA Technical Reports Server (NTRS)
Nickle, F. R.; Freeman, Arthur B.
1939-01-01
The safety of remotely operated vehicles depends on the correctness of the distributed protocol that facilitates the communication between the vehicle and the operator. A failure in this communication can result in catastrophic loss of the vehicle. To complicate matters, the communication system may be required to satisfy several, possibly conflicting, requirements. The design of protocols is typically an informal process based on successive iterations of a prototype implementation. Yet distributed protocols are notoriously difficult to get correct using such informal techniques. We present a formal specification of the design of a distributed protocol intended for use in a remotely operated vehicle, which is built from the composition of several simpler protocols. We demonstrate proof strategies that allow us to prove properties of each component protocol individually while ensuring that the property is preserved in the composition forming the entire system. Given that designs are likely to evolve as additional requirements emerge, we show how we have automated most of the repetitive proof steps to enable verification of rapidly changing designs.
SPECS: Secure and Privacy Enhancing Communications Schemes for VANETs
NASA Astrophysics Data System (ADS)
Chim, T. W.; Yiu, S. M.; Hui, L. C. K.; Jiang, Zoe L.; Li, Victor O. K.
Vehicular ad hoc network (VANET) is an emerging type of networks which facilitates vehicles on roads to communicate for driving safety. The basic idea is to allow arbitrary vehicles to broadcast ad hoc messages (e.g. traffic accidents) to other vehicles. However, this raises the concern of security and privacy. Messages should be signed and verified before they are trusted while the real identity of vehicles should not be revealed, but traceable by authorized party. Existing solutions either rely heavily on a tamper-proof hardware device, or cannot satisfy the privacy requirement and do not have an effective message verification scheme. In this paper, we provide a software-based solution which makes use of only two shared secrets to satisfy the privacy requirement and gives lower message overhead and at least 45% higher successful rate than previous solutions in the message verification phase using the bloom filter and the binary search techniques. We also provide the first group communication protocol to allow vehicles to authenticate and securely communicate with others in a group of known vehicles.
An Event Driven Hybrid Identity Management Approach to Privacy Enhanced e-Health
Sánchez-Guerrero, Rosa; Almenárez, Florina; Díaz-Sánchez, Daniel; Marín, Andrés; Arias, Patricia; Sanvido, Fabio
2012-01-01
Credential-based authorization offers interesting advantages for ubiquitous scenarios involving limited devices such as sensors and personal mobile equipment: the verification can be done locally; it offers a more reduced computational cost than its competitors for issuing, storing, and verification; and it naturally supports rights delegation. The main drawback is the revocation of rights. Revocation requires handling potentially large revocation lists, or using protocols to check the revocation status, bringing extra communication costs not acceptable for sensors and other limited devices. Moreover, the effective revocation consent—considered as a privacy rule in sensitive scenarios—has not been fully addressed. This paper proposes an event-based mechanism empowering a new concept, the sleepyhead credentials, which allows to substitute time constraints and explicit revocation by activating and deactivating authorization rights according to events. Our approach is to integrate this concept in IdM systems in a hybrid model supporting delegation, which can be an interesting alternative for scenarios where revocation of consent and user privacy are critical. The delegation includes a SAML compliant protocol, which we have validated through a proof-of-concept implementation. This article also explains the mathematical model describing the event-based model and offers estimations of the overhead introduced by the system. The paper focus on health care scenarios, where we show the flexibility of the proposed event-based user consent revocation mechanism. PMID:22778634
An event driven hybrid identity management approach to privacy enhanced e-health.
Sánchez-Guerrero, Rosa; Almenárez, Florina; Díaz-Sánchez, Daniel; Marín, Andrés; Arias, Patricia; Sanvido, Fabio
2012-01-01
Credential-based authorization offers interesting advantages for ubiquitous scenarios involving limited devices such as sensors and personal mobile equipment: the verification can be done locally; it offers a more reduced computational cost than its competitors for issuing, storing, and verification; and it naturally supports rights delegation. The main drawback is the revocation of rights. Revocation requires handling potentially large revocation lists, or using protocols to check the revocation status, bringing extra communication costs not acceptable for sensors and other limited devices. Moreover, the effective revocation consent--considered as a privacy rule in sensitive scenarios--has not been fully addressed. This paper proposes an event-based mechanism empowering a new concept, the sleepyhead credentials, which allows to substitute time constraints and explicit revocation by activating and deactivating authorization rights according to events. Our approach is to integrate this concept in IdM systems in a hybrid model supporting delegation, which can be an interesting alternative for scenarios where revocation of consent and user privacy are critical. The delegation includes a SAML compliant protocol, which we have validated through a proof-of-concept implementation. This article also explains the mathematical model describing the event-based model and offers estimations of the overhead introduced by the system. The paper focus on health care scenarios, where we show the flexibility of the proposed event-based user consent revocation mechanism.
An RFID solution for enhancing inpatient medication safety with real-time verifiable grouping-proof.
Chen, Yu-Yi; Tsai, Meng-Lin
2014-01-01
The occurrence of a medication error can threaten patient safety. The medication administration process is complex and cumbersome, and nursing staffs are prone to error when they are tired. Proper Information Technology (IT) can assist the nurse in correct medication administration. We review a recent proposal regarding a leading-edge solution to enhance inpatient medication safety by using RFID technology. The proof mechanism is the kernel concept in their design and worth studying to develop a well-designed grouping-proof scheme. Other RFID grouping-proof protocols could be similarly applied in administering physician orders. We improve on the weaknesses of previous works and develop a reading-order independent RFID grouping-proof scheme in this paper. In our scheme, tags are queried and verified under the direct control of the authorized reader without connecting to the back-end database server. Immediate verification in our design makes this application more portable and efficient and critical security issues have been analyzed by the threat model. Our scheme is suitable for the safe drug administration scenario and the drug package scenario in a hospital environment to enhance inpatient medication safety. It automatically checks for correct drug unit-dose and appropriate inpatient treatments. Copyright © 2013. Published by Elsevier Ireland Ltd.
2011-08-01
design space is large. His research contributions are to the field of Decision-based Design, specifically in linking consumer preferences and...Integrating Consumer Preferences into Engineering Design, to be published in 2012. He received his PhD from Northwestern University in Mechanical
Shuttle Communications and Tracking Systems Modeling and TDRSS Link Simulations Studies
NASA Technical Reports Server (NTRS)
Chie, C. M.; Dessouky, K.; Lindsey, W. C.; Tsang, C. S.; Su, Y. T.
1985-01-01
An analytical simulation package (LinCsim) which allows the analytical verification of data transmission performance through TDRSS satellites was modified. The work involved the modeling of the user transponder, TDRS, TDRS ground terminal, and link dynamics for forward and return links based on the TDRSS performance specifications (4) and the critical design reviews. The scope of this effort has recently been expanded to include the effects of radio frequency interference (RFI) on the bit error rate (BER) performance of the S-band return links. The RFI environment and the modified TDRSS satellite and ground station hardware are being modeled in accordance with their description in the applicable documents.
Direct and full-scale experimental verifications towards ground-satellite quantum key distribution
NASA Astrophysics Data System (ADS)
Wang, Jian-Yu; Yang, Bin; Liao, Sheng-Kai; Zhang, Liang; Shen, Qi; Hu, Xiao-Fang; Wu, Jin-Cai; Yang, Shi-Ji; Jiang, Hao; Tang, Yan-Lin; Zhong, Bo; Liang, Hao; Liu, Wei-Yue; Hu, Yi-Hua; Huang, Yong-Mei; Qi, Bo; Ren, Ji-Gang; Pan, Ge-Sheng; Yin, Juan; Jia, Jian-Jun; Chen, Yu-Ao; Chen, Kai; Peng, Cheng-Zhi; Pan, Jian-Wei
2013-05-01
Quantum key distribution (QKD) provides the only intrinsically unconditional secure method for communication based on the principle of quantum mechanics. Compared with fibre-based demonstrations, free-space links could provide the most appealing solution for communication over much larger distances. Despite significant efforts, all realizations to date rely on stationary sites. Experimental verifications are therefore extremely crucial for applications to a typical low Earth orbit satellite. To achieve direct and full-scale verifications of our set-up, we have carried out three independent experiments with a decoy-state QKD system, and overcome all conditions. The system is operated on a moving platform (using a turntable), on a floating platform (using a hot-air balloon), and with a high-loss channel to demonstrate performances under conditions of rapid motion, attitude change, vibration, random movement of satellites, and a high-loss regime. The experiments address wide ranges of all leading parameters relevant to low Earth orbit satellites. Our results pave the way towards ground-satellite QKD and a global quantum communication network.
Formal Safety Certification of Aerospace Software
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2005-01-01
In principle, formal methods offer many advantages for aerospace software development: they can help to achieve ultra-high reliability, and they can be used to provide evidence of the reliability claims which can then be subjected to external scrutiny. However, despite years of research and many advances in the underlying formalisms of specification, semantics, and logic, formal methods are not much used in practice. In our opinion this is related to three major shortcomings. First, the application of formal methods is still expensive because they are labor- and knowledge-intensive. Second, they are difficult to scale up to complex systems because they are based on deep mathematical insights about the behavior of the systems (t.e., they rely on the "heroic proof"). Third, the proofs can be difficult to interpret, and typically stand in isolation from the original code. In this paper, we describe a tool for formally demonstrating safety-relevant aspects of aerospace software, which largely circumvents these problems. We focus on safely properties because it has been observed that safety violations such as out-of-bounds memory accesses or use of uninitialized variables constitute the majority of the errors found in the aerospace domain. In our approach, safety means that the program will not violate a set of rules that can range for the simple memory access rules to high-level flight rules. These different safety properties are formalized as different safety policies in Hoare logic, which are then used by a verification condition generator along with the code and logical annotations in order to derive formal safety conditions; these are then proven using an automated theorem prover. Our certification system is currently integrated into a model-based code generation toolset that generates the annotations together with the code. However, this automated formal certification technology is not exclusively constrained to our code generator and could, in principle, also be integrated with other code generators such as RealTime Workshop or even applied to legacy code. Our approach circumvents the historical problems with formal methods by increasing the degree of automation on all levels. The restriction to safety policies (as opposed to arbitrary functional behavior) results in simpler proof problems that can generally be solved by fully automatic theorem proves. An automated linking mechanism between the safety conditions and the code provides some of the traceability mandated by process standards such as DO-178B. An automated explanation mechanism uses semantic markup added by the verification condition generator to produce natural-language explanations of the safety conditions and thus supports their interpretation in relation to the code. It shows an automatically generated certification browser that lets users inspect the (generated) code along with the safety conditions (including textual explanations), and uses hyperlinks to automate tracing between the two levels. Here, the explanations reflect the logical structure of the safety obligation but the mechanism can in principle be customized using different sets of domain concepts. The interface also provides some limited control over the certification process itself. Our long-term goal is a seamless integration of certification, code generation, and manual coding that results in a "certified pipeline" in which specifications are automatically transformed into executable code, together with the supporting artifacts necessary for achieving and demonstrating the high level of assurance needed in the aerospace domain.
The USEPA’s ToxCast program is developing a novel approach to chemical toxicity testing using high-throughput screening (HTS) assays to rapidly test thousands of chemicals against hundreds of in vitro molecular targets. This approach is based on the premise that in vitro HTS bioa...
Ultrasound functional imaging in an ex vivo beating porcine heart platform
NASA Astrophysics Data System (ADS)
Petterson, Niels J.; Fixsen, Louis S.; Rutten, Marcel C. M.; Pijls, Nico H. J.; van de Vosse, Frans N.; Lopata, Richard G. P.
2017-12-01
In recent years, novel ultrasound functional imaging (UFI) techniques have been introduced to assess cardiac function by measuring, e.g. cardiac output (CO) and/or myocardial strain. Verification and reproducibility assessment in a realistic setting remain major issues. Simulations and phantoms are often unrealistic, whereas in vivo measurements often lack crucial hemodynamic parameters or ground truth data, or suffer from the large physiological and clinical variation between patients when attempting clinical validation. Controlled validation in certain pathologies is cumbersome and often requires the use of lab animals. In this study, an isolated beating pig heart setup was adapted and used for performance assessment of UFI techniques such as volume assessment and ultrasound strain imaging. The potential of performing verification and reproducibility studies was demonstrated. For proof-of-principle, validation of UFI in pathological hearts was examined. Ex vivo porcine hearts (n = 6, slaughterhouse waste) were resuscitated and attached to a mock circulatory system. Radio frequency ultrasound data of the left ventricle were acquired in five short axis views and one long axis view. Based on these slices, the CO was measured, where verification was performed using flow sensor measurements in the aorta. Strain imaging was performed providing radial, circumferential and longitudinal strain to assess reproducibility and inter-subject variability under steady conditions. Finally, strains in healthy hearts were compared to a heart with an implanted left ventricular assist device, simulating a failing, supported heart. Good agreement between ultrasound and flow sensor based CO measurements was found. Strains were highly reproducible (intraclass correlation coefficients >0.8). Differences were found due to biological variation and condition of the hearts. Strain magnitude and patterns in the assisted heart were available for different pump action, revealing large changes compared to the normal condition. The setup provides a valuable benchmarking platform for UFI techniques. Future studies will include work on different pathologies and other means of measurement verification.
Verification of Concurrent Programs. Part II. Temporal Proof Principles.
1981-09-01
not modify any of the shared program variables. In order to ensure the correct synchronization between the processes we use three semaphore variables...direct, simple, and intuitive rides for the establishment of these properties. rhey usually replace long but repetitively similar chains of primitive ...modify the variables on which Q actually depends. A typical case is that of semaphores . We have the following property: The Semaphore Variable Rule
A resonance based model of biological evolution
NASA Astrophysics Data System (ADS)
Damasco, Achille; Giuliani, Alessandro
2017-04-01
We propose a coarse grained physical model of evolution. The proposed model 'at least in principle' is amenable of an experimental verification even if this looks as a conundrum: evolution is a unique historical process and the tape cannot be reversed and played again. Nevertheless, we can imagine a phenomenological scenario tailored upon state transitions in physical chemistry in which different agents of evolution play the role of the elements of a state transition like thermal noise or resonance effects. The abstract model we propose can be of help for sketching hypotheses and getting rid of some well-known features of natural history like the so-called Cambrian explosion. The possibility of an experimental proof of the model is discussed as well.
Deciding Full Branching Time Logic by Program Transformation
NASA Astrophysics Data System (ADS)
Pettorossi, Alberto; Proietti, Maurizio; Senni, Valerio
We present a method based on logic program transformation, for verifying Computation Tree Logic (CTL*) properties of finite state reactive systems. The finite state systems and the CTL* properties we want to verify, are encoded as logic programs on infinite lists. Our verification method consists of two steps. In the first step we transform the logic program that encodes the given system and the given property, into a monadic ω -program, that is, a stratified program defining nullary or unary predicates on infinite lists. This transformation is performed by applying unfold/fold rules that preserve the perfect model of the initial program. In the second step we verify the property of interest by using a proof method for monadic ω-programs.
NASA Technical Reports Server (NTRS)
Defeo, P.; Doane, D.; Saito, J.
1982-01-01
A Digital Flight Control Systems Verification Laboratory (DFCSVL) has been established at NASA Ames Research Center. This report describes the major elements of the laboratory, the research activities that can be supported in the area of verification and validation of digital flight control systems (DFCS), and the operating scenarios within which these activities can be carried out. The DFCSVL consists of a palletized dual-dual flight-control system linked to a dedicated PDP-11/60 processor. Major software support programs are hosted in a remotely located UNIVAC 1100 accessible from the PDP-11/60 through a modem link. Important features of the DFCSVL include extensive hardware and software fault insertion capabilities, a real-time closed loop environment to exercise the DFCS, an integrated set of software verification tools, and a user-oriented interface to all the resources and capabilities.
Mapping {sup 15}O Production Rate for Proton Therapy Verification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Grogg, Kira; Alpert, Nathaniel M.; Zhu, Xuping
Purpose: This work was a proof-of-principle study for the evaluation of oxygen-15 ({sup 15}O) production as an imaging target through the use of positron emission tomography (PET), to improve verification of proton treatment plans and to study the effects of perfusion. Methods and Materials: Dynamic PET measurements of irradiation-produced isotopes were made for a phantom and rabbit thigh muscles. The rabbit muscle was irradiated and imaged under both live and dead conditions. A differential equation was fitted to phantom and in vivo data, yielding estimates of {sup 15}O production and clearance rates, which were compared to live versus dead rates formore » the rabbit and to Monte Carlo predictions. Results: PET clearance rates agreed with decay constants of the dominant radionuclide species in 3 different phantom materials. In 2 oxygen-rich materials, the ratio of {sup 15}O production rates agreed with the expected ratio. In the dead rabbit thighs, the dynamic PET concentration histories were accurately described using {sup 15}O decay constant, whereas the live thigh activity decayed faster. Most importantly, the {sup 15}O production rates agreed within 2% (P>.5) between conditions. Conclusions: We developed a new method for quantitative measurement of {sup 15}O production and clearance rates in the period immediately following proton therapy. Measurements in the phantom and rabbits were well described in terms of {sup 15}O production and clearance rates, plus a correction for other isotopes. These proof-of-principle results support the feasibility of detailed verification of proton therapy treatment delivery. In addition, {sup 15}O clearance rates may be useful in monitoring permeability changes due to therapy.« less
Human factors engineering verification and validation for APR1400 computerized control room
DOE Office of Scientific and Technical Information (OSTI.GOV)
Shin, Y. C.; Moon, H. K.; Kim, J. H.
2006-07-01
This paper introduces the Advanced Power Reactor 1400 (APR1400) HFE V and V activities the Korea Hydro Nuclear Plant Co. LTD. (KHNP) has performed for the last 10 years and some of the lessons learned through these activities. The features of APR1400 main control room include large display panel, redundant compact workstations, computer-based procedure, and safety console. Several iterations of human factors evaluations have been performed from small scale proof of concept tests to large scale integrated system tests for identifying human engineering deficiencies in the human system interface design. Evaluations in the proof of concept test were focused onmore » checking the presence of any show stopper problems in the design concept. Later evaluations were mostly for finding design problems and for assuring the resolution of human factors issues of advanced control room. The results of design evaluations were useful not only for refining the control room design, but also for licensing the standard design. Several versions of APR1400 mock-ups with dynamic simulation models of currently operating Korea Standard Nuclear Plant (KSNP) have been used for the evaluations with the participation of operators from KSNP plants. (authors)« less
Position paper: the science of deep specification.
Appel, Andrew W; Beringer, Lennart; Chlipala, Adam; Pierce, Benjamin C; Shao, Zhong; Weirich, Stephanie; Zdancewic, Steve
2017-10-13
We introduce our efforts within the project 'The science of deep specification' to work out the key formal underpinnings of industrial-scale formal specifications of software and hardware components, anticipating a world where large verified systems are routinely built out of smaller verified components that are also used by many other projects. We identify an important class of specification that has already been used in a few experiments that connect strong component-correctness theorems across the work of different teams. To help popularize the unique advantages of that style, we dub it deep specification , and we say that it encompasses specifications that are rich , two-sided , formal and live (terms that we define in the article). Our core team is developing a proof-of-concept system (based on the Coq proof assistant) whose specification and verification work is divided across largely decoupled subteams at our four institutions, encompassing hardware microarchitecture, compilers, operating systems and applications, along with cross-cutting principles and tools for effective specification. We also aim to catalyse interest in the approach, not just by basic researchers but also by users in industry.This article is part of the themed issue 'Verified trustworthy software systems'. © 2017 The Author(s).
Secure Oblivious Hiding, Authentication, Tamper Proofing, and Verification Techniques
2002-08-01
compressing the bit- planes. The algorithm always starts with inspecting the 5th LSB plane. For color images , all three color-channels are compressed...use classical encryption engines, such as IDEA or DES . These algorithms have a fixed encryption block size, and, depending on the image dimensions, we...information can be stored either in a separate file, in the image header, or embedded in the image itself utilizing the modern concepts of steganography
Formal System Verification - Extension 2
2012-08-08
vision of truly trustworthy systems has been to provide a formally verified microkernel basis. We have previously developed the seL4 microkernel...together with a formal proof (in the theorem prover Isabelle/HOL) of its functional correctness [6]. This means that all the behaviours of the seL4 C...source code are included in the high-level, formal specification of the kernel. This work enabled us to provide further formal guarantees about seL4 , in
NASA Astrophysics Data System (ADS)
Acero, R.; Santolaria, J.; Pueo, M.; Aguilar, J. J.; Brau, A.
2015-11-01
High-range measuring equipment like laser trackers need large dimension calibrated reference artifacts in their calibration and verification procedures. In this paper, a new verification procedure for portable coordinate measuring instruments based on the generation and evaluation of virtual distances with an indexed metrology platform is developed. This methodology enables the definition of an unlimited number of reference distances without materializing them in a physical gauge to be used as a reference. The generation of the virtual points and reference lengths derived is linked to the concept of the indexed metrology platform and the knowledge of the relative position and orientation of its upper and lower platforms with high accuracy. It is the measuring instrument together with the indexed metrology platform one that remains still, rotating the virtual mesh around them. As a first step, the virtual distances technique is applied to a laser tracker in this work. The experimental verification procedure of the laser tracker with virtual distances is simulated and further compared with the conventional verification procedure of the laser tracker with the indexed metrology platform. The results obtained in terms of volumetric performance of the laser tracker proved the suitability of the virtual distances methodology in calibration and verification procedures for portable coordinate measuring instruments, broadening and expanding the possibilities for the definition of reference distances in these procedures.
Linking individual medicare health claims data with work-life claims and other administrative data.
Mokyr Horner, Elizabeth; Cullen, Mark R
2015-09-30
Researchers investigating health outcomes for populations over age 65 can utilize Medicare claims data, but these data include no direct information about individuals' health prior to age 65 and are not typically linkable to files containing data on exposures and behaviors during their worklives. The current paper is a proof-of-concept, of merging employers' administrative data and private, employment-based health claims with Medicare data. Characteristics of the linked data, including sensitivity and specificity, are evaluated with an eye toward potential uses of such linked data. This paper uses a sample of former manufacturing workers from an industrial cohort as a test case. The dataset created by this integration could be useful to research in areas such as social epidemiology and occupational health. Medicare and employment administrative data were linked for a large cohort of manufacturing workers (employed at some point during 1996-2008) who transitioned onto Medicare between 2001-2009. Data on work-life health, including biometric indicators, were used to predict health at age 65 and to investigate the concordance of employment-based insurance claims with subsequent Medicare insurance claims. Chronic diseases were found to have relatively high levels of concordance between employment-based private insurance and subsequent Medicare insurance. Information about patient health prior to receipt of Medicare, including biometric indicators, were found to predict health at age 65. Combining these data allows for evaluation of continuous health trajectories, as well as modeling later-life health as a function of work-life behaviors and exposures. It also provides a potential endpoint for occupational health research. This is the first harmonization of its kind, providing a proof-of-concept. The dataset created by this integration could be useful for research in areas such as social epidemiology and occupational health.
LIFDAR: A Diagnostic Tool for the Ionosphere
NASA Astrophysics Data System (ADS)
Kia, O. E.; Rodgers, C. T.; Batholomew, J. L.
2011-12-01
ITT Corporation proposes a novel system to measure and monitor the ion species within the Earth's ionosphere called Laser Induced Fluorescence Detection and Ranging (LIFDAR). Unlike current ionosphere measurements that detect electrons and magnetic field, LIFDAR remotely measures the major contributing ion species to the electron plasma. The LIFDAR dataset has the added capability to demonstrate stratification and classification of the layers of the ionosphere to ultimately give a true tomographic view. We propose a proof of concept study using existing atmospheric LIDAR sensors combined with a mountaintop observatory for a single ion species that is prevalent in all layers of the atmosphere. We envision the LIFDAR concept will enable verification, validation, and exploration of the physics of the magneto-hydrodynamic models used in ionosphere forecasting community. The LIFDAR dataset will provide the necessary ion and electron density data for the system wide data gap. To begin a proof of concept, we present the science justification of the LIFDAR system based on the model photon budget. This analysis is based on the fluorescence of ionized oxygen within the ionosphere versus altitude. We use existing model abundance data of the ionosphere during normal and perturbed states. We propagate the photon uncertainties from the laser source through the atmosphere to the plasma and back to the collecting optics and detector. We calculate the expected photon budget to determine signal to noise estimates based on the targeted altitude and detection efficiency. Finally, we use these results to derive a LIFDAR observation strategy compatible with operational parameters.
A formally verified algorithm for interactive consistency under a hybrid fault model
NASA Technical Reports Server (NTRS)
Lincoln, Patrick; Rushby, John
1993-01-01
Consistent distribution of single-source data to replicated computing channels is a fundamental problem in fault-tolerant system design. The 'Oral Messages' (OM) algorithm solves this problem of Interactive Consistency (Byzantine Agreement) assuming that all faults are worst-cass. Thambidurai and Park introduced a 'hybrid' fault model that distinguished three fault modes: asymmetric (Byzantine), symmetric, and benign; they also exhibited, along with an informal 'proof of correctness', a modified version of OM. Unfortunately, their algorithm is flawed. The discipline of mechanically checked formal verification eventually enabled us to develop a correct algorithm for Interactive Consistency under the hybrid fault model. This algorithm withstands $a$ asymmetric, $s$ symmetric, and $b$ benign faults simultaneously, using $m+1$ rounds, provided $n is greater than 2a + 2s + b + m$, and $m\\geg a$. We present this algorithm, discuss its subtle points, and describe its formal specification and verification in PVS. We argue that formal verification systems such as PVS are now sufficiently effective that their application to fault-tolerance algorithms should be considered routine.
Acoustic emission frequency discrimination
NASA Technical Reports Server (NTRS)
Sugg, Frank E. (Inventor); Graham, Lloyd J. (Inventor)
1988-01-01
In acoustic emission nondestructive testing, broadband frequency noise is distinguished from narrow banded acoustic emission signals, since the latter are valid events indicative of structural flaws in the material being examined. This is accomplished by separating out those signals which contain frequency components both within and beyond (either above or below) the range of valid acoustic emission events. Application to acoustic emission monitoring during nondestructive bond verification and proof loading of undensified tiles on the Space Shuttle Orbiter is considered.
Physical cryptographic verification of nuclear warheads
Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; Vavrek, Jayson R.
2016-01-01
How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times. PMID:27432959
Model Checking Failed Conjectures in Theorem Proving: A Case Study
NASA Technical Reports Server (NTRS)
Pike, Lee; Miner, Paul; Torres-Pomales, Wilfredo
2004-01-01
Interactive mechanical theorem proving can provide high assurance of correct design, but it can also be a slow iterative process. Much time is spent determining why a proof of a conjecture is not forthcoming. In some cases, the conjecture is false and in others, the attempted proof is insufficient. In this case study, we use the SAL family of model checkers to generate a concrete counterexample to an unproven conjecture specified in the mechanical theorem prover, PVS. The focus of our case study is the ROBUS Interactive Consistency Protocol. We combine the use of a mechanical theorem prover and a model checker to expose a subtle flaw in the protocol that occurs under a particular scenario of faults and processor states. Uncovering the flaw allows us to mend the protocol and complete its general verification in PVS.
Physical cryptographic verification of nuclear warheads
NASA Astrophysics Data System (ADS)
Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; Vavrek, Jayson R.
2016-08-01
How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.
Physical cryptographic verification of nuclear warheads.
Kemp, R Scott; Danagoulian, Areg; Macdonald, Ruaridh R; Vavrek, Jayson R
2016-08-02
How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably secure cryptographic hash that does not rely on electronics or software. These techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.
The EPA's National Risk Management Research Laboratory (NRMRL) and its verification organization partner, Battelle, operate the Advanced Monitoring Systems (AMS) Center under ETV. The AMS Center recently evaluated the performance of the Abraxis Ecologenia Ethynylestradiol (EE2) ...
A Comparison of Single-Cycle Versus Multiple-Cycle Proof Testing Strategies
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Millwater, H. R.; Russell, D. A.; Millwater, H. R.
1999-01-01
Single-cycle and multiple-cycle proof testing (SCPT and MCPT) strategies for reusable aerospace propulsion system components are critically evaluated and compared from a rigorous elastic-plastic fracture mechanics perspective. Earlier MCPT studies are briefly reviewed. New J-integral estimation methods for semielliptical surface cracks and cracks at notches are derived and validated. Engineering methods are developed to characterize crack growth rates during elastic-plastic fatigue crack growth (FCG) and the tear-fatigue interaction near instability. Surface crack growth experiments are conducted with Inconel 718 to characterize tearing resistance, FCG under small-scale yielding and elastic-plastic conditions, and crack growth during simulated MCPT. Fractography and acoustic emission studies provide additional insight. The relative merits of SCPT and MCPT are directly compared using a probabilistic analysis linked with an elastic-plastic crack growth computer code. The conditional probability of failure in service is computed for a population of components that have survived a previous proof test, based on an assumed distribution of initial crack depths. Parameter studies investigate the influence of proof factor, tearing resistance, crack shape, initial crack depth distribution, and notches on the MCPT versus SCPT comparison. The parameter studies provide a rational basis to formulate conclusions about the relative advantages and disadvantages of SCPT and MCPT. Practical engineering guidelines are proposed to help select the optimum proof test protocol in a given application.
A Comparison of Single-Cycle Versus Multiple-Cycle Proof Testing Strategies
NASA Technical Reports Server (NTRS)
McClung, R. C.; Chell, G. G.; Millwater, H. R.; Russell, D. A.; Orient, G. E.
1996-01-01
Single-cycle and multiple-cycle proof testing (SCPT and MCPT) strategies for reusable aerospace propulsion system components are critically evaluated and compared from a rigorous elastic-plastic fracture mechanics perspective. Earlier MCPT studies are briefly reviewed. New J-integral estimation methods for semi-elliptical surface cracks and cracks at notches are derived and validated. Engineering methods are developed to characterize crack growth rates during elastic-plastic fatigue crack growth (FCG) and the tear-fatigue interaction near instability. Surface crack growth experiments are conducted with Inconel 718 to characterize tearing resistance, FCG under small-scale yielding and elastic-plastic conditions, and crack growth during simulated MCPT. Fractography and acoustic emission studies provide additional insight. The relative merits of SCPT and MCPT are directly compared using a probabilistic analysis linked with an elastic-plastic crack growth computer code. The conditional probability of failure in service is computed for a population of components that have survived a previous proof test, based on an assumed distribution of initial crack depths. Parameter studies investigate the influence of proof factor, tearing resistance, crack shape, initial crack depth distribution, and notches on the MCPT vs. SCPT comparison. The parameter studies provide a rational basis to formulate conclusions about the relative advantages and disadvantages of SCPT and MCPT. Practical engineering guidelines are proposed to help select the optimum proof test protocol in a given application.
Demonstration of measurement-only blind quantum computing
NASA Astrophysics Data System (ADS)
Greganti, Chiara; Roehsner, Marie-Christine; Barz, Stefanie; Morimae, Tomoyuki; Walther, Philip
2016-01-01
Blind quantum computing allows for secure cloud networks of quasi-classical clients and a fully fledged quantum server. Recently, a new protocol has been proposed, which requires a client to perform only measurements. We demonstrate a proof-of-principle implementation of this measurement-only blind quantum computing, exploiting a photonic setup to generate four-qubit cluster states for computation and verification. Feasible technological requirements for the client and the device-independent blindness make this scheme very applicable for future secure quantum networks.
The augmented Lagrangian method for parameter estimation in elliptic systems
NASA Technical Reports Server (NTRS)
Ito, Kazufumi; Kunisch, Karl
1990-01-01
In this paper a new technique for the estimation of parameters in elliptic partial differential equations is developed. It is a hybrid method combining the output-least-squares and the equation error method. The new method is realized by an augmented Lagrangian formulation, and convergence as well as rate of convergence proofs are provided. Technically the critical step is the verification of a coercivity estimate of an appropriately defined Lagrangian functional. To obtain this coercivity estimate a seminorm regularization technique is used.
A 300 GHz collective scattering diagnostic for low temperature plasmas.
Hardin, Robert A; Scime, Earl E; Heard, John
2008-10-01
A compact and portable 300 GHz collective scattering diagnostic employing a homodyne detection scheme has been constructed and installed on the hot helicon experiment (HELIX). Verification of the homodyne detection scheme was accomplished with a rotating grooved aluminum wheel to Doppler shift the interaction beam. The HELIX chamber geometry and collection optics allow measurement of scattering angles ranging from 60 degrees to 90 degrees. Artificially driven ion-acoustic waves are also being investigated as a proof-of-principle test for the diagnostic system.
Solar energy heating system design package for a single-family residence at New Castle, Pennsylvania
NASA Technical Reports Server (NTRS)
1977-01-01
The design of a solar heating and hot water system for a single family dwelling is described. Cost trade studies on the energy conservation and architectural features of the solar house are discussed. The present status of verification for the single family heating system, i.e., proof that the components and the system meet applicable physical and functional requirements, is reported. The system integration drawings, the major subsystems drawings, and the architect's specifications and plans are included.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Helton, Jon C.; Brooks, Dusty Marie; Sallaberry, Cedric Jean-Marie.
Representations for margins associated with loss of assured safety (LOAS) for weak link (WL)/strong link (SL) systems involving multiple time-dependent failure modes are developed. The following topics are described: (i) defining properties for WLs and SLs, (ii) background on cumulative distribution functions (CDFs) for link failure time, link property value at link failure, and time at which LOAS occurs, (iii) CDFs for failure time margins defined by (time at which SL system fails) – (time at which WL system fails), (iv) CDFs for SL system property values at LOAS, (v) CDFs for WL/SL property value margins defined by (property valuemore » at which SL system fails) – (property value at which WL system fails), and (vi) CDFs for SL property value margins defined by (property value of failing SL at time of SL system failure) – (property value of this SL at time of WL system failure). Included in this presentation is a demonstration of a verification strategy based on defining and approximating the indicated margin results with (i) procedures based on formal integral representations and associated quadrature approximations and (ii) procedures based on algorithms for sampling-based approximations.« less
The EPA's National Risk Management Research Laboratory (NRMRL) and its verification organization partner, Battelle, operate the Advanced Monitoring Systems (AMS) Center under ETV. The AMS Center recently evaluated the performance of the Abraxis 17(beta)-estradiol (E2) magnetic p...
Proceedings of the Third International Workshop on Proof-Carrying Code and Software Certification
NASA Technical Reports Server (NTRS)
Ewen, Denney, W. (Editor); Jensen, Thomas (Editor)
2009-01-01
This NASA conference publication contains the proceedings of the Third International Workshop on Proof-Carrying Code and Software Certification, held as part of LICS in Los Angeles, CA, USA, on August 15, 2009. Software certification demonstrates the reliability, safety, or security of software systems in such a way that it can be checked by an independent authority with minimal trust in the techniques and tools used in the certification process itself. It can build on existing validation and verification (V&V) techniques but introduces the notion of explicit software certificates, Vvilich contain all the information necessary for an independent assessment of the demonstrated properties. One such example is proof-carrying code (PCC) which is an important and distinctive approach to enhancing trust in programs. It provides a practical framework for independent assurance of program behavior; especially where source code is not available, or the code author and user are unknown to each other. The workshop wiII address theoretical foundations of logic-based software certification as well as practical examples and work on alternative application domains. Here "certificate" is construed broadly, to include not just mathematical derivations and proofs but also safety and assurance cases, or any fonnal evidence that supports the semantic analysis of programs: that is, evidence about an intrinsic property of code and its behaviour that can be independently checked by any user, intermediary, or third party. These guarantees mean that software certificates raise trust in the code itself, distinct from and complementary to any existing trust in the creator of the code, the process used to produce it, or its distributor. In addition to the contributed talks, the workshop featured two invited talks, by Kelly Hayhurst and Andrew Appel. The PCC 2009 website can be found at http://ti.arc.nasa.gov /event/pcc 091.
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2011-01-01
This report presents a deductive proof of a self-stabilizing distributed clock synchronization protocol. It is focused on the distributed clock synchronization of an arbitrary, non-partitioned digraph ranging from fully connected to 1-connected networks of nodes while allowing for differences in the network elements. This protocol does not rely on assumptions about the initial state of the system, and no central clock or a centrally generated signal, pulse, or message is used. Nodes are anonymous, i.e., they do not have unique identities. There is no theoretical limit on the maximum number of participating nodes. The only constraint on the behavior of the node is that the interactions with other nodes are restricted to defined links and interfaces. We present a deductive proof of the correctness of the protocol as it applies to the networks with unidirectional and bidirectional links. We also confirm the claims of determinism and linear convergence.
NASA Astrophysics Data System (ADS)
Gohlke, Martin; Schuldt, Thilo; Weise, Dennis; Cordero, Jorge; Peters, Achim; Johann, Ulrich; Braxmaier, Claus
2017-11-01
The gravitational wave detector LISA utilizes as current baseline a high sensitivity Optical Readout (ORO) for measuring the relative position and tilt of a free flying proof mass with respect to the satellite housing. The required sensitivities in the frequency band from 30 μHz to 1Hz are ˜ pm/ √ Hz for the translation√ and nrad/√ Hz for the tilt measurement. EADS Astrium, in collaboration with the Humboldt University Berlin and the University of Applied Sciences Konstanz, has realized a prototype ORO over the past years. The interferometer is based on a highly symmetric design where both, measurement and reference beam have a similar optical pathlength, and the same frequency and polarization. The technique of differential wavefront sensing (DWS) for tilt measurement is implemented. With our setup noise levels below 5pm/ √Hz for translation and below 10nrad/ √Hz for tilt measurements - both for frequencies above 10mHz - were demonstrated. We give an overview over the experimental setup, its current performance and the planned improvements. We also discuss the application to first verification of critical LISA aspects. As example we present measurements of the coefficient of thermal expansion (CTE) of various carbon fiber reinforced plastic (CFRP) including a "near-zero-CTE" tube.
Physical cryptographic verification of nuclear warheads
Kemp, R. Scott; Danagoulian, Areg; Macdonald, Ruaridh R.; ...
2016-07-18
How does one prove a claim about a highly sensitive object such as a nuclear weapon without revealing information about the object? This paradox has challenged nuclear arms control for more than five decades. We present a mechanism in the form of an interactive proof system that can validate the structure and composition of an object, such as a nuclear warhead, to arbitrary precision without revealing either its structure or composition. We introduce a tomographic method that simultaneously resolves both the geometric and isotopic makeup of an object. We also introduce a method of protecting information using a provably securemore » cryptographic hash that does not rely on electronics or software. Finally, these techniques, when combined with a suitable protocol, constitute an interactive proof system that could reject hoax items and clear authentic warheads with excellent sensitivity in reasonably short measurement times.« less
Vavrek, Jayson R; Henderson, Brian S; Danagoulian, Areg
2018-04-24
Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618-8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal from the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy "genuine" and "hoax" objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.
Detection of Peptide-based nanoparticles in blood plasma by ELISA.
Bode, Gerard H; Pickl, Karin E; Sanchez-Purrà, Maria; Albaiges, Berta; Borrós, Salvador; Pötgens, Andy J G; Schmitz, Christoph; Sinner, Frank M; Losen, Mario; Steinbusch, Harry W M; Frank, Hans-Georg; Martinez-Martinez, Pilar
2015-01-01
The aim of the current study was to develop a method to detect peptide-linked nanoparticles in blood plasma. A convenient enzyme linked immunosorbent assay (ELISA) was developed for the detection of peptides functionalized with biotin and fluorescein groups. As a proof of principle, polymerized pentafluorophenyl methacrylate nanoparticles linked to biotin-carboxyfluorescein labeled peptides were intravenously injected in Wistar rats. Serial blood plasma samples were analyzed by ELISA and by liquid chromatography mass spectrometry (LC/MS) technology. The ELISA based method for the detection of FITC labeled peptides had a detection limit of 1 ng/mL. We were able to accurately measure peptides bound to pentafluorophenyl methacrylate nanoparticles in blood plasma of rats, and similar results were obtained by LC/MS. We detected FITC-labeled peptides on pentafluorophenyl methacrylate nanoparticles after injection in vivo. This method can be extended to detect nanoparticles with different chemical compositions.
Detection of Peptide-Based Nanoparticles in Blood Plasma by ELISA
Bode, Gerard H.; Pickl, Karin E.; Sanchez-Purrà, Maria; Albaiges, Berta; Borrós, Salvador; Pötgens, Andy J. G.; Schmitz, Christoph; Sinner, Frank M.; Losen, Mario; Steinbusch, Harry W. M.; Frank, Hans-Georg; Martinez-Martinez, Pilar
2015-01-01
Aims The aim of the current study was to develop a method to detect peptide-linked nanoparticles in blood plasma. Materials & Methods A convenient enzyme linked immunosorbent assay (ELISA) was developed for the detection of peptides functionalized with biotin and fluorescein groups. As a proof of principle, polymerized pentafluorophenyl methacrylate nanoparticles linked to biotin-carboxyfluorescein labeled peptides were intravenously injected in Wistar rats. Serial blood plasma samples were analyzed by ELISA and by liquid chromatography mass spectrometry (LC/MS) technology. Results The ELISA based method for the detection of FITC labeled peptides had a detection limit of 1 ng/mL. We were able to accurately measure peptides bound to pentafluorophenyl methacrylate nanoparticles in blood plasma of rats, and similar results were obtained by LC/MS. Conclusions We detected FITC-labeled peptides on pentafluorophenyl methacrylate nanoparticles after injection in vivo. This method can be extended to detect nanoparticles with different chemical compositions. PMID:25996618
DOE Office of Scientific and Technical Information (OSTI.GOV)
Taylor, Ronald C.; Sanfilippo, Antonio P.; McDermott, Jason E.
2011-02-18
Transcriptional regulatory networks are being determined using “reverse engineering” methods that infer connections based on correlations in gene state. Corroboration of such networks through independent means such as evidence from the biomedical literature is desirable. Here, we explore a novel approach, a bootstrapping version of our previous Cross-Ontological Analytic method (XOA) that can be used for semi-automated annotation and verification of inferred regulatory connections, as well as for discovery of additional functional relationships between the genes. First, we use our annotation and network expansion method on a biological network learned entirely from the literature. We show how new relevant linksmore » between genes can be iteratively derived using a gene similarity measure based on the Gene Ontology that is optimized on the input network at each iteration. Second, we apply our method to annotation, verification, and expansion of a set of regulatory connections found by the Context Likelihood of Relatedness algorithm.« less
Advanced verification methods for OVI security ink
NASA Astrophysics Data System (ADS)
Coombs, Paul G.; McCaffery, Shaun F.; Markantes, Tom
2006-02-01
OVI security ink +, incorporating OVP security pigment* microflakes, enjoys a history of effective document protection. This security feature provides not only first-line recognition by the person on the street, but also facilitates machine-readability. This paper explores the evolution of OVI reader technology from proof-of-concept to miniaturization. Three different instruments have been built to advance the technology of OVI machine verification. A bench-top unit has been constructed which allows users to automatically verify a multitude of different banknotes and OVI images. In addition, high speed modules were fabricated and tested in a state of the art banknote sorting machine. Both units demonstrate the ability of modern optical components to illuminate and collect light reflected from the interference platelets within OVI ink. Electronic hardware and software convert and process the optical information in milliseconds to accurately determine the authenticity of the security feature. Most recently, OVI ink verification hardware has been miniaturized and simplified providing yet another platform for counterfeit protection. These latest devices provide a tool for store clerks and bank tellers to unambiguously determine the validity of banknotes in the time period it takes the cash drawer to be opened.
Study on micro-water measurement method based on SF6 insulation equipment in high altitude area
NASA Astrophysics Data System (ADS)
Zhang, Han; Liu, Yajin; Yan, Jun; Liu, Zhijian; Yan, Yongfei
2018-06-01
Moisture content is an important indicator of the insulation and arc extinguishing performance of SF6 insulated electrical equipment. The research shows that moisture measurements are strongly influenced by altitude pressures and the different order of pressure correction and temperature correction calculation, different calculation results will result. Therefore, in this paper, we studies the pressure and temperature environment based on moisture test of SF6 gas insulated equipment in power industry. Firstly, the PVT characteristics of pure SF6 gas and water vapor were analyzed and put forward the necessity of pressure correction, then combined the Pitzer-Veli equation of SF6 gas and Water Pitzer-Veli equation to fit PVT equation of state of SF6-H20 that suitable for electric power industry and deduced the Correction Formula of Moisture Measurement in SF6 Gas. Finally, through experiments, completion of the calibration formula optimization and verification SF6 electrical equipment on, proof of the applicability and effectiveness of the correction formula.
Zhang, Zheshen; Voss, Paul L
2009-07-06
We propose a continuous variable based quantum key distribution protocol that makes use of discretely signaled coherent light and reverse error reconciliation. We present a rigorous security proof against collective attacks with realistic lossy, noisy quantum channels, imperfect detector efficiency, and detector electronic noise. This protocol is promising for convenient, high-speed operation at link distances up to 50 km with the use of post-selection.
ERIC Educational Resources Information Center
Kwon, Junehee; Lee, Yee Ming; Park, Eunhye; Wang, Yujia; Rushing, Keith
2017-01-01
Purpose/Objectives: This study assessed current practices and attitudes of school nutrition program (SNP) management staff regarding free and reduced-price (F-RP) meal application and verification in SNPs. Methods: Stratified, randomly selected 1,500 SNP management staff in 14 states received a link to an online questionnaire and/or a printed…
Automated solar panel assembly line
NASA Technical Reports Server (NTRS)
Somberg, H.
1981-01-01
The initial stage of the automated solar panel assembly line program was devoted to concept development and proof of approach through simple experimental verification. In this phase, laboratory bench models were built to demonstrate and verify concepts. Following this phase was machine design and integration of the various machine elements. The third phase was machine assembly and debugging. In this phase, the various elements were operated as a unit and modifications were made as required. The final stage of development was the demonstration of the equipment in a pilot production operation.
Experimental Blind Quantum Computing for a Classical Client.
Huang, He-Liang; Zhao, Qi; Ma, Xiongfeng; Liu, Chang; Su, Zu-En; Wang, Xi-Lin; Li, Li; Liu, Nai-Le; Sanders, Barry C; Lu, Chao-Yang; Pan, Jian-Wei
2017-08-04
To date, blind quantum computing demonstrations require clients to have weak quantum devices. Here we implement a proof-of-principle experiment for completely classical clients. Via classically interacting with two quantum servers that share entanglement, the client accomplishes the task of having the number 15 factorized by servers who are denied information about the computation itself. This concealment is accompanied by a verification protocol that tests servers' honesty and correctness. Our demonstration shows the feasibility of completely classical clients and thus is a key milestone towards secure cloud quantum computing.
Formal Verification of Curved Flight Collision Avoidance Maneuvers: A Case Study
2009-08-01
easily by Pythagoras theorem (i.e., (2r)2 = r2 + x21 for the triangle enclosed by h, x, c in Fig. 7a): x = ( √ (2r)2 − r2, 0) = ( √ 3r, 0) . (4...region [10]. Most notably, the separation proof in Section 4.7 is by overapproximation and tolerates asymmetric distances to c (Fig. 7b). Theorem 1... Theorem 1 is already sufficiently general, but the computational complexity high. It would be interesting future work to see if the informal robustness
NASA Technical Reports Server (NTRS)
Dunham, J. R. (Editor); Knight, J. C. (Editor)
1982-01-01
The state of the art in the production of crucial software for flight control applications was addressed. The association between reliability metrics and software is considered. Thirteen software development projects are discussed. A short term need for research in the areas of tool development and software fault tolerance was indicated. For the long term, research in format verification or proof methods was recommended. Formal specification and software reliability modeling, were recommended as topics for both short and long term research.
Experimental Blind Quantum Computing for a Classical Client
NASA Astrophysics Data System (ADS)
Huang, He-Liang; Zhao, Qi; Ma, Xiongfeng; Liu, Chang; Su, Zu-En; Wang, Xi-Lin; Li, Li; Liu, Nai-Le; Sanders, Barry C.; Lu, Chao-Yang; Pan, Jian-Wei
2017-08-01
To date, blind quantum computing demonstrations require clients to have weak quantum devices. Here we implement a proof-of-principle experiment for completely classical clients. Via classically interacting with two quantum servers that share entanglement, the client accomplishes the task of having the number 15 factorized by servers who are denied information about the computation itself. This concealment is accompanied by a verification protocol that tests servers' honesty and correctness. Our demonstration shows the feasibility of completely classical clients and thus is a key milestone towards secure cloud quantum computing.
QIPS: quantum information and quantum physics in space
NASA Astrophysics Data System (ADS)
Schmitt-Manderbach, Tobias; Scheidl, Thomas; Ursin, Rupert; Tiefenbacher, Felix; Weier, Henning; Fürst, Martin; Jennewein, T.; Perdigues, J.; Sodnik, Z.; Rarity, J.; Zeilinger, Anton; Weinfurter, Harald
2017-11-01
The aim of the QIPS project (financed by ESA) is to explore quantum phenomena and to demonstrate quantum communication over long distances. Based on the current state-of-the-art a first study investigating the feasibility of space based quantum communication has to establish goals for mid-term and long-term missions, but also has to test the feasibility of key issues in a long distance ground-to-ground experiment. We have therefore designed a proof-of-concept demonstration for establishing single photon links over a distance of 144 km between the Canary Islands of La Palma and Tenerife to evaluate main limitations for future space experiments. Here we report on the progress of this project and present first measurements of crucial parameters of the optical free space link.
NASA Technical Reports Server (NTRS)
Rushby, John; Miner, Paul S. (Technical Monitor)
2002-01-01
Airplanes are certified as a whole: there is no established basis for separately certifying some components, particularly software-intensive ones, independently of their specific application in a given airplane. The absence of separate certification inhibits the development of modular components that could be largely "precertified" and used in several different contexts within a single airplane, or across many different airplanes. In this report, we examine the issues in modular certification of software components and propose an approach based on assume-guarantee reasoning. We extend the method from verification to certification by considering behavior in the presence of failures. This exposes the need for partitioning, and separation of assumptions and guarantees into normal and abnormal cases. We then identify three classes of property that must be verified within this framework: safe function, true guarantees, and controlled failure. We identify a particular assume-guarantee proof rule (due to McMillan) that is appropriate to the applications considered, and formally verify its soundness in PVS.
Automated Installation Verification of COMSOL via LiveLink for MATLAB
DOE Office of Scientific and Technical Information (OSTI.GOV)
Crowell, Michael W
Verifying that a local software installation performs as the developer intends is a potentially time-consuming but necessary step for nuclear safety-related codes. Automating this process not only saves time, but can increase reliability and scope of verification compared to ‘hand’ comparisons. While COMSOL does not include automatic installation verification as many commercial codes do, it does provide tools such as LiveLink™ for MATLAB® and the COMSOL API for use with Java® through which the user can automate the process. Here we present a successful automated verification example of a local COMSOL 5.0 installation for nuclear safety-related calculations at the Oakmore » Ridge National Laboratory’s High Flux Isotope Reactor (HFIR).« less
Optical stabilization for time transfer infrastructure
NASA Astrophysics Data System (ADS)
Vojtech, Josef; Altmann, Michal; Skoda, Pavel; Horvath, Tomas; Slapak, Martin; Smotlacha, Vladimir; Havlis, Ondrej; Munster, Petr; Radil, Jan; Kundrat, Jan; Altmannova, Lada; Velc, Radek; Hula, Miloslav; Vohnout, Rudolf
2017-08-01
In this paper, we propose and present verification of all-optical methods for stabilization of the end-to-end delay of an optical fiber link. These methods are verified for deployment within infrastructure for accurate time and stable frequency distribution, based on sharing of fibers with research and educational network carrying live data traffic. Methods range from path length control, through temperature conditioning method to transmit wavelength control. Attention is given to achieve continuous control for relatively broad range of delays. We summarize design rules for delay stabilization based on the character and the total delay jitter.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Vavrek, Jayson R.; Henderson, Brian S.; Danagoulian, Areg
Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618–8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal frommore » the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here in this paper we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy “genuine” and “hoax” objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.« less
Reactive system verification case study: Fault-tolerant transputer communication
NASA Technical Reports Server (NTRS)
Crane, D. Francis; Hamory, Philip J.
1993-01-01
A reactive program is one which engages in an ongoing interaction with its environment. A system which is controlled by an embedded reactive program is called a reactive system. Examples of reactive systems are aircraft flight management systems, bank automatic teller machine (ATM) networks, airline reservation systems, and computer operating systems. Reactive systems are often naturally modeled (for logical design purposes) as a composition of autonomous processes which progress concurrently and which communicate to share information and/or to coordinate activities. Formal (i.e., mathematical) frameworks for system verification are tools used to increase the users' confidence that a system design satisfies its specification. A framework for reactive system verification includes formal languages for system modeling and for behavior specification and decision procedures and/or proof-systems for verifying that the system model satisfies the system specifications. Using the Ostroff framework for reactive system verification, an approach to achieving fault-tolerant communication between transputers was shown to be effective. The key components of the design, the decoupler processes, may be viewed as discrete-event-controllers introduced to constrain system behavior such that system specifications are satisfied. The Ostroff framework was also effective. The expressiveness of the modeling language permitted construction of a faithful model of the transputer network. The relevant specifications were readily expressed in the specification language. The set of decision procedures provided was adequate to verify the specifications of interest. The need for improved support for system behavior visualization is emphasized.
Vavrek, Jayson R.; Henderson, Brian S.; Danagoulian, Areg
2018-04-10
Future nuclear arms reduction efforts will require technologies to verify that warheads slated for dismantlement are authentic without revealing any sensitive weapons design information to international inspectors. Despite several decades of research, no technology has met these requirements simultaneously. Recent work by Kemp et al. [Kemp RS, Danagoulian A, Macdonald RR, Vavrek JR (2016) Proc Natl Acad Sci USA 113:8618–8623] has produced a novel physical cryptographic verification protocol that approaches this treaty verification problem by exploiting the isotope-specific nature of nuclear resonance fluorescence (NRF) measurements to verify the authenticity of a warhead. To protect sensitive information, the NRF signal frommore » the warhead is convolved with that of an encryption foil that contains key warhead isotopes in amounts unknown to the inspector. The convolved spectrum from a candidate warhead is statistically compared against that from an authenticated template warhead to determine whether the candidate itself is authentic. Here in this paper we report on recent proof-of-concept warhead verification experiments conducted at the Massachusetts Institute of Technology. Using high-purity germanium (HPGe) detectors, we measured NRF spectra from the interrogation of proxy “genuine” and “hoax” objects by a 2.52 MeV endpoint bremsstrahlung beam. The observed differences in NRF intensities near 2.2 MeV indicate that the physical cryptographic protocol can distinguish between proxy genuine and hoax objects with high confidence in realistic measurement times.« less
Space shuttle engineering and operations support. Avionics system engineering
NASA Technical Reports Server (NTRS)
Broome, P. A.; Neubaur, R. J.; Welsh, R. T.
1976-01-01
The shuttle avionics integration laboratory (SAIL) requirements for supporting the Spacelab/orbiter avionics verification process are defined. The principal topics are a Spacelab avionics hardware assessment, test operations center/electronic systems test laboratory (TOC/ESL) data processing requirements definition, SAIL (Building 16) payload accommodations study, and projected funding and test scheduling. Because of the complex nature of the Spacelab/orbiter computer systems, the PCM data link, and the high rate digital data system hardware/software relationships, early avionics interface verification is required. The SAIL is a prime candidate test location to accomplish this early avionics verification.
Zhang, Can; Cui, Hanyu; Han, Yufeng; Yu, Fangfang; Shi, Xiaoman
2018-02-01
A biomimetic enzyme-linked immunosorbent assay (BELISA) which was based on molecularly imprinted polymers on paper (MIPs-paper) with specific recognition was developed. As a detector, the surface of paper was modified with γ-MAPS by hydrolytic action and anchored the MIP layer on γ-MAPS modified-paper by copolymerization to construct the artificial antibody Through a series of experimentation and verification, we successful got the MIPs-paper and established BELISA for the detection of carbaryl. The development of MIPs-paper based on BELISA was applied to detect carbaryl in real samples and validated by an enzyme-linked immunosorbent assay (ELISA) based on anti-carbaryl biological antibody. The results of these two methods (BELISA and ELISA) were well correlated (R 2 =0.944). The established method of MIPs-paper BELISA exhibits the advantages of low cost, higher stability and being re-generable, which can be applied as a convenient tool for the fast and efficient detection of carbaryl. Copyright © 2017. Published by Elsevier Ltd.
Availability of buprenorphine on the Internet for purchase without a prescription
Bachhuber, Marcus A.; Cunningham, Chinazo O.
2012-01-01
Background Use of illicit buprenorphine is increasingly recognized, but it is unknown if the Internet currently represents an accessible source. Methods A series of Internet searches were conducted. Twenty searches were performed on two different search engines. The first 100 results of each search were classified into categories based on content. All Internet pharmacies were searched for buprenorphine preparations and if available, sites were examined to determine if a prescription was required for purchase, for the cost of buprenorphine, the geographical origin of the pharmacy, and evidence of validation by an online pharmacy verification service. Results Of the 2,000 links examined, 1422 were unique. Six percent of links were to illicit commercial sites, 2% were to legitimate commercial sites, and 2% were to illicit portal sites, which contained links to many illicit commercial sites. Twenty pharmacies offering buprenorphine for purchase without a prescription were identified. The monthly cost of a typical starting dose of 2 mg buprenorphine daily ranged between $232 and $1,163 USD. No pharmacies were listed by online pharmacy verification services. Conclusion Twenty online pharmacies advertising buprenorphine formulations for sale without a prescription were identified. Prices varied widely between illicit pharmacies but were uniformly more expensive than legitimate pharmacies. Illicitly obtained buprenorphine formulations appear to be relatively inaccessible and at high cost on the Internet. PMID:23201172
Dynamic enhancement of drug product labels to support drug safety, efficacy, and effectiveness.
Boyce, Richard D; Horn, John R; Hassanzadeh, Oktie; Waard, Anita de; Schneider, Jodi; Luciano, Joanne S; Rastegar-Mojarad, Majid; Liakata, Maria
2013-01-26
Out-of-date or incomplete drug product labeling information may increase the risk of otherwise preventable adverse drug events. In recognition of these concerns, the United States Federal Drug Administration (FDA) requires drug product labels to include specific information. Unfortunately, several studies have found that drug product labeling fails to keep current with the scientific literature. We present a novel approach to addressing this issue. The primary goal of this novel approach is to better meet the information needs of persons who consult the drug product label for information on a drug's efficacy, effectiveness, and safety. Using FDA product label regulations as a guide, the approach links drug claims present in drug information sources available on the Semantic Web with specific product label sections. Here we report on pilot work that establishes the baseline performance characteristics of a proof-of-concept system implementing the novel approach. Claims from three drug information sources were linked to the Clinical Studies, Drug Interactions, and Clinical Pharmacology sections of the labels for drug products that contain one of 29 psychotropic drugs. The resulting Linked Data set maps 409 efficacy/effectiveness study results, 784 drug-drug interactions, and 112 metabolic pathway assertions derived from three clinically-oriented drug information sources (ClinicalTrials.gov, the National Drug File - Reference Terminology, and the Drug Interaction Knowledge Base) to the sections of 1,102 product labels. Proof-of-concept web pages were created for all 1,102 drug product labels that demonstrate one possible approach to presenting information that dynamically enhances drug product labeling. We found that approximately one in five efficacy/effectiveness claims were relevant to the Clinical Studies section of a psychotropic drug product, with most relevant claims providing new information. We also identified several cases where all of the drug-drug interaction claims linked to the Drug Interactions section for a drug were potentially novel. The baseline performance characteristics of the proof-of-concept will enable further technical and user-centered research on robust methods for scaling the approach to the many thousands of product labels currently on the market.
Dynamic enhancement of drug product labels to support drug safety, efficacy, and effectiveness
2013-01-01
Out-of-date or incomplete drug product labeling information may increase the risk of otherwise preventable adverse drug events. In recognition of these concerns, the United States Federal Drug Administration (FDA) requires drug product labels to include specific information. Unfortunately, several studies have found that drug product labeling fails to keep current with the scientific literature. We present a novel approach to addressing this issue. The primary goal of this novel approach is to better meet the information needs of persons who consult the drug product label for information on a drug’s efficacy, effectiveness, and safety. Using FDA product label regulations as a guide, the approach links drug claims present in drug information sources available on the Semantic Web with specific product label sections. Here we report on pilot work that establishes the baseline performance characteristics of a proof-of-concept system implementing the novel approach. Claims from three drug information sources were linked to the Clinical Studies, Drug Interactions, and Clinical Pharmacology sections of the labels for drug products that contain one of 29 psychotropic drugs. The resulting Linked Data set maps 409 efficacy/effectiveness study results, 784 drug-drug interactions, and 112 metabolic pathway assertions derived from three clinically-oriented drug information sources (ClinicalTrials.gov, the National Drug File – Reference Terminology, and the Drug Interaction Knowledge Base) to the sections of 1,102 product labels. Proof-of-concept web pages were created for all 1,102 drug product labels that demonstrate one possible approach to presenting information that dynamically enhances drug product labeling. We found that approximately one in five efficacy/effectiveness claims were relevant to the Clinical Studies section of a psychotropic drug product, with most relevant claims providing new information. We also identified several cases where all of the drug-drug interaction claims linked to the Drug Interactions section for a drug were potentially novel. The baseline performance characteristics of the proof-of-concept will enable further technical and user-centered research on robust methods for scaling the approach to the many thousands of product labels currently on the market. PMID:23351881
System Engineering for J-2X Development: The Simpler, the Better
NASA Technical Reports Server (NTRS)
Kelly, William M.; Greasley, Paul; Greene, William D.; Ackerman, Peter
2008-01-01
The Ares I and Ares V Vehicles will utilize the J-2X rocket engine developed for NASA by the Pratt and Whitney Rocketdyne Company (PWR) as the upper stage engine (USE). The J-2X is an improved higher power version of the original J-2 engine used for Apollo. System Engineering (SE) facilitates direct and open discussions of issues and problems. This simple idea is often overlooked in large, complex engineering development programs. Definition and distribution of requirements from the engine level to the component level is controlled by Allocation Reports which breaks down numerical design objectives (weight, reliability, etc.) into quanta goals for each component area. Linked databases of design and verification requirements help eliminate redundancy and potential mistakes inherent in separated systems. Another tool, the Architecture Design Description (ADD), is used to control J-2X system architecture and effectively communicate configuration changes to those involved in the design process. But the proof of an effective process is in successful program accomplishment. SE is the methodology being used to meet the challenge of completing J-2X engine certification 2 years ahead of any engine program ever developed at PWR. This paper describes the simple, better SE tools and techniques used to achieve this success.
Slide-Ring Materials Using Cyclodextrin.
Ito, Kohzo
2017-01-01
We have recently synthesized slide-ring materials using cyclodextrin by cross-linking polyrotaxanes, a typical supramolecule. The slide-ring materials have polymer chains with bulky end groups topologically interlocked by figure-of-eight shaped junctions. This indicates that the cross-links can pass through the polymer chains similar to pulleys to relax the tension of the backbone polymer chains. The slide-ring materials also differ from conventional polymers in that the entropy of rings affects the elasticity. As a result, the slide-ring materials show quite small Young's modulus not proportional to the cross-linking density. This concept can be applied to a wide variety of polymeric materials as well as gels. In particular, the slide-ring materials show remarkable scratch-proof properties for coating materials for automobiles, cell phones, mobile computers, and so on. Further current applications include vibration-proof insulation materials for sound speakers, highly abrasive polishing media, dielectric actuators, and so on.
Building the Qualification File of EGNOS with DOORS
NASA Astrophysics Data System (ADS)
Fabre, J.
2008-08-01
EGNOS, the European Satellite-Based Augmentation System (SBAS) to GPS, is getting to its final deployment and being initially operated towards qualification and certification to reach operational capability by 2008/2009. A very important milestone in the development process is the System Qualification Review (QR). As the verification phase aims at demonstrating that the EGNOS System design meets the applicable requirements, the QR declares the completion of verification activities. The main document to present at QR is a consolidated, consistent and complete Qualification file. The information included shall give confidence to the QR reviewers that the performed qualification activities are completed. Therefore, an important issue for the project team is to focus on synthetic and consistent information, and to make the presentation as clear as possible. Traceability to applicable requirements shall be systematically presented. Moreover, in order to support verification justification, reference to details shall be available, and the reviewer shall have the possibility to link automatically to the documents including this detailed information. In that frame, Thales Alenia Space has implemented a strong support in terms of methodology and tool, to provide to System Engineering and Verification teams a single reference technical database, in which all team members consult the applicable requirements, compliance, justification, design data and record the information necessary to build the final Qualification file. This paper presents the EGNOS context, the Qualification file contents, and the methodology implemented, based on Thales Alenia Space practices and in line with ECSS. Finally, it shows how the Qualification file is built in a DOORS environment.
NASA Astrophysics Data System (ADS)
Tamilarasan, Ilavarasan; Saminathan, Brindha; Murugappan, Meenakshi
2016-04-01
The past decade has seen the phenomenal usage of orthogonal frequency division multiplexing (OFDM) in the wired as well as wireless communication domains, and it is also proposed in the literature as a future proof technique for the implementation of flexible resource allocation in cognitive optical networks. Fiber impairment assessment and adaptive compensation becomes critical in such implementations. A comprehensive analytical model for impairments in OFDM-based fiber links is developed. The proposed model includes the combined impact of laser phase fluctuations, fiber dispersion, self phase modulation, cross phase modulation, four-wave mixing, the nonlinear phase noise due to the interaction of amplified spontaneous emission with fiber nonlinearities, and the photodetector noises. The bit error rate expression for the proposed model is derived based on error vector magnitude estimation. The performance analysis of the proposed model is presented and compared for dispersion compensated and uncompensated backbone/backhaul links. The results suggest that OFDM would perform better for uncompensated links than the compensated links due to the negligible FWM effects and there is a need for flexible compensation. The proposed model can be employed in cognitive optical networks for accurate assessment of fiber-related impairments.
NASA Technical Reports Server (NTRS)
Linley, L. J.; Luper, A. B.; Dunn, J. H.
1982-01-01
The Bureau of Mines, U.S. Department of the Interior, is reviewing explosion protection methods for use in gassy coal mines. This performance criteria guideline is an evaluation of three explosion protection methods of machines electrically powered with voltages up to 15,000 volts ac. A sufficient amount of basic research has been accomplished to verify that the explosion proof and pressurized enclosure methods can provide adequate explosion protection with the present state of the art up to 15,000 volts ac. This routine application of the potted enclosure as a stand alone protection method requires further investigation or development in order to clarify performance criteria and verification certification requirements. An extensive literature search, a series of high voltage tests, and a design evaluation of the three explosion protection methods indicate that the explosion proof, pressurized, and potted enclosures can all be used to enclose up to 15,000 volts ac.
A New On-Line Diagnosis Protocol for the SPIDER Family of Byzantine Fault Tolerant Architectures
NASA Technical Reports Server (NTRS)
Geser, Alfons; Miner, Paul S.
2004-01-01
This paper presents the formal verification of a new protocol for online distributed diagnosis for the SPIDER family of architectures. An instance of the Scalable Processor-Independent Design for Electromagnetic Resilience (SPIDER) architecture consists of a collection of processing elements communicating over a Reliable Optical Bus (ROBUS). The ROBUS is a specialized fault-tolerant device that guarantees Interactive Consistency, Distributed Diagnosis (Group Membership), and Synchronization in the presence of a bounded number of physical faults. Formal verification of the original SPIDER diagnosis protocol provided a detailed understanding that led to the discovery of a significantly more efficient protocol. The original protocol was adapted from the formally verified protocol used in the MAFT architecture. It required O(N) message exchanges per defendant to correctly diagnose failures in a system with N nodes. The new protocol achieves the same diagnostic fidelity, but only requires O(1) exchanges per defendant. This paper presents this new diagnosis protocol and a formal proof of its correctness using PVS.
A lightweight and secure two factor anonymous authentication protocol for Global Mobility Networks.
Baig, Ahmed Fraz; Hassan, Khwaja Mansoor Ul; Ghani, Anwar; Chaudhry, Shehzad Ashraf; Khan, Imran; Ashraf, Muhammad Usman
2018-01-01
Global Mobility Networks(GLOMONETs) in wireless communication permits the global roaming services that enable a user to leverage the mobile services in any foreign country. Technological growth in wireless communication is also accompanied by new security threats and challenges. A threat-proof authentication protocol in wireless communication may overcome the security flaws by allowing only legitimate users to access a particular service. Recently, Lee et al. found Mun et al. scheme vulnerable to different attacks and proposed an advanced secure scheme to overcome the security flaws. However, this article points out that Lee et al. scheme lacks user anonymity, inefficient user authentication, vulnerable to replay and DoS attacks and Lack of local password verification. Furthermore, this article presents a more robust anonymous authentication scheme to handle the threats and challenges found in Lee et al.'s protocol. The proposed protocol is formally verified with an automated tool(ProVerif). The proposed protocol has superior efficiency in comparison to the existing protocols.
Verification of Numerical Programs: From Real Numbers to Floating Point Numbers
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn E.; Munoz, Cesar; Kirchner, Florent; Correnson, Loiec
2013-01-01
Numerical algorithms lie at the heart of many safety-critical aerospace systems. The complexity and hybrid nature of these systems often requires the use of interactive theorem provers to verify that these algorithms are logically correct. Usually, proofs involving numerical computations are conducted in the infinitely precise realm of the field of real numbers. However, numerical computations in these algorithms are often implemented using floating point numbers. The use of a finite representation of real numbers introduces uncertainties as to whether the properties veri ed in the theoretical setting hold in practice. This short paper describes work in progress aimed at addressing these concerns. Given a formally proven algorithm, written in the Program Verification System (PVS), the Frama-C suite of tools is used to identify sufficient conditions and verify that under such conditions the rounding errors arising in a C implementation of the algorithm do not affect its correctness. The technique is illustrated using an algorithm for detecting loss of separation among aircraft.
Formal Methods for Verification and Validation of Partial Specifications: A Case Study
NASA Technical Reports Server (NTRS)
Easterbrook, Steve; Callahan, John
1997-01-01
This paper describes our work exploring the suitability of formal specification methods for independent verification and validation (IV&V) of software specifications for large, safety critical systems. An IV&V contractor often has to perform rapid analysis on incomplete specifications, with no control over how those specifications are represented. Lightweight formal methods show significant promise in this context, as they offer a way of uncovering major errors, without the burden of full proofs of correctness. We describe a case study of the use of partial formal models for V&V of the requirements for Fault Detection Isolation and Recovery on the space station. We conclude that the insights gained from formalizing a specification are valuable, and it is the process of formalization, rather than the end product that is important. It was only necessary to build enough of the formal model to test the properties in which we were interested. Maintenance of fidelity between multiple representations of the same requirements (as they evolve) is still a problem, and deserves further study.
Fracture mechanics life analytical methods verification testing
NASA Technical Reports Server (NTRS)
Favenesi, J. A.; Clemons, T. G.; Riddell, W. T.; Ingraffea, A. R.; Wawrzynek, P. A.
1994-01-01
The objective was to evaluate NASCRAC (trademark) version 2.0, a second generation fracture analysis code, for verification and validity. NASCRAC was evaluated using a combination of comparisons to the literature, closed-form solutions, numerical analyses, and tests. Several limitations and minor errors were detected. Additionally, a number of major flaws were discovered. These major flaws were generally due to application of a specific method or theory, not due to programming logic. Results are presented for the following program capabilities: K versus a, J versus a, crack opening area, life calculation due to fatigue crack growth, tolerable crack size, proof test logic, tearing instability, creep crack growth, crack transitioning, crack retardation due to overloads, and elastic-plastic stress redistribution. It is concluded that the code is an acceptable fracture tool for K solutions of simplified geometries, for a limited number of J and crack opening area solutions, and for fatigue crack propagation with the Paris equation and constant amplitude loads when the Paris equation is applicable.
A lightweight and secure two factor anonymous authentication protocol for Global Mobility Networks
2018-01-01
Global Mobility Networks(GLOMONETs) in wireless communication permits the global roaming services that enable a user to leverage the mobile services in any foreign country. Technological growth in wireless communication is also accompanied by new security threats and challenges. A threat-proof authentication protocol in wireless communication may overcome the security flaws by allowing only legitimate users to access a particular service. Recently, Lee et al. found Mun et al. scheme vulnerable to different attacks and proposed an advanced secure scheme to overcome the security flaws. However, this article points out that Lee et al. scheme lacks user anonymity, inefficient user authentication, vulnerable to replay and DoS attacks and Lack of local password verification. Furthermore, this article presents a more robust anonymous authentication scheme to handle the threats and challenges found in Lee et al.’s protocol. The proposed protocol is formally verified with an automated tool(ProVerif). The proposed protocol has superior efficiency in comparison to the existing protocols. PMID:29702675
NASA Astrophysics Data System (ADS)
Takenaka, Hideki; Koyama, Yoshisada; Akioka, Maki; Kolev, Dimitar; Iwakiri, Naohiko; Kunimori, Hiroo; Carrasco-Casado, Alberto; Munemasa, Yasushi; Okamoto, Eiji; Toyoshima, Morio
2016-03-01
Research and development of space optical communications is conducted in the National Institute of Information and Communications Technology (NICT). The NICT developed the Small Optical TrAnsponder (SOTA), which was embarked on a 50kg-class satellite and launched into a low earth orbit (LEO). The space-to-ground laser communication experiments have been conducted with the SOTA. Atmospheric turbulence causes signal fadings and becomes an issue to be solved in satellite-to-ground laser communication links. Therefore, as error-correcting functions, a Reed-Solomon (RS) code and a Low-Density Generator Matrix (LDGM) code are implemented in the communication system onboard the SOTA. In this paper, we present the in-orbit verification results of SOTA including the characteristic of the functions, the communication performance with the LDGM code via satellite-to-ground atmospheric paths, and the link budget analysis and the comparison between theoretical and experimental results.
Regression Verification Using Impact Summaries
NASA Technical Reports Server (NTRS)
Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana
2013-01-01
Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.
A Sub-ps Stability Time Transfer Method Based on Optical Modems.
Frank, Florian; Stefani, Fabio; Tuckey, Philip; Pottie, Paul-Eric
2018-06-01
Coherent optical fiber links recently demonstrate their ability to compare the most advanced optical clocks over a continental scale. The outstanding performances of the optical clocks are stimulating the community to build much more stable time scales, and to develop the means to compare them. Optical fiber link is one solution that needs to be explored. Here, we are investigating a new method to transfer time based on an optical demodulation of a phase step imprint onto the optical carrier. We show the implementation of a proof-of-principle experiment over 86-km urban fiber, and report time interval transfer stability of 1 pulse per second signal with sub-ps resolution from 10 s to one day of measurement time. Prospects for future development and implementation in active telecommunication networks, not only regarding performance but also compatibility, conclude this paper.
ECC-based grouping-proof RFID for inpatient medication safety.
Lin, Qiping; Zhang, Fangguo
2012-12-01
Several papers were proposed in which symmetric cryptography was used to design RFID grouping-proof for medication safety in the Journal of Medical Systems. However, if we want to ensure privacy, authentication and protection against the tracking of RFID-tags without losing system scalability, we must design an asymmetric cryptography-based RFID. This paper will propose a new ECC-based grouping-proof for RFID. Our ECC-based grouping-proof reduces the computation of tags and prevents timeout problems from occurring in n-party grouping-proof protocol. Based on asymmetric cryptography, the proposed scheme is practical, secure and efficient for medication applications.
Wu, Hao; Wang, Ruoxu; Liu, Deming; Fu, Songnian; Zhao, Can; Wei, Huifeng; Tong, Weijun; Shum, Perry Ping; Tang, Ming
2016-04-01
We proposed and demonstrated a few-mode fiber (FMF) based optical-fiber sensor for distributed curvature measurement through quasi-single-mode Brillouin frequency shift (BFS). By central-alignment splicing FMF and single-mode fiber (SMF) with a fusion taper, a SMF-components-compatible distributed curvature sensor based on FMF is realized using the conventional Brillouin optical time-domain analysis system. The distributed BFS change induced by bending in FMF has been theoretically and experimentally investigated. The precise BFS response to the curvature along the fiber link has been calibrated. A proof-of-concept experiment is implemented to validate its effectiveness in distributed curvature measurement.
On the Connectivity and Multihop Delay of Ad Hoc Cognitive Radio Networks
2011-04-01
that we can move fast enough such that the driving time on the road is negligible. When the secondary network is instantaneously connected, there...1.10], and the proof of Lemma 4 is based on properties of the diameter9 of the finite connected components formed by communication links in an intermit ...realization of the destination. We can see that if the secondary network is instantaneously connected (Fig. 5-(a)), the ratio decreases very fast as
Gómez, Angel; Seyle, D Conor; Huici, Carmen; Swann, William B
2009-12-01
Recent research has demonstrated self-verification strivings in groups, such that people strive to verify collective identities, which are personal self-views (e.g., "sensitive") associated with group membership (e.g., "women"). Such demonstrations stop short of showing that the desire for self-verification can fully transcend the self-other barrier, as in people working to verify ingroup identities (e.g., "Americans are loud") even when such identities are not self-descriptive ("I am quiet and unassuming"). Five studies focus on such ingroup verification strivings. Results indicate that people prefer to interact with individuals who verify their ingroup identities over those who enhance these identities (Experiments 1-5). Strivings for ingroup identity verification were independent of the extent to which the identities were self-descriptive but were stronger among participants who were highly invested in their ingroup identities, as reflected in high certainty of these identities (Experiments 1-4) and high identification with the group (Experiments 1-5). In addition, whereas past demonstrations of self-verification strivings have been limited to efforts to verify the content of identities (Experiments 1 to 3), the findings also show that they strive to verify the valence of their identities (i.e., the extent to which the identities are valued; Experiments 4 and 5). Self-verification strivings, rather than self-enhancement strivings, appeared to motivate participants' strivings for ingroup identity verification. Links to collective self-verification strivings and social identity theory are discussed.
Automated analysis in generic groups
NASA Astrophysics Data System (ADS)
Fagerholm, Edvard
This thesis studies automated methods for analyzing hardness assumptions in generic group models, following ideas of symbolic cryptography. We define a broad class of generic and symbolic group models for different settings---symmetric or asymmetric (leveled) k-linear groups --- and prove ''computational soundness'' theorems for the symbolic models. Based on this result, we formulate a master theorem that relates the hardness of an assumption to solving problems in polynomial algebra. We systematically analyze these problems identifying different classes of assumptions and obtain decidability and undecidability results. Then, we develop automated procedures for verifying the conditions of our master theorems, and thus the validity of hardness assumptions in generic group models. The concrete outcome is an automated tool, the Generic Group Analyzer, which takes as input the statement of an assumption, and outputs either a proof of its generic hardness or shows an algebraic attack against the assumption. Structure-preserving signatures are signature schemes defined over bilinear groups in which messages, public keys and signatures are group elements, and the verification algorithm consists of evaluating ''pairing-product equations''. Recent work on structure-preserving signatures studies optimality of these schemes in terms of the number of group elements needed in the verification key and the signature, and the number of pairing-product equations in the verification algorithm. While the size of keys and signatures is crucial for many applications, another aspect of performance is the time it takes to verify a signature. The most expensive operation during verification is the computation of pairings. However, the concrete number of pairings is not captured by the number of pairing-product equations considered in earlier work. We consider the question of what is the minimal number of pairing computations needed to verify structure-preserving signatures. We build an automated tool to search for structure-preserving signatures matching a template. Through exhaustive search we conjecture lower bounds for the number of pairings required in the Type~II setting and prove our conjecture to be true. Finally, our tool exhibits examples of structure-preserving signatures matching the lower bounds, which proves tightness of our bounds, as well as improves on previously known structure-preserving signature schemes.
On the engineering of crucial software
NASA Technical Reports Server (NTRS)
Pratt, T. W.; Knight, J. C.; Gregory, S. T.
1983-01-01
The various aspects of the conventional software development cycle are examined. This cycle was the basis of the augmented approach contained in the original grant proposal. This cycle was found inadequate for crucial software development, and the justification for this opinion is presented. Several possible enhancements to the conventional software cycle are discussed. Software fault tolerance, a possible enhancement of major importance, is discussed separately. Formal verification using mathematical proof is considered. Automatic programming is a radical alternative to the conventional cycle and is discussed. Recommendations for a comprehensive approach are presented, and various experiments which could be conducted in AIRLAB are described.
A Test Generation Framework for Distributed Fault-Tolerant Algorithms
NASA Technical Reports Server (NTRS)
Goodloe, Alwyn; Bushnell, David; Miner, Paul; Pasareanu, Corina S.
2009-01-01
Heavyweight formal methods such as theorem proving have been successfully applied to the analysis of safety critical fault-tolerant systems. Typically, the models and proofs performed during such analysis do not inform the testing process of actual implementations. We propose a framework for generating test vectors from specifications written in the Prototype Verification System (PVS). The methodology uses a translator to produce a Java prototype from a PVS specification. Symbolic (Java) PathFinder is then employed to generate a collection of test cases. A small example is employed to illustrate how the framework can be used in practice.
Tackling Africa's digital divide
NASA Astrophysics Data System (ADS)
Lavery, Martin P. J.; Abadi, Mojtaba Mansour; Bauer, Ralf; Brambilla, Gilberto; Cheng, Ling; Cox, Mitchell A.; Dudley, Angela; Ellis, Andrew D.; Fontaine, Nicolas K.; Kelly, Anthony E.; Marquardt, Christoph; Matlhane, Selaelo; Ndagano, Bienvenu; Petruccione, Francesco; Slavík, Radan; Romanato, Filippo; Rosales-Guzmán, Carmelo; Roux, Filippus S.; Roux, Kobus; Wang, Jian; Forbes, Andrew
2018-05-01
Innovations in `sustainable' photonics technologies such as free-space optical links and solar-powered equipment provide developing countries with new cost-effective opportunities for deploying future-proof telecommunication networks.
Translating expert system rules into Ada code with validation and verification
NASA Technical Reports Server (NTRS)
Becker, Lee; Duckworth, R. James; Green, Peter; Michalson, Bill; Gosselin, Dave; Nainani, Krishan; Pease, Adam
1991-01-01
The purpose of this ongoing research and development program is to develop software tools which enable the rapid development, upgrading, and maintenance of embedded real-time artificial intelligence systems. The goals of this phase of the research were to investigate the feasibility of developing software tools which automatically translate expert system rules into Ada code and develop methods for performing validation and verification testing of the resultant expert system. A prototype system was demonstrated which automatically translated rules from an Air Force expert system was demonstrated which detected errors in the execution of the resultant system. The method and prototype tools for converting AI representations into Ada code by converting the rules into Ada code modules and then linking them with an Activation Framework based run-time environment to form an executable load module are discussed. This method is based upon the use of Evidence Flow Graphs which are a data flow representation for intelligent systems. The development of prototype test generation and evaluation software which was used to test the resultant code is discussed. This testing was performed automatically using Monte-Carlo techniques based upon a constraint based description of the required performance for the system.
NASA Astrophysics Data System (ADS)
Mitryk, Shawn; Mueller, Guido
The Laser Interferometer Space Antenna (LISA) is a space-based modified Michelson interfer-ometer designed to measure gravitational radiation in the frequency range from 30 uHz to 1 Hz. The interferometer measurement system (IMS) utilizes one-way laser phase measurements to cancel the laser phase noise, reconstruct the proof-mass motion, and extract the gravitational wave (GW) induced laser phase modulations in post-processing using a technique called time-delay interferometry (TDI). Unfortunately, there exist few hard-ware verification experiments of the IMS. The University of Florida LISA Interferometry Simulator (UFLIS) is designed to perform hardware-in-the-loop simulations of the LISA interferometry system, modeling the characteris-tics of the LISA mission as accurately as possible. This depends, first, on replicating the laser pre-stabilization by locking the laser phase to an ultra-stable Zerodur cavity length reference using the PDH locking method. Phase measurements of LISA-like photodetector beat-notes are taken using the UF-phasemeter (PM) which can measure the laser BN frequency to within an accuracy of 0.22 uHz. The inter-space craft (SC) laser links including the time-delay due to the 5 Gm light travel time along the LISA arms, the laser Doppler shifts due to differential SC motion, and the GW induced laser phase modulations are simulated electronically using the electronic phase delay (EPD) unit. The EPD unit replicates the laser field propagation between SC by measuring a photodetector beat-note frequency with the UF-phasemeter and storing the information in memory. After the requested delay time, the frequency information is added to a Doppler offset and a GW-like frequency modulation. The signal is then regenerated with the inter-SC laser phase affects applied. Utilizing these components, I will present the first complete TDI simulations performed using the UFLIS. The LISA model is presented along-side the simulation, comparing the generation and measurement of LISA-like signals. Phasemeter measurements are used in post-processing and combined in the linear combinations defined by TDI, thus, canceling the laser phase and phase-lock loop noise to extract the applied GW modulation buried under the noise. Nine order of magnitude common mode laser noise cancellation is achieved at a frequency of 1 mHz and the GW signal is clearly visible after the laser and PLL noise cancellation.
Zhao, Hongbo; Chen, Yuying; Feng, Wenquan; Zhuang, Chen
2018-05-25
Inter-satellite links are an important component of the new generation of satellite navigation systems, characterized by low signal-to-noise ratio (SNR), complex electromagnetic interference and the short time slot of each satellite, which brings difficulties to the acquisition stage. The inter-satellite link in both Global Positioning System (GPS) and BeiDou Navigation Satellite System (BDS) adopt the long code spread spectrum system. However, long code acquisition is a difficult and time-consuming task due to the long code period. Traditional folding methods such as extended replica folding acquisition search technique (XFAST) and direct average are largely restricted because of code Doppler and additional SNR loss caused by replica folding. The dual folding method (DF-XFAST) and dual-channel method have been proposed to achieve long code acquisition in low SNR and high dynamic situations, respectively, but the former is easily affected by code Doppler and the latter is not fast enough. Considering the environment of inter-satellite links and the problems of existing algorithms, this paper proposes a new long code acquisition algorithm named dual-channel acquisition method based on the extended replica folding algorithm (DC-XFAST). This method employs dual channels for verification. Each channel contains an incoming signal block. Local code samples are folded and zero-padded to the length of the incoming signal block. After a circular FFT operation, the correlation results contain two peaks of the same magnitude and specified relative position. The detection process is eased through finding the two largest values. The verification takes all the full and partial peaks into account. Numerical results reveal that the DC-XFAST method can improve acquisition performance while acquisition speed is guaranteed. The method has a significantly higher acquisition probability than folding methods XFAST and DF-XFAST. Moreover, with the advantage of higher detection probability and lower false alarm probability, it has a lower mean acquisition time than traditional XFAST, DF-XFAST and zero-padding.
2007-01-01
Metrology; (270.5290) Photon statistics. References and links 1. W. H. Louisell, A. Yariv, and A. E. Siegman , “Quantum Fluctuations and Noise in...939–941 (1981). 7. S. R. Bowman, Y. H. Shih, and C. O. Alley, “The use of Geiger mode avalanche photodiodes for precise laser ranging at very low...light levels: An experimental evaluation”, in Laser Radar Technology and Applications I, James M. Cruickshank, Robert C. Harney eds., Proc. SPIE 663
Verification and validation of RADMODL Version 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kimball, K.D.
1993-03-01
RADMODL is a system of linked computer codes designed to calculate the radiation environment following an accident in which nuclear materials are released. The RADMODL code and the corresponding Verification and Validation (V&V) calculations (Appendix A), were developed for Westinghouse Savannah River Company (WSRC) by EGS Corporation (EGS). Each module of RADMODL is an independent code and was verified separately. The full system was validated by comparing the output of the various modules with the corresponding output of a previously verified version of the modules. The results of the verification and validation tests show that RADMODL correctly calculates the transportmore » of radionuclides and radiation doses. As a result of this verification and validation effort, RADMODL Version 1.0 is certified for use in calculating the radiation environment following an accident.« less
Mealier, Anne-Laure; Pointeau, Gregoire; Mirliaz, Solène; Ogawa, Kenji; Finlayson, Mark; Dominey, Peter F.
2017-01-01
It has been proposed that starting from meaning that the child derives directly from shared experience with others, adult narrative enriches this meaning and its structure, providing causal links between unseen intentional states and actions. This would require a means for representing meaning from experience—a situation model—and a mechanism that allows information to be extracted from sentences and mapped onto the situation model that has been derived from experience, thus enriching that representation. We present a hypothesis and theory concerning how the language processing infrastructure for grammatical constructions can naturally be extended to narrative constructions to provide a mechanism for using language to enrich meaning derived from physical experience. Toward this aim, the grammatical construction models are augmented with additional structures for representing relations between events across sentences. Simulation results demonstrate proof of concept for how the narrative construction model supports multiple successive levels of meaning creation which allows the system to learn about the intentionality of mental states, and argument substitution which allows extensions to metaphorical language and analogical problem solving. Cross-linguistic validity of the system is demonstrated in Japanese. The narrative construction model is then integrated into the cognitive system of a humanoid robot that provides the memory systems and world-interaction required for representing meaning in a situation model. In this context proof of concept is demonstrated for how the system enriches meaning in the situation model that has been directly derived from experience. In terms of links to empirical data, the model predicts strong usage based effects: that is, that the narrative constructions used by children will be highly correlated with those that they experience. It also relies on the notion of narrative or discourse function words. Both of these are validated in the experimental literature. PMID:28861011
Mealier, Anne-Laure; Pointeau, Gregoire; Mirliaz, Solène; Ogawa, Kenji; Finlayson, Mark; Dominey, Peter F
2017-01-01
It has been proposed that starting from meaning that the child derives directly from shared experience with others, adult narrative enriches this meaning and its structure, providing causal links between unseen intentional states and actions. This would require a means for representing meaning from experience-a situation model-and a mechanism that allows information to be extracted from sentences and mapped onto the situation model that has been derived from experience, thus enriching that representation. We present a hypothesis and theory concerning how the language processing infrastructure for grammatical constructions can naturally be extended to narrative constructions to provide a mechanism for using language to enrich meaning derived from physical experience. Toward this aim, the grammatical construction models are augmented with additional structures for representing relations between events across sentences. Simulation results demonstrate proof of concept for how the narrative construction model supports multiple successive levels of meaning creation which allows the system to learn about the intentionality of mental states, and argument substitution which allows extensions to metaphorical language and analogical problem solving. Cross-linguistic validity of the system is demonstrated in Japanese. The narrative construction model is then integrated into the cognitive system of a humanoid robot that provides the memory systems and world-interaction required for representing meaning in a situation model. In this context proof of concept is demonstrated for how the system enriches meaning in the situation model that has been directly derived from experience. In terms of links to empirical data, the model predicts strong usage based effects: that is, that the narrative constructions used by children will be highly correlated with those that they experience. It also relies on the notion of narrative or discourse function words. Both of these are validated in the experimental literature.
Model Based Verification of Cyber Range Event Environments
2015-12-10
Model Based Verification of Cyber Range Event Environments Suresh K. Damodaran MIT Lincoln Laboratory 244 Wood St., Lexington, MA, USA...apply model based verification to cyber range event environment configurations, allowing for the early detection of errors in event environment...Environment Representation (CCER) ontology. We also provide an overview of a methodology to specify verification rules and the corresponding error
Design and evaluation of a wireless sensor network based aircraft strength testing system.
Wu, Jian; Yuan, Shenfang; Zhou, Genyuan; Ji, Sai; Wang, Zilong; Wang, Yang
2009-01-01
The verification of aerospace structures, including full-scale fatigue and static test programs, is essential for structure strength design and evaluation. However, the current overall ground strength testing systems employ a large number of wires for communication among sensors and data acquisition facilities. The centralized data processing makes test programs lack efficiency and intelligence. Wireless sensor network (WSN) technology might be expected to address the limitations of cable-based aeronautical ground testing systems. This paper presents a wireless sensor network based aircraft strength testing (AST) system design and its evaluation on a real aircraft specimen. In this paper, a miniature, high-precision, and shock-proof wireless sensor node is designed for multi-channel strain gauge signal conditioning and monitoring. A cluster-star network topology protocol and application layer interface are designed in detail. To verify the functionality of the designed wireless sensor network for strength testing capability, a multi-point WSN based AST system is developed for static testing of a real aircraft undercarriage. Based on the designed wireless sensor nodes, the wireless sensor network is deployed to gather, process, and transmit strain gauge signals and monitor results under different static test loads. This paper shows the efficiency of the wireless sensor network based AST system, compared to a conventional AST system.
Design and Evaluation of a Wireless Sensor Network Based Aircraft Strength Testing System
Wu, Jian; Yuan, Shenfang; Zhou, Genyuan; Ji, Sai; Wang, Zilong; Wang, Yang
2009-01-01
The verification of aerospace structures, including full-scale fatigue and static test programs, is essential for structure strength design and evaluation. However, the current overall ground strength testing systems employ a large number of wires for communication among sensors and data acquisition facilities. The centralized data processing makes test programs lack efficiency and intelligence. Wireless sensor network (WSN) technology might be expected to address the limitations of cable-based aeronautical ground testing systems. This paper presents a wireless sensor network based aircraft strength testing (AST) system design and its evaluation on a real aircraft specimen. In this paper, a miniature, high-precision, and shock-proof wireless sensor node is designed for multi-channel strain gauge signal conditioning and monitoring. A cluster-star network topology protocol and application layer interface are designed in detail. To verify the functionality of the designed wireless sensor network for strength testing capability, a multi-point WSN based AST system is developed for static testing of a real aircraft undercarriage. Based on the designed wireless sensor nodes, the wireless sensor network is deployed to gather, process, and transmit strain gauge signals and monitor results under different static test loads. This paper shows the efficiency of the wireless sensor network based AST system, compared to a conventional AST system. PMID:22408521
Chambers, Andrew G; Percy, Andrew J; Simon, Romain; Borchers, Christoph H
2014-04-01
Accurate cancer biomarkers are needed for early detection, disease classification, prediction of therapeutic response and monitoring treatment. While there appears to be no shortage of candidate biomarker proteins, a major bottleneck in the biomarker pipeline continues to be their verification by enzyme linked immunosorbent assays. Multiple reaction monitoring (MRM), also known as selected reaction monitoring, is a targeted mass spectrometry approach to protein quantitation and is emerging to bridge the gap between biomarker discovery and clinical validation. Highly multiplexed MRM assays are readily configured and enable simultaneous verification of large numbers of candidates facilitating the development of biomarker panels which can increase specificity. This review focuses on recent applications of MRM to the analysis of plasma and serum from cancer patients for biomarker verification. The current status of this approach is discussed along with future directions for targeted mass spectrometry in clinical biomarker validation.
40 CFR 1066.250 - Base inertia verification.
Code of Federal Regulations, 2013 CFR
2013-07-01
... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...
40 CFR 1066.250 - Base inertia verification.
Code of Federal Regulations, 2012 CFR
2012-07-01
... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...
40 CFR 1066.250 - Base inertia verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Base inertia verification. 1066.250... CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.250 Base inertia verification. (a) Overview. This section describes how to verify the dynamometer's base inertia. (b) Scope and frequency...
NASA Astrophysics Data System (ADS)
Karam, Walid; Mokbel, Chafic; Greige, Hanna; Chollet, Gerard
2006-05-01
A GMM based audio visual speaker verification system is described and an Active Appearance Model with a linear speaker transformation system is used to evaluate the robustness of the verification. An Active Appearance Model (AAM) is used to automatically locate and track a speaker's face in a video recording. A Gaussian Mixture Model (GMM) based classifier (BECARS) is used for face verification. GMM training and testing is accomplished on DCT based extracted features of the detected faces. On the audio side, speech features are extracted and used for speaker verification with the GMM based classifier. Fusion of both audio and video modalities for audio visual speaker verification is compared with face verification and speaker verification systems. To improve the robustness of the multimodal biometric identity verification system, an audio visual imposture system is envisioned. It consists of an automatic voice transformation technique that an impostor may use to assume the identity of an authorized client. Features of the transformed voice are then combined with the corresponding appearance features and fed into the GMM based system BECARS for training. An attempt is made to increase the acceptance rate of the impostor and to analyzing the robustness of the verification system. Experiments are being conducted on the BANCA database, with a prospect of experimenting on the newly developed PDAtabase developed within the scope of the SecurePhone project.
Szajek, Krzysztof; Wierszycki, Marcin
2016-01-01
Dental implant designing is a complex process which considers many limitations both biological and mechanical in nature. In earlier studies, a complete procedure for improvement of two-component dental implant was proposed. However, the optimization tasks carried out required assumption on representative load case, which raised doubts on optimality for the other load cases. This paper deals with verification of the optimal design in context of fatigue life and its main goal is to answer the question if the assumed load scenario (solely horizontal occlusal load) leads to the design which is also "safe" for oblique occlussal loads regardless the angle from an implant axis. The verification is carried out with series of finite element analyses for wide spectrum of physiologically justified loads. The design of experiment methodology with full factorial technique is utilized. All computations are done in Abaqus suite. The maximal Mises stress and normalized effective stress amplitude for various load cases are discussed and compared with the assumed "safe" limit (equivalent of fatigue life for 5e6 cycles). The obtained results proof that coronial-appical load component should be taken into consideration in the two component dental implant when fatigue life is optimized. However, its influence in the analyzed case is small and does not change the fact that the fatigue life improvement is observed for all components within whole range of analyzed loads.
Accounting for Proof Test Data in a Reliability Based Design Optimization Framework
NASA Technical Reports Server (NTRS)
Ventor, Gerharad; Scotti, Stephen J.
2012-01-01
This paper investigates the use of proof (or acceptance) test data during the reliability based design optimization of structural components. It is assumed that every component will be proof tested and that the component will only enter into service if it passes the proof test. The goal is to reduce the component weight, while maintaining high reliability, by exploiting the proof test results during the design process. The proposed procedure results in the simultaneous design of the structural component and the proof test itself and provides the designer with direct control over the probability of failing the proof test. The procedure is illustrated using two analytical example problems and the results indicate that significant weight savings are possible when exploiting the proof test results during the design process.
ERIC Educational Resources Information Center
Wiese, Eliane S.; Koedinger, Kenneth R.
2017-01-01
This paper proposes "grounded feedback" as a way to provide implicit verification when students are working with a novel representation. In grounded feedback, students' responses are in the target, to-be-learned representation, and those responses are reflected in a more-accessible linked representation that is intrinsic to the domain.…
Regenerative memory in time-delayed neuromorphic photonic resonators
NASA Astrophysics Data System (ADS)
Romeira, B.; Avó, R.; Figueiredo, José M. L.; Barland, S.; Javaloyes, J.
2016-01-01
We investigate a photonic regenerative memory based upon a neuromorphic oscillator with a delayed self-feedback (autaptic) connection. We disclose the existence of a unique temporal response characteristic of localized structures enabling an ideal support for bits in an optical buffer memory for storage and reshaping of data information. We link our experimental implementation, based upon a nanoscale nonlinear resonant tunneling diode driving a laser, to the paradigm of neuronal activity, the FitzHugh-Nagumo model with delayed feedback. This proof-of-concept photonic regenerative memory might constitute a building block for a new class of neuron-inspired photonic memories that can handle high bit-rate optical signals.
Surrogate based wind farm layout optimization using manifold mapping
NASA Astrophysics Data System (ADS)
Kaja Kamaludeen, Shaafi M.; van Zuijle, Alexander; Bijl, Hester
2016-09-01
High computational cost associated with the high fidelity wake models such as RANS or LES serves as a primary bottleneck to perform a direct high fidelity wind farm layout optimization (WFLO) using accurate CFD based wake models. Therefore, a surrogate based multi-fidelity WFLO methodology (SWFLO) is proposed. The surrogate model is built using an SBO method referred as manifold mapping (MM). As a verification, optimization of spacing between two staggered wind turbines was performed using the proposed surrogate based methodology and the performance was compared with that of direct optimization using high fidelity model. Significant reduction in computational cost was achieved using MM: a maximum computational cost reduction of 65%, while arriving at the same optima as that of direct high fidelity optimization. The similarity between the response of models, the number of mapping points and its position, highly influences the computational efficiency of the proposed method. As a proof of concept, realistic WFLO of a small 7-turbine wind farm is performed using the proposed surrogate based methodology. Two variants of Jensen wake model with different decay coefficients were used as the fine and coarse model. The proposed SWFLO method arrived at the same optima as that of the fine model with very less number of fine model simulations.
Stepwise construction of a metabolic network in Event-B: The heat shock response.
Sanwal, Usman; Petre, Luigia; Petre, Ion
2017-12-01
There is a high interest in constructing large, detailed computational models for biological processes. This is often done by putting together existing submodels and adding to them extra details/knowledge. The result of such approaches is usually a model that can only answer questions on a very specific level of detail, and thus, ultimately, is of limited use. We focus instead on an approach to systematically add details to a model, with formal verification of its consistency at each step. In this way, one obtains a set of reusable models, at different levels of abstraction, to be used for different purposes depending on the question to address. We demonstrate this approach using Event-B, a computational framework introduced to develop formal specifications of distributed software systems. We first describe how to model generic metabolic networks in Event-B. Then, we apply this method for modeling the biological heat shock response in eukaryotic cells, using Event-B refinement techniques. The advantage of using Event-B consists in having refinement as an intrinsic feature; this provides as a final result not only a correct model, but a chain of models automatically linked by refinement, each of which is provably correct and reusable. This is a proof-of-concept that refinement in Event-B is suitable for biomodeling, serving for mastering biological complexity. Copyright © 2017 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Karabat, Cagatay; Kiraz, Mehmet Sabir; Erdogan, Hakan; Savas, Erkay
2015-12-01
In this paper, we introduce a new biometric verification and template protection system which we call THRIVE. The system includes novel enrollment and authentication protocols based on threshold homomorphic encryption where a private key is shared between a user and a verifier. In the THRIVE system, only encrypted binary biometric templates are stored in a database and verification is performed via homomorphically randomized templates, thus, original templates are never revealed during authentication. Due to the underlying threshold homomorphic encryption scheme, a malicious database owner cannot perform full decryption on encrypted templates of the users in the database. In addition, security of the THRIVE system is enhanced using a two-factor authentication scheme involving user's private key and biometric data. Using simulation-based techniques, the proposed system is proven secure in the malicious model. The proposed system is suitable for applications where the user does not want to reveal her biometrics to the verifier in plain form, but needs to prove her identity by using biometrics. The system can be used with any biometric modality where a feature extraction method yields a fixed size binary template and a query template is verified when its Hamming distance to the database template is less than a threshold. The overall connection time for the proposed THRIVE system is estimated to be 336 ms on average for 256-bit biometric templates on a desktop PC running with quad core 3.2 GHz CPUs at 10 Mbit/s up/down link connection speed. Consequently, the proposed system can be efficiently used in real-life applications.
NASA Astrophysics Data System (ADS)
Cai, Xiushan; Meng, Lingxin; Zhang, Wei; Liu, Leipo
2018-03-01
We establish robustness of the predictor feedback control law to perturbations appearing at the system input for affine nonlinear systems with time-varying input delay and additive disturbances. Furthermore, it is shown that it is inverse optimal with respect to a differential game problem. All of the stability and inverse optimality proofs are based on the infinite-dimensional backstepping transformation and an appropriate Lyapunov functional. A single-link manipulator subject to input delays and disturbances is given to illustrate the validity of the proposed method.
Applications of Machine Learning to Downscaling and Verification
NASA Astrophysics Data System (ADS)
Prudden, R.
2017-12-01
Downscaling, sometimes known as super-resolution, means converting model data into a more detailed local forecast. It is a problem which could be highly amenable to machine learning approaches, provided that sufficient historical forecast data and observations are available. It is also closely linked to the subject of verification, since improving a forecast requires a way to measure that improvement. This talk will describe some early work towards downscaling Met Office ensemble forecasts, and discuss how the output may be usefully evaluated.
Specification and verification of gate-level VHDL models of synchronous and asynchronous circuits
NASA Technical Reports Server (NTRS)
Russinoff, David M.
1995-01-01
We present a mathematical definition of hardware description language (HDL) that admits a semantics-preserving translation to a subset of VHDL. Our HDL includes the basic VHDL propagation delay mechanisms and gate-level circuit descriptions. We also develop formal procedures for deriving and verifying concise behavioral specifications of combinational and sequential devices. The HDL and the specification procedures have been formally encoded in the computational logic of Boyer and Moore, which provides a LISP implementation as well as a facility for mechanical proof-checking. As an application, we design, specify, and verify a circuit that achieves asynchronous communication by means of the biphase mark protocol.
Experimental verification of rank 1 chaos in switch-controlled Chua circuit.
Oksasoglu, Ali; Ozoguz, Serdar; Demirkol, Ahmet S; Akgul, Tayfun; Wang, Qiudong
2009-03-01
In this paper, we provide the first experimental proof for the existence of rank 1 chaos in the switch-controlled Chua circuit by following a step-by-step procedure given by the theory of rank 1 maps. At the center of this procedure is a periodically kicked limit cycle obtained from the unforced system. Then, this limit cycle is subjected to periodic kicks by adding externally controlled switches to the original circuit. Both the smooth nonlinearity and the piecewise linear cases are considered in this experimental investigation. Experimental results are found to be in concordance with the conclusions of the theory.
Rui, Jia-bai; Zheng, Chuan-xian; Zeng, Qing-tang
2002-12-01
Objective. To test and demonstrate embryonic form of our future space station ECLSS, which will also form an advanced research and test ground facility. Method. The following functions of the system were tested and demonstrated: integrated solid amine CO2 collection and concentration, Sabatier CO2 reduction, urine processing thermoelectric integrated membrane evaporation, solid polymer water electrolysis O2 generation, concentrated ventilation, temperature and humidity control, the measurement and control system, and other non-regenerative techniques. All of these were demonstrated in a sealed adiabatic module, and passed the proof-tests. Result. The principal technical requirements of the system and each regenerative subsystem were met. The integration of system general and each subsystem was successful, and the partial closed loop of the system's integration has been realized basically. Conclusion. The reasonableness of the project design was verified, and the major system technical requirements were satisfied. The suitability and harmonization among system general and each subsystem were good, the system operated normally, and the parameters measured were correct.
NASA Astrophysics Data System (ADS)
Gao, Dongyang; Zheng, Xiaobing; Li, Jianjun; Hu, Youbo; Xia, Maopeng; Salam, Abdul; Zhang, Peng
2018-03-01
Based on spontaneous parametric downconversion process, we propose a novel self-calibration radiometer scheme which can self-calibrate the degradation of its own response and ultimately monitor the fluctuation of a target radiation. Monitor results were independent of its degradation and not linked to the primary standard detector scale. The principle and feasibility of the proposed scheme were verified by observing bromine-tungsten lamp. A relative standard deviation of 0.39 % was obtained for stable bromine-tungsten lamp. Results show that the proposed scheme is advanced of its principle. The proposed scheme could make a significant breakthrough in the self-calibration issue on the space platform.
Research on key technology of the verification system of steel rule based on vision measurement
NASA Astrophysics Data System (ADS)
Jia, Siyuan; Wang, Zhong; Liu, Changjie; Fu, Luhua; Li, Yiming; Lu, Ruijun
2018-01-01
The steel rule plays an important role in quantity transmission. However, the traditional verification method of steel rule based on manual operation and reading brings about low precision and low efficiency. A machine vison based verification system of steel rule is designed referring to JJG1-1999-Verificaiton Regulation of Steel Rule [1]. What differentiates this system is that it uses a new calibration method of pixel equivalent and decontaminates the surface of steel rule. Experiments show that these two methods fully meet the requirements of the verification system. Measuring results strongly prove that these methods not only meet the precision of verification regulation, but also improve the reliability and efficiency of the verification system.
Systematic Model-in-the-Loop Test of Embedded Control Systems
NASA Astrophysics Data System (ADS)
Krupp, Alexander; Müller, Wolfgang
Current model-based development processes offer new opportunities for verification automation, e.g., in automotive development. The duty of functional verification is the detection of design flaws. Current functional verification approaches exhibit a major gap between requirement definition and formal property definition, especially when analog signals are involved. Besides lack of methodical support for natural language formalization, there does not exist a standardized and accepted means for formal property definition as a target for verification planning. This article addresses several shortcomings of embedded system verification. An Enhanced Classification Tree Method is developed based on the established Classification Tree Method for Embeded Systems CTM/ES which applies a hardware verification language to define a verification environment.
Guidelines for Proof Test Analysis
NASA Technical Reports Server (NTRS)
Chell, G. G.; McClung, R. C.; Kuhlman, C. J.; Russell, D. A.; Garr, K.; Donnelly, B.
1999-01-01
These guidelines integrate state-of-the-art elastic-plastic fracture mechanics (EPFM) and proof test implementation issues into a comprehensive proof test analysis procedure in the form of a road map which identifies the types of data, fracture mechanics based parameters, and calculations needed to perform flaw screening and minimum proof load analyses of fracture critical components. Worked examples are presented to illustrate the application of the road map to proof test analysis. The state-of-the art fracture technology employed in these guidelines is based on the EPFM parameter, J, and a pictorial representation of a J fracture analysis, called the failure assessment diagram (FAD) approach. The recommended fracture technology is validated using finite element J results, and laboratory and hardware fracture test results on the nickel-based superalloy Inconel 718, the aluminum alloy 2024-T3511, and ferritic pressure vessel steels. In all cases the laboratory specimens and hardware failed by ductile mechanisms. Advanced proof test analyses involving probability analysis and multiple-cycle proof testing (MCPT) are addressed. Finally, recommendations are provided on how to account for the effects of the proof test overload on subsequent service fatigue and fracture behaviors.
Towards a semantic PACS: Using Semantic Web technology to represent imaging data.
Van Soest, Johan; Lustberg, Tim; Grittner, Detlef; Marshall, M Scott; Persoon, Lucas; Nijsten, Bas; Feltens, Peter; Dekker, Andre
2014-01-01
The DICOM standard is ubiquitous within medicine. However, improved DICOM semantics would significantly enhance search operations. Furthermore, databases of current PACS systems are not flexible enough for the demands within image analysis research. In this paper, we investigated if we can use Semantic Web technology, to store and represent metadata of DICOM image files, as well as linking additional computational results to image metadata. Therefore, we developed a proof of concept containing two applications: one to store commonly used DICOM metadata in an RDF repository, and one to calculate imaging biomarkers based on DICOM images, and store the biomarker values in an RDF repository. This enabled us to search for all patients with a gross tumor volume calculated to be larger than 50 cc. We have shown that we can successfully store the DICOM metadata in an RDF repository and are refining our proof of concept with regards to volume naming, value representation, and the applications themselves.
Hylemetry versus Biometry: a new method to certificate the lithography authenticity
NASA Astrophysics Data System (ADS)
Schirripa Spagnolo, Giuseppe; Cozzella, Lorenzo; Simonetti, Carla
2011-06-01
When we buy an artwork object a certificate of authenticity contain specific details about the artwork. Unfortunately, these certificates are often exchanged between similar artworks: the same document is supplied by the seller to certificate the originality. In this way the buyer will have a copy of an original certificate to attest that the "not original artwork" is an original one. A solution for this problem would be to insert a system that links together the certificate and a specific artwork. To do this it is necessary, for a single artwork, to find unique, unrepeatable, and unchangeable characteristics. In this paper we propose a new lithography certification based on the color spots distribution, which compose the lithography itself. Due to the high resolution acquisition media available today, it is possible using analysis method typical of speckle metrology. In particular, in verification phase it is only necessary acquiring the same portion of lithography, extracting the verification information, using the private key to obtain the same information from the certificate and confronting the two information using a comparison threshold. Due to the possible rotation and translation it is applied image correlation solutions, used in speckle metrology, to determine translation and rotation error and correct allow to verifying extracted and acquired images in the best situation, for granting correct originality verification.
Boudot's Range-Bounded Commitment Scheme Revisited
NASA Astrophysics Data System (ADS)
Cao, Zhengjun; Liu, Lihua
Checking whether a committed integer lies in a specific interval has many cryptographic applications. In Eurocrypt'98, Chan et al. proposed an instantiation (CFT Proof). Based on CFT, Boudot presented a popular range-bounded commitment scheme in Eurocrypt'2000. Both CFT Proof and Boudot Proof are based on the encryption E(x, r)=g^xh^r mod n, where n is an RSA modulus whose factorization is unknown by the prover. They did not use a single base as usual. Thus an increase in cost occurs. In this paper, we show that it suffices to adopt a single base. The cost of the modified Boudot Proof is about half of that of the original scheme. Moreover, the key restriction in the original scheme, i.e., both the discrete logarithm of g in base h and the discrete logarithm of h in base g are unknown by the prover, which is a potential menace to the Boudot Proof, is definitely removed.
The transition to formal thinking in mathematics
NASA Astrophysics Data System (ADS)
Tall, David
2008-09-01
This paper focuses on the changes in thinking involved in the transition from school mathematics to formal proof in pure mathematics at university. School mathematics is seen as a combination of visual representations, including geometry and graphs, together with symbolic calculations and manipulations. Pure mathematics in university shifts towards a formal framework of axiomatic systems and mathematical proof. In this paper, the transition in thinking is formulated within a framework of `three worlds of mathematics'- the `conceptual-embodied' world based on perception, action and thought experiment, the `proceptual-symbolic' world of calculation and algebraic manipulation compressing processes such as counting into concepts such as number, and the `axiomatic-formal' world of set-theoretic concept definitions and mathematical proof. Each `world' has its own sequence of development and its own forms of proof that may be blended together to give a rich variety of ways of thinking mathematically. This reveals mathematical thinking as a blend of differing knowledge structures; for instance, the real numbers blend together the embodied number line, symbolic decimal arithmetic and the formal theory of a complete ordered field. Theoretical constructs are introduced to describe how genetic structures set before birth enable the development of mathematical thinking, and how experiences that the individual has met before affect their personal growth. These constructs are used to consider how students negotiate the transition from school to university mathematics as embodiment and symbolism are blended with formalism. At a higher level, structure theorems proved in axiomatic theories link back to more sophisticated forms of embodiment and symbolism, revealing the intimate relationship between the three worlds.
A small chance of paradise —Equivalence of balanced states
NASA Astrophysics Data System (ADS)
Krawczyk, M. J.; Kaluzny, S.; Kulakowski, K.
2017-06-01
A social network is modeled by a complete graph of N nodes, with interpersonal relations represented by links. In the framework of the Heider balance theory, we prove numerically that the probability of each balanced state is the same. This means in particular, that the probability of the paradise state, where all relations are positive, is 21-N . The proof is performed within two models. In the first, relations are changing continuously in time, and the proof is performed only for N = 3 with the methods of nonlinear dynamics. The second model is the Constrained Triad Dynamics, as introduced by Antal, Krapivsky and Redner in 2005. In the latter case, the proof makes use of the symmetries of the network of system states and it is completed for 3≤ N≤ 7 .
NASA Astrophysics Data System (ADS)
Meng, Bowen; Xing, Lei; Han, Bin; Koong, Albert; Chang, Daniel; Cheng, Jason; Li, Ruijiang
2013-11-01
Non-coplanar beams are important for treatment of both cranial and noncranial tumors. Treatment verification of such beams with couch rotation/kicks, however, is challenging, particularly for the application of cone beam CT (CBCT). In this situation, only limited and unconventional imaging angles are feasible to avoid collision between the gantry, couch, patient, and on-board imaging system. The purpose of this work is to develop a CBCT verification strategy for patients undergoing non-coplanar radiation therapy. We propose an image reconstruction scheme that integrates a prior image constrained compressed sensing (PICCS) technique with image registration. Planning CT or CBCT acquired at the neutral position is rotated and translated according to the nominal couch rotation/translation to serve as the initial prior image. Here, the nominal couch movement is chosen to have a rotational error of 5° and translational error of 8 mm from the ground truth in one or more axes or directions. The proposed reconstruction scheme alternates between two major steps. First, an image is reconstructed using the PICCS technique implemented with total-variation minimization and simultaneous algebraic reconstruction. Second, the rotational/translational setup errors are corrected and the prior image is updated by applying rigid image registration between the reconstructed image and the previous prior image. The PICCS algorithm and rigid image registration are alternated iteratively until the registration results fall below a predetermined threshold. The proposed reconstruction algorithm is evaluated with an anthropomorphic digital phantom and physical head phantom. The proposed algorithm provides useful volumetric images for patient setup using projections with an angular range as small as 60°. It reduced the translational setup errors from 8 mm to generally <1 mm and the rotational setup errors from 5° to <1°. Compared with the PICCS algorithm alone, the integration of rigid registration significantly improved the reconstructed image quality, with a reduction of mostly 2-3 folds (up to 100) in root mean square image error. The proposed algorithm provides a remedy for solving the problem of non-coplanar CBCT reconstruction from limited angle of projections by combining the PICCS technique and rigid image registration in an iterative framework. In this proof of concept study, non-coplanar beams with couch rotations of 45° can be effectively verified with the CBCT technique.
Down-conversion IM-DD RF photonic link utilizing MQW MZ modulator.
Xu, Longtao; Jin, Shilei; Li, Yifei
2016-04-18
We present the first down-conversion intensity modulated-direct detection (IM-DD) RF photonic link that achieves frequency down-conversion using the nonlinear optical phase modulation inside a Mach-Zehnder (MZ) modulator. The nonlinear phase modulation is very sensitive and it can enable high RF-to-IF conversion efficiency. Furthermore, the link linearity is enhanced by canceling the nonlinear distortions from the nonlinear phase modulation and the MZ interferometer. Proof-of-concept measurement was performed. The down-conversion IM-DD link demonstrated 28dB improvement in distortion levels over that of a conventional IM-DD link using a LiNbO3 MZ modulator.
Kendig, Catherine Elizabeth
2016-06-01
"Proof of concept" is a phrase frequently used in descriptions of research sought in program announcements, in experimental studies, and in the marketing of new technologies. It is often coupled with either a short definition or none at all, its meaning assumed to be fully understood. This is problematic. As a phrase with potential implications for research and technology, its assumed meaning requires some analysis to avoid it becoming a descriptive category that refers to all things scientifically exciting. I provide a short analysis of proof of concept research and offer an example of it within synthetic biology. I suggest that not only are there activities that circumscribe new epistemological categories but there are also associated normative ethical categories or principles linked to the research. I examine these and provide an outline for an alternative ethical account to describe these activities that I refer to as "extended agency ethics". This view is used to explain how the type of research described as proof of concept also provides an attendant proof of principle that is the result of decision-making that extends across practitioners, their tools, techniques, and the problem solving activities of other research groups.
Pyles, Lee; Hemmati, Pouya; Pan, J; Yu, Xiaoju; Liu, Ke; Wang, Jing; Tsakistos, Andreas; Zheleva, Bistra; Shao, Weiguang; Ni, Quan
2017-04-01
A system for collection, distribution, and long distant, asynchronous interpretation of cardiac auscultation has been developed and field-tested in rural China. We initiated a proof-of-concept test as a critical component of design of a system to allow rural physicians with little experience in evaluation of congenital heart disease (CHD) to obtain assistance in diagnosis and management of children with significant heart disease. The project tested the hypothesis that acceptable screening of heart murmurs could be accomplished using a digital stethoscope and internet cloud transmittal to deliver phonocardiograms to an experienced observer. Of the 7993 children who underwent school-based screening in the Menghai District of Yunnan Province, Peoples Republic of China, 149 had a murmur noted by a screener. They had digital heart sounds and phonocardiograms collected with the HeartLink tele auscultation system, and underwent echocardiography by a cardiology resident from the First Affiliated Hospital of Kunming Medical University. The digital phonocardiograms, stored on a cloud server, were later remotely reviewed by a board-certified American pediatric cardiologist. Fourteen of these subjects were found to have CHD confirmed by echocardiogram. Using the HeartLink system, the pediatric cardiologist identified 11 of the 14 subjects with pathological murmurs, and missed three subjects with atrial septal defects, which were incorrectly identified as venous hum or Still's murmur. In addition, ten subjects were recorded as having pathological murmurs, when no CHD was confirmed by echocardiography during the field study. The overall test accuracy was 91% with 78.5% sensitivity and 92.6% specificity. This proof-of-concept study demonstrated the feasibility of differentiating pathologic murmurs due to CHD from normal functional heart murmurs with the HeartLink system. This field study is an initial step to develop a cost-effective CHD screening strategy in low-resource settings with a shortage of trained medical professionals and pediatric heart programs.
Modifying the ECC-based grouping-proof RFID system to increase inpatient medication safety.
Ko, Wen-Tsai; Chiou, Shin-Yan; Lu, Erl-Huei; Chang, Henry Ker-Chang
2014-09-01
RFID technology is increasingly used in applications that require tracking, identification, and authentication. It attaches RFID-readable tags to objects for identification and execution of specific RFID-enabled applications. Recently, research has focused on the use of grouping-proofs for preserving privacy in RFID applications, wherein a proof of two or more tags must be simultaneously scanned. In 2010, a privacy-preserving grouping proof protocol for RFID based on ECC in public-key cryptosystem was proposed but was shown to be vulnerable to tracking attacks. A proposed enhancement protocol was also shown to have defects which prevented proper execution. In 2012, Lin et al. proposed a more efficient RFID ECC-based grouping proof protocol to promote inpatient medication safety. However, we found this protocol is also vulnerable to tracking and impersonation attacks. We then propose a secure privacy-preserving RFID grouping proof protocol for inpatient medication safety and demonstrate its resistance to such attacks.
Figueiro, Ana Claudia; de Araújo Oliveira, Sydia Rosana; Hartz, Zulmira; Couturier, Yves; Bernier, Jocelyne; do Socorro Machado Freire, Maria; Samico, Isabella; Medina, Maria Guadalupe; de Sa, Ronice Franco; Potvin, Louise
2017-03-01
Public health interventions are increasingly represented as complex systems. Research tools for capturing the dynamic of interventions processes, however, are practically non-existent. This paper describes the development and proof of concept process of an analytical tool, the critical event card (CEC), which supports the representation and analysis of complex interventions' evolution, based on critical events. Drawing on the actor-network theory (ANT), we developed and field-tested the tool using three innovative health interventions in northeastern Brazil. Interventions were aimed to promote health equity through intersectoral approaches; were engaged in participatory evaluation and linked to professional training programs. The CEC developing involve practitioners and researchers from projects. Proof of concept was based on document analysis, face-to-face interviews and focus groups. Analytical categories from CEC allow identifying and describing critical events as milestones in the evolution of complex interventions. Categories are (1) event description; (2) actants (human and non-human) involved; (3) interactions between actants; (4) mediations performed; (5) actions performed; (6) inscriptions produced; and (7) consequences for interventions. The CEC provides a tool to analyze and represent intersectoral internvetions' complex and dynamic evolution.
Identification of host response signatures of infection.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Branda, Steven S.; Sinha, Anupama; Bent, Zachary
2013-02-01
Biological weapons of mass destruction and emerging infectious diseases represent a serious and growing threat to our national security. Effective response to a bioattack or disease outbreak critically depends upon efficient and reliable distinguishing between infected vs healthy individuals, to enable rational use of scarce, invasive, and/or costly countermeasures (diagnostics, therapies, quarantine). Screening based on direct detection of the causative pathogen can be problematic, because culture- and probe-based assays are confounded by unanticipated pathogens (e.g., deeply diverged, engineered), and readily-accessible specimens (e.g., blood) often contain little or no pathogen, particularly at pre-symptomatic stages of disease. Thus, in addition to themore » pathogen itself, one would like to detect infection-specific host response signatures in the specimen, preferably ones comprised of nucleic acids (NA), which can be recovered and amplified from tiny specimens (e.g., fingerstick draws). Proof-of-concept studies have not been definitive, however, largely due to use of sub-optimal sample preparation and detection technologies. For purposes of pathogen detection, Sandia has developed novel molecular biology methods that enable selective isolation of NA unique to, or shared between, complex samples, followed by identification and quantitation via Second Generation Sequencing (SGS). The central hypothesis of the current study is that variations on this approach will support efficient identification and verification of NA-based host response signatures of infectious disease. To test this hypothesis, we re-engineered Sandia's sophisticated sample preparation pipelines, and developed new SGS data analysis tools and strategies, in order to pioneer use of SGS for identification of host NA correlating with infection. Proof-of-concept studies were carried out using specimens drawn from pathogen-infected non-human primates (NHP). This work provides a strong foundation for large-scale, highly-efficient efforts to identify and verify infection-specific host NA signatures in human populations.« less
NASA Astrophysics Data System (ADS)
Raaymakers, B. W.; Jürgenliemk-Schulz, I. M.; Bol, G. H.; Glitzner, M.; Kotte, A. N. T. J.; van Asselen, B.; de Boer, J. C. J.; Bluemink, J. J.; Hackett, S. L.; Moerland, M. A.; Woodings, S. J.; Wolthaus, J. W. H.; van Zijp, H. M.; Philippens, M. E. P.; Tijssen, R.; Kok, J. G. M.; de Groot-van Breugel, E. N.; Kiekebosch, I.; Meijers, L. T. C.; Nomden, C. N.; Sikkes, G. G.; Doornaert, P. A. H.; Eppinga, W. S. C.; Kasperts, N.; Kerkmeijer, L. G. W.; Tersteeg, J. H. A.; Brown, K. J.; Pais, B.; Woodhead, P.; Lagendijk, J. J. W.
2017-12-01
The integration of 1.5 T MRI functionality with a radiotherapy linear accelerator (linac) has been pursued since 1999 by the UMC Utrecht in close collaboration with Elekta and Philips. The idea behind this integrated device is to offer unrivalled, online and real-time, soft-tissue visualization of the tumour and the surroundings for more precise radiation delivery. The proof of concept of this device was given in 2009 by demonstrating simultaneous irradiation and MR imaging on phantoms, since then the device has been further developed and commercialized by Elekta. The aim of this work is to demonstrate the clinical feasibility of online, high-precision, high-field MRI guidance of radiotherapy using the first clinical prototype MRI-Linac. Four patients with lumbar spine bone metastases were treated with a 3 or 5 beam step-and-shoot IMRT plan. The IMRT plan was created while the patient was on the treatment table and based on the online 1.5 T MR images; pre-treatment CT was deformably registered to the online MRI to obtain Hounsfield values. Bone metastases were chosen as the first site as these tumors can be clearly visualized on MRI and the surrounding spine bone can be detected on the integrated portal imager. This way the portal images served as an independent verification of the MRI based guidance to quantify the geometric precision of radiation delivery. Dosimetric accuracy was assessed post-treatment from phantom measurements with an ionization chamber and film. Absolute doses were found to be highly accurate, with deviations ranging from 0.0% to 1.7% in the isocenter. The geometrical, MRI based targeting as confirmed using portal images was better than 0.5 mm, ranging from 0.2 mm to 0.4 mm. In conclusion, high precision, high-field, 1.5 T MRI guided radiotherapy is clinically feasible.
New Proofs of Some q-Summation and q-Transformation Formulas
Liu, Xian-Fang; Bi, Ya-Qing; Luo, Qiu-Ming
2014-01-01
We obtain an expectation formula and give the probabilistic proofs of some summation and transformation formulas of q-series based on our expectation formula. Although these formulas in themselves are not the probability results, the proofs given are based on probabilistic concepts. PMID:24895675
NASA Contractor Report: Guidelines for Proof Test Analysis
NASA Technical Reports Server (NTRS)
Chell, G. G.; McClung, R. C.; Kuhlman, C. J.; Russell, D. A.; Garr, K.; Donnelly, B.
1997-01-01
These Guidelines integrate state-of-the-art Elastic-Plastic Fracture Mechanics (EPFM) and proof test implementation issues into a comprehensive proof test analysis procedure in the form of a Road Map which identifies the types of data, fracture mechanics based parameters, and calculations needed to perform flaw screening and minimum proof load analyses of fracture critical components. Worked examples are presented to illustrate the application of the Road Map to proof test analysis. The state-of-the-art fracture technology employed in these Guidelines is based on the EPFM parameter, J, and a pictorial representation of a J fracture analysis, called the Failure Assessment Diagram (FAD) approach. The recommended fracture technology is validated using finite element J results, and laboratory and hardware fracture test results on the nickel-based superalloy IN-718, the aluminum alloy 2024-T351 1, and ferritic pressure vessel steels. In all cases the laboratory specimens and hardware failed by ductile mechanisms. Advanced proof test analyses involving probability analysis and Multiple Cycle Proof Testing (MCPT) are addressed. Finally, recommendations are provided on to how to account for the effects of the proof test overload on subsequent service fatigue and fracture behaviors.
High performance data transfer
NASA Astrophysics Data System (ADS)
Cottrell, R.; Fang, C.; Hanushevsky, A.; Kreuger, W.; Yang, W.
2017-10-01
The exponentially increasing need for high speed data transfer is driven by big data, and cloud computing together with the needs of data intensive science, High Performance Computing (HPC), defense, the oil and gas industry etc. We report on the Zettar ZX software. This has been developed since 2013 to meet these growing needs by providing high performance data transfer and encryption in a scalable, balanced, easy to deploy and use way while minimizing power and space utilization. In collaboration with several commercial vendors, Proofs of Concept (PoC) consisting of clusters have been put together using off-the- shelf components to test the ZX scalability and ability to balance services using multiple cores, and links. The PoCs are based on SSD flash storage that is managed by a parallel file system. Each cluster occupies 4 rack units. Using the PoCs, between clusters we have achieved almost 200Gbps memory to memory over two 100Gbps links, and 70Gbps parallel file to parallel file with encryption over a 5000 mile 100Gbps link.
On Mathematicians' Proof Skimming: A Reply to Inglis and Alcock
ERIC Educational Resources Information Center
Weber, Keith; Mejia-Ramos, Juan Pablo
2013-01-01
n a recent article, Inglis and Alcock (2012) contended that their data challenge the claim that when mathematicians validate proofs, they initially skim a proof to grasp its main idea before reading individual parts of the proof more carefully. This result is based on the fact that when mathematicians read proofs in their study, on average their…
Rethinking the Discovery Function of Proof within the Context of Proofs and Refutations
ERIC Educational Resources Information Center
Komatsu, Kotaro; Tsujiyama, Yosuke; Sakamaki, Aruta
2014-01-01
Proof and proving are important components of school mathematics and have multiple functions in mathematical practice. Among these functions of proof, this paper focuses on the discovery function that refers to invention of a new statement or conjecture by reflecting on or utilizing a constructed proof. Based on two cases in which eighth and ninth…
A UAV-based gas sensing system for detecting fugitive methane emissions
NASA Astrophysics Data System (ADS)
Hugenholtz, C.; Barchyn, T.; Myshak, S.; Bauer, J.
2016-12-01
Methane is one of the most prevalent greenhouse gases emitted by human activities and is a major component of government-led initiatives to reduce GHG emissions in Canada, the USA, and elsewhere. In light of growing demand for measurements and verification of atmospheric methane concentration across the oil and gas supply chain, an autonomous airborne gas sensing system was developed that combines a small UAV and a lightweight gas monitor. This paper outlines the technology, analytics, and presents data from a case study to demonstrate the proof of concept. The UAV is a fixed-wing (2.2 m wingspan), battery-operated platform, with a flight endurance of 80-120 minutes. The gas sensor onboard the UAV is a tunable diode laser absorption spectrometer that uses an integrated transmitter/receiver unit and a remote, passive retro-reflector. The transmitter is attached to one of the winglets, while the other is coated with reflective material. The total weight of the UAV and gas sensor is 4.3 kg. During flight, the system operates autonomously, acquiring averages of raw measurements at 1 Hz, with a recorded resolution of 0.0455 ppm. The onboard measurement and control unit (MCU) for the gas sensor is integrated with the UAV autopilot in order to provide time-stamped and geotagged concentration measurements, and to provide real-time flight adjustments when concentration exceeds a pre-determined threshold. The data are retrieved from the MCU when the mission is complete. In order to demonstrate the proof of concept, we present results from a case study and outline opportunities for translating the measurements into decision making.
Coherent inductive communications link for biomedical applications
NASA Technical Reports Server (NTRS)
Hogrefe, Arthur F. (Inventor); Radford, Wade E. (Inventor)
1985-01-01
A two-way coherent inductive communications link between an external transceiver and an internal transceiver located in a biologically implanted programmable medical device. Digitally formatted command data and programming data is transmitted to the implanted medical device by frequency shift keying the inductive communications link. Internal transceiver is powered by the inductive field between internal and external transceivers. Digitally formatted data is transmitted to external transceiver by internal transceiver amplitude modulating inductive field. Immediate verification of the establishment of a reliable communications link is provided by determining existence of frequency lock and bit phase lock between internal and external transceivers.
Investigation, Development, and Evaluation of Performance Proving for Fault-tolerant Computers
NASA Technical Reports Server (NTRS)
Levitt, K. N.; Schwartz, R.; Hare, D.; Moore, J. S.; Melliar-Smith, P. M.; Shostak, R. E.; Boyer, R. S.; Green, M. W.; Elliott, W. D.
1983-01-01
A number of methodologies for verifying systems and computer based tools that assist users in verifying their systems were developed. These tools were applied to verify in part the SIFT ultrareliable aircraft computer. Topics covered included: STP theorem prover; design verification of SIFT; high level language code verification; assembly language level verification; numerical algorithm verification; verification of flight control programs; and verification of hardware logic.
High-speed laser communications in UAV scenarios
NASA Astrophysics Data System (ADS)
Griethe, Wolfgang; Gregory, Mark; Heine, Frank; Kämpfner, Hartmut
2011-05-01
Optical links, based on coherent homodyne detection and BPSK modulation with bidirectional data transmission of 5.6 Gbps over distances of about 5,000 km and BER of 10-8, have been sufficiently verified in space. The verification results show that this technology is suitable not only for space applications but also for applications in the troposphere. After a brief description of the Laser Communication Terminal (LCT) for space applications, the paper consequently discusses the future utilization of satellite-based optical data links for Beyond Line of Sight (BLOS) operations of High Altitude Long Endurance (HALE) Unmanned Aerial Vehicles (UAV). It is shown that the use of optical frequencies is the only logical consequence of an ever-increasing demand for bandwidth. In terms of Network Centric Warfare it is highly recommended that Unmanned Aircraft Systems (UAS) of the future should incorporate that technology which allows almost unlimited bandwidth. The advantages of optical communications especially for Intelligence, Surveillance and Reconnaissance (ISR) are underlined. Moreover, the preliminary design concept of an airborne laser communication terminal is described. Since optical bi-directional links have been tested between a LCT in space and a TESAT Optical Ground Station (OGS), preliminary analysis on tracking and BER performance and the impact of atmospheric disturbances on coherent links will be presented.
Framework for rapid assessment and adoption of new vector control tools.
Vontas, John; Moore, Sarah; Kleinschmidt, Immo; Ranson, Hilary; Lindsay, Steve; Lengeler, Christian; Hamon, Nicholas; McLean, Tom; Hemingway, Janet
2014-04-01
Evidence-informed health policy making is reliant on systematic access to, and appraisal of, the best available research evidence. This review suggests a strategy to improve the speed at which evidence is gathered on new vector control tools (VCTs) using a framework based on measurements of the vectorial capacity of an insect population to transmit disease. We explore links between indicators of VCT efficacy measurable in small-scale experiments that are relevant to entomological and epidemiological parameters measurable only in large-scale proof-of-concept randomised control trials (RCTs). We hypothesise that once RCTs establish links between entomological and epidemiological indicators then rapid evaluation of new products within the same product category may be conducted through smaller scale experiments without repetition of lengthy and expensive RCTs. Copyright © 2014 Elsevier Ltd. All rights reserved.
A formal approach to the analysis of clinical computer-interpretable guideline modeling languages.
Grando, M Adela; Glasspool, David; Fox, John
2012-01-01
To develop proof strategies to formally study the expressiveness of workflow-based languages, and to investigate their applicability to clinical computer-interpretable guideline (CIG) modeling languages. We propose two strategies for studying the expressiveness of workflow-based languages based on a standard set of workflow patterns expressed as Petri nets (PNs) and notions of congruence and bisimilarity from process calculus. Proof that a PN-based pattern P can be expressed in a language L can be carried out semi-automatically. Proof that a language L cannot provide the behavior specified by a PNP requires proof by exhaustion based on analysis of cases and cannot be performed automatically. The proof strategies are generic but we exemplify their use with a particular CIG modeling language, PROforma. To illustrate the method we evaluate the expressiveness of PROforma against three standard workflow patterns and compare our results with a previous similar but informal comparison. We show that the two proof strategies are effective in evaluating a CIG modeling language against standard workflow patterns. We find that using the proposed formal techniques we obtain different results to a comparable previously published but less formal study. We discuss the utility of these analyses as the basis for principled extensions to CIG modeling languages. Additionally we explain how the same proof strategies can be reused to prove the satisfaction of patterns expressed in the declarative language CIGDec. The proof strategies we propose are useful tools for analysing the expressiveness of CIG modeling languages. This study provides good evidence of the benefits of applying formal methods of proof over semi-formal ones. Copyright © 2011 Elsevier B.V. All rights reserved.
Applications of square-related theorems
NASA Astrophysics Data System (ADS)
Srinivasan, V. K.
2014-04-01
The square centre of a given square is the point of intersection of its two diagonals. When two squares of different side lengths share the same square centre, there are in general four diagonals that go through the same square centre. The Two Squares Theorem developed in this paper summarizes some nice theoretical conclusions that can be obtained when two squares of different side lengths share the same square centre. These results provide the theoretical basis for two of the constructions given in the book of H.S. Hall and F.H. Stevens , 'A Shorter School Geometry, Part 1, Metric Edition'. In page 134 of this book, the authors present, in exercise 4, a practical construction which leads to a verification of the Pythagorean theorem. Subsequently in Theorems 29 and 30, the authors present the standard proofs of the Pythagorean theorem and its converse. In page 140, the authors present, in exercise 15, what amounts to a geometric construction, whose verification involves a simple algebraic identity. Both the constructions are of great importance and can be replicated by using the standard equipment provided in a 'geometry toolbox' carried by students in high schools. The author hopes that the results proved in this paper, in conjunction with the two constructions from the above-mentioned book, would provide high school students an appreciation of the celebrated theorem of Pythagoras. The diagrams that accompany this document are based on the free software GeoGebra. The author formally acknowledges his indebtedness to the creators of this free software at the end of this document.
The capability of lithography simulation based on MVM-SEM® system
NASA Astrophysics Data System (ADS)
Yoshikawa, Shingo; Fujii, Nobuaki; Kanno, Koichi; Imai, Hidemichi; Hayano, Katsuya; Miyashita, Hiroyuki; Shida, Soichi; Murakawa, Tsutomu; Kuribara, Masayuki; Matsumoto, Jun; Nakamura, Takayuki; Matsushita, Shohei; Hara, Daisuke; Pang, Linyong
2015-10-01
The 1Xnm technology node lithography is using SMO-ILT, NTD or more complex pattern. Therefore in mask defect inspection, defect verification becomes more difficult because many nuisance defects are detected in aggressive mask feature. One key Technology of mask manufacture is defect verification to use aerial image simulator or other printability simulation. AIMS™ Technology is excellent correlation for the wafer and standards tool for defect verification however it is difficult for verification over hundred numbers or more. We reported capability of defect verification based on lithography simulation with a SEM system that architecture and software is excellent correlation for simple line and space.[1] In this paper, we use a SEM system for the next generation combined with a lithography simulation tool for SMO-ILT, NTD and other complex pattern lithography. Furthermore we will use three dimension (3D) lithography simulation based on Multi Vision Metrology SEM system. Finally, we will confirm the performance of the 2D and 3D lithography simulation based on SEM system for a photomask verification.
NASA Technical Reports Server (NTRS)
Payne, M. H.
1973-01-01
The bounds for the normalized associated Legendre functions P sub nm were studied to provide a rational basis for the truncation of the geopotential series in spherical harmonics in various orbital analyses. The conjecture is made that the largest maximum of the normalized associated Legendre function lies in the interval which indicates the greatest integer function. A procedure is developed for verifying this conjecture. An on-line algebraic manipulator, IAM, is used to implement the procedure and the verification is carried out for all n equal to or less than 2m, for m = 1 through 6. A rigorous proof of the conjecture is not available.
Travel Vaccines Enter the Digital Age: Creating a Virtual Immunization Record
Wilson, Kumanan; Atkinson, Katherine M.; Bell, Cameron P.
2016-01-01
At present, proof of immunization against diseases such as yellow fever is required at some international borders in concordance with the International Health Regulations. The current standard, the International Certificate of Vaccination or Prophylaxis (ICVP), has limitations as a paper record including the possibility of being illegible, misplaced, or damaged. We believe that a complementary, digital record would offer advantages to public health and travelers alike. These include enhanced availability and reliability, potential to include lot specific information, and integration with immunization information systems. Challenges exist in implementation, particularly pertaining to verification at border crossings. We describe a potential course for the development and implementation of a digital ICVP record. PMID:26711516
NASA airframe structural integrity program
NASA Technical Reports Server (NTRS)
Harris, Charles E.
1991-01-01
NASA has initiated a research program with the long-term objective of supporting the aerospace industry in addressing issues related to the aging commercial transport fleet. The interdisciplinary program combines advanced fatigue crack growth prediction methodology with innovative nondestructive examination technology with the focus on multi-site damage (MSD) at riveted connections. A fracture mechanics evaluation of the concept of pressure proof testing the fuselage to screen for MSD has been completed. Also, a successful laboratory demonstration of the ability of the thermal flux method to detect disbonds at riveted lap splice joints has been conducted. All long-term program elements have been initiated and the plans for the methodology verification program are being coordinated with the airframe manufacturers.
NASA airframe structural integrity program
NASA Technical Reports Server (NTRS)
Harris, Charles E.
1990-01-01
NASA initiated a research program with the long-term objective of supporting the aerospace industry in addressing issues related to the aging of the commercial transport fleet. The program combines advanced fatigue crack growth prediction methodology with innovative nondestructive examination technology with the focus on multi-stage damage (MSD) at rivited connections. A fracture mechanics evaluation of the concept of pressure proof testing the fuselage to screen for MSD was completed. A successful laboratory demonstration of the ability of the thermal flux method to detect disbonds at rivited lap splice joints was conducted. All long-term program elements were initiated, and the plans for the methodology verification program are being coordinated with the airframe manufacturers.
NASA Technical Reports Server (NTRS)
Rinehart, Maegan L.
2011-01-01
The purpose of this activity is to provide the Mechanical Components Test Facility (MCTF) with the capability to obtain electronic leak test and proof pressure data, Payload and Components Real-time Automated Test System (PACRATS) data acquisition software will be utilized to display real-time data. It will record leak rates and pressure/vacuum level(s) simultaneously. This added functionality will provide electronic leak test and pressure data at specified sampling frequencies. Electronically stored data will provide ES61 with increased data security, analysis, and accuracy. The tasks performed in this procedure are to verify PACRATS only, and are not intended to provide verifications for MCTF equipment.
NASA Astrophysics Data System (ADS)
Kim, Cheol-kyun; Kim, Jungchan; Choi, Jaeseung; Yang, Hyunjo; Yim, Donggyu; Kim, Jinwoong
2007-03-01
As the minimum transistor length is getting smaller, the variation and uniformity of transistor length seriously effect device performance. So, the importance of optical proximity effects correction (OPC) and resolution enhancement technology (RET) cannot be overemphasized. However, OPC process is regarded by some as a necessary evil in device performance. In fact, every group which includes process and design, are interested in whole chip CD variation trend and CD uniformity, which represent real wafer. Recently, design based metrology systems are capable of detecting difference between data base to wafer SEM image. Design based metrology systems are able to extract information of whole chip CD variation. According to the results, OPC abnormality was identified and design feedback items are also disclosed. The other approaches are accomplished on EDA companies, like model based OPC verifications. Model based verification will be done for full chip area by using well-calibrated model. The object of model based verification is the prediction of potential weak point on wafer and fast feed back to OPC and design before reticle fabrication. In order to achieve robust design and sufficient device margin, appropriate combination between design based metrology system and model based verification tools is very important. Therefore, we evaluated design based metrology system and matched model based verification system for optimum combination between two systems. In our study, huge amount of data from wafer results are classified and analyzed by statistical method and classified by OPC feedback and design feedback items. Additionally, novel DFM flow would be proposed by using combination of design based metrology and model based verification tools.
A methodology for the rigorous verification of plasma simulation codes
NASA Astrophysics Data System (ADS)
Riva, Fabio
2016-10-01
The methodology used to assess the reliability of numerical simulation codes constitutes the Verification and Validation (V&V) procedure. V&V is composed by two separate tasks: the verification, which is a mathematical issue targeted to assess that the physical model is correctly solved, and the validation, which determines the consistency of the code results, and therefore of the physical model, with experimental data. In the present talk we focus our attention on the verification, which in turn is composed by the code verification, targeted to assess that a physical model is correctly implemented in a simulation code, and the solution verification, that quantifies the numerical error affecting a simulation. Bridging the gap between plasma physics and other scientific domains, we introduced for the first time in our domain a rigorous methodology for the code verification, based on the method of manufactured solutions, as well as a solution verification based on the Richardson extrapolation. This methodology was applied to GBS, a three-dimensional fluid code based on a finite difference scheme, used to investigate the plasma turbulence in basic plasma physics experiments and in the tokamak scrape-off layer. Overcoming the difficulty of dealing with a numerical method intrinsically affected by statistical noise, we have now generalized the rigorous verification methodology to simulation codes based on the particle-in-cell algorithm, which are employed to solve Vlasov equation in the investigation of a number of plasma physics phenomena.
Stork Color Proofing Technology
NASA Astrophysics Data System (ADS)
Ekman, C. Frederick
1989-04-01
For the past few years, Stork Colorproofing B.V. has been marketing an analog color proofing system in Europe based on electrophoto-graphic technology it pioneered for the purpose of high resolution, high fidelity color imaging in the field of the Graphic Arts. Based in part on this technology, it will make available on a commercial basis a digital color proofing system in 1989. Proofs from both machines will provide an exact reference for the user and will look, feel, and behave in a reproduction sense like the printed press sheet.
Runtime verification of embedded real-time systems.
Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg
We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.
Supersonic Gas-Liquid Cleaning System
NASA Technical Reports Server (NTRS)
Kinney, Frank
1996-01-01
The Supersonic Gas-Liquid Cleaning System Research Project consisted mainly of a feasibility study, including theoretical and engineering analysis, of a proof-of-concept prototype of this particular cleaning system developed by NASA-KSC. The cleaning system utilizes gas-liquid supersonic nozzles to generate high impingement velocities at the surface of the device to be cleaned. The cleaning fluid being accelerated to these high velocities may consist of any solvent or liquid, including water. Compressed air or any inert gas is used to provide the conveying medium for the liquid, as well as substantially reduce the total amount of liquid needed to perform adequate surface cleaning and cleanliness verification. This type of aqueous cleaning system is considered to be an excellent way of conducting cleaning and cleanliness verification operations as replacements for the use of CFC 113 which must be discontinued by 1995. To utilize this particular cleaning system in various cleaning applications for both the Space Program and the commercial market, it is essential that the cleaning system, especially the supersonic nozzle, be characterized for such applications. This characterization consisted of performing theoretical and engineering analysis, identifying desirable modifications/extensions to the basic concept, evaluating effects of variations in operating parameters, and optimizing hardware design for specific applications.
Using Sudoku to Introduce Proof Techniques
ERIC Educational Resources Information Center
Snyder, Brian A.
2010-01-01
In this article we show how the Sudoku puzzle and the three simple rules determining its solution can be used as an introduction to proof-based mathematics. In the completion of the puzzle, students can construct multi-step solutions that involve sequencing of steps, use methods such as backtracking and proof by cases, and proof by contradiction…
Minagawa, Hiroko; Yasui, Yoshihiro; Adachi, Hirokazu; Ito, Miyabi; Hirose, Emi; Nakamura, Noriko; Hata, Mami; Kobayashi, Shinichi; Yamashita, Teruo
2015-11-09
Japan was verified as having achieved measles elimination by the Measles Regional Verification Commission in the Western Pacific Region in March 2015. Verification of measles elimination implies the absence of continuous endemic transmission. After the last epidemic in 2007 with an estimated 18,000 cases, Japan introduced nationwide case-based measles surveillance in January 2008. Laboratory diagnosis for all suspected measles cases is essentially required by law, and virus detection tests are mostly performed by municipal public health institutes. Despite relatively high vaccination coverage and vigorous response to every case by the local health center staff, outbreak of measles is repeatedly observed in Aichi Prefecture, Japan. Measles virus N and H gene detection by nested double RT-PCR was performed with all specimens collected from suspected cases and transferred to our institute. Genotyping and further molecular epidemiological analyses were performed with the direct nucleotide sequence data of appropriate PCR products. Between 2010 and 2014, specimens from 389 patients suspected for measles were tested in our institute. Genotypes D9, D8, H1 and B3 were detected. Further molecular epidemiological analyses were helpful to establish links between patients, and sometimes useful to discriminate one outbreak from another. All virus-positive cases, including 49 cases involved in three outbreaks without any obvious epidemiological link with importation, were considered as import-related based on the nucleotide sequence information. Chain of transmission in the latest outbreak in 2014 terminated after the third generations, much earlier than the 2010-11 outbreak (6th generations). Since 2010, almost all measles cases reported in Aichi Prefecture are either import or import-related, based primarily on genotypes and nucleotide sequences of measles virus detected. In addition, genotyping and molecular epidemiological analyses are indispensable to prove the interruption of endemic transmission when the importations of measles are repeatedly observed. Copyright © 2015 Elsevier Ltd. All rights reserved.
CD volume design and verification
NASA Technical Reports Server (NTRS)
Li, Y. P.; Hughes, J. S.
1993-01-01
In this paper, we describe a prototype for CD-ROM volume design and verification. This prototype allows users to create their own model of CD volumes by modifying a prototypical model. Rule-based verification of the test volumes can then be performed later on against the volume definition. This working prototype has proven the concept of model-driven rule-based design and verification for large quantity of data. The model defined for the CD-ROM volumes becomes a data model as well as an executable specification.
Proof test methodology for composites
NASA Technical Reports Server (NTRS)
Wu, Edward M.; Bell, David K.
1992-01-01
The special requirements for proof test of composites are identified based on the underlying failure process of composites. Two proof test methods are developed to eliminate the inevitable weak fiber sites without also causing flaw clustering which weakens the post-proof-test composite. Significant reliability enhancement by these proof test methods has been experimentally demonstrated for composite strength and composite life in tension. This basic proof test methodology is relevant to the certification and acceptance of critical composite structures. It can also be applied to the manufacturing process development to achieve zero-reject for very large composite structures.
Software Validation via Model Animation
NASA Technical Reports Server (NTRS)
Dutle, Aaron M.; Munoz, Cesar A.; Narkawicz, Anthony J.; Butler, Ricky W.
2015-01-01
This paper explores a new approach to validating software implementations that have been produced from formally-verified algorithms. Although visual inspection gives some confidence that the implementations faithfully reflect the formal models, it does not provide complete assurance that the software is correct. The proposed approach, which is based on animation of formal specifications, compares the outputs computed by the software implementations on a given suite of input values to the outputs computed by the formal models on the same inputs, and determines if they are equal up to a given tolerance. The approach is illustrated on a prototype air traffic management system that computes simple kinematic trajectories for aircraft. Proofs for the mathematical models of the system's algorithms are carried out in the Prototype Verification System (PVS). The animation tool PVSio is used to evaluate the formal models on a set of randomly generated test cases. Output values computed by PVSio are compared against output values computed by the actual software. This comparison improves the assurance that the translation from formal models to code is faithful and that, for example, floating point errors do not greatly affect correctness and safety properties.
Privacy preservation and authentication on secure geographical routing in VANET
NASA Astrophysics Data System (ADS)
Punitha, A.; Manickam, J. Martin Leo
2017-05-01
Vehicular Ad hoc Networks (VANETs) play an important role in vehicle-to-vehicle communication as it offers a high level of safety and convenience to drivers. In order to increase the level of security and safety in VANETs, in this paper, we propose a Privacy Preservation and Authentication on Secure Geographical Routing Protocol (PPASGR) for VANET. It provides security by detecting and preventing malicious nodes through two directional antennas such as forward (f-antenna) and backward (b-antenna). The malicious nodes are detected by direction detection, consistency detection and conflict detection. The location of the trusted neighbour is identified using TNT-based location verification scheme after the implementation of the Vehicle Tamper Proof Device (VTPD), Trusted Authority (TA) is generated that produces the anonymous credentials. Finally, VTPD generates pseudo-identity using TA which retrieves the real identity of the sender. Through this approach, the authentication, integrity and confidentiality for routing packets can be achieved. The simulation results show that the proposed approach reduces the packet drop due to attack and improves the packet delivery ratio.
A Byzantine-Fault Tolerant Self-Stabilizing Protocol for Distributed Clock Synchronization Systems
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar R.
2006-01-01
Embedded distributed systems have become an integral part of safety-critical computing applications, necessitating system designs that incorporate fault tolerant clock synchronization in order to achieve ultra-reliable assurance levels. Many efficient clock synchronization protocols do not, however, address Byzantine failures, and most protocols that do tolerate Byzantine failures do not self-stabilize. Of the Byzantine self-stabilizing clock synchronization algorithms that exist in the literature, they are based on either unjustifiably strong assumptions about initial synchrony of the nodes or on the existence of a common pulse at the nodes. The Byzantine self-stabilizing clock synchronization protocol presented here does not rely on any assumptions about the initial state of the clocks. Furthermore, there is neither a central clock nor an externally generated pulse system. The proposed protocol converges deterministically, is scalable, and self-stabilizes in a short amount of time. The convergence time is linear with respect to the self-stabilization period. Proofs of the correctness of the protocol as well as the results of formal verification efforts are reported.
Potential Collaborative Research topics with Korea’s Agency for Defense Development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Farrar, Charles R.; Todd, Michael D.
2012-08-23
This presentation provides a high level summary of current research activities at the Los Alamos National Laboratory (LANL)-University of California Jacobs School of Engineering (UCSD) Engineering Institute that will be presented at Korea's Agency for Defense Development (ADD). These research activities are at the basic engineering science level with different level of maturity ranging from initial concepts to field proof-of-concept demonstrations. We believe that all of these activities are appropriate for collaborative research activities with ADD subject to approval by each institution. All the activities summarized herein have the common theme that they are multi-disciplinary in nature and typically involvedmore » the integration of high-fidelity predictive modeling, advanced sensing technologies and new development in information technology. These activities include: Wireless Sensor Systems, Swarming Robot sensor systems, Advanced signal processing (compressed sensing) and pattern recognition, Model Verification and Validation, Optimal/robust sensor system design, Haptic systems for large-scale data processing, Cyber-physical security for robots, Multi-source energy harvesting, Reliability-based approaches to damage prognosis, SHMTools software development, and Cyber-physical systems advanced study institute.« less
Improving the fiber coupling efficiency for DARWIN by loss-less shaping of the receive beams
NASA Astrophysics Data System (ADS)
Voland, Ch.; Weigel, Th.; Dreischer, Th.; Wallner, O.; Ergenzinger, K.; Ries, H.; Jetter, R.; Vosteen, A.
2017-11-01
For the DARWIN mission the extremely low planet signal levels require an optical instrument design with utmost efficiency to guarantee the required science performance. By shaping the transverse amplitude and phase distributions of the receive beams, the singlemode fibre coupling efficiency can be increased to almost 100%, thus allowing for a gain of more than 20% compared to conventional designs. We show that the use of "tailored freeform surfaces" for purpose of beam shaping dramatically reduces the coupling degradations, which otherwise result from mode mismatch between the Airy pattern of the image and the fibre mode, and therefore allows for achieving a performance close to the physical limitations. We present an application of tailored surfaces for building a beam shaping optics that shall enhance fibre coupling performance as core part of a space based interferometer in the future DARWIN mission and present performance predictions by wave-optical simulations. We assess the feasibility of manufacturing the corresponding tailored surfaces and describe the proof of concept demonstrator we use for experimental performance verification.
Dynamic analysis and control of lightweight manipulators with flexible parallel link mechanisms
NASA Technical Reports Server (NTRS)
Lee, Jeh Won
1991-01-01
The flexible parallel link mechanism is designed for increased rigidity to sustain the buckling when it carries a heavy payload. Compared to a one link flexible manipulator, a two link flexible manipulator, especially the flexible parallel mechanism, has more complicated characteristics in dynamics and control. The objective of this research is the theoretical analysis and the experimental verification of dynamics and control of a two link flexible manipulator with a flexible parallel link mechanism. Nonlinear equations of motion of the lightweight manipulator are derived by the Lagrangian method in symbolic form to better understand the structure of the dynamic model. A manipulator with a flexible parallel link mechanism is a constrained dynamic system whose equations are sensitive to numerical integration error. This constrained system is solved using singular value decomposition of the constraint Jacobian matrix. The discrepancies between the analytical model and the experiment are explained using a simplified and a detailed finite element model. The step response of the analytical model and the TREETOPS model match each other well. The nonlinear dynamics is studied using a sinusoidal excitation. The actuator dynamic effect on a flexible robot was investigated. The effects are explained by the root loci and the Bode plot theoretically and experimentally. For the base performance for the advanced control scheme, a simple decoupled feedback scheme is applied.
Research-Based Interventions in the Area of Proof: The Past, the Present, and the Future
ERIC Educational Resources Information Center
Stylianides, Gabriel J.; Stylianides, Andreas J.
2017-01-01
The concept of "proof" has attracted considerable research attention over the past decades in part due to its indisputable importance to the discipline of mathematics and to students' learning of mathematics. Yet, the teaching and learning of proof is an instructionally arduous territory, with proof being recognized as a hard-to-teach…
Using Toulmin Analysis to Analyse an Instructor's Proof Presentation in Abstract Algebra
ERIC Educational Resources Information Center
Fukawa-Connelly, Timothy
2014-01-01
This paper provides a method for analysing undergraduate teaching of proof-based courses using Toulmin's model (1969) of argumentation. It presents a case study of one instructor's presentation of proofs. The analysis shows that the instructor presents different levels of detail in different proofs; thus, the students have an inconsistent set of…
Bridging the Gap between Graphical Arguments and Verbal-Symbolic Proofs in a Real Analysis Context
ERIC Educational Resources Information Center
Zazkis, Dov; Weber, Keith; Mejía-Ramos, Juan Pablo
2016-01-01
We examine a commonly suggested proof construction strategy from the mathematics education literature--that students first produce a graphical argument and then work to construct a verbal-symbolic proof based on that graphical argument. The work of students who produce such graphical arguments when solving proof construction tasks was analyzed to…
Itri, Francesco; Monti, Daria Maria; Chino, Marco; Vinciguerra, Roberto; Altucci, Carlo; Lombardi, Angela; Piccoli, Renata; Birolo, Leila; Arciello, Angela
2017-10-07
The identification of protein-protein interaction networks in living cells is becoming increasingly fundamental to elucidate main biological processes and to understand disease molecular bases on a system-wide level. We recently described a method (LUCK, Laser UV Cross-linKing) to cross-link interacting protein surfaces in living cells by UV laser irradiation. By using this innovative methodology, that does not require any protein modification or cell engineering, here we demonstrate that, upon UV laser irradiation of HeLa cells, a direct interaction between GAPDH and alpha-enolase was "frozen" by a cross-linking event. We validated the occurrence of this direct interaction by co-immunoprecipitation and Immuno-FRET analyses. This represents a proof of principle of the LUCK capability to reveal direct protein interactions in their physiological environment. Copyright © 2017 Elsevier Inc. All rights reserved.
A New Integrated Threshold Selection Methodology for Spatial Forecast Verification of Extreme Events
NASA Astrophysics Data System (ADS)
Kholodovsky, V.
2017-12-01
Extreme weather and climate events such as heavy precipitation, heat waves and strong winds can cause extensive damage to the society in terms of human lives and financial losses. As climate changes, it is important to understand how extreme weather events may change as a result. Climate and statistical models are often independently used to model those phenomena. To better assess performance of the climate models, a variety of spatial forecast verification methods have been developed. However, spatial verification metrics that are widely used in comparing mean states, in most cases, do not have an adequate theoretical justification to benchmark extreme weather events. We proposed a new integrated threshold selection methodology for spatial forecast verification of extreme events that couples existing pattern recognition indices with high threshold choices. This integrated approach has three main steps: 1) dimension reduction; 2) geometric domain mapping; and 3) thresholds clustering. We apply this approach to an observed precipitation dataset over CONUS. The results are evaluated by displaying threshold distribution seasonally, monthly and annually. The method offers user the flexibility of selecting a high threshold that is linked to desired geometrical properties. The proposed high threshold methodology could either complement existing spatial verification methods, where threshold selection is arbitrary, or be directly applicable in extreme value theory.
NASA Astrophysics Data System (ADS)
Prasad, Narasimha S.; Kratovil, Patrick T.; Tucker, Sara C.; Vallestero, Neil J.; Khusid, Mark
2004-01-01
A free-space, line-of-sight, ground-based optical link at 1.5 microns is attractive for tactical communications because it would provide eye-safety, covertness and jam-proof operation. However, the effects of atmospheric turbulence have to be appropriately mitigated for achieving acceptable bit-error-rate (BER) for reliable dissemination of information. Models to predict achievable BER at 1.5 microns for several beam propagation schemes that include beam scanning have been developed for various turbulence conditions. In this paper, we report performance characterization of free-space, high-data (>1Gb/s) rate beam propagation parameters at 1.5 microns for achieving BER reduction under the presence of turbulence. For standard free-space optical links, the mean SNR limits the achievable BER to lesser than 10-6 for Cn2 (structure constant of refractive index fluctuations) around 10-12 m-2/3. To validate these models, simultaneous measurements of structure constant of refractive index fluctuations, Cn2, and coherence diameter over tactical ranges have been carried out and analyzed. The effect of input beam conditioning to reduce BER levels have been explored. Furthermore, single and multiple transmit beams in conjunction with single and multiple detector arrangements have been examined. Based on these measurements, it is shown that the advantages of input beam conditioning coupled with modified receiver geometric characteristics would provide a path for BER reduction and hence, appreciable enhancements in data link reliability.
NASA Technical Reports Server (NTRS)
Powell, John D.
2003-01-01
This document discusses the verification of the Secure Socket Layer (SSL) communication protocol as a demonstration of the Model Based Verification (MBV) portion of the verification instrument set being developed under the Reducing Software Security Risk (RSSR) Trough an Integrated Approach research initiative. Code Q of the National Aeronautics and Space Administration (NASA) funds this project. The NASA Goddard Independent Verification and Validation (IV&V) facility manages this research program at the NASA agency level and the Assurance Technology Program Office (ATPO) manages the research locally at the Jet Propulsion Laboratory (California institute of Technology) where the research is being carried out.
Integrating Fingerprint Verification into the Smart Card-Based Healthcare Information System
NASA Astrophysics Data System (ADS)
Moon, Daesung; Chung, Yongwha; Pan, Sung Bum; Park, Jin-Won
2009-12-01
As VLSI technology has been improved, a smart card employing 32-bit processors has been released, and more personal information such as medical, financial data can be stored in the card. Thus, it becomes important to protect personal information stored in the card. Verification of the card holder's identity using a fingerprint has advantages over the present practices of Personal Identification Numbers (PINs) and passwords. However, the computational workload of fingerprint verification is much heavier than that of the typical PIN-based solution. In this paper, we consider three strategies to implement fingerprint verification in a smart card environment and how to distribute the modules of fingerprint verification between the smart card and the card reader. We first evaluate the number of instructions of each step of a typical fingerprint verification algorithm, and estimate the execution time of several cryptographic algorithms to guarantee the security/privacy of the fingerprint data transmitted in the smart card with the client-server environment. Based on the evaluation results, we analyze each scenario with respect to the security level and the real-time execution requirements in order to implement fingerprint verification in the smart card with the client-server environment.
Simulation environment based on the Universal Verification Methodology
NASA Astrophysics Data System (ADS)
Fiergolski, A.
2017-01-01
Universal Verification Methodology (UVM) is a standardized approach of verifying integrated circuit designs, targeting a Coverage-Driven Verification (CDV). It combines automatic test generation, self-checking testbenches, and coverage metrics to indicate progress in the design verification. The flow of the CDV differs from the traditional directed-testing approach. With the CDV, a testbench developer, by setting the verification goals, starts with an structured plan. Those goals are targeted further by a developed testbench, which generates legal stimuli and sends them to a device under test (DUT). The progress is measured by coverage monitors added to the simulation environment. In this way, the non-exercised functionality can be identified. Moreover, the additional scoreboards indicate undesired DUT behaviour. Such verification environments were developed for three recent ASIC and FPGA projects which have successfully implemented the new work-flow: (1) the CLICpix2 65 nm CMOS hybrid pixel readout ASIC design; (2) the C3PD 180 nm HV-CMOS active sensor ASIC design; (3) the FPGA-based DAQ system of the CLICpix chip. This paper, based on the experience from the above projects, introduces briefly UVM and presents a set of tips and advices applicable at different stages of the verification process-cycle.
In vivo dose verification method in catheter based high dose rate brachytherapy.
Jaselskė, Evelina; Adlienė, Diana; Rudžianskas, Viktoras; Urbonavičius, Benas Gabrielis; Inčiūra, Arturas
2017-12-01
In vivo dosimetry is a powerful tool for dose verification in radiotherapy. Its application in high dose rate (HDR) brachytherapy is usually limited to the estimation of gross errors, due to inability of the dosimetry system/ method to record non-uniform dose distribution in steep dose gradient fields close to the radioactive source. In vivo dose verification in interstitial catheter based HDR brachytherapy is crucial since the treatment is performed inserting radioactive source at the certain positions within the catheters that are pre-implanted into the tumour. We propose in vivo dose verification method for this type of brachytherapy treatment which is based on the comparison between experimentally measured and theoretical dose values calculated at well-defined locations corresponding dosemeter positions in the catheter. Dose measurements were performed using TLD 100-H rods (6 mm long, 1 mm diameter) inserted in a certain sequences into additionally pre-implanted dosimetry catheter. The adjustment of dosemeter positioning in the catheter was performed using reconstructed CT scans of patient with pre-implanted catheters. Doses to three Head&Neck and one Breast cancer patient have been measured during several randomly selected treatment fractions. It was found that the average experimental dose error varied from 4.02% to 12.93% during independent in vivo dosimetry control measurements for selected Head&Neck cancer patients and from 7.17% to 8.63% - for Breast cancer patient. Average experimental dose error was below the AAPM recommended margin of 20% and did not exceed the measurement uncertainty of 17.87% estimated for this type of dosemeters. Tendency of slightly increasing average dose error was observed in every following treatment fraction of the same patient. It was linked to the changes of theoretically estimated dosemeter positions due to the possible patient's organ movement between different treatment fractions, since catheter reconstruction was performed for the first treatment fraction only. These findings indicate potential for further average dose error reduction in catheter based brachytherapy by at least 2-3% in the case that catheter locations will be adjusted before each following treatment fraction, however it requires more detailed investigation. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Bioluminescence Monitoring of Neuronal Activity in Freely Moving Zebrafish Larvae
Knafo, Steven; Prendergast, Andrew; Thouvenin, Olivier; Figueiredo, Sophie Nunes; Wyart, Claire
2017-01-01
The proof of concept for bioluminescence monitoring of neural activity in zebrafish with the genetically encoded calcium indicator GFP-aequorin has been previously described (Naumann et al., 2010) but challenges remain. First, bioluminescence signals originating from a single muscle fiber can constitute a major pitfall. Second, bioluminescence signals emanating from neurons only are very small. To improve signals while verifying specificity, we provide an optimized 4 steps protocol achieving: 1) selective expression of a zebrafish codon-optimized GFP-aequorin, 2) efficient soaking of larvae in GFP-aequorin substrate coelenterazine, 3) bioluminescence monitoring of neural activity from motor neurons in free-tailed moving animals performing acoustic escapes and 4) verification of the absence of muscle expression using immunohistochemistry. PMID:29130058
Formal specification and mechanical verification of SIFT - A fault-tolerant flight control system
NASA Technical Reports Server (NTRS)
Melliar-Smith, P. M.; Schwartz, R. L.
1982-01-01
The paper describes the methodology being employed to demonstrate rigorously that the SIFT (software-implemented fault-tolerant) computer meets its requirements. The methodology uses a hierarchy of design specifications, expressed in the mathematical domain of multisorted first-order predicate calculus. The most abstract of these, from which almost all details of mechanization have been removed, represents the requirements on the system for reliability and intended functionality. Successive specifications in the hierarchy add design and implementation detail until the PASCAL programs implementing the SIFT executive are reached. A formal proof that a SIFT system in a 'safe' state operates correctly despite the presence of arbitrary faults has been completed all the way from the most abstract specifications to the PASCAL program.
Nuclear disarmament verification via resonant phenomena.
Hecla, Jake J; Danagoulian, Areg
2018-03-28
Nuclear disarmament treaties are not sufficient in and of themselves to neutralize the existential threat of the nuclear weapons. Technologies are necessary for verifying the authenticity of the nuclear warheads undergoing dismantlement before counting them toward a treaty partner's obligation. Here we present a concept that leverages isotope-specific nuclear resonance phenomena to authenticate a warhead's fissile components by comparing them to a previously authenticated template. All information is encrypted in the physical domain in a manner that amounts to a physical zero-knowledge proof system. Using Monte Carlo simulations, the system is shown to reveal no isotopic or geometric information about the weapon, while readily detecting hoaxing attempts. This nuclear technique can dramatically increase the reach and trustworthiness of future nuclear disarmament treaties.
Ultrasonography in diagnosing clinically occult groin hernia: systematic review and meta-analysis.
Kwee, Robert M; Kwee, Thomas C
2018-05-14
To provide an updated systematic review on the performance of ultrasonography (US) in diagnosing clinically occult groin hernia. A systematic search was performed in MEDLINE and Embase. Methodological quality of included studies was assessed. Accuracy data of US in detecting clinically occult groin hernia were extracted. Positive predictive value (PPV) was pooled with a random effects model. For studies investigating the performance of US in hernia type classification (inguinal vs femoral), correctly classified proportion was assessed. Sixteen studies were included. In the two studies without verification bias, sensitivities were 29.4% [95% confidence interval (CI), 15.1-47.5%] and 90.9% (95% CI, 70.8-98.9%); specificities were 90.0% (95% CI, 80.5-95.9%) and 90.6% (95% CI, 83.0-95.6%). Verification bias or a variation of it (i.e. study limited to only subjects with definitive proof of disease status) was present in all other studies. Sensitivity, specificity, and negative predictive value (NPV) were not pooled. PPV ranged from 58.8 to 100%. Pooled PPV, based on data from ten studies with low risk of bias and no applicability concerns with respect to patient selection, was 85.6% (95% CI, 76.5-92.7%). Proportion of correctly classified hernias, based on data from four studies, ranged between 94.4% and 99.1%. Sensitivity, specificity and NPV of US in detecting clinically occult groin hernia cannot reliably be determined based on current evidence. Further studies are necessary. Accuracy may strongly depend on the examiner's skills. PPV is high. Inguinal and femoral hernias can reliably be differentiated by US. • Sensitivity, specificity and NPV of ultrasound in detecting clinically occult groin hernia cannot reliably be determined based on current evidence. • Accuracy may strongly depend on the examiner's skills. • PPV of US in detection of clinically occult groin hernia is high [pooled PPV of 85.6% (95% confidence interval, 76.5-92.7%)]. • US has very high performance in correctly differentiating between clinically occult inguinal and femoral hernia (correctness of 94.4- 99.1%).
The SeaHorn Verification Framework
NASA Technical Reports Server (NTRS)
Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.
2015-01-01
In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.
40 CFR 1065.550 - Gas analyzer range verification and drift verification.
Code of Federal Regulations, 2014 CFR
2014-07-01
... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations... concentration subcomponents (e.g., THC and CH4 for NMHC) separately. For example, for NMHC measurements, perform drift verification on NMHC; do not verify THC and CH4 separately. (2) Drift verification requires two...
Verification of Autonomous Systems for Space Applications
NASA Technical Reports Server (NTRS)
Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.
2006-01-01
Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.
Manoufali, Mohamed; Bialkowski, Konstanty; Mohammed, Beadaa Jasem; Mills, Paul C; Abbosh, Amin
2018-01-01
Near-field inductive-coupling link can establish a reliable power source to a batteryless implantable medical device based on Faraday's law of induction. In this paper, the design, modeling, and experimental verification of an inductive-coupling link between an off-body loop antenna and a 0.9 three-dimensional (3-D) bowtie brain implantable antenna is presented. To ensure reliability of the design, the implantable antenna is embedded in the cerebral spinal fluid of a realistic human head model. Exposure, temperature, and propagation simulations of the near electromagnetic fields in a frequency-dispersive head model were carried out to comply with the IEEE safety standards. Concertedly, a fabrication process for the implantable antenna is proposed, which can be extended to devise and miniaturize different 3-D geometric shapes. The performance of the proposed inductive link was tested in a biological environment; in vitro measurements of the fabricated prototypes were carried in a pig's head and piglet. The measurements of the link gain demonstrated in the pig's head and in piglet. The in vitro measurement results showed that the proposed 3-D implantable antenna is suitable for integration with a miniaturized batteryless brain implantable medical device (BIMD).
Using SysML for verification and validation planning on the Large Synoptic Survey Telescope (LSST)
NASA Astrophysics Data System (ADS)
Selvy, Brian M.; Claver, Charles; Angeli, George
2014-08-01
This paper provides an overview of the tool, language, and methodology used for Verification and Validation Planning on the Large Synoptic Survey Telescope (LSST) Project. LSST has implemented a Model Based Systems Engineering (MBSE) approach as a means of defining all systems engineering planning and definition activities that have historically been captured in paper documents. Specifically, LSST has adopted the Systems Modeling Language (SysML) standard and is utilizing a software tool called Enterprise Architect, developed by Sparx Systems. Much of the historical use of SysML has focused on the early phases of the project life cycle. Our approach is to extend the advantages of MBSE into later stages of the construction project. This paper details the methodology employed to use the tool to document the verification planning phases, including the extension of the language to accommodate the project's needs. The process includes defining the Verification Plan for each requirement, which in turn consists of a Verification Requirement, Success Criteria, Verification Method(s), Verification Level, and Verification Owner. Each Verification Method for each Requirement is defined as a Verification Activity and mapped into Verification Events, which are collections of activities that can be executed concurrently in an efficient and complementary way. Verification Event dependency and sequences are modeled using Activity Diagrams. The methodology employed also ties in to the Project Management Control System (PMCS), which utilizes Primavera P6 software, mapping each Verification Activity as a step in a planned activity. This approach leads to full traceability from initial Requirement to scheduled, costed, and resource loaded PMCS task-based activities, ensuring all requirements will be verified.
Muller, Jennifer K.; Sepulveda, Maria S.; Borgert, Christopher J.; Gross, Timothy S.
2005-01-01
This work describes the uptake of two organochlorine pesticides from slow-release pellets by largemouth bass and the utility of a blood plasma enzyme-linked immunosorbent assay (ELISA) method for exposure verification. We measured blood and tissue levels by gas chromatography/mass spectrometry and by a novel ELISA method, and present a critical comparison of the results.
Satellite Communication Hardware Emulation System (SCHES)
NASA Technical Reports Server (NTRS)
Kaplan, Ted
1993-01-01
Satellite Communication Hardware Emulator System (SCHES) is a powerful simulator that emulates the hardware used in TDRSS links. SCHES is a true bit-by-bit simulator that models communications hardware accurately enough to be used as a verification mechanism for actual hardware tests on user spacecraft. As a credit to its modular design, SCHES is easily configurable to model any user satellite communication link, though some development may be required to tailor existing software to user specific hardware.
Glove-based approach to online signature verification.
Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A
2008-06-01
Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%.
An Extended Proof-Carrying Code Framework for Security Enforcement
NASA Astrophysics Data System (ADS)
Pirzadeh, Heidar; Dubé, Danny; Hamou-Lhadj, Abdelwahab
The rapid growth of the Internet has resulted in increased attention to security to protect users from being victims of security threats. In this paper, we focus on security mechanisms that are based on Proof-Carrying Code (PCC) techniques. In a PCC system, a code producer sends a code along with its safety proof to the consumer. The consumer executes the code only if the proof is valid. Although PCC has been shown to be a useful security framework, it suffers from the sheer size of typical proofs -proofs of even small programs can be considerably large. In this paper, we propose an extended PCC framework (EPCC) in which, instead of the proof, a proof generator for the program in question is transmitted. This framework enables the execution of the proof generator and the recovery of the proof on the consumer's side in a secure manner using a newly created virtual machine called the VEP (Virtual Machine for Extended PCC).
Jin, Long; Liao, Bolin; Liu, Mei; Xiao, Lin; Guo, Dongsheng; Yan, Xiaogang
2017-01-01
By incorporating the physical constraints in joint space, a different-level simultaneous minimization scheme, which takes both the robot kinematics and robot dynamics into account, is presented and investigated for fault-tolerant motion planning of redundant manipulator in this paper. The scheme is reformulated as a quadratic program (QP) with equality and bound constraints, which is then solved by a discrete-time recurrent neural network. Simulative verifications based on a six-link planar redundant robot manipulator substantiate the efficacy and accuracy of the presented acceleration fault-tolerant scheme, the resultant QP and the corresponding discrete-time recurrent neural network. PMID:28955217
Linking Simulation with Formal Verification and Modeling of Wireless Sensor Network in TLA+
NASA Astrophysics Data System (ADS)
Martyna, Jerzy
In this paper, we present the results of the simulation of a wireless sensor network based on the flooding technique and SPIN protocols. The wireless sensor network was specified and verified by means of the TLA+ specification language [1]. For a model of wireless sensor network built this way simulation was carried with the help of specially constructed software tools. The obtained results allow us to predict the behaviour of the wireless sensor network in various topologies and spatial densities. Visualization of the output data enable precise examination of some phenomenas in wireless sensor networks, such as a hidden terminal, etc.
ERIC Educational Resources Information Center
MONTAGU, ASHLEY
A DISCUSSION ON THE VARIOUS RACES WAS PRESENTED. STATISTICS SHOWED THAT LIKENESSES AMONG GROUPS WERE ABOUT 95 PERCENT, WHILE DIFFERENCES WERE ONLY 5 PERCENT. FROM THE BIOLOGICAL STANDPOINT, THERE WAS NO PHYSICALLY INFERIOR OR PHYSICALLY SUPERIOR RACIAL TRAITS. THERE WAS NO PROOF THAT "RACE" AND INTELLIGENCE WERE LINKED. RATHER EVIDENCE…
ERIC Educational Resources Information Center
Fiallo, Jorge; Gutiérrez, Angel
2017-01-01
We present results from a classroom-based intervention designed to help a class of grade 10 students (14-15 years old) learn proof while studying trigonometry in a dynamic geometry software environment. We analysed some students' solutions to conjecture-and-proof problems that let them gain experience in stating conjectures and developing proofs.…
Putting time into proof outlines
NASA Technical Reports Server (NTRS)
Schneider, Fred B.; Bloom, Bard; Marzullo, Keith
1993-01-01
A logic for reasoning about timing properties of concurrent programs is presented. The logic is based on Hoare-style proof outlines and can handle maximal parallelism as well as certain resource-constrained execution environments. The correctness proof for a mutual exclusion protocol that uses execution timings in a subtle way illustrates the logic in action. A soundness proof using structural operational semantics is outlined in the appendix.
Data-oriented scheduling for PROOF
NASA Astrophysics Data System (ADS)
Xu, Neng; Guan, Wen; Wu, Sau Lan; Ganis, Gerardo
2011-12-01
The Parallel ROOT Facility - PROOF - is a distributed analysis system optimized for I/O intensive analysis tasks of HEP data. With LHC entering the analysis phase, PROOF has become a natural ingredient for computing farms at Tier3 level. These analysis facilities will typically be used by a few tenths of users, and can also be federated into a sort of analysis cloud corresponding to the Virtual Organization of the experiment. Proper scheduling is required to guarantee fair resource usage, to enforce priority policies and to optimize the throughput. In this paper we discuss an advanced priority system that we are developing for PROOF. The system has been designed to automatically adapt to unknown length of the tasks, to take into account the data location and availability (including distribution across geographically separated sites), and the {group, user} default priorities. In this system, every element - user, group, dataset, job slot and storage - gets its priority and those priorities are dynamically linked with each other. In order to tune the interplay between the various components, we have designed and started implementing a simulation application that can model various type and size of PROOF clusters. In this application a monitoring package records all the changes of them so that we can easily understand and tune the performance. We will discuss the status of our simulation and show examples of the results we are expecting from it.
Assurance Cases for Proofs as Evidence
NASA Technical Reports Server (NTRS)
Chaki, Sagar; Gurfinkel, Arie; Wallnau, Kurt; Weinstock, Charles
2009-01-01
Proof-carrying code (PCC) provides a 'gold standard' for establishing formal and objective confidence in program behavior. However, in order to extend the benefits of PCC - and other formal certification techniques - to realistic systems, we must establish the correspondence of a mathematical proof of a program's semantics and its actual behavior. In this paper, we argue that assurance cases are an effective means of establishing such a correspondence. To this end, we present an assurance case pattern for arguing that a proof is free from various proof hazards. We also instantiate this pattern for a proof-based mechanism to provide evidence about a generic medical device software.
Verification testing of the SeptiTech Model 400 System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing was u...
NASA Astrophysics Data System (ADS)
Sindiy, Oleg V.
This dissertation presents a model-based system-of-systems engineering (SoSE) approach as a design philosophy for architecting in system-of-systems (SoS) problems. SoS refers to a special class of systems in which numerous systems with operational and managerial independence interact to generate new capabilities that satisfy societal needs. Design decisions are more complicated in a SoS setting. A revised Process Model for SoSE is presented to support three phases in SoS architecting: defining the scope of the design problem, abstracting key descriptors and their interrelations in a conceptual model, and implementing computer-based simulations for architectural analyses. The Process Model enables improved decision support considering multiple SoS features and develops computational models capable of highlighting configurations of organizational, policy, financial, operational, and/or technical features. Further, processes for verification and validation of SoS models and simulations are also important due to potential impact on critical decision-making and, thus, are addressed. Two research questions frame the research efforts described in this dissertation. The first concerns how the four key sources of SoS complexity---heterogeneity of systems, connectivity structure, multi-layer interactions, and the evolutionary nature---influence the formulation of SoS models and simulations, trade space, and solution performance and structure evaluation metrics. The second question pertains to the implementation of SoSE architecting processes to inform decision-making for a subset of SoS problems concerning the design of information exchange services in space-based operations domain. These questions motivate and guide the dissertation's contributions. A formal methodology for drawing relationships within a multi-dimensional trade space, forming simulation case studies from applications of candidate architecture solutions to a campaign of notional mission use cases, and executing multi-purpose analysis studies is presented. These efforts are coupled to the generation of aggregate and time-dependent solution performance metrics via the hierarchical decomposition of objectives and the analytical recomposition of multi-attribute qualitative program drivers from quantifiable measures. This methodology was also applied to generate problem-specific solution structure evaluation metrics that facilitate the comparison of alternate solutions at a high level of aggregation, at lower levels of abstraction, and to relate options for design variables with associated performance values. For proof-of-capability demonstration, the selected application problem concerns the design of command, control, communication, and information (C3I) architecture services for a notional campaign of crewed and robotic lunar surface missions. The impetus for the work was the demonstration of using model-based SoSE for design of sustainable interoperability capabilities between all data and communication assets in extended lunar campaigns. A comprehensive Lunar C3I simulation tool was developed by a team of researchers at Purdue University in support of NASA's Constellation Program; the author of this dissertation was a key contributor to the creation of this tool and made modifications and extensions to key components relevant to the methodological concepts presented in this dissertation. The dissertation concludes with a presentation of example results based on the interrogation of the constructed Lunar C3I computational model. The results are based on a family of studies, structured around a trade-tree of architecture options, which were conducted to test the hypothesis that the SoSE approach is efficacious in the information-exchange architecture design in space exploration domain. Included in the family of proof-of-capability studies is a simulation of the Apollo 17 mission, which allows not only for partial verification and validation of the model, but also provides insights for prioritizing future model design iterations to make it more realistic representation of the "real world." A caveat within the results presented is that they serve within the capacity of a proof-of-capability demonstration, and as such, they are a product of models and analyses that need further development before the tool's results can be employed for decision-making. Additional discussion is provided for how to further develop and validate the Lunar C3I tool and also to make it extensible to other SoS design problems of similar nature in space exploration and other problem application domains.
Z-2 Architecture Description and Requirements Verification Results
NASA Technical Reports Server (NTRS)
Graziosi, Dave; Jones, Bobby; Ferl, Jinny; Scarborough, Steve; Hewes, Linda; Ross, Amy; Rhodes, Richard
2016-01-01
The Z-2 Prototype Planetary Extravehicular Space Suit Assembly is a continuation of NASA's Z series of spacesuits. The Z-2 is another step in NASA's technology development roadmap leading to human exploration of the Martian surface. The suit was designed for maximum mobility at 8.3 psid, reduced mass, and to have high fidelity life support interfaces. As Z-2 will be man-tested at full vacuum in NASA JSC's Chamber B, it was manufactured as Class II, making it the most flight-like planetary walking suit produced to date. The Z-2 suit architecture is an evolution of previous EVA suits, namely the ISS EMU, Mark III, Rear Entry I-Suit and Z-1 spacesuits. The suit is a hybrid hard and soft multi-bearing, rear entry spacesuit. The hard upper torso (HUT) is an all-composite structure and includes a 2-bearing rolling convolute shoulder with Vernier sizing mechanism, removable suit port interface plate (SIP), elliptical hemispherical helmet and self-don/doff shoulder harness. The hatch is a hybrid aluminum and composite construction with Apollo style gas connectors, custom water pass-thru, removable hatch cage and interfaces to primary and auxiliary life support feed water bags. The suit includes Z-1 style lower arms with cam brackets for Vernier sizing and government furnished equipment (GFE) Phase VI gloves. The lower torso includes a telescopic waist sizing system, waist bearing, rolling convolute waist joint, hard brief, 2 bearing soft hip thigh, Z-1 style legs with ISS EMU style cam brackets for sizing, and conformal walking boots with ankle bearings. The Z-2 Requirements Verification Plan includes the verification of more than 200 individual requirements. The verification methods include test, analysis, inspection, demonstration or a combination of methods. Examples of unmanned requirements include suit leakage, proof pressure testing, operational life, mass, isometric man-loads, sizing adjustment ranges, internal and external interfaces such as in-suit drink bag, partial pressure relief valve, purge valve, donning stand and ISS Body Restraint Tether (BRT). Examples of manned requirements include verification of anthropometric range, suit self-don/doff, secondary suit exit method, donning stand self-ingress/egress and manned mobility covering eight functional tasks. The eight functional tasks include kneeling with object pick-up, standing toe touch, cross-body reach, walking, reach to the SIP and helmet visor. This paper will provide an overview of the Z-2 design. Z-2 requirements verification testing was performed with NASA at the ILC Houston test facility. This paper will also discuss pre-delivery manned and unmanned test results as well as analysis performed in support of requirements verification.
Real-Time Peer Review: An Innovative Feature to an Evidence-Based Practice Conference
Eldredge, Jonathan D.; Phillips, Holly E.; Kroth, Philip J.
2013-01-01
Many health sciences librarians as well as other professionals attend conferences on a regular basis. This study sought to link an innovative peer review process of presented research papers to long-term conference outcomes in the peer-reviewed professional journal literature. An evidence-based conference included a proof-of-concept study to gauge the long-term outcomes from research papers presented during the program. Real-time peer review recommendations from the conference were linked to final versions of articles published in the peer-reviewed literature. The real-time peer review feedback served as the basis for further mentoring to guide prospective authors toward publishing their research results. These efforts resulted in the publication of two of the four research papers in the peer-viewed literature. A third presented paper appeared in a blog because the authors wanted to disseminate their findings more quickly than through the journal literature. The presenters of the fourth paper never published their study. Real-time peer review from this study can be adapted to other professional conferences that include presented research papers. PMID:24180649
IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS
The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...
Gender Verification of Female Olympic Athletes.
ERIC Educational Resources Information Center
Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.
2002-01-01
Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…
Formation of target-specific binding sites in enzymes: solid-phase molecular imprinting of HRP
NASA Astrophysics Data System (ADS)
Czulak, J.; Guerreiro, A.; Metran, K.; Canfarotta, F.; Goddard, A.; Cowan, R. H.; Trochimczuk, A. W.; Piletsky, S.
2016-05-01
Here we introduce a new concept for synthesising molecularly imprinted nanoparticles by using proteins as macro-functional monomers. For a proof-of-concept, a model enzyme (HRP) was cross-linked using glutaraldehyde in the presence of glass beads (solid-phase) bearing immobilized templates such as vancomycin and ampicillin. The cross-linking process links together proteins and protein chains, which in the presence of templates leads to the formation of permanent target-specific recognition sites without adverse effects on the enzymatic activity. Unlike complex protein engineering approaches commonly employed to generate affinity proteins, the method proposed can be used to produce protein-based ligands in a short time period using native protein molecules. These affinity materials are potentially useful tools especially for assays since they combine the catalytic properties of enzymes (for signaling) and molecular recognition properties of antibodies. We demonstrate this concept in an ELISA-format assay where HRP imprinted with vancomycin and ampicillin replaced traditional enzyme-antibody conjugates for selective detection of templates at micromolar concentrations. This approach can potentially provide a fast alternative to raising antibodies for targets that do not require high assay sensitivities; it can also find uses as a biochemical research tool, as a possible replacement for immunoperoxidase-conjugates.Here we introduce a new concept for synthesising molecularly imprinted nanoparticles by using proteins as macro-functional monomers. For a proof-of-concept, a model enzyme (HRP) was cross-linked using glutaraldehyde in the presence of glass beads (solid-phase) bearing immobilized templates such as vancomycin and ampicillin. The cross-linking process links together proteins and protein chains, which in the presence of templates leads to the formation of permanent target-specific recognition sites without adverse effects on the enzymatic activity. Unlike complex protein engineering approaches commonly employed to generate affinity proteins, the method proposed can be used to produce protein-based ligands in a short time period using native protein molecules. These affinity materials are potentially useful tools especially for assays since they combine the catalytic properties of enzymes (for signaling) and molecular recognition properties of antibodies. We demonstrate this concept in an ELISA-format assay where HRP imprinted with vancomycin and ampicillin replaced traditional enzyme-antibody conjugates for selective detection of templates at micromolar concentrations. This approach can potentially provide a fast alternative to raising antibodies for targets that do not require high assay sensitivities; it can also find uses as a biochemical research tool, as a possible replacement for immunoperoxidase-conjugates. Electronic supplementary information (ESI) available: Additional circular dichroism data and nanoparticle tracking analysis trace. See DOI: 10.1039/c6nr02009g
Research on Quantum Algorithms at the Institute for Quantum Information
2009-10-17
accuracy threshold theorem for the one-way quantum computer. Their proof is based on a novel scheme, in which a noisy cluster state in three spatial...detected. The proof applies to independent stochastic noise but (in contrast to proofs of the quantum accuracy threshold theorem based on concatenated...proved quantum threshold theorems for long-range correlated non-Markovian noise, for leakage faults, for the one-way quantum computer, for postselected
Researchers who perform air quality modeling studies usually do so on a regional scale. Typically, the boundary conditions are generated by another model which might have a different chemical mechanism, spatial resolution, and/or map projection. Hence, a necessary conversion/inte...
An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices
Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei
2017-01-01
In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer’s forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices. PMID:28075375
An Approach to Biometric Verification Based on Human Body Communication in Wearable Devices.
Li, Jingzhen; Liu, Yuhang; Nie, Zedong; Qin, Wenjian; Pang, Zengyao; Wang, Lei
2017-01-10
In this paper, an approach to biometric verification based on human body communication (HBC) is presented for wearable devices. For this purpose, the transmission gain S21 of volunteer's forearm is measured by vector network analyzer (VNA). Specifically, in order to determine the chosen frequency for biometric verification, 1800 groups of data are acquired from 10 volunteers in the frequency range 0.3 MHz to 1500 MHz, and each group includes 1601 sample data. In addition, to achieve the rapid verification, 30 groups of data for each volunteer are acquired at the chosen frequency, and each group contains only 21 sample data. Furthermore, a threshold-adaptive template matching (TATM) algorithm based on weighted Euclidean distance is proposed for rapid verification in this work. The results indicate that the chosen frequency for biometric verification is from 650 MHz to 750 MHz. The false acceptance rate (FAR) and false rejection rate (FRR) based on TATM are approximately 5.79% and 6.74%, respectively. In contrast, the FAR and FRR were 4.17% and 37.5%, 3.37% and 33.33%, and 3.80% and 34.17% using K-nearest neighbor (KNN) classification, support vector machines (SVM), and naive Bayesian method (NBM) classification, respectively. In addition, the running time of TATM is 0.019 s, whereas the running times of KNN, SVM and NBM are 0.310 s, 0.0385 s, and 0.168 s, respectively. Therefore, TATM is suggested to be appropriate for rapid verification use in wearable devices.
Verification testing of the F.R. Mahoney Amphidrome System was conducted over a twelve month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at the Otis Air National Guard Base in Borne, MA. Sanitary Sewerage from the base residential housing w...
An Agent-Based Dynamic Model for Analysis of Distributed Space Exploration Architectures
NASA Astrophysics Data System (ADS)
Sindiy, Oleg V.; DeLaurentis, Daniel A.; Stein, William B.
2009-07-01
A range of complex challenges, but also potentially unique rewards, underlie the development of exploration architectures that use a distributed, dynamic network of resources across the solar system. From a methodological perspective, the prime challenge is to systematically model the evolution (and quantify comparative performance) of such architectures, under uncertainty, to effectively direct further study of specialized trajectories, spacecraft technologies, concept of operations, and resource allocation. A process model for System-of-Systems Engineering is used to define time-varying performance measures for comparative architecture analysis and identification of distinguishing patterns among interoperating systems. Agent-based modeling serves as the means to create a discrete-time simulation that generates dynamics for the study of architecture evolution. A Solar System Mobility Network proof-of-concept problem is introduced representing a set of longer-term, distributed exploration architectures. Options within this set revolve around deployment of human and robotic exploration and infrastructure assets, their organization, interoperability, and evolution, i.e., a system-of-systems. Agent-based simulations quantify relative payoffs for a fully distributed architecture (which can be significant over the long term), the latency period before they are manifest, and the up-front investment (which can be substantial compared to alternatives). Verification and sensitivity results provide further insight on development paths and indicate that the framework and simulation modeling approach may be useful in architectural design of other space exploration mass, energy, and information exchange settings.
An evaluation of the pressure proof test concept for 2024-T3 aluminium alloy sheet
NASA Technical Reports Server (NTRS)
Dawicke, D. S.; Poe, C. C., Jr.; Newman, J. C.; Harris, C. E.
1991-01-01
The concept of pressure proof testing of fuselage structures with fatigue cracks to insure structural integrity was evaluated from a fracture mechanics viewpoint. A generic analytical and experimental investigation was conducted on uniaxially loaded flat panels with crack configurations and stress levels typical of longitudinal lap splice joints in commercial transport aircraft fuselages. The results revealed that the remaining fatigue life after a proof cycle was longer than that without the proof cycle because of crack growth retardation due to increased crack closure. However, based on a crack length that is slightly less than the critical value at the maximum proof stress, the minimum assured life or proof test interval must be no more than 550 pressure cycles for a 1.33 proof factor and 1530 pressure cycles for a 1.5 proof factor to prevent in-flight failures.
An evaluation of the pressure proof test concept for thin sheet 2024-T3
NASA Technical Reports Server (NTRS)
Dawicke, D. S.; Poe, C. C., Jr.; Newman, J. C., Jr.; Harris, C. E.
1990-01-01
The concept of pressure proof testing of fuselage structures with fatigue cracks to insure structural integrity was evaluated from a fracture mechanics viewpoint. A generic analytical and experimental investigation was conducted on uniaxially loaded flat panels with crack configurations and stress levels typical of longitudinal lap-splice joints in commercial transport aircraft fuselage. The results revealed that the remaining fatigue life after a proof test was longer than that without the proof test because of crack growth retardation due to increased crack closure. However, based on a crack length that is slightly less than the critical value at the maximum proof test stress, the minimum assured life or proof test interval must be no more than 550 pressure cycles for a 1.33 proof factor and 1530 pressure cycles for a 1.5 proof factor to prevent in-flight failures.
An evaluation of the pressure proof test concept for thin sheet 2024-T3
NASA Technical Reports Server (NTRS)
Dawicke, D. S.; Poe, C. C., Jr.; Newman, James C., Jr.; Harris, Charles E.
1990-01-01
The concept of pressure proof testing of fuselage structures with fatigue cracks to insure structural integrity was evaluated from a fracture mechanics viewpoint. A generic analytical and experimental investigation was conducted on uniaxially loaded flat panels with crack configurations and stress levels typical of longitudinal lap splice joints in commercial transport aircraft fuselages. The results revealed that the remaining fatigue life after a proof test was longer than that without the proof test because of crack growth retardation due to increased crack closure. However, based on a crack length that is slightly less than the critical value at the maximum proof test stress, the minimum assured life or proof test interval must be no more than 550 pressure cycles for a 1.33 proof factor and 1530 pressure cycles for a 1.5 proof factor to prevent in-flight failures.
[Understanding mistake-proofing].
de Saint Maurice, G; Giraud, N; Ausset, S; Auroy, Y; Lenoir, B; Amalberti, R
2011-01-01
The mistake-proofing concept often refers to physical devices that prevent actors from making a wrong action. In anaesthesiology, one immediately thinks to specific design of outlets for medical gases. More generally, the principle of mistake-proofing is to avoid an error, by placing knowledge in the world rather than knowledge in the head. As it often happens in risk management, healthcare has received information transfers from the industry. Computer is changing the concept of mistake-proofing, initially based on physical design, such as aerospace and automotive industry. The mistake-proofing concept may be applied to prevention, detection, and mitigation of errors. The forcing functions are a specific part of mistake-proofing: they prevent a wrong action or they force a virtuous one. Grout proposes a little shortcut to identify mistake-proofing devices: "If it is not possible to picture it in action, it is probably not a mistake-proofing device". Copyright © 2010 Elsevier Masson SAS. All rights reserved.
Optimal periodic proof test based on cost-effective and reliability criteria
NASA Technical Reports Server (NTRS)
Yang, J.-N.
1976-01-01
An exploratory study for the optimization of periodic proof tests for fatigue-critical structures is presented. The optimal proof load level and the optimal number of periodic proof tests are determined by minimizing the total expected (statistical average) cost, while the constraint on the allowable level of structural reliability is satisfied. The total expected cost consists of the expected cost of proof tests, the expected cost of structures destroyed by proof tests, and the expected cost of structural failure in service. It is demonstrated by numerical examples that significant cost saving and reliability improvement for fatigue-critical structures can be achieved by the application of the optimal periodic proof test. The present study is relevant to the establishment of optimal maintenance procedures for fatigue-critical structures.
A Semantic Basis for Proof Queries and Transformations
NASA Technical Reports Server (NTRS)
Aspinall, David; Denney, Ewen W.; Luth, Christoph
2013-01-01
We extend the query language PrQL, designed for inspecting machine representations of proofs, to also allow transformation of proofs. PrQL natively supports hiproofs which express proof structure using hierarchically nested labelled trees, which we claim is a natural way of taming the complexity of huge proofs. Query-driven transformations enable manipulation of this structure, in particular, to transform proofs produced by interactive theorem provers into forms that assist their understanding, or that could be consumed by other tools. In this paper we motivate and define basic transformation operations, using an abstract denotational semantics of hiproofs and queries. This extends our previous semantics for queries based on syntactic tree representations.We define update operations that add and remove sub-proofs, and manipulate the hierarchy to group and ungroup nodes. We show that
NASA Technical Reports Server (NTRS)
Saito, Jim
1987-01-01
The user guide of verification and validation (V&V) tools for the Automated Engineering Design (AED) language is specifically written to update the information found in several documents pertaining to the automated verification of flight software tools. The intent is to provide, in one document, all the information necessary to adequately prepare a run to use the AED V&V tools. No attempt is made to discuss the FORTRAN V&V tools since they were not updated and are not currently active. Additionally, the current descriptions of the AED V&V tools are contained and provides information to augment the NASA TM 84276. The AED V&V tools are accessed from the digital flight control systems verification laboratory (DFCSVL) via a PDP-11/60 digital computer. The AED V&V tool interface handlers on the PDP-11/60 generate a Univac run stream which is transmitted to the Univac via a Remote Job Entry (RJE) link. Job execution takes place on the Univac 1100 and the job output is transmitted back to the DFCSVL and stored as a PDP-11/60 printfile.
Use of Unstructured Event-Based Reports for Global Infectious Disease Surveillance
Blench, Michael; Tolentino, Herman; Freifeld, Clark C.; Mandl, Kenneth D.; Mawudeku, Abla; Eysenbach, Gunther; Brownstein, John S.
2009-01-01
Free or low-cost sources of unstructured information, such as Internet news and online discussion sites, provide detailed local and near real-time data on disease outbreaks, even in countries that lack traditional public health surveillance. To improve public health surveillance and, ultimately, interventions, we examined 3 primary systems that process event-based outbreak information: Global Public Health Intelligence Network, HealthMap, and EpiSPIDER. Despite similarities among them, these systems are highly complementary because they monitor different data types, rely on varying levels of automation and human analysis, and distribute distinct information. Future development should focus on linking these systems more closely to public health practitioners in the field and establishing collaborative networks for alert verification and dissemination. Such development would further establish event-based monitoring as an invaluable public health resource that provides critical context and an alternative to traditional indicator-based outbreak reporting. PMID:19402953
Update: Advancement of Contact Dynamics Modeling for Human Spaceflight Simulation Applications
NASA Technical Reports Server (NTRS)
Brain, Thomas A.; Kovel, Erik B.; MacLean, John R.; Quiocho, Leslie J.
2017-01-01
Pong is a new software tool developed at the NASA Johnson Space Center that advances interference-based geometric contact dynamics based on 3D graphics models. The Pong software consists of three parts: a set of scripts to extract geometric data from 3D graphics models, a contact dynamics engine that provides collision detection and force calculations based on the extracted geometric data, and a set of scripts for visualizing the dynamics response with the 3D graphics models. The contact dynamics engine can be linked with an external multibody dynamics engine to provide an integrated multibody contact dynamics simulation. This paper provides a detailed overview of Pong including the overall approach and modeling capabilities, which encompasses force generation from contact primitives and friction to computational performance. Two specific Pong-based examples of International Space Station applications are discussed, and the related verification and validation using this new tool are also addressed.
Simulation verification techniques study
NASA Technical Reports Server (NTRS)
Schoonmaker, P. B.; Wenglinski, T. H.
1975-01-01
Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.
Practical proof of CP element based design for 14nm node and beyond
NASA Astrophysics Data System (ADS)
Maruyama, Takashi; Takita, Hiroshi; Ikeno, Rimon; Osawa, Morimi; Kojima, Yoshinori; Sugatani, Shinji; Hoshino, Hiromi; Hino, Toshio; Ito, Masaru; Iizuka, Tetsuya; Komatsu, Satoshi; Ikeda, Makoto; Asada, Kunihiro
2013-03-01
To realize HVM (High Volume Manufacturing) with CP (Character Projection) based EBDW, the shot count reduction is the essential key. All device circuits should be composed with predefined character parts and we call this methodology "CP element based design". In our previous work, we presented following three concepts [2]. 1) Memory: We reported the prospects of affordability for the CP-stencil resource. 2) Logic cell: We adopted a multi-cell clustering approach in the physical synthesis. 3) Random interconnect: We proposed an ultra-regular layout scheme using fixed size wiring tiles containing repeated tracks and cutting points at the tile edges. In this paper, we will report the experimental proofs in these methodologies. In full chip layout, CP stencil resource management is critical key. From the MCC-POC (Proof of Concept) result [1], we assumed total available CP stencil resource as 9000um2. We should manage to layout all circuit macros within this restriction. Especially the issues in assignment of CP-stencil resource for the memory macros are the most important as they consume considerable degree of resource because of the various line-ups such as 1RW-, 2RW-SRAMs, Resister Files and ROM which require several varieties of large size peripheral circuits. Furthermore the memory macros typically take large area of more than 40% of die area in the forefront logic LSI products so that the shot count increase impact is serious. To realize CP-stencil resource saving we had constructed automatic CP analyzing system. We developed two types of extraction mode of simple division by block and layout repeatability recognition. By properly controlling these models based upon each peripheral circuit characteristics, we could minimize the consumption of CP stencil resources. The estimation for 14nm technology node had been performed based on the analysis of practical memory compiler. The required resource for memory macro is proved to be affordable value which is 60% of full CP stencil resource and wafer level converted shot count is proved to be the level which meets 100WPH throughput. In logic cell design, circuit performance verification result after the cell clustering has been estimated. The cell clustering by the acknowledgment of physical distance proved to owe large penalty mainly in the wiring length. To reduce this design penalty, we proposed CP cell clustering by the acknowledgment of logical distance. For shot-count reduction of random interconnect area design, we proposed a more structural routing architecture which consists of the track exchange and the via position arrangement. Putting these design approaches together, we can design CP stencils to hit the target throughput within the area constraint. From the analysis for other macros such as analog, I/O, and DUMMY, it has proved that we don't need special CP design approach than legacy pattern matching CP extraction. From all these experimental results we get good prospects to the reality of full CP element based layout.
Deductive Verification of Cryptographic Software
NASA Technical Reports Server (NTRS)
Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara
2009-01-01
We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.
Deriving Safety Cases from Machine-Generated Proofs
NASA Technical Reports Server (NTRS)
Basir, Nurlida; Fischer, Bernd; Denney, Ewen
2009-01-01
Proofs provide detailed justification for the validity of claims and are widely used in formal software development methods. However, they are often complex and difficult to understand, because they use machine-oriented formalisms; they may also be based on assumptions that are not justified. This causes concerns about the trustworthiness of using formal proofs as arguments in safety-critical applications. Here, we present an approach to develop safety cases that correspond to formal proofs found by automated theorem provers and reveal the underlying argumentation structure and top-level assumptions. We concentrate on natural deduction proofs and show how to construct the safety cases by covering the proof tree with corresponding safety case fragments.
Phenol red-silk tyrosine cross-linked hydrogels.
Sundarakrishnan, Aswin; Herrero Acero, Enrique; Coburn, Jeannine; Chwalek, Karolina; Partlow, Benjamin; Kaplan, David L
2016-09-15
Phenol red is a cytocompatible pH sensing dye that is commonly added to cell culture media, but removed from some media formulations due to its structural mimicry of estrogen. Phenol red free media is also used during live cell imaging, to avoid absorbance and fluorescence quenching of fluorophores. To overcome these complications, we developed cytocompatible and degradable phenol red-silk tyrosine cross-linked hydrogels using horseradish peroxidase (HRP) enzyme and hydrogen peroxide (H2O2). Phenol red added to silk during tyrosine crosslinking accelerated di-tyrosine formation in a concentration-dependent reaction. Phenol red diffusion studies and UV-Vis spectra of phenol red-silk tyrosine hydrogels at different pHs showed altered absorption bands, confirming entrapment of dye within the hydrogel network. LC-MS of HRP-reacted phenol red and N-acetyl-l-tyrosine reaction products confirmed covalent bonds between the phenolic hydroxyl group of phenol red and tyrosine on the silk. At lower phenol red concentrations, leak-proof hydrogels which did not release phenol red were fabricated and found to be cytocompatible based on live-dead staining and alamar blue assessments of encapsulated fibroblasts. Due to the spectral overlap between phenol red absorbance at 415nm and di-tyrosine fluorescence at 417nm, phenol red-silk hydrogels provide both absorbance and fluorescence-based pH sensing. With an average pKa of 6.8 and good cytocompatibiltiy, phenol red-silk hydrogels are useful for pH sensing in phenol red free systems, cellular microenvironments and bioreactors. Phenol red entrapped within hydrogels facilitates pH sensing in phenol red free environments. Leak-proof phenol red based pH sensors require covalent binding techniques, but are complicated due to the lack of amino or carboxyl groups on phenol red. Currently, there is no simple, reliable technique to covalently link phenol red to hydrogel matrices, for real-time pH sensing in cell culture environments. Herein, we take advantage of phenolic groups for covalent linkage of phenol red to silk tyrosine in the presence of HRP and H2O2. The novelty of the current system stems from its simplicity and the use of silk protein to create a cytocompatible, degradable sensor capable of real-time pH sensing in cell culture microenvironments. Copyright © 2016 Acta Materialia Inc. Published by Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Feriyanto
2018-01-01
This research aims to describe the ability of students’ mathematical proof in determining the validity of argument reviewed from gender differences. The subjects of this research were one male and one female student of the fifth semester of Mathematic Education study program. The subjects were selected based on the highest mathematics ability which was assesed from their previous assignments and tests. In addition, the communication ability of the subjects was also considered in order to facilitate the researcher in conducting interviews. Based on the result of the test with direct and indirect proof, it could be concluded that the subjects were able to: 1) mention all facts/premises and write about what should be shown (conclusion) in direct proof and write additional premise in indirect proof; 2) connect facts/premises to concepts which must be mastered; 3) use equivalent concept to manipulate and organize the proof; 4) use the concept of syllogism and tollens mode to obtain the desired conclusion; 5) construct mathematical evidence systematically, and logically; 6) complement the reason for each step appropriately. The difference was that the male subject wrote the final conclusion, while the female subject did not write the final conclusion on the proof.
Advanced in-production hotspot prediction and monitoring with micro-topography
NASA Astrophysics Data System (ADS)
Fanton, P.; Hasan, T.; Lakcher, A.; Le-Gratiet, B.; Prentice, C.; Simiz, J.-G.; La Greca, R.; Depre, L.; Hunsche, S.
2017-03-01
At 28nm technology node and below, hot spot prediction and process window control across production wafers have become increasingly critical to prevent hotspots from becoming yield-limiting defects. We previously established proof of concept for a systematic approach to identify the most critical pattern locations, i.e. hotspots, in a reticle layout by computational lithography and combining process window characteristics of these patterns with across-wafer process variation data to predict where hotspots may become yield impacting defects [1,2]. The current paper establishes the impact of micro-topography on a 28nm metal layer, and its correlation with hotspot best focus variations across a production chip layout. Detailed topography measurements are obtained from an offline tool, and pattern-dependent best focus (BF) shifts are determined from litho simulations that include mask-3D effects. We also establish hotspot metrology and defect verification by SEM image contour extraction and contour analysis. This enables detection of catastrophic defects as well as quantitative characterization of pattern variability, i.e. local and global CD uniformity, across a wafer to establish hotspot defect and variability maps. Finally, we combine defect prediction and verification capabilities for process monitoring by on-product, guided hotspot metrology, i.e. with sampling locations being determined from the defect prediction model and achieved prediction accuracy (capture rate) around 75%
Epidemiology: second-rate science?
Parascandola, M
1998-01-01
In recent years epidemiology has come under increasing criticism in regulatory and public arenas for being "unscientific." The tobacco industry has taken advantage of this, insisting for decades that evidence linking cigarettes and lung cancer falls short of proof. Moreover, many epidemiologists remain unduly skeptical and self-conscious about the status of their own causal claims. This situation persists in part because of a widespread belief that only the laboratory can provide evidence sufficient for scientific proof. Adherents of this view erroneously believe that there is no element of uncertainty or inductive inference in the "direct observation" of the laboratory researcher and that epidemiology provides mere "circumstantial" evidence. The historical roots of this attitude can be traced to philosopher John Stuart Mill and physiologist Claude Bernard and their influence on modern experimental thinking. The author uses the debate over cigarettes and lung cancer to examine ideas of proof in medical science and public health, concluding that inductive inference from a limited sample to a larger population is an element in all empirical science.
Empowering Students' Proof Learning through Communal Engagement
ERIC Educational Resources Information Center
Ko, Yi-Yin; Yee, Sean P.; Bleiler-Baxter, Sarah K.; Boyle, Justin D.
2016-01-01
This article describes the authors' three-component instructional sequence--a before-class activity, a during-class activity, and an after-class activity--which supports students in becoming self-regulated proof learners by actively developing class-based criteria for proof. All four authors implemented this sequence in their classrooms, and the…
Putting time into proof outlines
NASA Technical Reports Server (NTRS)
Schneider, Fred B.; Bloom, Bard; Marzullo, Keith
1991-01-01
A logic for reasoning about timing of concurrent programs is presented. The logic is based on proof outlines and can handle maximal parallelism as well as resource-constrained execution environments. The correctness proof for a mutual exclusion protocol that uses execution timings in a subtle way illustrates the logic in action.
Power System Test and Verification at Satellite Level
NASA Astrophysics Data System (ADS)
Simonelli, Giulio; Mourra, Olivier; Tonicello, Ferdinando
2008-09-01
Most of the articles on Power Systems deal with the architecture and technical solutions related to the functionalities of the power system and their performances. Very few articles, if none, address integration and verification aspects of the Power System at satellite level and the related issues with the Power EGSE (Electrical Ground Support Equipment), which, also, have to support the AIT/AIV (Assembly Integration Test and Verification) program of the satellite and, eventually, the launch campaign. In the last years a more complex development and testing concept based on MDVE (Model Based Development and Verification Environment) has been introduced. In the MDVE approach the simulation software is used to simulate the Satellite environment and, in the early stages, the satellites units. This approach changed significantly the Power EGSE requirements. Power EGSEs or, better, Power SCOEs (Special Check Out Equipment) are now requested to provide the instantaneous power generated by the solar array throughout the orbit. To achieve that, the Power SCOE interfaces to the RTS (Real Time Simulator) of the MDVE. The RTS provides the instantaneous settings, which belong to that point along the orbit, to the Power SCOE so that the Power SCOE generates the instantaneous {I,V} curve of the SA (Solar Array). That means a real time test for the power system, which is even more valuable for EO (Earth Observation) satellites where the Solar Array aspect angle to the sun is rarely fixed, and the power load profile can be particularly complex (for example, in radar applications). In this article the major issues related to integration and testing of Power Systems will be discussed taking into account different power system topologies (i.e. regulated bus, unregulated bus, battery bus, based on MPPT or S3R…). Also aspects about Power System AIT I/Fs (interfaces) and Umbilical I/Fs with the launcher and the Power SCOE I/Fs will be addressed. Last but not least, protection strategy of the Power System during AIT/AIV program will also be discussed. The objective of this discussion is also to provide the Power System Engineer with a checklist of key aspects linked to the satellite AIT/AIV program, that have to be considered in the early phases of a new power system development.
Tang, G.; Andre, B.; Hoffman, F. M.; Painter, S. L.; Thornton, P. E.; Yuan, F.; Bisht, G.; Hammond, G. E.; Lichtner, P. C.; Kumar, J.; Mills, R. T.; Xu, X.
2016-04-19
This Modeling Archive is in support of an NGEE Arctic discussion paper under review and available at doi:10.5194/gmd-9-927-2016. The purpose is to document the simulations to allow verification, reproducibility, and follow-up studies. This dataset contains shell scripts to create the CLM-PFLOTRAN cases, specific input files for PFLOTRAN and CLM, outputs, and python scripts to make the figures using the outputs in the publication. Through these results, we demonstrate that CLM-PFLOTRAN can approximately reproduce CLM results in selected cases for the Arctic, temperate and tropic sites. In addition, the new framework facilitates mechanistic representations of soil biogeochemistry processes in the land surface model.
Study on perception and control layer of mine CPS with mixed logic dynamic approach
NASA Astrophysics Data System (ADS)
Li, Jingzhao; Ren, Ping; Yang, Dayu
2017-01-01
Mine inclined roadway transportation system of mine cyber physical system is a hybrid system consisting of a continuous-time system and a discrete-time system, which can be divided into inclined roadway signal subsystem, error-proofing channel subsystems, anti-car subsystems, and frequency control subsystems. First, to ensure stable operation, improve efficiency and production safety, this hybrid system model with n inputs and m outputs is constructed and analyzed in detail, then its steady schedule state to be solved. Second, on the basis of the formal modeling for real-time systems, we use hybrid toolbox for system security verification. Third, the practical application of mine cyber physical system shows that the method for real-time simulation of mine cyber physical system is effective.
Electrical isolation and characteristics of permanent magnet-actuated valves for PDMS microfluidics.
Chen, Chang-Yu; Chen, Chang-Hung; Tu, Ting-Yuan; Lin, Cheng-Ming; Wo, Andrew M
2011-02-21
This paper presents a magnetically driven valve via a permanent magnet pressing a spacer against deformable polydimethylsiloxane (PDMS) to fully close a microchannel. Its ability for electrical isolation, time response, and resistance to backpressure are interrogated. Simulation of the valve closing process was commenced along with experimental verification. Effects of PDMS thickness, and dimension and aspect ratio of microchannels were characterized. Up to 10 GΩ electrical isolation was demonstrated, as well as 50-70 ms valve response and ∼200 kPa resistible pressure. On-demand actuation for arbitrary flow patterns further quantifies its utility. With advantages of simple fabrication, flexible valving location, and no external power requirement, the on/off valve could be leveraged for proof-of-concept microfluidic devices and other applications.
Automatic Review of Abstract State Machines by Meta Property Verification
NASA Technical Reports Server (NTRS)
Arcaini, Paolo; Gargantini, Angelo; Riccobene, Elvinia
2010-01-01
A model review is a validation technique aimed at determining if a model is of sufficient quality and allows defects to be identified early in the system development, reducing the cost of fixing them. In this paper we propose a technique to perform automatic review of Abstract State Machine (ASM) formal specifications. We first detect a family of typical vulnerabilities and defects a developer can introduce during the modeling activity using the ASMs and we express such faults as the violation of meta-properties that guarantee certain quality attributes of the specification. These meta-properties are then mapped to temporal logic formulas and model checked for their violation. As a proof of concept, we also report the result of applying this ASM review process to several specifications.
Authoring and verification of clinical guidelines: a model driven approach.
Pérez, Beatriz; Porres, Ivan
2010-08-01
The goal of this research is to provide a framework to enable authoring and verification of clinical guidelines. The framework is part of a larger research project aimed at improving the representation, quality and application of clinical guidelines in daily clinical practice. The verification process of a guideline is based on (1) model checking techniques to verify guidelines against semantic errors and inconsistencies in their definition, (2) combined with Model Driven Development (MDD) techniques, which enable us to automatically process manually created guideline specifications and temporal-logic statements to be checked and verified regarding these specifications, making the verification process faster and cost-effective. Particularly, we use UML statecharts to represent the dynamics of guidelines and, based on this manually defined guideline specifications, we use a MDD-based tool chain to automatically process them to generate the input model of a model checker. The model checker takes the resulted model together with the specific guideline requirements, and verifies whether the guideline fulfils such properties. The overall framework has been implemented as an Eclipse plug-in named GBDSSGenerator which, particularly, starting from the UML statechart representing a guideline, allows the verification of the guideline against specific requirements. Additionally, we have established a pattern-based approach for defining commonly occurring types of requirements in guidelines. We have successfully validated our overall approach by verifying properties in different clinical guidelines resulting in the detection of some inconsistencies in their definition. The proposed framework allows (1) the authoring and (2) the verification of clinical guidelines against specific requirements defined based on a set of property specification patterns, enabling non-experts to easily write formal specifications and thus easing the verification process. Copyright 2010 Elsevier Inc. All rights reserved.
A study of applications scribe frame data verifications using design rule check
NASA Astrophysics Data System (ADS)
Saito, Shoko; Miyazaki, Masaru; Sakurai, Mitsuo; Itoh, Takahisa; Doi, Kazumasa; Sakurai, Norioko; Okada, Tomoyuki
2013-06-01
In semiconductor manufacturing, scribe frame data generally is generated for each LSI product according to its specific process design. Scribe frame data is designed based on definition tables of scanner alignment, wafer inspection and customers specified marks. We check that scribe frame design is conforming to specification of alignment and inspection marks at the end. Recently, in COT (customer owned tooling) business or new technology development, there is no effective verification method for the scribe frame data, and we take a lot of time to work on verification. Therefore, we tried to establish new verification method of scribe frame data by applying pattern matching and DRC (Design Rule Check) which is used in device verification. We would like to show scheme of the scribe frame data verification using DRC which we tried to apply. First, verification rules are created based on specifications of scanner, inspection and others, and a mark library is also created for pattern matching. Next, DRC verification is performed to scribe frame data. Then the DRC verification includes pattern matching using mark library. As a result, our experiments demonstrated that by use of pattern matching and DRC verification our new method can yield speed improvements of more than 12 percent compared to the conventional mark checks by visual inspection and the inspection time can be reduced to less than 5 percent if multi-CPU processing is used. Our method delivers both short processing time and excellent accuracy when checking many marks. It is easy to maintain and provides an easy way for COT customers to use original marks. We believe that our new DRC verification method for scribe frame data is indispensable and mutually beneficial.
The challenge of computer mathematics.
Barendregt, Henk; Wiedijk, Freek
2005-10-15
Progress in the foundations of mathematics has made it possible to formulate all thinkable mathematical concepts, algorithms and proofs in one language and in an impeccable way. This is not in spite of, but partially based on the famous results of Gödel and Turing. In this way statements are about mathematical objects and algorithms, proofs show the correctness of statements and computations, and computations are dealing with objects and proofs. Interactive computer systems for a full integration of defining, computing and proving are based on this. The human defines concepts, constructs algorithms and provides proofs, while the machine checks that the definitions are well formed and the proofs and computations are correct. Results formalized so far demonstrate the feasibility of this 'computer mathematics'. Also there are very good applications. The challenge is to make the systems more mathematician-friendly, by building libraries and tools. The eventual goal is to help humans to learn, develop, communicate, referee and apply mathematics.
Insights from Smart Meters: The Potential for Peak-Hour Savings from Behavior-Based Programs
DOE Office of Scientific and Technical Information (OSTI.GOV)
Todd, Annika; Perry, Michael; Smith, Brian
The rollout of smart meters in the last several years has opened up new forms of previously unavailable energy data. Many utilities are now able in real-time to capture granular, household level interval usage data at very high-frequency levels for a large proportion of their residential and small commercial customer population. This can be linked to other time and locationspecific information, providing vast, constantly growing streams of rich data (sometimes referred to by the recently popular buzz word, “big data”). Within the energy industry there is increasing interest in tapping into the opportunities that these data can provide. What canmore » we do with all of these data? The richness and granularity of these data enable many types of creative and cutting-edge analytics. Technically sophisticated and rigorous statistical techniques can be used to pull interesting insights out of this highfrequency, human-focused data. We at LBNL are calling this “behavior analytics”. This kind of analytics has the potential to provide tremendous value to a wide range of energy programs. For example, highly disaggregated and heterogeneous information about actual energy use would allow energy efficiency (EE) and/or demand response (DR) program implementers to target specific programs to specific households; would enable evaluation, measurement and verification (EM&V) of energy efficiency programs to be performed on a much shorter time horizon than was previously possible; and would provide better insights in to the energy and peak hour savings associated with specifics types of EE and DR programs (e.g., behavior-based (BB) programs). In this series, “Insights from Smart Meters”, we will present concrete, illustrative examples of the type of value that insights from behavior analytics of these data can provide (as well as pointing out its limitations). We will supply several types of key findings, including: • Novel results, which answer questions the industry previously was unable to answer; • Proof-of-concept analytics tools that can be adapted and used by others; and • Guidelines and protocols that summarize analytical best practices. This report focuses on one example of the kind of value that analysis of this data can provide: insights into whether behavior-based (BB) efficiency programs have the potential to provide peak-hour energy savings.« less
Microgravity Acceleration Measurement System (MAMS) Flight Configuration Verification and Status
NASA Technical Reports Server (NTRS)
Wagar, William
2000-01-01
The Microgravity Acceleration Measurement System (MAMS) is a precision spaceflight instrument designed to measure and characterize the microgravity environment existing in the US Lab Module of the International Space Station. Both vibratory and quasi-steady triaxial acceleration data are acquired and provided to an Ethernet data link. The MAMS Double Mid-Deck Locker (DMDL) EXPRESS Rack payload meets all the ISS IDD and ICD interface requirements as discussed in the paper which also presents flight configuration illustrations. The overall MAMS sensor and data acquisition performance and verification data are presented in addition to a discussion of the Command and Data Handling features implemented via the ISS, downlink and the GRC Telescience Center displays.
Property-driven functional verification technique for high-speed vision system-on-chip processor
NASA Astrophysics Data System (ADS)
Nshunguyimfura, Victor; Yang, Jie; Liu, Liyuan; Wu, Nanjian
2017-04-01
The implementation of functional verification in a fast, reliable, and effective manner is a challenging task in a vision chip verification process. The main reason for this challenge is the stepwise nature of existing functional verification techniques. This vision chip verification complexity is also related to the fact that in most vision chip design cycles, extensive efforts are focused on how to optimize chip metrics such as performance, power, and area. Design functional verification is not explicitly considered at an earlier stage at which the most sound decisions are made. In this paper, we propose a semi-automatic property-driven verification technique. The implementation of all verification components is based on design properties. We introduce a low-dimension property space between the specification space and the implementation space. The aim of this technique is to speed up the verification process for high-performance parallel processing vision chips. Our experimentation results show that the proposed technique can effectively improve the verification effort up to 20% for the complex vision chip design while reducing the simulation and debugging overheads.
NASA Astrophysics Data System (ADS)
Zhafirah Muhammad, Nurul; Harun, A.; Hambali, N. A. M. A.; Murad, S. A. Z.; Mohyar, S. N.; Isa, M. N.; Jambek, AB
2017-11-01
Increased demand in internet of thing (IOT) application based has inadvertently forced the move towards higher complexity of integrated circuit supporting SoC. Such spontaneous increased in complexity poses unequivocal complicated validation strategies. Hence, the complexity allows researchers to come out with various exceptional methodologies in order to overcome this problem. This in essence brings about the discovery of dynamic verification, formal verification and hybrid techniques. In reserve, it is very important to discover bugs at infancy of verification process in (SoC) in order to reduce time consuming and fast time to market for the system. Ergo, in this paper we are focusing on the methodology of verification that can be done at Register Transfer Level of SoC based on the AMBA bus design. On top of that, the discovery of others verification method called Open Verification Methodology (OVM) brings out an easier way in RTL validation methodology neither as the replacement for the traditional method yet as an effort for fast time to market for the system. Thus, the method called OVM is proposed in this paper as the verification method for larger design to avert the disclosure of the bottleneck in validation platform.
Proof of a new colour decomposition for QCD amplitudes
Melia, Tom
2015-12-16
Recently, Johansson and Ochirov conjectured the form of a new colour decom-position for QCD tree-level amplitudes. This note provides a proof of that conjecture. The proof is based on ‘Mario World’ Feynman diagrams, which exhibit the hierarchical Dyck structure previously found to be very useful when dealing with multi-quark amplitudes.
Proof of a new colour decomposition for QCD amplitudes
DOE Office of Scientific and Technical Information (OSTI.GOV)
Melia, Tom
Recently, Johansson and Ochirov conjectured the form of a new colour decom-position for QCD tree-level amplitudes. This note provides a proof of that conjecture. The proof is based on ‘Mario World’ Feynman diagrams, which exhibit the hierarchical Dyck structure previously found to be very useful when dealing with multi-quark amplitudes.
Derivative, Maxima and Minima in a Graphical Context
ERIC Educational Resources Information Center
Rivera-Figueroa, Antonio; Ponce-Campuzano, Juan Carlos
2013-01-01
A deeper learning of the properties and applications of the derivative for the study of functions may be achieved when teachers present lessons within a highly graphic context, linking the geometric illustrations to formal proofs. Each concept is better understood and more easily retained when it is presented and explained visually using graphs.…
North Idaho E. coli Infections Linked to Raw Clover Sprouts > Idaho
Stamps Nutrition Education Heating/Telephone Women, Infants and Children Nursing Home Cost Assistance WIC About WIC FAQs Contact Us Apply for WIC Vendor Health Partners Breastfeeding Staff Nutrition Education Livable Communities Idaho Physical Activity and Nutrition (IPAN) Fit and Fall Proof(tm) Nutrition Physical
Cognitive Models: The Missing Link to Learning Fraction Multiplication and Division
ERIC Educational Resources Information Center
de Castro, Belinda V.
2008-01-01
This quasi-experimental study aims to streamline cognitive models on fraction multiplication and division that contain the most worthwhile features of other existing models. Its exploratory nature and its approach to proof elicitation can be used to help establish its effectiveness in building students' understanding of fractions as compared to…
Using Concept Space to Verify Hyponymy in Building a Hyponymy Lexicon
NASA Astrophysics Data System (ADS)
Liu, Lei; Zhang, Sen; Diao, Lu Hong; Yan, Shu Ying; Cao, Cun Gen
Verification of hyponymy relations is a basic problem in knowledge acquisition. We present a method of hyponymy verification based on concept space. Firstly, we give the definition of concept space about a group of candidate hyponymy relations. Secondly we analyze the concept space and define a set of hyponymy features based on the space structure. Then we use them to verify candidate hyponymy relations. Experimental results show that the method can provide adequate verification of hyponymy.
2003-03-01
Different?," Jour. of Experimental & Theoretical Artificial Intelligence, Special Issue on Al for Systems Validation and Verification, 12(4), 2000, pp...Hamilton, D., " Experiences in Improving the State of Practice in Verification and Validation of Knowledge-Based Systems," Workshop Notes of the AAAI...Unsuspected Power of the Standard Turing Test," Jour. of Experimental & Theoretical Artificial Intelligence., 12, 2000, pp3 3 1-3 4 0 . [30] Gaschnig
FORMED: Bringing Formal Methods to the Engineering Desktop
2016-02-01
integrates formal verification into software design and development by precisely defining semantics for a restricted subset of the Unified Modeling...input-output contract satisfaction and absence of null pointer dereferences. 15. SUBJECT TERMS Formal Methods, Software Verification , Model-Based...Domain specific languages (DSLs) drive both implementation and formal verification
This report is a product of the U.S. EPA's Environmental Technoloy Verification (ETV) Program and is focused on the Smart Sonics Ultrasonic Aqueous Cleaning Systems. The verification is based on three main objectives. (1) The Smart Sonic Aqueous Cleaning Systems, Model 2000 and...
NASA Astrophysics Data System (ADS)
Stefaneas, Petros; Vandoulakis, Ioannis M.
2015-12-01
This paper outlines a logical representation of certain aspects of the process of mathematical proving that are important from the point of view of Artificial Intelligence. Our starting-point is the concept of proof-event or proving, introduced by Goguen, instead of the traditional concept of mathematical proof. The reason behind this choice is that in contrast to the traditional static concept of mathematical proof, proof-events are understood as processes, which enables their use in Artificial Intelligence in such contexts, in which problem-solving procedures and strategies are studied. We represent proof-events as problem-centered spatio-temporal processes by means of the language of the calculus of events, which captures adequately certain temporal aspects of proof-events (i.e. that they have history and form sequences of proof-events evolving in time). Further, we suggest a "loose" semantics for the proof-events, by means of Kolmogorov's calculus of problems. Finally, we expose the intented interpretations for our logical model from the fields of automated theorem-proving and Web-based collective proving.
Deriving Safety Cases from Automatically Constructed Proofs
NASA Technical Reports Server (NTRS)
Basir, Nurlida; Denney, Ewen; Fischer, Bernd
2009-01-01
Formal proofs provide detailed justification for the validity of claims and are widely used in formal software development methods. However, they are often complex and difficult to understand, because the formalism in which they are constructed and encoded is usually machine-oriented, and they may also be based on assumptions that are not justified. This causes concerns about the trustworthiness of using formal proofs as arguments in safety-critical applications. Here, we present an approach to develop safety cases that correspond to formal proofs found by automated theorem provers and reveal the underlying argumentation structure and top-level assumptions. We concentrate on natural deduction style proofs, which are closer to human reasoning than resolution proofs, and show how to construct the safety cases by covering the natural deduction proof tree with corresponding safety case fragments. We also abstract away logical book-keeping steps, which reduces the size of the constructed safety cases. We show how the approach can be applied to the proofs found by the Muscadet prover.
Marsolo, Keith; Margolis, Peter A; Forrest, Christopher B; Colletti, Richard B; Hutton, John J
2015-01-01
We collaborated with the ImproveCareNow Network to create a proof-of-concept architecture for a network-based Learning Health System. This collaboration involved transitioning an existing registry to one that is linked to the electronic health record (EHR), enabling a "data in once" strategy. We sought to automate a series of reports that support care improvement while also demonstrating the use of observational registry data for comparative effectiveness research. We worked with three leading EHR vendors to create EHR-based data collection forms. We automated many of ImproveCareNow's analytic reports and developed an application for storing protected health information and tracking patient consent. Finally, we deployed a cohort identification tool to support feasibility studies and hypothesis generation. There is ongoing uptake of the system. To date, 31 centers have adopted the EHR-based forms and 21 centers are uploading data to the registry. Usage of the automated reports remains high and investigators have used the cohort identification tools to respond to several clinical trial requests. The current process for creating EHR-based data collection forms requires groups to work individually with each vendor. A vendor-agnostic model would allow for more rapid uptake. We believe that interfacing network-based registries with the EHR would allow them to serve as a source of decision support. Additional standards are needed in order for this vision to be achieved, however. We have successfully implemented a proof-of-concept Learning Health System while providing a foundation on which others can build. We have also highlighted opportunities where sponsors could help accelerate progress.
Artificial Intelligence and Information Management
NASA Astrophysics Data System (ADS)
Fukumura, Teruo
After reviewing the recent popularization of the information transmission and processing technologies, which are supported by the progress of electronics, the authors describe that by the introduction of the opto-electronics into the information technology, the possibility of applying the artificial intelligence (AI) technique to the mechanization of the information management has emerged. It is pointed out that althuogh AI deals with problems in the mental world, its basic methodology relies upon the verification by evidence, so the experiment on computers become indispensable for the study of AI. The authors also describe that as computers operate by the program, the basic intelligence which is concerned in AI is that expressed by languages. This results in the fact that the main tool of AI is the logical proof and it involves an intrinsic limitation. To answer a question “Why do you employ AI in your problem solving”, one must have ill-structured problems and intend to conduct deep studies on the thinking and the inference, and the memory and the knowledge-representation. Finally the authors discuss the application of AI technique to the information management. The possibility of the expert-system, processing of the query, and the necessity of document knowledge-base are stated.
NASA Astrophysics Data System (ADS)
Lee, Seung Yup; Pakela, Julia M.; Helton, Michael C.; Vishwanath, Karthik; Chung, Yooree G.; Kolodziejski, Noah J.; Stapels, Christopher J.; McAdams, Daniel R.; Fernandez, Daniel E.; Christian, James F.; O'Reilly, Jameson; Farkas, Dana; Ward, Brent B.; Feinberg, Stephen E.; Mycek, Mary-Ann
2017-12-01
In reconstructive surgery, the ability to detect blood flow interruptions to grafted tissue represents a critical step in preventing postsurgical complications. We have developed and pilot tested a compact, fiber-based device that combines two complimentary modalities-diffuse correlation spectroscopy (DCS) and diffuse reflectance spectroscopy-to quantitatively monitor blood perfusion. We present a proof-of-concept study on an in vivo porcine model (n=8). With a controllable arterial blood flow supply, occlusion studies (n=4) were performed on surgically isolated free flaps while the device simultaneously monitored blood flow through the supplying artery as well as flap perfusion from three orientations: the distal side of the flap and two transdermal channels. Further studies featuring long-term monitoring, arterial failure simulations, and venous failure simulations were performed on flaps that had undergone an anastomosis procedure (n=4). Additionally, benchtop verification of the DCS system was performed on liquid flow phantoms. Data revealed relationships between diffuse optical measures and state of occlusion as well as the ability to detect arterial and venous compromise. The compact construction of the device, along with its noninvasive and quantitative nature, would make this technology suitable for clinical translation.
Neronov, Andrii
2017-11-10
Cosmic rays could be produced via shock acceleration powered by supernovae. The supernova hypothesis implies that each supernova injects, on average, some 10^{50} erg in cosmic rays, while the shock acceleration model predicts a power law cosmic ray spectrum with the slope close to 2. Verification of these predictions requires measurement of the spectrum and power of cosmic ray injection from supernova population(s). Here, we obtain such measurements based on γ-ray observation of the Constellation III region of the Large Magellanic Cloud. We show that γ-ray emission from this young star formation region originates from cosmic rays injected by approximately two thousand supernovae, rather than by a massive star wind powered by a superbubble predating supernova activity. Cosmic ray injection power is found to be (1.1_{-0.2}^{+0.5})×10^{50} erg/supernova (for the estimated interstellar medium density 0.3 cm^{-3}). The spectrum is a power law with slope 2.09_{-0.07}^{+0.06}. This agrees with the model of particle acceleration at supernova shocks and provides a direct proof of the supernova origin of cosmic rays.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bojechko, Casey; Phillps, Mark; Kalet, Alan
Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into differentmore » failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.« less
Review of the nutritional benefits and risks related to intense sweeteners.
Olivier, Bruyère; Serge, Ahmed H; Catherine, Atlan; Jacques, Belegaud; Murielle, Bortolotti; Marie-Chantal, Canivenc-Lavier; Sybil, Charrière; Jean-Philippe, Girardet; Sabine, Houdart; Esther, Kalonji; Perrine, Nadaud; Fabienne, Rajas; Gérard, Slama; Irène, Margaritis
2015-01-01
The intense sweeteners currently authorised in Europe comprise ten compounds of various chemical natures. Their overall use has sharply risen in the last 20 years. These compounds are mainly used to formulate reduced-calorie products while maintaining sweetness. This extensive analysis of the literature reviews the data currently available on the potential nutritional benefits and risks related to the consumption of products containing intense sweeteners. Regarding nutritional benefits, the available studies, while numerous, do not provide proof that the consumption of artificial sweeteners as sugar substitutes is beneficial in terms of weight management, blood glucose regulation in diabetic subjects or the incidence of type 2 diabetes. Regarding nutritional risks (incidence of type 2 diabetes, habituation to sweetness in adults, cancers, etc.), it is not possible based on the available data to establish a link between the occurrence of these risks and the consumption of artificial sweeteners. However, some studies underline the need to improve knowledge of the links between intense sweeteners consumption and certain risks.
Multiplexing of spatial modes in the mid-IR region
NASA Astrophysics Data System (ADS)
Gailele, Lucas; Maweza, Loyiso; Dudley, Angela; Ndagano, Bienvenu; Rosales-Guzman, Carmelo; Forbes, Andrew
2017-02-01
Traditional optical communication systems optimize multiplexing in polarization and wavelength both trans- mitted in fiber and free-space to attain high bandwidth data communication. Yet despite these technologies, we are expected to reach a bandwidth ceiling in the near future. Communications using orbital angular momentum (OAM) carrying modes offers infinite dimensional states, providing means to increase link capacity by multiplexing spatially overlapping modes in both the azimuthal and radial degrees of freedom. OAM modes are multiplexed and de-multiplexed by the use of spatial light modulators (SLM). Implementation of complex amplitude modulation is employed on laser beams phase and amplitude to generate Laguerre-Gaussian (LG) modes. Modal decomposition is employed to detect these modes due to their orthogonality as they propagate in space. We demonstrate data transfer by sending images as a proof-of concept in a lab-based scheme. We demonstrate the creation and detection of OAM modes in the mid-IR region as a precursor to a mid-IR free-space communication link.
24-26 GHz radio-over-fiber and free-space optics for fifth-generation systems.
Bohata, Jan; Komanec, Matěj; Spáčil, Jan; Ghassemlooy, Zabih; Zvánovec, Stanislav; Slavík, Radan
2018-03-01
This Letter outlines radio-over-fiber combined with radio-over-free-space optics (RoFSO) and radio frequency free-space transmission, which is of particular relevance for fifth-generation networks. Here, the frequency band of 24-26 GHz is adopted to demonstrate a low-cost, compact, and high-energy-efficient solution based on the direct intensity modulation and direct detection scheme. For our proof-of-concept demonstration, we use 64 quadrature amplitude modulation with a 100 MHz bandwidth. We assess the link performance by exposing the RoFSO section to atmospheric turbulence conditions. Further, we show that the measured minimum error vector magnitude (EVM) is 4.7% and also verify that the proposed system with the free-space-optics link span of 100 m under strong turbulence can deliver an acceptable EVM of <9% with signal-to-noise ratio levels of 22 dB and 10 dB with and without turbulence, respectively.
Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA
2011-01-25
A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.
The RUSTIC program links three subordinate models--PRZM, VADOFT, and SAFTMOD--in order to predict pesticide transport and transformation through the crop root zone, the unsaturated zone, and the saturated zone to drinking water wells. PRZM is a one-dimensional finite-difference m...
Self-priorization processes in action and perception.
Frings, Christian; Wentura, Dirk
2014-10-01
Recently, Sui, He, and Humphreys (2012) introduced a new paradigm to investigate prioritized processing of self-related information. In a balanced design, they arbitrarily assigned simple geometric shapes to the participant and 2 others. Subsequently, the task was to judge whether label-shape pairings matched. The authors found a remarkable self-prioritization effect, that is, for matching self-related trials verification was very fast and accurate in comparison to the other conditions. We tested the hypothesis that the self-priorization effect extends from perception-self links to action-self links. In particular, we assigned simple movements (i.e., up, down, left, right) to the participant, 2 others (i.e., the mother; a stranger), and a neutral label, respectively. In each trial participants executed a movement (triggered by a cue), which was followed by a briefly presented label. Participants had to judge whether label-movement pairings matched. In accordance with Sui et al. (2012) we found a remarkable self-prioritization effect, that is, for matching self-related trials verification was very fast and accurate in comparison to the other conditions.
Signature Verification Based on Handwritten Text Recognition
NASA Astrophysics Data System (ADS)
Viriri, Serestina; Tapamo, Jules-R.
Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.
A framework of multitemplate ensemble for fingerprint verification
NASA Astrophysics Data System (ADS)
Yin, Yilong; Ning, Yanbin; Ren, Chunxiao; Liu, Li
2012-12-01
How to improve performance of an automatic fingerprint verification system (AFVS) is always a big challenge in biometric verification field. Recently, it becomes popular to improve the performance of AFVS using ensemble learning approach to fuse related information of fingerprints. In this article, we propose a novel framework of fingerprint verification which is based on the multitemplate ensemble method. This framework is consisted of three stages. In the first stage, enrollment stage, we adopt an effective template selection method to select those fingerprints which best represent a finger, and then, a polyhedron is created by the matching results of multiple template fingerprints and a virtual centroid of the polyhedron is given. In the second stage, verification stage, we measure the distance between the centroid of the polyhedron and a query image. In the final stage, a fusion rule is used to choose a proper distance from a distance set. The experimental results on the FVC2004 database prove the improvement on the effectiveness of the new framework in fingerprint verification. With a minutiae-based matching method, the average EER of four databases in FVC2004 drops from 10.85 to 0.88, and with a ridge-based matching method, the average EER of these four databases also decreases from 14.58 to 2.51.
Use of metaknowledge in the verification of knowledge-based systems
NASA Technical Reports Server (NTRS)
Morell, Larry J.
1989-01-01
Knowledge-based systems are modeled as deductive systems. The model indicates that the two primary areas of concern in verification are demonstrating consistency and completeness. A system is inconsistent if it asserts something that is not true of the modeled domain. A system is incomplete if it lacks deductive capability. Two forms of consistency are discussed along with appropriate verification methods. Three forms of incompleteness are discussed. The use of metaknowledge, knowledge about knowledge, is explored in connection to each form of incompleteness.
Simple Proof of Jury Test for Complex Polynomials
NASA Astrophysics Data System (ADS)
Choo, Younseok; Kim, Dongmin
Recently some attempts have been made in the literature to give simple proofs of Jury test for real polynomials. This letter presents a similar result for complex polynomials. A simple proof of Jury test for complex polynomials is provided based on the Rouché's Theorem and a single-parameter characterization of Schur stability property for complex polynomials.
Adaptive Power Control for Space Communications
NASA Technical Reports Server (NTRS)
Thompson, Willie L., II; Israel, David J.
2008-01-01
This paper investigates the implementation of power control techniques for crosslinks communications during a rendezvous scenario of the Crew Exploration Vehicle (CEV) and the Lunar Surface Access Module (LSAM). During the rendezvous, NASA requires that the CEV supports two communication links: space-to-ground and crosslink simultaneously. The crosslink will generate excess interference to the space-to-ground link as the distances between the two vehicles decreases, if the output power is fixed and optimized for the worst-case link analysis at the maximum distance range. As a result, power control is required to maintain the optimal power level for the crosslink without interfering with the space-to-ground link. A proof-of-concept will be described and implemented with Goddard Space Flight Center (GSFC) Communications, Standard, and Technology Lab (CSTL).
EPA Facility Registry Service (FRS): CAMDBS
This web feature service contains location and facility identification information from EPA's Facility Registry Service (FRS) for the subset of facilities that link to the Clean Air Markets Division Business System (CAMDBS). Administered by the EPA Clean Air Markets Division, within the Office of Air and Radiation, CAMDBS supports the implementation of market-based air pollution control programs, including the Acid Rain Program and regional programs designed to reduce the transport of ozone. FRS identifies and geospatially locates facilities, sites or places subject to environmental regulations or of environmental interest. Using vigorous verification and data management procedures, FRS integrates facility data from EPA's national program systems, other federal agencies, and State and tribal master facility records and provides EPA with a centrally managed, single source of comprehensive and authoritative information on facilities. This data set contains the subset of FRS integrated facilities that link to CAMDBS facilities once the CAMDBS data has been integrated into the FRS database. Additional information on FRS is available at the EPA website https://www.epa.gov/enviro/facility-registry-service-frs.
Antony, Matthieu; Bertone, Maria Paola; Barthes, Olivier
2017-03-14
Results-based financing (RBF) has been introduced in many countries across Africa and a growing literature is building around the assessment of their impact. These studies are usually quantitative and often silent on the paths and processes through which results are achieved and on the wider health system effects of RBF. To address this gap, our study aims at exploring the implementation of an RBF pilot in Benin, focusing on the verification of results. The study is based on action research carried out by authors involved in the pilot as part of the agency supporting the RBF implementation in Benin. While our participant observation and operational collaboration with project's stakeholders informed the study, the analysis is mostly based on quantitative and qualitative secondary data, collected throughout the project's implementation and documentation processes. Data include project documents, reports and budgets, RBF data on service outputs and on the outcome of the verification, daily activity timesheets of the technical assistants in the districts, as well as focus groups with Community-based Organizations and informal interviews with technical assistants and district medical officers. Our analysis focuses on the actual practices of quantitative, qualitative and community verification. Results show that the verification processes are complex, costly and time-consuming, and in practice they end up differing from what designed originally. We explore the consequences of this on the operation of the scheme, on its potential to generate the envisaged change. We find, for example, that the time taken up by verification procedures limits the time available for data analysis and feedback to facility staff, thus limiting the potential to improve service delivery. Verification challenges also result in delays in bonus payment, which delink effort and reward. Additionally, the limited integration of the verification activities of district teams with their routine tasks causes a further verticalization of the health system. Our results highlight the potential disconnect between the theory of change behind RBF and the actual scheme's implementation. The implications are relevant at methodological level, stressing the importance of analyzing implementation processes to fully understand results, as well as at operational level, pointing to the need to carefully adapt the design of RBF schemes (including verification and other key functions) to the context and to allow room to iteratively modify it during implementation. They also question whether the rationale for thorough and costly verification is justified, or rather adaptations are possible.
Liang, Yun; Kim, Gwe-Ya; Pawlicki, Todd; Mundt, Arno J; Mell, Loren K
2013-03-04
The purpose of this study was to develop dosimetry verification procedures for volumetric-modulated arc therapy (VMAT)-based total marrow irradiation (TMI). The VMAT based TMI plans were generated for three patients: one child and two adults. The planning target volume (PTV) was defined as bony skeleton, from head to mid-femur, with a 3 mm margin. The plan strategy similar to published studies was adopted. The PTV was divided into head and neck, chest, and pelvic regions, with separate plans each of which is composed of 2-3 arcs/fields. Multiple isocenters were evenly distributed along the patient's axial direction. The focus of this study is to establish a dosimetry quality assurance procedure involving both two-dimensional (2D) and three-dimensional (3D) volumetric verifications, which is desirable for a large PTV treated with multiple isocenters. The 2D dose verification was performed with film for gamma evaluation and absolute point dose was measured with ion chamber, with attention to the junction between neighboring plans regarding hot/cold spots. The 3D volumetric dose verification used commercial dose reconstruction software to reconstruct dose from electronic portal imaging devices (EPID) images. The gamma evaluation criteria in both 2D and 3D verification were 5% absolute point dose difference and 3 mm of distance to agreement. With film dosimetry, the overall average gamma passing rate was 98.2% and absolute dose difference was 3.9% in junction areas among the test patients; with volumetric portal dosimetry, the corresponding numbers were 90.7% and 2.4%. A dosimetry verification procedure involving both 2D and 3D was developed for VMAT-based TMI. The initial results are encouraging and warrant further investigation in clinical trials.
Directional antenna array (DAA) for communications, control, and data link protection
NASA Astrophysics Data System (ADS)
Molchanov, Pavlo A.; Contarino, Vincent M.
2013-06-01
A next generation of Smart antennas with point-to-point communication and jam, spoof protection capability by verification of spatial position is offered. A directional antenna array (DAA) with narrow irradiation beam provides counter terrorism protection for communications, data link, control and GPS. Communications are "invisible" to guided missiles because of 20 dB smaller irradiation outside the beam and spatial separation. This solution can be implemented with current technology. Directional antennas have higher gain and can be multi-frequency or have wide frequency band in contrast to phase antenna arrays. This multi-directional antenna array provides a multi-functional communication network and simultaneously can be used for command control, data link and GPS.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marleau, Peter; Brubaker, Erik; Deland, Sharon M.
This report summarizes the discussion and conclusions reached during a table top exercise held at Sandia National Laboratories, Albuquerque on September 3, 2014 regarding a recently described approach for nuclear warhead verification based on the cryptographic concept of a zero-knowledge protocol (ZKP) presented in a recent paper authored by Glaser, Barak, and Goldston. A panel of Sandia National Laboratories researchers, whose expertise includes radiation instrumentation design and development, cryptography, and arms control verification implementation, jointly reviewed the paper and identified specific challenges to implementing the approach as well as some opportunities. It was noted that ZKP as used in cryptographymore » is a useful model for the arms control verification problem, but the direct analogy to arms control breaks down quickly. The ZKP methodology for warhead verification fits within the general class of template-based verification techniques, where a reference measurement is used to confirm that a given object is like another object that has already been accepted as a warhead by some other means. This can be a powerful verification approach, but requires independent means to trust the authenticity of the reference warhead - a standard that may be difficult to achieve, which the ZKP authors do not directly address. Despite some technical challenges, the concept of last-minute selection of the pre-loads and equipment could be a valuable component of a verification regime.« less
Survey of Verification and Validation Techniques for Small Satellite Software Development
NASA Technical Reports Server (NTRS)
Jacklin, Stephen A.
2015-01-01
The purpose of this paper is to provide an overview of the current trends and practices in small-satellite software verification and validation. This document is not intended to promote a specific software assurance method. Rather, it seeks to present an unbiased survey of software assurance methods used to verify and validate small satellite software and to make mention of the benefits and value of each approach. These methods include simulation and testing, verification and validation with model-based design, formal methods, and fault-tolerant software design with run-time monitoring. Although the literature reveals that simulation and testing has by far the longest legacy, model-based design methods are proving to be useful for software verification and validation. Some work in formal methods, though not widely used for any satellites, may offer new ways to improve small satellite software verification and validation. These methods need to be further advanced to deal with the state explosion problem and to make them more usable by small-satellite software engineers to be regularly applied to software verification. Last, it is explained how run-time monitoring, combined with fault-tolerant software design methods, provides an important means to detect and correct software errors that escape the verification process or those errors that are produced after launch through the effects of ionizing radiation.
Living Grace and Courtesy in the Primary
ERIC Educational Resources Information Center
Soholt, Polli
2015-01-01
Polli Soholt looks at grace and courtesy from the 3-6 classroom perspective with clear theory explanations as they pertain to the larger classroom culture. She discusses the link between older and young children and the presence of the teacher as a model for grace and takes a brief look to neural science for proof of the existence of social…
Using business intelligence to improve performance.
Wadsworth, Tom; Graves, Brian; Glass, Steve; Harrison, A Marc; Donovan, Chris; Proctor, Andrew
2009-10-01
Cleveland Clinic's enterprise performance management program offers proof that comparisons of actual performance against strategic objectives can enable healthcare organization to achieve rapid organizational change. Here are four lessons Cleveland Clinic learned from this initiative: Align performance metrics with strategic initiatives. Structure dashboards for the CEO. Link performance to annual reviews. Customize dashboard views to the specific user.
Cheng, Xu-Dong; Jia, Xiao-Bin; Feng, Liang; Jiang, Jun
2013-12-01
The secondary development of major traditional Chinese medicine varieties is one of important links during the modernization, scientification and standardization of traditional Chinese medicines. How to accurately and effectively identify the pharmacodynamic material basis of original formulae becomes the primary problem in the secondary development, as well as the bottleneck in the modernization development of traditional Chinese medicines. On the basis of the existing experimental methods, and according to the study thought that the multi-component and complex effects of traditional Chinese medicine components need to combine multi-disciplinary methods and technologies, we propose the study thought of the material basis of secondary development of major traditional Chinese medicine varieties based on the combination of in vivo and in vitro experiments. It is believed that studies on material basis needs three links, namely identification, screening and verification, and in vivo and in vitro study method corresponding to each link is mutually complemented and verified. Finally, the accurate and reliable material basis is selected. This thought provides reference for the secondary development of major traditional Chinese medicine varieties and studies on compound material basis.
Using Toulmin analysis to analyse an instructor's proof presentation in abstract algebra
NASA Astrophysics Data System (ADS)
Fukawa-connelly, Timothy
2014-01-01
This paper provides a method for analysing undergraduate teaching of proof-based courses using Toulmin's model (1969) of argumentation. It presents a case study of one instructor's presentation of proofs. The analysis shows that the instructor presents different levels of detail in different proofs; thus, the students have an inconsistent set of written models for their work. Similarly, the analysis shows that the details the instructor says aloud differ from what she writes down. Although her verbal commentary provides additional detail and appears to have pedagogical value, for instance, by modelling thinking that supports proof writing, this value might be better realized if she were to change her teaching practices.
The Vice: Some Historically Inspired and Proof-Generated Steps to Limits of Sequences
ERIC Educational Resources Information Center
Burn, Bob
2005-01-01
This paper proposes a genetic development of the concept of limit of a sequence leading to a definition, through a succession of proofs rather than through a succession of sequences or a succession of epsilons. The major ideas on which it is based are historical and depend on Euclid, Archimedes, Fermat, Wallis and Newton. Proofs of equality by…
The Failure to Construct Proof Based on Assimilation and Accommodation Framework from Piaget
ERIC Educational Resources Information Center
Netti, Syukma; Nusantara, Toto; Subanji; Abadyo; Anwar, Lathiful
2016-01-01
The purpose of this article is to describe the process of a proof construction. It is more specific on the failure of the process. Piaget's frameworks, assimilation and accommodation, were used to analyze it. Method of this research was qualitative method. Data were collected by asking five students working on problems of proof using think aloud…
Generalized Dandelin’s Theorem
NASA Astrophysics Data System (ADS)
Kheyfets, A. L.
2017-11-01
The paper gives a geometric proof of the theorem which states that in case of the plane section of a second-order surface of rotation (quadrics of rotation, QR), such conics as an ellipse, a hyperbola or a parabola (types of conic sections) are formed. The theorem supplements the well-known Dandelin’s theorem which gives the geometric proof only for a circular cone and applies the proof to all QR, namely an ellipsoid, a hyperboloid, a paraboloid and a cylinder. That’s why the considered theorem is known as the generalized Dandelin’s theorem (GDT). The GDT proof is based on a relatively unknown generalized directrix definition (GDD) of conics. The work outlines the GDD proof for all types of conics as their necessary and sufficient condition. Based on the GDD, the author proves the GDT for all QR in case of a random position of the cutting plane. The graphical stereometric structures necessary for the proof are given. The implementation of the structures by 3d computer methods is considered. The article shows the examples of the builds made in the AutoCAD package. The theorem is intended for the training course of theoretical training of elite student groups of architectural and construction specialties.
Proof-test-based life prediction of high-toughness pressure vessels
DOE Office of Scientific and Technical Information (OSTI.GOV)
Panontin, T.L.; Hill, M.R.
1996-02-01
The paper examines the problems associated with applying proof-test-based life prediction to vessels made of high-toughness metals. Two A106 Gr B pipe specimens containing long, through-wall circumferential flaws were tested. One failed during hydrostatic testing and the other during tension-tension cycling following a hydrostatic test. Quantitative fractography was used to verify experimentally obtained fatigue crack growth rates and a variety of LEFM and EPFM techniques were used to analyze the experimental results. The results show that: plastic collapse analysis provides accurate predictions of screened (initial) crack size when the flow stress is determined experimentally; LEFM analysis underestimates the crack sizemore » screened by the proof test and overpredicts the subsequent fatigue life of the vessel when retardation effects are small (i.e., low proof levels); and, at a high proof-test level (2.4 {times} operating pressure), the large retardation effect on fatigue crack growth due to the overload overwhelmed the deleterious effect on fatigue life from stable tearing during the proof test and alleviated the problem of screening only long cracks due to the high toughness of the metal.« less
Simulation-Based Verification of Autonomous Controllers via Livingstone PathFinder
NASA Technical Reports Server (NTRS)
Lindsey, A. E.; Pecheur, Charles
2004-01-01
AI software is often used as a means for providing greater autonomy to automated systems, capable of coping with harsh and unpredictable environments. Due in part to the enormous space of possible situations that they aim to addrs, autonomous systems pose a serious challenge to traditional test-based verification approaches. Efficient verification approaches need to be perfected before these systems can reliably control critical applications. This publication describes Livingstone PathFinder (LPF), a verification tool for autonomous control software. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the controller embedded in a simulated operating environment. Although LPF has focused on NASA s Livingstone model-based diagnosis system applications, the architecture is modular and adaptable to other systems. This article presents different facets of LPF and experimental results from applying the software to a Livingstone model of the main propulsion feed subsystem for a prototype space vehicle.
A Framework for Evidence-Based Licensure of Adaptive Autonomous Systems
2016-03-01
insights gleaned to DoD. The autonomy community has identified significant challenges associated with test, evaluation verification and validation of...licensure as a test, evaluation, verification , and validation (TEVV) framework that can address these challenges. IDA found that traditional...language requirements to testable (preferably machine testable) specifications • Design of architectures that treat development and verification of
Verification of an on line in vivo semiconductor dosimetry system for TBI with two TLD procedures.
Sánchez-Doblado, F; Terrón, J A; Sánchez-Nieto, B; Arráns, R; Errazquin, L; Biggs, D; Lee, C; Núñez, L; Delgado, A; Muñiz, J L
1995-01-01
This work presents the verification of an on line in vivo dosimetry system based on semiconductors. Software and hardware has been designed to convert the diode signal into absorbed dose. Final verification was made in the form of an intercomparison with two independent thermoluminiscent (TLD) dosimetry systems, under TBI conditions.
NASA. Lewis Research Center Advanced Modulation and Coding Project: Introduction and overview
NASA Technical Reports Server (NTRS)
Budinger, James M.
1992-01-01
The Advanced Modulation and Coding Project at LeRC is sponsored by the Office of Space Science and Applications, Communications Division, Code EC, at NASA Headquarters and conducted by the Digital Systems Technology Branch of the Space Electronics Division. Advanced Modulation and Coding is one of three focused technology development projects within the branch's overall Processing and Switching Program. The program consists of industry contracts for developing proof-of-concept (POC) and demonstration model hardware, university grants for analyzing advanced techniques, and in-house integration and testing of performance verification and systems evaluation. The Advanced Modulation and Coding Project is broken into five elements: (1) bandwidth- and power-efficient modems; (2) high-speed codecs; (3) digital modems; (4) multichannel demodulators; and (5) very high-data-rate modems. At least one contract and one grant were awarded for each element.
Software Certification for Temporal Properties With Affordable Tool Qualification
NASA Technical Reports Server (NTRS)
Xia, Songtao; DiVito, Benedetto L.
2005-01-01
It has been recognized that a framework based on proof-carrying code (also called semantic-based software certification in its community) could be used as a candidate software certification process for the avionics industry. To meet this goal, tools in the "trust base" of a proof-carrying code system must be qualified by regulatory authorities. A family of semantic-based software certification approaches is described, each different in expressive power, level of automation and trust base. Of particular interest is the so-called abstraction-carrying code, which can certify temporal properties. When a pure abstraction-carrying code method is used in the context of industrial software certification, the fact that the trust base includes a model checker would incur a high qualification cost. This position paper proposes a hybrid of abstraction-based and proof-based certification methods so that the model checker used by a client can be significantly simplified, thereby leading to lower cost in tool qualification.
Eronen, Lauri; Toivonen, Hannu
2012-06-06
Biological databases contain large amounts of data concerning the functions and associations of genes and proteins. Integration of data from several such databases into a single repository can aid the discovery of previously unknown connections spanning multiple types of relationships and databases. Biomine is a system that integrates cross-references from several biological databases into a graph model with multiple types of edges, such as protein interactions, gene-disease associations and gene ontology annotations. Edges are weighted based on their type, reliability, and informativeness. We present Biomine and evaluate its performance in link prediction, where the goal is to predict pairs of nodes that will be connected in the future, based on current data. In particular, we formulate protein interaction prediction and disease gene prioritization tasks as instances of link prediction. The predictions are based on a proximity measure computed on the integrated graph. We consider and experiment with several such measures, and perform a parameter optimization procedure where different edge types are weighted to optimize link prediction accuracy. We also propose a novel method for disease-gene prioritization, defined as finding a subset of candidate genes that cluster together in the graph. We experimentally evaluate Biomine by predicting future annotations in the source databases and prioritizing lists of putative disease genes. The experimental results show that Biomine has strong potential for predicting links when a set of selected candidate links is available. The predictions obtained using the entire Biomine dataset are shown to clearly outperform ones obtained using any single source of data alone, when different types of links are suitably weighted. In the gene prioritization task, an established reference set of disease-associated genes is useful, but the results show that under favorable conditions, Biomine can also perform well when no such information is available.The Biomine system is a proof of concept. Its current version contains 1.1 million entities and 8.1 million relations between them, with focus on human genetics. Some of its functionalities are available in a public query interface at http://biomine.cs.helsinki.fi, allowing searching for and visualizing connections between given biological entities.
NASA Astrophysics Data System (ADS)
Rhodes, Edgar A.; Peters, Charles W.
1993-02-01
A recently developed neutron diagnostic probe system has the potential to satisfy a significant number of van-mobile and fixed-portal requirements for nondestructive detection, including monitoring of contraband explosives, drugs, and weapon materials, and treaty verification of sealed munitions. The probe is based on a unique associated-particle sealed-tube neutron generator (APSTNG) that interrogates the object of interest with a low-intensity beam of 14- MeV neutrons generated from the deuterium-tritium reaction and that detects the alpha-particle associated with each neutron. Gamma-ray spectra of resulting neutron reactions identify nuclides associated with all major chemicals in explosives, drugs, and chemical warfare agents, as well as many pollutants and fissile and fertile special nuclear material. Flight times determined from detection times of the gamma-rays and alpha-particles yield a separate coarse tomographic image of each identified nuclide. The APSTNG also forms the basis for a compact fast-neutron transmission imaging system that can be used along with or instead of the emission imaging system. Proof-of-concept experiments have been performed under laboratory conditions for simulated nuclear and chemical warfare munitions and for explosives and drugs. The small and relatively inexpensive APSTNG exhibits high reliability and can be quickly replaced. Surveillance systems based on APSTNG technology can avoid the large physical size, high capital and operating expenses, and reliability problems associated with complex accelerators.
Initial Characterization of Optical Communications with Disruption-Tolerant Network Protocols
NASA Technical Reports Server (NTRS)
Schoolcraft, Joshua; Wilson, Keith
2011-01-01
Disruption-tolerant networks (DTNs) are groups of network assets connected with a suite of communication protocol technologies designed to mitigate the effects of link delay and disruption. Application of DTN protocols to diverse groups of network resources in multiple sub-networks results in an overlay network-of-networks with autonomous data routing capability. In space environments where delay or disruption is expected, performance of this type of architecture (such as an interplanetary internet) can increase with the inclusion of new communications mediums and techniques. Space-based optical communication links are therefore an excellent building block of space DTN architectures. When compared to traditional radio frequency (RF) communications, optical systems can provide extremely power-efficient and high bandwidth links bridging sub-networks. Because optical links are more susceptible to link disruption and experience the same light-speed delays as RF, optical-enabled DTN architectures can lessen potential drawbacks and maintain the benefits of autonomous optical communications over deep space distances. These environment-driven expectations - link delay and interruption, along with asymmetric data rates - are the purpose of the proof-of-concept experiment outlined herein. In recognizing the potential of these two technologies, we report an initial experiment and characterization of the performance of a DTN-enabled space optical link. The experiment design employs a point-to-point free-space optical link configured to have asymmetric bandwidth. This link connects two networked systems running a DTN protocol implementation designed and written at JPL for use on spacecraft, and further configured for higher bandwidth performance. Comparing baseline data transmission metrics with and without periodic optical link interruptions, the experiment confirmed the DTN protocols' ability to handle real-world unexpected link outages while maintaining capability of reliably delivering data at relatively high rates. Finally, performance characterizations from this data suggest performance optimizations to configuration and protocols for future optical-specific DTN space link scenarios.
Epidemiology: second-rate science?
Parascandola, M
1998-01-01
In recent years epidemiology has come under increasing criticism in regulatory and public arenas for being "unscientific." The tobacco industry has taken advantage of this, insisting for decades that evidence linking cigarettes and lung cancer falls short of proof. Moreover, many epidemiologists remain unduly skeptical and self-conscious about the status of their own causal claims. This situation persists in part because of a widespread belief that only the laboratory can provide evidence sufficient for scientific proof. Adherents of this view erroneously believe that there is no element of uncertainty or inductive inference in the "direct observation" of the laboratory researcher and that epidemiology provides mere "circumstantial" evidence. The historical roots of this attitude can be traced to philosopher John Stuart Mill and physiologist Claude Bernard and their influence on modern experimental thinking. The author uses the debate over cigarettes and lung cancer to examine ideas of proof in medical science and public health, concluding that inductive inference from a limited sample to a larger population is an element in all empirical science. Images p313-a p317-a p318-a PMID:9672568
Sabet, Mahsheed; O'Connor, Daryl J.; Greer, Peter B.
2011-01-01
There have been several manual, semi‐automatic and fully‐automatic methods proposed for verification of the position of mechanical isocenter as part of comprehensive quality assurance programs required for linear accelerator‐based stereotactic radiosurgery/radiotherapy (SRS/SRT) treatments. In this paper, a systematic review has been carried out to discuss the present methods for isocenter verification and compare their characteristics, to help physicists in making a decision on selection of their quality assurance routine. PACS numbers: 87.53.Ly, 87.56.Fc, 87.56.‐v PMID:22089022
Current status of 3D EPID-based in vivo dosimetry in The Netherlands Cancer Institute
NASA Astrophysics Data System (ADS)
Mijnheer, B.; Olaciregui-Ruiz, I.; Rozendaal, R.; Spreeuw, H.; van Herk, M.; Mans, A.
2015-01-01
3D in vivo dose verification using a-Si EPIDs is performed routinely in our institution for almost all RT treatments. The EPID-based 3D dose distribution is reconstructed using a back-projection algorithm and compared with the planned dose distribution using 3D gamma evaluation. Dose-reconstruction and gamma-evaluation software runs automatically, and deviations outside the alert criteria are immediately available and investigated, in combination with inspection of cone-beam CT scans. The implementation of our 3D EPID- based in vivo dosimetry approach was able to replace pre-treatment verification for more than 90% of the patient treatments. Clinically relevant deviations could be detected for approximately 1 out of 300 patient treatments (IMRT and VMAT). Most of these errors were patient related anatomical changes or deviations from the routine clinical procedure, and would not have been detected by pre-treatment verification. Moreover, 3D EPID-based in vivo dose verification is a fast and accurate tool to assure the safe delivery of RT treatments. It provides clinically more useful information and is less time consuming than pre-treatment verification measurements. Automated 3D in vivo dosimetry is therefore a prerequisite for large-scale implementation of patient-specific quality assurance of RT treatments.
31 CFR 363.6 - What special terms do I need to know to understand this part?
Code of Federal Regulations, 2012 CFR
2012-07-01
... TreasuryDirect account at account establishment using an online verification service or offline... the online process by which all securities contained within the minor linked account are moved to the... transaction to convert a definitive savings bond to a book-entry bond. Online means use of the Internet. Owner...
31 CFR 363.6 - What special terms do I need to know to understand this part?
Code of Federal Regulations, 2013 CFR
2013-07-01
... TreasuryDirect account at account establishment using an online verification service or offline... the online process by which all securities contained within the minor linked account are moved to the... transaction to convert a definitive savings bond to a book-entry bond. Online means use of the Internet. Owner...
31 CFR 363.6 - What special terms do I need to know to understand this part?
Code of Federal Regulations, 2014 CFR
2014-07-01
... TreasuryDirect account at account establishment using an online verification service or offline... the online process by which all securities contained within the minor linked account are moved to the... transaction to convert a definitive savings bond to a book-entry bond. Online means use of the Internet. Owner...
Radio-frequency low-coherence interferometry.
Fernández-Pousa, Carlos R; Mora, José; Maestre, Haroldo; Corral, Pablo
2014-06-15
A method for retrieving low-coherence interferograms, based on the use of a microwave photonics filter, is proposed and demonstrated. The method is equivalent to the double-interferometer technique, with the scanning interferometer replaced by an analog fiber-optics link and the visibility recorded as the amplitude of its radio-frequency (RF) response. As a low-coherence interferometry system, it shows a decrease of resolution induced by the fiber's third-order dispersion (β3). As a displacement sensor, it provides highly linear and slope-scalable readouts of the interferometer's optical path difference in terms of RF, even in the presence of third-order dispersion. In a proof-of-concept experiment, we demonstrate 20-μm displacement readouts using C-band EDFA sources and standard single-mode fiber.
NASA Astrophysics Data System (ADS)
Claver, C. F.; Selvy, Brian M.; Angeli, George; Delgado, Francisco; Dubois-Felsmann, Gregory; Hascall, Patrick; Lotz, Paul; Marshall, Stuart; Schumacher, German; Sebag, Jacques
2014-08-01
The Large Synoptic Survey Telescope project was an early adopter of SysML and Model Based Systems Engineering practices. The LSST project began using MBSE for requirements engineering beginning in 2006 shortly after the initial release of the first SysML standard. Out of this early work the LSST's MBSE effort has grown to include system requirements, operational use cases, physical system definition, interfaces, and system states along with behavior sequences and activities. In this paper we describe our approach and methodology for cross-linking these system elements over the three classical systems engineering domains - requirement, functional and physical - into the LSST System Architecture model. We also show how this model is used as the central element to the overall project systems engineering effort. More recently we have begun to use the cross-linked modeled system architecture to develop and plan the system verification and test process. In presenting this work we also describe "lessons learned" from several missteps the project has had with MBSE. Lastly, we conclude by summarizing the overall status of the LSST's System Architecture model and our plans for the future as the LSST heads toward construction.
Ihlow, Alexander; Schweizer, Patrick; Seiffert, Udo
2008-01-23
To find candidate genes that potentially influence the susceptibility or resistance of crop plants to powdery mildew fungi, an assay system based on transient-induced gene silencing (TIGS) as well as transient over-expression in single epidermal cells of barley has been developed. However, this system relies on quantitative microscopic analysis of the barley/powdery mildew interaction and will only become a high-throughput tool of phenomics upon automation of the most time-consuming steps. We have developed a high-throughput screening system based on a motorized microscope which evaluates the specimens fully automatically. A large-scale double-blind verification of the system showed an excellent agreement of manual and automated analysis and proved the system to work dependably. Furthermore, in a series of bombardment experiments an RNAi construct targeting the Mlo gene was included, which is expected to phenocopy resistance mediated by recessive loss-of-function alleles such as mlo5. In most cases, the automated analysis system recorded a shift towards resistance upon RNAi of Mlo, thus providing proof of concept for its usefulness in detecting gene-target effects. Besides saving labor and enabling a screening of thousands of candidate genes, this system offers continuous operation of expensive laboratory equipment and provides a less subjective analysis as well as a complete and enduring documentation of the experimental raw data in terms of digital images. In general, it proves the concept of enabling available microscope hardware to handle challenging screening tasks fully automatically.
Stathakarou, Natalia; Zary, Nabil; Kononowicz, Andrzej A
2014-01-01
Background. Massive Open Online Courses (MOOCs) are an emerging trend in online learning. However, their technology is not yet completely adjusted to the needs of healthcare education. Integration of Virtual Patients within MOOCs to increase interactivity and foster clinical reasoning skills training, has been discussed in the past, but not verified by a practical implementation. Objective. To investigate the technical feasibility of integrating MOOCs with Virtual Patients for the purpose of enabling further research into the potential pedagogical benefits of this approach. Methods. We selected OpenEdx and Open Labyrinth as representative constituents of a MOOC platform and Virtual Patient system integration. Based upon our prior experience we selected the most fundamental technical requirement to address. Grounded in the available literature we identified an e-learning standard to guide the integration. We attempted to demonstrate the feasibility of the integration by designing a "proof-of-concept" prototype. The resulting pilot implementation was subject of verification by two test cases. Results. A Single Sign-On mechanism connecting Open Labyrinth with OpenEdx and based on the IMS LTI standard was successfully implemented and verified. Conclusion. We investigated the technical perspective of integrating Virtual Patients with MOOCs. By addressing this crucial technical requirement we set a base for future research on the educational benefits of using virtual patients in MOOCs. This provides new opportunities for integrating specialized software in healthcare education at massive scale.
Zary, Nabil; Kononowicz, Andrzej A.
2014-01-01
Background. Massive Open Online Courses (MOOCs) are an emerging trend in online learning. However, their technology is not yet completely adjusted to the needs of healthcare education. Integration of Virtual Patients within MOOCs to increase interactivity and foster clinical reasoning skills training, has been discussed in the past, but not verified by a practical implementation. Objective. To investigate the technical feasibility of integrating MOOCs with Virtual Patients for the purpose of enabling further research into the potential pedagogical benefits of this approach. Methods. We selected OpenEdx and Open Labyrinth as representative constituents of a MOOC platform and Virtual Patient system integration. Based upon our prior experience we selected the most fundamental technical requirement to address. Grounded in the available literature we identified an e-learning standard to guide the integration. We attempted to demonstrate the feasibility of the integration by designing a “proof-of-concept” prototype. The resulting pilot implementation was subject of verification by two test cases. Results. A Single Sign-On mechanism connecting Open Labyrinth with OpenEdx and based on the IMS LTI standard was successfully implemented and verified. Conclusion. We investigated the technical perspective of integrating Virtual Patients with MOOCs. By addressing this crucial technical requirement we set a base for future research on the educational benefits of using virtual patients in MOOCs. This provides new opportunities for integrating specialized software in healthcare education at massive scale. PMID:25405078
A penalty-based nodal discontinuous Galerkin method for spontaneous rupture dynamics
NASA Astrophysics Data System (ADS)
Ye, R.; De Hoop, M. V.; Kumar, K.
2017-12-01
Numerical simulation of the dynamic rupture processes with slip is critical to understand the earthquake source process and the generation of ground motions. However, it can be challenging due to the nonlinear friction laws interacting with seismicity, coupled with the discontinuous boundary conditions across the rupture plane. In practice, the inhomogeneities in topography, fault geometry, elastic parameters and permiability add extra complexity. We develop a nodal discontinuous Galerkin method to simulate seismic wave phenomenon with slipping boundary conditions, including the fluid-solid boundaries and ruptures. By introducing a novel penalty flux, we avoid solving Riemann problems on interfaces, which makes our method capable for general anisotropic and poro-elastic materials. Based on unstructured tetrahedral meshes in 3D, the code can capture various geometries in geological model, and use polynomial expansion to achieve high-order accuracy. We consider the rate and state friction law, in the spontaneous rupture dynamics, as part of a nonlinear transmitting boundary condition, which is weakly enforced across the fault surface as numerical flux. An iterative coupling scheme is developed based on implicit time stepping, containing a constrained optimization process that accounts for the nonlinear part. To validate the method, we proof the convergence of the coupled system with error estimates. We test our algorithm on a well-established numerical example (TPV102) of the SCEC/USGS Spontaneous Rupture Code Verification Project, and benchmark with the simulation of PyLith and SPECFEM3D with agreeable results.
NASA Astrophysics Data System (ADS)
Tatli, Hamza; Yucel, Derya; Yilmaz, Sercan; Fayda, Merdan
2018-02-01
The aim of this study is to develop an algorithm for independent MU/treatment time (TT) verification for non-IMRT treatment plans, as a part of QA program to ensure treatment delivery accuracy. Two radiotherapy delivery units and their treatment planning systems (TPS) were commissioned in Liv Hospital Radiation Medicine Center, Tbilisi, Georgia. Beam data were collected according to vendors' collection guidelines, and AAPM reports recommendations, and processed by Microsoft Excel during in-house algorithm development. The algorithm is designed and optimized for calculating SSD and SAD treatment plans, based on AAPM TG114 dose calculation recommendations, coded and embedded in MS Excel spreadsheet, as a preliminary verification algorithm (VA). Treatment verification plans were created by TPSs based on IAEA TRS 430 recommendations, also calculated by VA, and point measurements were collected by solid water phantom, and compared. Study showed that, in-house VA can be used for non-IMRT plans MU/TT verifications.
NASA Astrophysics Data System (ADS)
McKellip, Rodney; Yuan, Ding; Graham, William; Holland, Donald E.; Stone, David; Walser, William E.; Mao, Chengye
1997-06-01
The number of available spaceborne and airborne systems will dramatically increase over the next few years. A common systematic approach toward verification of these systems will become important for comparing the systems' operational performance. The Commercial Remote Sensing Program at the John C. Stennis Space Center (SSC) in Mississippi has developed design requirements for a remote sensing verification target range to provide a means to evaluate spatial, spectral, and radiometric performance of optical digital remote sensing systems. The verification target range consists of spatial, spectral, and radiometric targets painted on a 150- by 150-meter concrete pad located at SSC. The design criteria for this target range are based upon work over a smaller, prototypical target range at SSC during 1996. This paper outlines the purpose and design of the verification target range based upon an understanding of the systems to be evaluated as well as data analysis results from the prototypical target range.
Proceedings of the First NASA Formal Methods Symposium
NASA Technical Reports Server (NTRS)
Denney, Ewen (Editor); Giannakopoulou, Dimitra (Editor); Pasareanu, Corina S. (Editor)
2009-01-01
Topics covered include: Model Checking - My 27-Year Quest to Overcome the State Explosion Problem; Applying Formal Methods to NASA Projects: Transition from Research to Practice; TLA+: Whence, Wherefore, and Whither; Formal Methods Applications in Air Transportation; Theorem Proving in Intel Hardware Design; Building a Formal Model of a Human-Interactive System: Insights into the Integration of Formal Methods and Human Factors Engineering; Model Checking for Autonomic Systems Specified with ASSL; A Game-Theoretic Approach to Branching Time Abstract-Check-Refine Process; Software Model Checking Without Source Code; Generalized Abstract Symbolic Summaries; A Comparative Study of Randomized Constraint Solvers for Random-Symbolic Testing; Component-Oriented Behavior Extraction for Autonomic System Design; Automated Verification of Design Patterns with LePUS3; A Module Language for Typing by Contracts; From Goal-Oriented Requirements to Event-B Specifications; Introduction of Virtualization Technology to Multi-Process Model Checking; Comparing Techniques for Certified Static Analysis; Towards a Framework for Generating Tests to Satisfy Complex Code Coverage in Java Pathfinder; jFuzz: A Concolic Whitebox Fuzzer for Java; Machine-Checkable Timed CSP; Stochastic Formal Correctness of Numerical Algorithms; Deductive Verification of Cryptographic Software; Coloured Petri Net Refinement Specification and Correctness Proof with Coq; Modeling Guidelines for Code Generation in the Railway Signaling Context; Tactical Synthesis Of Efficient Global Search Algorithms; Towards Co-Engineering Communicating Autonomous Cyber-Physical Systems; and Formal Methods for Automated Diagnosis of Autosub 6000.
Verification testing of the ReCip® RTS-500 System was conducted over a 12-month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located on Otis Air National Guard Base in Bourne, Massachusetts. A nine-week startup period preceded the verification test t...
He, Hua; McDermott, Michael P.
2012-01-01
Sensitivity and specificity are common measures of the accuracy of a diagnostic test. The usual estimators of these quantities are unbiased if data on the diagnostic test result and the true disease status are obtained from all subjects in an appropriately selected sample. In some studies, verification of the true disease status is performed only for a subset of subjects, possibly depending on the result of the diagnostic test and other characteristics of the subjects. Estimators of sensitivity and specificity based on this subset of subjects are typically biased; this is known as verification bias. Methods have been proposed to correct verification bias under the assumption that the missing data on disease status are missing at random (MAR), that is, the probability of missingness depends on the true (missing) disease status only through the test result and observed covariate information. When some of the covariates are continuous, or the number of covariates is relatively large, the existing methods require parametric models for the probability of disease or the probability of verification (given the test result and covariates), and hence are subject to model misspecification. We propose a new method for correcting verification bias based on the propensity score, defined as the predicted probability of verification given the test result and observed covariates. This is estimated separately for those with positive and negative test results. The new method classifies the verified sample into several subsamples that have homogeneous propensity scores and allows correction for verification bias. Simulation studies demonstrate that the new estimators are more robust to model misspecification than existing methods, but still perform well when the models for the probability of disease and probability of verification are correctly specified. PMID:21856650
Gaia challenging performances verification: combination of spacecraft models and test results
NASA Astrophysics Data System (ADS)
Ecale, Eric; Faye, Frédéric; Chassat, François
2016-08-01
To achieve the ambitious scientific objectives of the Gaia mission, extremely stringent performance requirements have been given to the spacecraft contractor (Airbus Defence and Space). For a set of those key-performance requirements (e.g. end-of-mission parallax, maximum detectable magnitude, maximum sky density or attitude control system stability), this paper describes how they are engineered during the whole spacecraft development process, with a focus on the end-to-end performance verification. As far as possible, performances are usually verified by end-to-end tests onground (i.e. before launch). However, the challenging Gaia requirements are not verifiable by such a strategy, principally because no test facility exists to reproduce the expected flight conditions. The Gaia performance verification strategy is therefore based on a mix between analyses (based on spacecraft models) and tests (used to directly feed the models or to correlate them). Emphasis is placed on how to maximize the test contribution to performance verification while keeping the test feasible within an affordable effort. In particular, the paper highlights the contribution of the Gaia Payload Module Thermal Vacuum test to the performance verification before launch. Eventually, an overview of the in-flight payload calibration and in-flight performance verification is provided.
Integrity Verification for Multiple Data Copies in Cloud Storage Based on Spatiotemporal Chaos
NASA Astrophysics Data System (ADS)
Long, Min; Li, You; Peng, Fei
Aiming to strike for a balance between the security, efficiency and availability of the data verification in cloud storage, a novel integrity verification scheme based on spatiotemporal chaos is proposed for multiple data copies. Spatiotemporal chaos is implemented for node calculation of the binary tree, and the location of the data in the cloud is verified. Meanwhile, dynamic operation can be made to the data. Furthermore, blind information is used to prevent a third-party auditor (TPA) leakage of the users’ data privacy in a public auditing process. Performance analysis and discussion indicate that it is secure and efficient, and it supports dynamic operation and the integrity verification of multiple copies of data. It has a great potential to be implemented in cloud storage services.
The Verification-based Analysis of Reliable Multicast Protocol
NASA Technical Reports Server (NTRS)
Wu, Yunqing
1996-01-01
Reliable Multicast Protocol (RMP) is a communication protocol that provides an atomic, totally ordered, reliable multicast service on top of unreliable IP Multicasting. In this paper, we develop formal models for R.W using existing automatic verification systems, and perform verification-based analysis on the formal RMP specifications. We also use the formal models of RW specifications to generate a test suite for conformance testing of the RMP implementation. Throughout the process of RMP development, we follow an iterative, interactive approach that emphasizes concurrent and parallel progress between the implementation and verification processes. Through this approach, we incorporate formal techniques into our development process, promote a common understanding for the protocol, increase the reliability of our software, and maintain high fidelity between the specifications of RMP and its implementation.
UIVerify: A Web-Based Tool for Verification and Automatic Generation of User Interfaces
NASA Technical Reports Server (NTRS)
Shiffman, Smadar; Degani, Asaf; Heymann, Michael
2004-01-01
In this poster, we describe a web-based tool for verification and automatic generation of user interfaces. The verification component of the tool accepts as input a model of a machine and a model of its interface, and checks that the interface is adequate (correct). The generation component of the tool accepts a model of a given machine and the user's task, and then generates a correct and succinct interface. This write-up will demonstrate the usefulness of the tool by verifying the correctness of a user interface to a flight-control system. The poster will include two more examples of using the tool: verification of the interface to an espresso machine, and automatic generation of a succinct interface to a large hypothetical machine.
NASA Astrophysics Data System (ADS)
Zhong, Shenlu; Li, Mengjiao; Tang, Xiajie; He, Weiqing; Wang, Xiaogang
2017-01-01
A novel optical information verification and encryption method is proposed based on inference principle and phase retrieval with sparsity constraints. In this method, a target image is encrypted into two phase-only masks (POMs), which comprise sparse phase data used for verification. Both of the two POMs need to be authenticated before being applied for decrypting. The target image can be optically reconstructed when the two authenticated POMs are Fourier transformed and convolved by the correct decryption key, which is also generated in encryption process. No holographic scheme is involved in the proposed optical verification and encryption system and there is also no problem of information disclosure in the two authenticable POMs. Numerical simulation results demonstrate the validity and good performance of this new proposed method.
A Program Certification Assistant Based on Fully Automated Theorem Provers
NASA Technical Reports Server (NTRS)
Denney, Ewen; Fischer, Bernd
2005-01-01
We describe a certification assistant to support formal safety proofs for programs. It is based on a graphical user interface that hides the low-level details of first-order automated theorem provers while supporting limited interactivity: it allows users to customize and control the proof process on a high level, manages the auxiliary artifacts produced during this process, and provides traceability between the proof obligations and the relevant parts of the program. The certification assistant is part of a larger program synthesis system and is intended to support the deployment of automatically generated code in safety-critical applications.
78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service
Federal Register 2010, 2011, 2012, 2013, 2014
2013-09-12
... developed CBSV as a user- friendly, internet-based application with safeguards that protect the public's information. In addition to the benefit of providing high volume, centralized SSN verification services to users in a secure manner, CBSV provides us with cost and workload management benefits. New Information...
Model-Based Building Verification in Aerial Photographs.
1987-09-01
Powers ’ordon E. Schacher Chaii nan Dean of Science and Electrical and Computer Engineering Engineering "p. 5.€ ’ ,’"..€ € . € -, _ _ . ."€ . 4...paper, we have proposed an ex)erimental knowledge-based " verification syste, te organization for change (letection is oitliinet. , Kowledge rules and
Formal Verification for a Next-Generation Space Shuttle
NASA Technical Reports Server (NTRS)
Nelson, Stacy D.; Pecheur, Charles; Koga, Dennis (Technical Monitor)
2002-01-01
This paper discusses the verification and validation (V&2) of advanced software used for integrated vehicle health monitoring (IVHM), in the context of NASA's next-generation space shuttle. We survey the current VBCV practice and standards used in selected NASA projects, review applicable formal verification techniques, and discuss their integration info existing development practice and standards. We also describe two verification tools, JMPL2SMV and Livingstone PathFinder, that can be used to thoroughly verify diagnosis applications that use model-based reasoning, such as the Livingstone system.
Addendum to the User Manual for NASGRO Elastic-Plastic Fracture Mechanics Software Module
NASA Technical Reports Server (NTRS)
Gregg, M. Wayne (Technical Monitor); Chell, Graham; Gardner, Brian
2003-01-01
The elastic-plastic fracture mechanics modules in NASGRO have been enhanced by the addition of of the following: new J-integral solutions based on the reference stress method and finite element solutions; the extension of the critical crack and critical load modules for cracks with two degrees of freedom that tear and failure by ductile instability; the addition of a proof test analysis module that includes safe life analysis, calculates proof loads, and determines the flaw screening 1 capability for a given proof load; the addition of a tear-fatigue module for ductile materials that simultaneously tear and extend by fatigue; and a multiple cycle proof test module for estimating service reliability following a proof test.
29 CFR 1919.29 - Limitations on safe working loads and proof loads.
Code of Federal Regulations, 2013 CFR
2013-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR (CONTINUED) GEAR CERTIFICATION Certification of Vessels: Tests and Proof... pertinent limitations based on stability and/or on structural competence at particular radii. Safe working...
29 CFR 1919.29 - Limitations on safe working loads and proof loads.
Code of Federal Regulations, 2012 CFR
2012-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR (CONTINUED) GEAR CERTIFICATION Certification of Vessels: Tests and Proof... pertinent limitations based on stability and/or on structural competence at particular radii. Safe working...
29 CFR 1919.29 - Limitations on safe working loads and proof loads.
Code of Federal Regulations, 2011 CFR
2011-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR (CONTINUED) GEAR CERTIFICATION Certification of Vessels: Tests and Proof... pertinent limitations based on stability and/or on structural competence at particular radii. Safe working...
29 CFR 1919.29 - Limitations on safe working loads and proof loads.
Code of Federal Regulations, 2010 CFR
2010-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR (CONTINUED) GEAR CERTIFICATION Certification of Vessels: Tests and Proof... pertinent limitations based on stability and/or on structural competence at particular radii. Safe working...
29 CFR 1919.29 - Limitations on safe working loads and proof loads.
Code of Federal Regulations, 2014 CFR
2014-07-01
... ADMINISTRATION, DEPARTMENT OF LABOR (CONTINUED) GEAR CERTIFICATION Certification of Vessels: Tests and Proof... pertinent limitations based on stability and/or on structural competence at particular radii. Safe working...
Can Link Analysis Be Applied to Identify Behavioral Patterns in Train Recorder Data?
Strathie, Ailsa; Walker, Guy H
2016-03-01
A proof-of-concept analysis was conducted to establish whether link analysis could be applied to data from on-train recorders to detect patterns of behavior that could act as leading indicators of potential safety issues. On-train data recorders capture data about driving behavior on thousands of routine journeys every day and offer a source of untapped data that could be used to offer insights into human behavior. Data from 17 journeys undertaken by six drivers on the same route over a 16-hr period were analyzed using link analysis, and four key metrics were examined: number of links, network density, diameter, and sociometric status. The results established that link analysis can be usefully applied to data captured from on-vehicle recorders. The four metrics revealed key differences in normal driver behavior. These differences have promising construct validity as leading indicators. Link analysis is one method that could be usefully applied to exploit data routinely gathered by on-vehicle data recorders. It facilitates a proactive approach to safety based on leading indicators, offers a clearer understanding of what constitutes normal driving behavior, and identifies trends at the interface of people and systems, which is currently a key area of strategic risk. These research findings have direct applications in the field of transport data monitoring. They offer a means of automatically detecting patterns in driver behavior that could act as leading indicators of problems during operation and that could be used in the proactive monitoring of driver competence, risk management, and even infrastructure design. © 2015, Human Factors and Ergonomics Society.
Achieving Agreement in Three Rounds with Bounded-Byzantine Faults
NASA Technical Reports Server (NTRS)
Malekpour, Mahyar, R.
2017-01-01
A three-round algorithm is presented that guarantees agreement in a system of K greater than or equal to 3F+1 nodes provided each faulty node induces no more than F faults and each good node experiences no more than F faults, where, F is the maximum number of simultaneous faults in the network. The algorithm is based on the Oral Message algorithm of Lamport, Shostak, and Pease and is scalable with respect to the number of nodes in the system and applies equally to traditional node-fault model as well as the link-fault model. We also present a mechanical verification of the algorithm focusing on verifying the correctness of a bounded model of the algorithm as well as confirming claims of determinism.
Experimental implant communication of high data rate video using an ultra wideband radio link.
Chávez-Santiago, Raúl; Balasingham, Ilangko; Bergsland, Jacob; Zahid, Wasim; Takizawa, Kenichi; Miura, Ryu; Li, Huan-Bang
2013-01-01
Ultra wideband (UWB) is one of the radio technologies adopted by the IEEE 802.15.6™-2012 standard for on-body communication in body area networks (BANs). However, a number of simulation-based studies suggest the feasibility of using UWB for high data rate implant communication too. This paper presents an experimental verification of said predictions. We carried out radio transmissions of H.264/1280×720 pixels video at 80 Mbps through a UWB multiband orthogonal frequency division multiplexing (MB-OFDM) interface in a porcine chirurgical model. The results demonstrated successful transmission up to a maximum depth of 30 mm in the abdomen and 33 mm in the thorax within the 4.2-4.8 GHz frequency band.
Sorgho, Raissa; Franke, Jonas; Simboro, Seraphin; Phalkey, Revati; Saeurborn, Rainer
Malnutrition remains a leading cause of death in children in low- and middle-income countries; this will be aggravated by climate change. Annually, 6.9 million deaths of children under 5 were attributable directly or indirectly to malnutrition. Although these figures have recently decreased, evidence shows that a world with a medium climate (local warming up to 3-4 °C) will create an additional 25.2 million malnourished children. This proof of concept study explores the relationships between childhood malnutrition (more specifically stunting), regional agricultural yields, and climate variables through the use of remote sensing (RS) satellite imaging along with algorithms to predict the effect of climate variability on agricultural yields and on malnutrition of children under 5. The success of this proof of purpose study, NUTRItion and CLIMate (NUTRICLIM), should encourage researchers to apply both concept and tools to study of the link between weather variability, crop yield, and malnutrition on a larger scale. It would also allow for linking such micro-level data to climate models and address the challenge of projecting the additional impact of childhood malnutrition from climate change to various policy relevant time horizons.
Proof-of-Concept of a Millimeter-Wave Integrated Heterogeneous Network for 5G Cellular
Okasaka, Shozo; Weiler, Richard J.; Keusgen, Wilhelm; Pudeyev, Andrey; Maltsev, Alexander; Karls, Ingolf; Sakaguchi, Kei
2016-01-01
The fifth-generation mobile networks (5G) will not only enhance mobile broadband services, but also enable connectivity for a massive number of Internet-of-Things devices, such as wireless sensors, meters or actuators. Thus, 5G is expected to achieve a 1000-fold or more increase in capacity over 4G. The use of the millimeter-wave (mmWave) spectrum is a key enabler to allowing 5G to achieve such enhancement in capacity. To fully utilize the mmWave spectrum, 5G is expected to adopt a heterogeneous network (HetNet) architecture, wherein mmWave small cells are overlaid onto a conventional macro-cellular network. In the mmWave-integrated HetNet, splitting of the control plane (CP) and user plane (UP) will allow continuous connectivity and increase the capacity of the mmWave small cells. mmWave communication can be used not only for access linking, but also for wireless backhaul linking, which will facilitate the installation of mmWave small cells. In this study, a proof-of-concept (PoC) was conducted to demonstrate the practicality of a prototype mmWave-integrated HetNet, using mmWave technologies for both backhaul and access. PMID:27571074
Proof-of-Concept of a Millimeter-Wave Integrated Heterogeneous Network for 5G Cellular.
Okasaka, Shozo; Weiler, Richard J; Keusgen, Wilhelm; Pudeyev, Andrey; Maltsev, Alexander; Karls, Ingolf; Sakaguchi, Kei
2016-08-25
The fifth-generation mobile networks (5G) will not only enhance mobile broadband services, but also enable connectivity for a massive number of Internet-of-Things devices, such as wireless sensors, meters or actuators. Thus, 5G is expected to achieve a 1000-fold or more increase in capacity over 4G. The use of the millimeter-wave (mmWave) spectrum is a key enabler to allowing 5G to achieve such enhancement in capacity. To fully utilize the mmWave spectrum, 5G is expected to adopt a heterogeneous network (HetNet) architecture, wherein mmWave small cells are overlaid onto a conventional macro-cellular network. In the mmWave-integrated HetNet, splitting of the control plane (CP) and user plane (UP) will allow continuous connectivity and increase the capacity of the mmWave small cells. mmWave communication can be used not only for access linking, but also for wireless backhaul linking, which will facilitate the installation of mmWave small cells. In this study, a proof-of-concept (PoC) was conducted to demonstrate the practicality of a prototype mmWave-integrated HetNet, using mmWave technologies for both backhaul and access.
MR Imaging Based Treatment Planning for Radiotherapy of Prostate Cancer
2007-02-01
developed practical methods for heterogeneity correction for MRI - based dose calculations (Chen et al 2007). 6) We will use existing Monte Carlo ... Monte Carlo verification of IMRT dose distributions from a commercial treatment planning optimization system, Phys. Med. Biol., 45:2483-95 (2000) Ma...accuracy and consistency for MR based IMRT treatment planning for prostate cancer. A short paper entitled “ Monte Carlo dose verification of MR image based
Hybrid sol-gel optical materials
Zeigler, J.M.
1993-04-20
Hybrid sol-gel materials comprise silicate sols cross-linked with linear polysilane, polygermane, or poly(silane-germane). The sol-gel materials are useful as optical identifiers in tagging and verification applications and, in a different aspect, as stable, visible light transparent non-linear optical materials. Methyl or phenyl silicones, polyaryl sulfides, polyaryl ethers, and rubbery polysilanes may be used in addition to the linear polysilane. The linear polymers cross-link with the sol to form a matrix having high optical transparency, resistance to thermooxidative aging, adherence to a variety of substrates, brittleness, and a resistance to cracking during thermal cycling.
Hybrid sol-gel optical materials
Zeigler, John M.
1993-01-01
Hybrid sol-gel materials comprise silicate sols cross-linked with linear polysilane, polygermane, or poly(silane-germane). The sol-gel materials are useful as optical identifiers in tagging and verification applications and, in a different aspect, as stable, visible light transparent non-linear optical materials. Methyl or phenyl silicones, polyaryl sulfides, polyaryl ethers, and rubbery polysilanes may be used in addition to the linear polysilane. The linear polymers cross-link with the sol to form a matrix having high optical transparency, resistance to thermooxidative aging, adherence to a variety of substrates, brittleness, and a resistance to cracking during thermal cycling.
Hybrid sol-gel optical materials
Zeigler, John M.
1992-01-01
Hybrid sol-gel materials comprise silicate sols cross-linked with linear polysilane, polygermane, or poly(silane-germane). The sol-gel materials are useful as optical identifiers in tagging and verification applications and, in a different aspect, as stable, visible light transparent non-linear optical materials. Methyl or phenyl silicones, polyaryl sulfides, polyaryl ethers, and rubbery polysilanes may be used in addition to the linear polysilane. The linear polymers cross-link with the sol to form a matrix having high optical transparency, resistance to thermooxidative aging, adherence to a variety of substrates, brittleness, and a resistance to cracking during thermal cycling.
Mutation Testing for Effective Verification of Digital Components of Physical Systems
NASA Astrophysics Data System (ADS)
Kushik, N. G.; Evtushenko, N. V.; Torgaev, S. N.
2015-12-01
Digital components of modern physical systems are often designed applying circuitry solutions based on the field programmable gate array technology (FPGA). Such (embedded) digital components should be carefully tested. In this paper, an approach for the verification of digital physical system components based on mutation testing is proposed. The reference description of the behavior of a digital component in the hardware description language (HDL) is mutated by introducing into it the most probable errors and, unlike mutants in high-level programming languages, the corresponding test case is effectively derived based on a comparison of special scalable representations of the specification and the constructed mutant using various logic synthesis and verification systems.
Sequential Data Assimilation for Seismicity: a Proof of Concept
NASA Astrophysics Data System (ADS)
van Dinther, Y.; Fichtner, A.; Kuensch, H. R.
2015-12-01
Our physical understanding and probabilistic forecasting ability of earthquakes is significantly hampered by limited indications of the state of stress and strength on faults and their governing parameters. Using the sequential data assimilation framework developed in meteorology and oceanography (e.g., Evensen, JGR, 1994) and a seismic cycle forward model based on Navier-Stokes Partial Differential Equations (van Dinther et al., JGR, 2013), we show that such information with its uncertainties is within reach, at least for laboratory setups. We aim to provide the first, thorough proof of concept for seismicity related PDE applications via a perfect model test of seismic cycles in a simplified wedge-like subduction setup. By evaluating the performance with respect to known numerical input and output, we aim to answer wether there is any probabilistic forecast value for this laboratory-like setup, which and how many parameters can be constrained, and how much data in both space and time would be needed to do so. Thus far our implementation of an Ensemble Kalman Filter demonstrated that probabilistic estimates of both the state of stress and strength on a megathrust fault can be obtained and utilized even when assimilating surface velocity data at a single point in time and space. An ensemble-based error covariance matrix containing velocities, stresses and pressure links surface velocity observations to fault stresses and strengths well enough to update fault coupling accordingly. Depending on what synthetic data show, coseismic events can then be triggered or inhibited.
Analytic Methods in Investigative Geometry.
ERIC Educational Resources Information Center
Dobbs, David E.
2001-01-01
Suggests an alternative proof by analytic methods, which is more accessible than rigorous proof based on Euclid's Elements, in which students need only apply standard methods of trigonometry to the data without introducing new points or lines. (KHR)
COBE navigation with one-way return-link Doppler in the post-helium-venting phase
NASA Technical Reports Server (NTRS)
Dunham, Joan; Nemesure, M.; Samii, M. V.; Maher, M.; Teles, Jerome; Jackson, J.
1991-01-01
The results of a navigation experiment with one way return link Doppler tracking measurements for operational orbit determination of the Cosmic Background Explorer (COBE) spacecraft are presented. The frequency of the tracking signal for the one way measurements was stabilized with an Ultrastable Oscillator (USO), and the signal was relayed by the Tracking and Data Relay Satellite System (TDRSS). The study achieved three objectives: space qualification of TDRSS noncoherent one way return link Doppler tracking; determination of flight performance of the USO coupled to the second generation TDRSS compatible user transponder; and verification of algorithms for navigation using actual one way tracking data. Orbit determination and the inflight USO performance evaluation results are presented.
Verification testing of the Waterloo Biofilter Systems (WBS), Inc. Waterloo Biofilter® Model 4-Bedroom system was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC) located at Otis Air National Guard Base in Bourne, Mas...
NASA Technical Reports Server (NTRS)
1975-01-01
The findings are presented of investigations on concepts and techniques in automated performance verification. The investigations were conducted to provide additional insight into the design methodology and to develop a consolidated technology base from which to analyze performance verification design approaches. Other topics discussed include data smoothing, function selection, flow diagrams, data storage, and shuttle hydraulic systems.
Verification testing of the Aquapoint, Inc. (AQP) BioclereTM Model 16/12 was conducted over a thirteen month period at the Massachusetts Alternative Septic System Test Center (MASSTC), located at Otis Air National Guard Base in Bourne, Massachusetts. Sanitary sewerage from the ba...
Dynamic testing for shuttle design verification
NASA Technical Reports Server (NTRS)
Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.
1972-01-01
Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.
SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements
DOE Office of Scientific and Technical Information (OSTI.GOV)
Thoelking, J; Yuvaraj, S; Jens, F
Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference)more » and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan verification. Funding Support, Disclosures, and Conflict of Interest: COIs: Frank Lohr: Elekta: research grant, travel grants, teaching honoraria IBA: research grant, travel grants, teaching honoraria, advisory board C-Rad: board honoraria, travel grants Frederik Wenz: Elekta: research grant, teaching honoraria, consultant, advisory board Zeiss: research grant, teaching honoraria, patent Hansjoerg Wertz: Elekta: research grant, teaching honoraria IBA: research grant.« less
Efficient multiuser quantum cryptography network based on entanglement.
Xue, Peng; Wang, Kunkun; Wang, Xiaoping
2017-04-04
We present an efficient quantum key distribution protocol with a certain entangled state to solve a special cryptographic task. Also, we provide a proof of security of this protocol by generalizing the proof of modified of Lo-Chau scheme. Based on this two-user scheme, a quantum cryptography network protocol is proposed without any quantum memory.
Efficient multiuser quantum cryptography network based on entanglement
Xue, Peng; Wang, Kunkun; Wang, Xiaoping
2017-01-01
We present an efficient quantum key distribution protocol with a certain entangled state to solve a special cryptographic task. Also, we provide a proof of security of this protocol by generalizing the proof of modified of Lo-Chau scheme. Based on this two-user scheme, a quantum cryptography network protocol is proposed without any quantum memory. PMID:28374854
Specifications-Based Grading in an Introduction to Proofs Course
ERIC Educational Resources Information Center
Williams, Kristopher
2018-01-01
This article describes a system of specifications-based grading used in an introduction to proofs course. The system was introduced to address two issues that arose in the course: how to spend less time grading and how to encourage use of feedback. We describe the implementation of the system and the results on grading and on students.
Efficient multiuser quantum cryptography network based on entanglement
NASA Astrophysics Data System (ADS)
Xue, Peng; Wang, Kunkun; Wang, Xiaoping
2017-04-01
We present an efficient quantum key distribution protocol with a certain entangled state to solve a special cryptographic task. Also, we provide a proof of security of this protocol by generalizing the proof of modified of Lo-Chau scheme. Based on this two-user scheme, a quantum cryptography network protocol is proposed without any quantum memory.
NASA Astrophysics Data System (ADS)
Capiński, Maciej J.; Gidea, Marian; de la Llave, Rafael
2017-01-01
We present a diffusion mechanism for time-dependent perturbations of autonomous Hamiltonian systems introduced in Gidea (2014 arXiv:1405.0866). This mechanism is based on shadowing of pseudo-orbits generated by two dynamics: an ‘outer dynamics’, given by homoclinic trajectories to a normally hyperbolic invariant manifold, and an ‘inner dynamics’, given by the restriction to that manifold. On the inner dynamics the only assumption is that it preserves area. Unlike other approaches, Gidea (2014 arXiv:1405.0866) does not rely on the KAM theory and/or Aubry-Mather theory to establish the existence of diffusion. Moreover, it does not require to check twist conditions or non-degeneracy conditions near resonances. The conditions are explicit and can be checked by finite precision calculations in concrete systems (roughly, they amount to checking that Melnikov-type integrals do not vanish and that some manifolds are transversal). As an application, we study the planar elliptic restricted three-body problem. We present a rigorous theorem that shows that if some concrete calculations yield a non zero value, then for any sufficiently small, positive value of the eccentricity of the orbits of the main bodies, there are orbits of the infinitesimal body that exhibit a change of energy that is bigger than some fixed number, which is independent of the eccentricity. We verify numerically these calculations for values of the masses close to that of the Jupiter/Sun system. The numerical calculations are not completely rigorous, because we ignore issues of round-off error and do not estimate the truncations, but they are not delicate at all by the standard of numerical analysis. (Standard tests indicate that we get 7 or 8 figures of accuracy where 1 would be enough.) The code of these verifications is available. We hope that some full computer assisted proofs will be obtained in the near future since there are packages (CAPD) designed for problems of this type.
Improved Hip-Based Individual Recognition Using Wearable Motion Recording Sensor
NASA Astrophysics Data System (ADS)
Gafurov, Davrondzhon; Bours, Patrick
In todays society the demand for reliable verification of a user identity is increasing. Although biometric technologies based on fingerprint or iris can provide accurate and reliable recognition performance, they are inconvenient for periodic or frequent re-verification. In this paper we propose a hip-based user recognition method which can be suitable for implicit and periodic re-verification of the identity. In our approach we use a wearable accelerometer sensor attached to the hip of the person, and then the measured hip motion signal is analysed for identity verification purposes. The main analyses steps consists of detecting gait cycles in the signal and matching two sets of detected gait cycles. Evaluating the approach on a hip data set consisting of 400 gait sequences (samples) from 100 subjects, we obtained equal error rate (EER) of 7.5% and identification rate at rank 1 was 81.4%. These numbers are improvements by 37.5% and 11.2% respectively of the previous study using the same data set.
Formal Verification of a Power Controller Using the Real-Time Model Checker UPPAAL
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Larsen, Kim Guldstrand; Skou, Arne
1999-01-01
A real-time system for power-down control in audio/video components is modeled and verified using the real-time model checker UPPAAL. The system is supposed to reside in an audio/video component and control (read from and write to) links to neighbor audio/video components such as TV, VCR and remote-control. In particular, the system is responsible for the powering up and down of the component in between the arrival of data, and in order to do so in a safe way without loss of data, it is essential that no link interrupts are lost. Hence, a component system is a multitasking system with hard real-time requirements, and we present techniques for modeling time consumption in such a multitasked, prioritized system. The work has been carried out in a collaboration between Aalborg University and the audio/video company B&O. By modeling the system, 3 design errors were identified and corrected, and the following verification confirmed the validity of the design but also revealed the necessity for an upper limit of the interrupt frequency. The resulting design has been implemented and it is going to be incorporated as part of a new product line.
An Efficient Location Verification Scheme for Static Wireless Sensor Networks.
Kim, In-Hwan; Kim, Bo-Sung; Song, JooSeok
2017-01-24
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors.
An Efficient Location Verification Scheme for Static Wireless Sensor Networks
Kim, In-hwan; Kim, Bo-sung; Song, JooSeok
2017-01-01
In wireless sensor networks (WSNs), the accuracy of location information is vital to support many interesting applications. Unfortunately, sensors have difficulty in estimating their location when malicious sensors attack the location estimation process. Even though secure localization schemes have been proposed to protect location estimation process from attacks, they are not enough to eliminate the wrong location estimations in some situations. The location verification can be the solution to the situations or be the second-line defense. The problem of most of the location verifications is the explicit involvement of many sensors in the verification process and requirements, such as special hardware, a dedicated verifier and the trusted third party, which causes more communication and computation overhead. In this paper, we propose an efficient location verification scheme for static WSN called mutually-shared region-based location verification (MSRLV), which reduces those overheads by utilizing the implicit involvement of sensors and eliminating several requirements. In order to achieve this, we use the mutually-shared region between location claimant and verifier for the location verification. The analysis shows that MSRLV reduces communication overhead by 77% and computation overhead by 92% on average, when compared with the other location verification schemes, in a single sensor verification. In addition, simulation results for the verification of the whole network show that MSRLV can detect the malicious sensors by over 90% when sensors in the network have five or more neighbors. PMID:28125007
Simulation-based MDP verification for leading-edge masks
NASA Astrophysics Data System (ADS)
Su, Bo; Syrel, Oleg; Pomerantsev, Michael; Hagiwara, Kazuyuki; Pearman, Ryan; Pang, Leo; Fujimara, Aki
2017-07-01
For IC design starts below the 20nm technology node, the assist features on photomasks shrink well below 60nm and the printed patterns of those features on masks written by VSB eBeam writers start to show a large deviation from the mask designs. Traditional geometry-based fracturing starts to show large errors for those small features. As a result, other mask data preparation (MDP) methods have become available and adopted, such as rule-based Mask Process Correction (MPC), model-based MPC and eventually model-based MDP. The new MDP methods may place shot edges slightly differently from target to compensate for mask process effects, so that the final patterns on a mask are much closer to the design (which can be viewed as the ideal mask), especially for those assist features. Such an alteration generally produces better masks that are closer to the intended mask design. Traditional XOR-based MDP verification cannot detect problems caused by eBeam effects. Much like model-based OPC verification which became a necessity for OPC a decade ago, we see the same trend in MDP today. Simulation-based MDP verification solution requires a GPU-accelerated computational geometry engine with simulation capabilities. To have a meaningful simulation-based mask check, a good mask process model is needed. The TrueModel® system is a field tested physical mask model developed by D2S. The GPU-accelerated D2S Computational Design Platform (CDP) is used to run simulation-based mask check, as well as model-based MDP. In addition to simulation-based checks such as mask EPE or dose margin, geometry-based rules are also available to detect quality issues such as slivers or CD splits. Dose margin related hotspots can also be detected by setting a correct detection threshold. In this paper, we will demonstrate GPU-acceleration for geometry processing, and give examples of mask check results and performance data. GPU-acceleration is necessary to make simulation-based mask MDP verification acceptable.
Shafi, Shahid; Barnes, Sunni; Ahn, Chul; Hemilla, Mark R; Cryer, H Gill; Nathens, Avery; Neal, Melanie; Fildes, John
2016-10-01
The Trauma Quality Improvement Project of the American College of Surgeons (ACS) has demonstrated variations in trauma center outcomes despite similar verification status. The purpose of this study was to identify structural characteristics of trauma centers that affect patient outcomes. Trauma registry data on 361,187 patients treated at 222 ACS-verified Level I and Level II trauma centers were obtained from the National Trauma Data Bank of ACS. These data were used to estimate each center's observed-to-expected (O-E) mortality ratio with 95% confidence intervals using multivariate logistic regression analysis. De-identified data on structural characteristics of these trauma centers were obtained from the ACS Verification Review Committee. Centers in the lowest quartile of mortality based on O-E ratio (n = 56) were compared to the rest (n = 166) using Classification and Regression Tree (CART) analysis to identify institutional characteristics independently associated with high-performing centers. Of the 72 structural characteristics explored, only 3 were independently associated with high-performing centers: annual patient visits to the emergency department of fewer than 61,000; proportion of patients on Medicare greater than 20%; and continuing medical education for emergency department physician liaison to the trauma program ranging from 55 and 113 hours annually. Each 5% increase in O-E mortality ratio was associated with an increase in total length of stay of one day (r = 0.25; p < 0.001). Very few structural characteristics of ACS-verified trauma centers are associated with risk-adjusted mortality. Thus, variations in patient outcomes across trauma centers are likely related to variations in clinical practices. Therapeutic study, level III.
Method for secure electronic voting system: face recognition based approach
NASA Astrophysics Data System (ADS)
Alim, M. Affan; Baig, Misbah M.; Mehboob, Shahzain; Naseem, Imran
2017-06-01
In this paper, we propose a framework for low cost secure electronic voting system based on face recognition. Essentially Local Binary Pattern (LBP) is used for face feature characterization in texture format followed by chi-square distribution is used for image classification. Two parallel systems are developed based on smart phone and web applications for face learning and verification modules. The proposed system has two tire security levels by using person ID followed by face verification. Essentially class specific threshold is associated for controlling the security level of face verification. Our system is evaluated three standard databases and one real home based database and achieve the satisfactory recognition accuracies. Consequently our propose system provides secure, hassle free voting system and less intrusive compare with other biometrics.
Zimmerman, Lindsay P; Goel, Satyender; Sathar, Shazia; Gladfelter, Charon E; Onate, Alejandra; Kane, Lindsey L; Sital, Shelly; Phua, Jasmin; Davis, Paris; Margellos-Anast, Helen; Meltzer, David O; Polonsky, Tamar S; Shah, Raj C; Trick, William E; Ahmad, Faraz S; Kho, Abel N
2018-01-01
This article presents and describes our methods in developing a novel strategy for recruitment of underrepresented, community-based participants, for pragmatic research studies leveraging routinely collected electronic health record (EHR) data. We designed a new approach for recruiting eligible patients from the community, while also leveraging affiliated health systems to extract clinical data for community participants. The strategy involves methods for data collection, linkage, and tracking. In this workflow, potential participants are identified in the community and surveyed regarding eligibility. These data are then encrypted and deidentified via a hashing algorithm for linkage of the community participant back to a record at a clinical site. The linkage allows for eligibility verification and automated follow-up. Longitudinal data are collected by querying the EHR data and surveying the community participant directly. We discuss this strategy within the context of two national research projects, a clinical trial and an observational cohort study. The community-based recruitment strategy is a novel, low-touch, clinical trial enrollment method to engage a diverse set of participants. Direct outreach to community participants, while utilizing EHR data for clinical information and follow-up, allows for efficient recruitment and follow-up strategies. This new strategy for recruitment links data reported from community participants to clinical data in the EHR and allows for eligibility verification and automated follow-up. The workflow has the potential to improve recruitment efficiency and engage traditionally underrepresented individuals in research. Schattauer GmbH Stuttgart.
Yu, Shihui; Kielt, Matthew; Stegner, Andrew L; Kibiryeva, Nataliya; Bittel, Douglas C; Cooley, Linda D
2009-12-01
The American College of Medical Genetics guidelines for microarray analysis for constitutional cytogenetic abnormalities require abnormal or ambiguous results from microarray-based comparative genomic hybridization (aCGH) analysis be confirmed by an alternative method. We employed quantitative real-time polymerase chain reaction (qPCR) technology using SYBR Green I reagents for confirmation of 93 abnormal aCGH results (50 deletions and 43 duplications) and 54 parental samples. A novel qPCR protocol using DNA sequences coding for X-linked lethal diseases in males for designing reference primers was established. Of the 81 sets of test primers used for confirmation of 93 abnormal copy number variants (CNVs) in 80 patients, 71 sets worked after the initial primer design (88%), 9 sets were redesigned once, and 1 set twice because of poor amplification. Fifty-four parental samples were tested using 33 sets of test primers to follow up 34 CNVs in 30 patients. Nineteen CNVs were confirmed as inherited, 13 were negative in both parents, and 2 were inconclusive due to a negative result in a single parent. The qPCR assessment clarified aCGH results in two cases and corrected a fluorescence in situ hybridization result in one case. Our data illustrate that qPCR methodology using SYBR Green I reagents is accurate, highly sensitive, specific, rapid, and cost-effective for verification of chromosomal imbalances detected by aCGH in the clinical setting.
Cognitive and neural mechanisms of decision biases in recognition memory.
Windmann, Sabine; Urbach, Thomas P; Kutas, Marta
2002-08-01
In recognition memory tasks, stimuli can be classified as "old" either on the basis of accurate memory or a bias to respond "old", yet bias has received little attention in the cognitive neuroscience literature. Here we examined the pattern and timing of bias-related effects in event-related brain potentials (ERPs) to determine whether the bias is linked more to memory retrieval or to response verification processes. Participants were divided into a High Bias and a Low Bias group according to their bias to respond "old". These groups did not differ in recognition accuracy or in the ERP pattern to items that actually were old versus new (Objective Old/New Effect). However, when the old/new distinction was based on each subject's perspective, i.e. when items judged "old" were compared with those judged "new" (Subjective Old/New Effect), significant group differences were observed over prefrontal sites with a timing (300-500 ms poststimulus) more consistent with bias acting early on memory retrieval processes than on post-retrieval response verification processes. In the standard old/new effect (Hits vs Correct Rejections), these group differences were intermediate to those for the Objective and the Subjective comparisons, indicating that such comparisons are confounded by response bias. We propose that these biases are top-down controlled processes mediated by prefrontal cortex areas.
A New Proof of the Expected Frequency Spectrum under the Standard Neutral Model.
Hudson, Richard R
2015-01-01
The sample frequency spectrum is an informative and frequently employed approach for summarizing DNA variation data. Under the standard neutral model the expectation of the sample frequency spectrum has been derived by at least two distinct approaches. One relies on using results from diffusion approximations to the Wright-Fisher Model. The other is based on Pólya urn models that correspond to the standard coalescent model. A new proof of the expected frequency spectrum is presented here. It is a proof by induction and does not require diffusion results and does not require the somewhat complex sums and combinatorics of the derivations based on urn models.
Reachability analysis of real-time systems using time Petri nets.
Wang, J; Deng, Y; Xu, G
2000-01-01
Time Petri nets (TPNs) are a popular Petri net model for specification and verification of real-time systems. A fundamental and most widely applied method for analyzing Petri nets is reachability analysis. The existing technique for reachability analysis of TPNs, however, is not suitable for timing property verification because one cannot derive end-to-end delay in task execution, an important issue for time-critical systems, from the reachability tree constructed using the technique. In this paper, we present a new reachability based analysis technique for TPNs for timing property analysis and verification that effectively addresses the problem. Our technique is based on a concept called clock-stamped state class (CS-class). With the reachability tree generated based on CS-classes, we can directly compute the end-to-end time delay in task execution. Moreover, a CS-class can be uniquely mapped to a traditional state class based on which the conventional reachability tree is constructed. Therefore, our CS-class-based analysis technique is more general than the existing technique. We show how to apply this technique to timing property verification of the TPN model of a command and control (C2) system.
Jin, Peng; van der Horst, Astrid; de Jong, Rianne; van Hooft, Jeanin E; Kamphuis, Martijn; van Wieringen, Niek; Machiels, Melanie; Bel, Arjan; Hulshof, Maarten C C M; Alderliesten, Tanja
2015-12-01
The aim of this study was to quantify interfractional esophageal tumor position variation using markers and investigate the use of markers for setup verification. Sixty-five markers placed in the tumor volumes of 24 esophageal cancer patients were identified in computed tomography (CT) and follow-up cone-beam CT. For each patient we calculated pairwise distances between markers over time to evaluate geometric tumor volume variation. We then quantified marker displacements relative to bony anatomy and estimated the variation of systematic (Σ) and random errors (σ). During bony anatomy-based setup verification, we visually inspected whether the markers were inside the planning target volume (PTV) and attempted marker-based registration. Minor time trends with substantial fluctuations in pairwise distances implied tissue deformation. Overall, Σ(σ) in the left-right/cranial-caudal/anterior-posterior direction was 2.9(2.4)/4.1(2.4)/2.2(1.8) mm; for the proximal stomach, it was 5.4(4.3)/4.9(3.2)/1.9(2.4) mm. After bony anatomy-based setup correction, all markers were inside the PTV. However, due to large tissue deformation, marker-based registration was not feasible. Generally, the interfractional position variation of esophageal tumors is more pronounced in the cranial-caudal direction and in the proximal stomach. Currently, marker-based setup verification is not feasible for clinical routine use, but markers can facilitate the setup verification by inspecting whether the PTV covers the tumor volume adequately. Copyright © 2015 Elsevier Ireland Ltd. All rights reserved.
Offline signature verification using convolution Siamese network
NASA Astrophysics Data System (ADS)
Xing, Zi-Jian; Yin, Fei; Wu, Yi-Chao; Liu, Cheng-Lin
2018-04-01
This paper presents an offline signature verification approach using convolutional Siamese neural network. Unlike the existing methods which consider feature extraction and metric learning as two independent stages, we adopt a deepleaning based framework which combines the two stages together and can be trained end-to-end. The experimental results on two offline public databases (GPDSsynthetic and CEDAR) demonstrate the superiority of our method on the offline signature verification problem.
QPF verification using different radar-based analyses: a case study
NASA Astrophysics Data System (ADS)
Moré, J.; Sairouni, A.; Rigo, T.; Bravo, M.; Mercader, J.
2009-09-01
Verification of QPF in NWP models has been always challenging not only for knowing what scores are better to quantify a particular skill of a model but also for choosing the more appropriate methodology when comparing forecasts with observations. On the one hand, an objective verification technique can provide conclusions that are not in agreement with those ones obtained by the "eyeball" method. Consequently, QPF can provide valuable information to forecasters in spite of having poor scores. On the other hand, there are difficulties in knowing the "truth" so different results can be achieved depending on the procedures used to obtain the precipitation analysis. The aim of this study is to show the importance of combining different precipitation analyses and verification methodologies to obtain a better knowledge of the skills of a forecasting system. In particular, a short range precipitation forecasting system based on MM5 at 12 km coupled with LAPS is studied in a local convective precipitation event that took place in NE Iberian Peninsula on October 3rd 2008. For this purpose, a variety of verification methods (dichotomous, recalibration and object oriented methods) are used to verify this case study. At the same time, different precipitation analyses are used in the verification process obtained by interpolating radar data using different techniques.
Non-disturbing optical power monitor for links in the visible spectrum using a polymer optical fibre
NASA Astrophysics Data System (ADS)
Ribeiro, Ricardo M.; Freitas, Taiane A. M. G.; Barbero, Andrés P. L.; Silva, Vinicius N. H.
2015-08-01
We describe a simple and inexpensive inline optical power monitor (OPMo) for polymer optical fibre (POF) links that are transmitting visible light carriers. The OPMo is non-invasive in the sense that it does not tap any guided light from the fibre core; rather, it collects and detects the spontaneous side-scattered light. Indeed, the OPMo indicates whether a POF transmission link has dark or live status and measures the average optical power level of the propagating signals without disconnecting the fibre link. This paper demonstrates the proof-of-principle of the device for one wavelength at a time, selected from a set of previously calibrated wavelength channels which have been found in the 45 dB dynamic range, with 50 dBm sensitivity or insensitivity by the use or non-use of a mode scrambler. Our findings are very promising milestones for further OPMo development towards the marketplace.
A new paper-based platform technology for point-of-care diagnostics.
Gerbers, Roman; Foellscher, Wilke; Chen, Hong; Anagnostopoulos, Constantine; Faghri, Mohammad
2014-10-21
Currently, the Lateral flow Immunoassays (LFIAs) are not able to perform complex multi-step immunodetection tests because of their inability to introduce multiple reagents in a controlled manner to the detection area autonomously. In this research, a point-of-care (POC) paper-based lateral flow immunosensor was developed incorporating a novel microfluidic valve technology. Layers of paper and tape were used to create a three-dimensional structure to form the fluidic network. Unlike the existing LFIAs, multiple directional valves are embedded in the test strip layers to control the order and the timing of mixing for the sample and multiple reagents. In this paper, we report a four-valve device which autonomously directs three different fluids to flow sequentially over the detection area. As proof of concept, a three-step alkaline phosphatase based Enzyme-Linked ImmunoSorbent Assay (ELISA) protocol with Rabbit IgG as the model analyte was conducted to prove the suitability of the device for immunoassays. The detection limit of about 4.8 fm was obtained.
NASA Astrophysics Data System (ADS)
Han, Ke-Zhen; Feng, Jian; Cui, Xiaohong
2017-10-01
This paper considers the fault-tolerant optimised tracking control (FTOTC) problem for unknown discrete-time linear system. A research scheme is proposed on the basis of data-based parity space identification, reinforcement learning and residual compensation techniques. The main characteristic of this research scheme lies in the parity-space-identification-based simultaneous tracking control and residual compensation. The specific technical line consists of four main contents: apply subspace aided method to design observer-based residual generator; use reinforcement Q-learning approach to solve optimised tracking control policy; rely on robust H∞ theory to achieve noise attenuation; adopt fault estimation triggered by residual generator to perform fault compensation. To clarify the design and implementation procedures, an integrated algorithm is further constructed to link up these four functional units. The detailed analysis and proof are subsequently given to explain the guaranteed FTOTC performance of the proposed conclusions. Finally, a case simulation is provided to verify its effectiveness.
Kandel, Benjamin M; Wang, Danny J J; Gee, James C; Avants, Brian B
2014-01-01
Although much attention has recently been focused on single-subject functional networks, using methods such as resting-state functional MRI, methods for constructing single-subject structural networks are in their infancy. Single-subject cortical networks aim to describe the self-similarity across the cortical structure, possibly signifying convergent developmental pathways. Previous methods for constructing single-subject cortical networks have used patch-based correlations and distance metrics based on curvature and thickness. We present here a method for constructing similarity-based cortical structural networks that utilizes a rotation-invariant representation of structure. The resulting graph metrics are closely linked to age and indicate an increasing degree of closeness throughout development in nearly all brain regions, perhaps corresponding to a more regular structure as the brain matures. The derived graph metrics demonstrate a four-fold increase in power for detecting age as compared to cortical thickness. This proof of concept study indicates that the proposed metric may be useful in identifying biologically relevant cortical patterns.
Bayesian Estimation of Combined Accuracy for Tests with Verification Bias
Broemeling, Lyle D.
2011-01-01
This presentation will emphasize the estimation of the combined accuracy of two or more tests when verification bias is present. Verification bias occurs when some of the subjects are not subject to the gold standard. The approach is Bayesian where the estimation of test accuracy is based on the posterior distribution of the relevant parameter. Accuracy of two combined binary tests is estimated employing either “believe the positive” or “believe the negative” rule, then the true and false positive fractions for each rule are computed for two tests. In order to perform the analysis, the missing at random assumption is imposed, and an interesting example is provided by estimating the combined accuracy of CT and MRI to diagnose lung cancer. The Bayesian approach is extended to two ordinal tests when verification bias is present, and the accuracy of the combined tests is based on the ROC area of the risk function. An example involving mammography with two readers with extreme verification bias illustrates the estimation of the combined test accuracy for ordinal tests. PMID:26859487
Tichit, Paul-Henri; Burokur, Shah Nawaz; Qiu, Cheng-Wei; de Lustrac, André
2013-09-27
It has long been conjectured that isotropic radiation by a simple coherent source is impossible due to changes in polarization. Though hypothetical, the isotropic source is usually taken as the reference for determining a radiator's gain and directivity. Here, we demonstrate both theoretically and experimentally that an isotropic radiator can be made of a simple and finite source surrounded by electric-field-driven LC resonator metamaterials designed by space manipulation. As a proof-of-concept demonstration, we show the first isotropic source with omnidirectional radiation from a dipole source (applicable to all distributed sources), which can open up several possibilities in axion electrodynamics, optical illusion, novel transformation-optic devices, wireless communication, and antenna engineering. Owing to the electric- field-driven LC resonator realization scheme, this principle can be readily applied to higher frequency regimes where magnetism is usually not present.
Large-Scale Cryogen Systems and Test Facilities
NASA Technical Reports Server (NTRS)
Johnson, R. G.; Sass, J. P.; Hatfield, W. H.
2007-01-01
NASA has completed initial construction and verification testing of the Integrated Systems Test Facility (ISTF) Cryogenic Testbed. The ISTF is located at Complex 20 at Cape Canaveral Air Force Station, Florida. The remote and secure location is ideally suited for the following functions: (1) development testing of advanced cryogenic component technologies, (2) development testing of concepts and processes for entire ground support systems designed for servicing large launch vehicles, and (3) commercial sector testing of cryogenic- and energy-related products and systems. The ISTF Cryogenic Testbed consists of modular fluid distribution piping and storage tanks for liquid oxygen/nitrogen (56,000 gal) and liquid hydrogen (66,000 gal). Storage tanks for liquid methane (41,000 gal) and Rocket Propellant 1 (37,000 gal) are also specified for the facility. A state-of-the-art blast proof test command and control center provides capability for remote operation, video surveillance, and data recording for all test areas.
A Framework for RFID Survivability Requirement Analysis and Specification
NASA Astrophysics Data System (ADS)
Zuo, Yanjun; Pimple, Malvika; Lande, Suhas
Many industries are becoming dependent on Radio Frequency Identification (RFID) technology for inventory management and asset tracking. The data collected about tagged objects though RFID is used in various high level business operations. The RFID system should hence be highly available, reliable, and dependable and secure. In addition, this system should be able to resist attacks and perform recovery in case of security incidents. Together these requirements give rise to the notion of a survivable RFID system. The main goal of this paper is to analyze and specify the requirements for an RFID system to become survivable. These requirements, if utilized, can assist the system in resisting against devastating attacks and recovering quickly from damages. This paper proposes the techniques and approaches for RFID survivability requirements analysis and specification. From the perspective of system acquisition and engineering, survivability requirement is the important first step in survivability specification, compliance formulation, and proof verification.
Verification of road databases using multiple road models
NASA Astrophysics Data System (ADS)
Ziems, Marcel; Rottensteiner, Franz; Heipke, Christian
2017-08-01
In this paper a new approach for automatic road database verification based on remote sensing images is presented. In contrast to existing methods, the applicability of the new approach is not restricted to specific road types, context areas or geographic regions. This is achieved by combining several state-of-the-art road detection and road verification approaches that work well under different circumstances. Each one serves as an independent module representing a unique road model and a specific processing strategy. All modules provide independent solutions for the verification problem of each road object stored in the database in form of two probability distributions, the first one for the state of a database object (correct or incorrect), and a second one for the state of the underlying road model (applicable or not applicable). In accordance with the Dempster-Shafer Theory, both distributions are mapped to a new state space comprising the classes correct, incorrect and unknown. Statistical reasoning is applied to obtain the optimal state of a road object. A comparison with state-of-the-art road detection approaches using benchmark datasets shows that in general the proposed approach provides results with larger completeness. Additional experiments reveal that based on the proposed method a highly reliable semi-automatic approach for road data base verification can be designed.
Building Knowledge Graphs for NASA's Earth Science Enterprise
NASA Astrophysics Data System (ADS)
Zhang, J.; Lee, T. J.; Ramachandran, R.; Shi, R.; Bao, Q.; Gatlin, P. N.; Weigel, A. M.; Maskey, M.; Miller, J. J.
2016-12-01
Inspired by Google Knowledge Graph, we have been building a prototype Knowledge Graph for Earth scientists, connecting information and data in NASA's Earth science enterprise. Our primary goal is to advance the state-of-the-art NASA knowledge extraction capability by going beyond traditional catalog search and linking different distributed information (such as data, publications, services, tools and people). This will enable a more efficient pathway to knowledge discovery. While Google Knowledge Graph provides impressive semantic-search and aggregation capabilities, it is limited to search topics for general public. We use the similar knowledge graph approach to semantically link information gathered from a wide variety of sources within the NASA Earth Science enterprise. Our prototype serves as a proof of concept on the viability of building an operational "knowledge base" system for NASA Earth science. Information is pulled from structured sources (such as NASA CMR catalog, GCMD, and Climate and Forecast Conventions) and unstructured sources (such as research papers). Leveraging modern techniques of machine learning, information retrieval, and deep learning, we provide an integrated data mining and information discovery environment to help Earth scientists to use the best data, tools, methodologies, and models available to answer a hypothesis. Our knowledge graph would be able to answer questions like: Which articles discuss topics investigating similar hypotheses? How have these methods been tested for accuracy? Which approaches have been highly cited within the scientific community? What variables were used for this method and what datasets were used to represent them? What processing was necessary to use this data? These questions then lead researchers and citizen scientists to investigate the sources where data can be found, available user guides, information on how the data was acquired, and available tools and models to use with this data. As a proof of concept, we focus on a well-defined domain - Hurricane Science linking research articles and their findings, data, people and tools/services. Modern information retrieval, natural language processing machine learning and deep learning techniques are applied to build the knowledge network.
Formation of target-specific binding sites in enzymes: solid-phase molecular imprinting of HRP.
Czulak, J; Guerreiro, A; Metran, K; Canfarotta, F; Goddard, A; Cowan, R H; Trochimczuk, A W; Piletsky, S
2016-06-07
Here we introduce a new concept for synthesising molecularly imprinted nanoparticles by using proteins as macro-functional monomers. For a proof-of-concept, a model enzyme (HRP) was cross-linked using glutaraldehyde in the presence of glass beads (solid-phase) bearing immobilized templates such as vancomycin and ampicillin. The cross-linking process links together proteins and protein chains, which in the presence of templates leads to the formation of permanent target-specific recognition sites without adverse effects on the enzymatic activity. Unlike complex protein engineering approaches commonly employed to generate affinity proteins, the method proposed can be used to produce protein-based ligands in a short time period using native protein molecules. These affinity materials are potentially useful tools especially for assays since they combine the catalytic properties of enzymes (for signaling) and molecular recognition properties of antibodies. We demonstrate this concept in an ELISA-format assay where HRP imprinted with vancomycin and ampicillin replaced traditional enzyme-antibody conjugates for selective detection of templates at micromolar concentrations. This approach can potentially provide a fast alternative to raising antibodies for targets that do not require high assay sensitivities; it can also find uses as a biochemical research tool, as a possible replacement for immunoperoxidase-conjugates.
Control, communication and monitoring of intravaginal drug delivery in dairy cows.
Cross, Peter S; Künnemeyer, Rainer; Bunt, Craig R; Carnegie, Dale A; Rathbone, Michael J
2004-09-10
We present the design of an electronically controlled drug delivery system. The intravaginally located device is a low-invasive platform that can measure and react inside the cow vagina while providing external control and monitoring ability. The electronics manufactured from off the shelf components occupies 16 mL of a Theratron syringe. A microcontroller reads and logs sensor data and controls a gascell. The generated gas pressure propels the syringe piston and releases the formulation. A two way radio link allows communication between other devices or a base station. Proof of principle experiments confirm variable-rate, arbitrary profile drug delivery qualified by internal sensors. A total volume of 30 mL was dispensed over a 7-day-period with a volume error of +/- 1 mL or +/- 7% for larger volumes. Delivery was controlled or overridden via the wireless link, and proximity to other devices was detected and recorded. The results suggest that temperature and activity sensing or social grouping determined via proximity can be used to detect oestrus and trigger appropriate responses.
Removal of bisphenol A in canned liquid food by enzyme-based nanocomposites
NASA Astrophysics Data System (ADS)
Tapia-Orozco, Natalia; Meléndez-Saavedra, Fanny; Figueroa, Mario; Gimeno, Miquel; García-Arrazola, Roeb
2018-02-01
Laccase from Trametes versicolor was immobilized on TiO2 nanoparticles; the nanocomposites obtained were used for the removal of bisphenol A (BPA) in a liquid food matrix. To achieve a high enzymatic stability over a wide pH range and at temperatures above 50 °C, the nanocomposite structures were prepared by both physical adsorption and covalent linking of the enzyme onto the nanometric support. All the nanocomposite structures retained 40% of their enzymatic activity after 60 days of storage. Proof-of-concept experiments in aqueous media using the nanocomposites resulted on a > 60% BPA removal after 48 h and showed that BPA was depleted within 5 days. The nanocomposites were tested in canned liquid food samples; the removal reached 93.3% within 24 h using the physically adsorbed laccase. For the covalently linked enzyme, maximum BPA removal was 91.3%. The formation of BPA dimers and trimers was observed in all the assays. Food samples with sugar and protein contents above 3 and 4 mg mL-1 showed an inhibitory effect on the enzymatic activity.
Lightweight Biometric Sensing for Walker Classification Using Narrowband RF Links
Liang, Zhuo-qian
2017-01-01
This article proposes a lightweight biometric sensing system using ubiquitous narrowband radio frequency (RF) links for path-dependent walker classification. The fluctuated received signal strength (RSS) sequence generated by human motion is used for feature representation. To capture the most discriminative characteristics of individuals, a three-layer RF sensing network is organized for building multiple sampling links at the most common heights of upper limbs, thighs, and lower legs. The optimal parameters of sensing configuration, such as the height of link location and number of fused links, are investigated to improve sensory data distinctions among subjects, and the experimental results suggest that the synergistic sensing by using multiple links can contribute a better performance. This is the new consideration of using RF links in building a biometric sensing system. In addition, two types of classification methods involving vector quantization (VQ) and hidden Markov models (HMMs) are developed and compared for closed-set walker recognition and verification. Experimental studies in indoor line-of-sight (LOS) and non-line-of-sight (NLOS) scenarios are conducted to validate the proposed method. PMID:29206188
Lightweight Biometric Sensing for Walker Classification Using Narrowband RF Links.
Liu, Tong; Liang, Zhuo-Qian
2017-12-05
This article proposes a lightweight biometric sensing system using ubiquitous narrowband radio frequency (RF) links for path-dependent walker classification. The fluctuated received signal strength (RSS) sequence generated by human motion is used for feature representation. To capture the most discriminative characteristics of individuals, a three-layer RF sensing network is organized for building multiple sampling links at the most common heights of upper limbs, thighs, and lower legs. The optimal parameters of sensing configuration, such as the height of link location and number of fused links, are investigated to improve sensory data distinctions among subjects, and the experimental results suggest that the synergistic sensing by using multiple links can contribute a better performance. This is the new consideration of using RF links in building a biometric sensing system. In addition, two types of classification methods involving vector quantization (VQ) and hidden Markov models (HMMs) are developed and compared for closed-set walker recognition and verification. Experimental studies in indoor line-of-sight (LOS) and non-line-of-sight (NLOS) scenarios are conducted to validate the proposed method.
2010-11-01
Novembre 2010. Contexte: La puissance des ordinateurs nous permet aujourd’hui d’étudier des problèmes pour lesquels une solution analytique n’existe... 13 4.8 Proof of Corollary........................................................................................................ 13 ...optimal capacities for links. e DRDC CORA TM 2010-249 13 4.9 Example Figure 4 below shows that the probability of achieving the optimal
ERIC Educational Resources Information Center
Scripp, Lawrence; Paradis, Laura
2014-01-01
This article provides a window into Chicago Arts Partnerships in Education's (CAPE) Partnerships in Arts Integration Research (PAIR) project conducted in Chicago public schools (CPS) (pairresults.org), which statistically demonstrates how a three-year arts integration project can impact treatment versus control students in both academic and arts…
Nett Technologies’ BlueMAX 100 version A Urea-Based SCR System utilizes a zeolite catalyst coating on a cordierite honeycomb substrate for heavy-duty diesel nonroad engines for use with commercial ultra-low–sulfur diesel fuel. This environmental technology verification (ETV) repo...
Face verification system for Android mobile devices using histogram based features
NASA Astrophysics Data System (ADS)
Sato, Sho; Kobayashi, Kazuhiro; Chen, Qiu
2016-07-01
This paper proposes a face verification system that runs on Android mobile devices. In this system, facial image is captured by a built-in camera on the Android device firstly, and then face detection is implemented using Haar-like features and AdaBoost learning algorithm. The proposed system verify the detected face using histogram based features, which are generated by binary Vector Quantization (VQ) histogram using DCT coefficients in low frequency domains, as well as Improved Local Binary Pattern (Improved LBP) histogram in spatial domain. Verification results with different type of histogram based features are first obtained separately and then combined by weighted averaging. We evaluate our proposed algorithm by using publicly available ORL database and facial images captured by an Android tablet.
Proof of Concept for an Approach to a Finer Resolution Inventory
Chris J. Cieszewski; Kim Iles; Roger C. Lowe; Michal Zasada
2005-01-01
This report presents a proof of concept for a statistical framework to develop a timely, accurate, and unbiased fiber supply assessment in the State of Georgia, U.S.A. The proposed approach is based on using various data sources and modeling techniques to calibrate satellite image-based statewide stand lists, which provide initial estimates for a State inventory on a...