Science.gov

Sample records for online signature verification

  1. Online Signature Verification Using Fourier Descriptors

    NASA Astrophysics Data System (ADS)

    Yanikoglu, Berrin; Kholmatov, Alisher

    2009-12-01

    We present a novel online signature verification system based on the Fast Fourier Transform. The advantage of using the Fourier domain is the ability to compactly represent an online signature using a fixed number of coefficients. The fixed-length representation leads to fast matching algorithms and is essential in certain applications. The challenge on the other hand is to find the right preprocessing steps and matching algorithm for this representation. We report on the effectiveness of the proposed method, along with the effects of individual preprocessing and normalization steps, based on comprehensive tests over two public signature databases. We also propose to use the pen-up duration information in identifying forgeries. The best results obtained on the SUSIG-Visual subcorpus and the MCYT-100 database are 6.2% and 12.1% error rate on skilled forgeries, respectively. The fusion of the proposed system with our state-of-the-art Dynamic Time Warping (DTW) system lowers the error rate of the DTW system by up to about 25%. While the current error rates are higher than state-of-the-art results for these databases, as an approach using global features, the system possesses many advantages. Considering also the suggested improvements, the FFT system shows promise both as a stand-alone system and especially in combination with approaches that are based on local features.

  2. Glove-based approach to online signature verification.

    PubMed

    Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A

    2008-06-01

    Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%. PMID:18421114

  3. Fusion strategies for boosting cancelable online signature verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    2010-04-01

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. As one solution, we proposed a scheme to enhance the performance of a cancelable approach for online signature verification by combining scores calculated from two transformed datasets generated using two keys. Generally, the same verification algorithm is used for transformed data as for raw (non-transformed) data in cancelable approaches, and, in our previous work, a verification system developed for a non-transformed dataset was used to calculate the scores from transformed data. In this paper, we modify the verification system by using transformed data for training. Several experiments were performed by using public databases, and the experimental results show that the modification of the verification system improved the performances. Our cancelable system combines two scores to make a decision. Several fusion strategies are also considered, and the experimental results are reported here.

  4. On-line signature verification method by Laplacian spectral analysis and dynamic time warping

    NASA Astrophysics Data System (ADS)

    Li, Changting; Peng, Liangrui; Liu, Changsong; Ding, Xiaoqing

    2013-12-01

    As smartphones and touch screens are more and more popular, on-line signature verification technology can be used as one of personal identification means for mobile computing. In this paper, a novel Laplacian Spectral Analysis (LSA) based on-line signature verification method is presented and an integration framework of LSA and Dynamic Time Warping (DTW) based methods for practical application is proposed. In LSA based method, a Laplacian matrix is constructed by regarding the on-line signature as a graph. The signature's writing speed information is utilized in the Laplacian matrix of the graph. The eigenvalue spectrum of the Laplacian matrix is analyzed and used for signature verification. The framework to integrate LSA and DTW methods is further proposed. DTW is integrated at two stages. First, it is used to provide stroke matching results for the LSA method to construct the corresponding graph better. Second, the on-line signature verification results by DTW are fused with that of the LSA method. Experimental results on public signature database and practical signature data on mobile phones proved the effectiveness of the proposed method.

  5. Online handwritten signature verification using neural network classifier based on principal component analysis.

    PubMed

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Yussof, Salman; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%. PMID:25133227

  6. Online Handwritten Signature Verification Using Neural Network Classifier Based on Principal Component Analysis

    PubMed Central

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%. PMID:25133227

  7. Signature verification by only single genuine sample in offline and online systems

    NASA Astrophysics Data System (ADS)

    Adamski, Marcin; Saeed, Khalid

    2016-06-01

    The paper presents innovatory methods and algorithms with experimental results on signature verification. It is mainly focused on applications where there is only one reference signature available for comparison. Such restriction is often present in practice and requires selection of specific methods. In this context, both offline and online approaches are investigated. In offline approach, binary image of the signature is initially thinned to obtain a one pixel-wide line. Then, a sampling technique is applied in order to form the signature feature vector. The identification and verification processes are based on comparing the reference feature vector with the questioned samples using Shape Context algorithm. In the case of online data, the system makes use of dynamic information such as trajectory, pen pressure, pen azimuth and pen altitude collected at the time of signing. After further preprocessing, these functional features are verified by means of Dynamic Time Warping method.

  8. Camera-Based Online Signature Verification with Sequential Marginal Likelihood Change Detector

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Yasuda, Kumiko; Shirato, Satoshi; Matsumoto, Takashi

    Several online signature verification systems that use cameras have been proposed. These systems obtain online signature data from video images by tracking the pen tip. Such systems are very useful because special devices such as pen-operated digital tablets are not necessary. One drawback, however, is that if the captured images are blurred, pen tip tracking may fail, which causes performance degradation. To solve this problem, here we propose a scheme to detect such images and re-estimate the pen tip position associated with the blurred images. Our pen tracking algorithm is implemented by using the sequential Monte Carlo method, and a sequential marginal likelihood is used for blurred image detection. Preliminary experiments were performed using private data consisting of 390 genuine signatures and 1560 forged signatures. The experimental results show that the proposed algorithm improved performance in terms of verification accuracy.

  9. Hill-Climbing Attacks and Robust Online Signature Verification Algorithm against Hill-Climbing Attacks

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo

    Attacks using hill-climbing methods have been reported as a vulnerability of biometric authentication systems. In this paper, we propose a robust online signature verification algorithm against such attacks. Specifically, the attack considered in this paper is a hill-climbing forged data attack. Artificial forgeries are generated offline by using the hill-climbing method, and the forgeries are input to a target system to be attacked. In this paper, we analyze the menace of hill-climbing forged data attacks using six types of hill-climbing forged data and propose a robust algorithm by incorporating the hill-climbing method into an online signature verification algorithm. Experiments to evaluate the proposed system were performed using a public online signature database. The proposed algorithm showed improved performance against this kind of attack.

  10. Personal authentication in video surveillance systems using an on-line signature verification approach

    NASA Astrophysics Data System (ADS)

    Lien, Cheng-Chang; Han, Chin-Chuan; Lin, Su-Ming

    2005-03-01

    In this paper, a novel on-line signature verification approach is proposed for personal authentication in video surveillance systems. As we know, digit password-based authentication is the most popular manner in many network-based applications. However, if the passwords were leaked, the monitoring data are easily falsified. Biometric-based authentication using signature features is a natural and friendly approach to remedy this problem. In this study, a signature-based authentication is proposed to identify the individuals by using the template matching strategy. Some experimental results were conducted to show the effectiveness of our proposed methods.

  11. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  12. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  13. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  14. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  15. New online signature acquisition system

    NASA Astrophysics Data System (ADS)

    Oulefki, Adel; Mostefai, Messaoud; Abbadi, Belkacem; Djebrani, Samira; Bouziane, Abderraouf; Chahir, Youssef

    2013-01-01

    We present a nonconstraining and low-cost online signature acquisition system that has been developed to enhance the performances of an existing multimodal biometric authentication system (based initially on both voice and image modalities). A laboratory prototype has been developed and validated for an online signature acquisition.

  16. An individuality model for online signatures using global Fourier descriptors

    NASA Astrophysics Data System (ADS)

    Kholmatov, Alisher; Yanikoglu, Berrin

    2008-03-01

    The discriminative capability of a biometric is based on its individuality/uniqueness and is an important factor in choosing a biometric for a large-scale deployment. Individuality studies have been carried out rigorously for only certain biometrics, in particular fingerprint and iris, while work on establishing handwriting and signature individuality has been mainly on feature level. In this study, we present a preliminary individuality model for online signatures using the Fourier domain representation of the signature. Using the normalized Fourier coefficients as global features describing the signature, we derive a formula for the probability of coincidentally matching a given signature. Estimating model parameters from a large database and making certain simplifying assumptions, the probability of two arbitrary signatures to match in 13 of the coefficients is calculated as 4.7x10 -4. When compared with the results of a verification algorithm that parallels the theoretical model, the results show that the theoretical model fits the random forgery test results fairly well. While online signatures are sometimes dismissed as not very secure, our results show that the probability of successfully guessing an online signature is very low. Combined with the fact that signature is a behavioral biometric with adjustable complexity, these results support the use of online signatures for biometric authentication.

  17. Offline signature verification using local binary pattern and octave pattern

    NASA Astrophysics Data System (ADS)

    Ahlawat, Sahil; Goel, Anubhav; Prasad, Surabhi; Singh, Preety

    2014-01-01

    Signature verification holds a significant place in today's world as most of the bank transactions, stock trading etc. are validated via signatures. Signatures are considered as one of the most effective biometric identity but unfortunately signature forgery attempts are quite rampant. To prevent this, a robust signature verification mechanism is essential. In this paper, a new method has been proposed which uses Local Binary Pattern and geometrical features. A new geometric property has been devised i.e. Octave Pattern. Performance is analyzed by comparing random, semi-skilled and skilled forgeries with the genuine signature.

  18. Applications of a hologram watermarking protocol: aging-aware biometric signature verification and time validity check with personal documents

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Croce Ferri, Lucilla

    2003-06-01

    Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.

  19. Online adaptation and verification of VMAT

    SciTech Connect

    Crijns, Wouter; Defraene, Gilles; Depuydt, Tom; Haustermans, Karin; Van Herck, Hans; Maes, Frederik; Van den Heuvel, Frank

    2015-07-15

    Purpose: This work presents a method for fast volumetric modulated arc therapy (VMAT) adaptation in response to interfraction anatomical variations. Additionally, plan parameters extracted from the adapted plans are used to verify the quality of these plans. The methods were tested as a prostate class solution and compared to replanning and to their current clinical practice. Methods: The proposed VMAT adaptation is an extension of their previous intensity modulated radiotherapy (IMRT) adaptation. It follows a direct (forward) planning approach: the multileaf collimator (MLC) apertures are corrected in the beam’s eye view (BEV) and the monitor units (MUs) are corrected using point dose calculations. All MLC and MU corrections are driven by the positions of four fiducial points only, without need for a full contour set. Quality assurance (QA) of the adapted plans is performed using plan parameters that can be calculated online and that have a relation to the delivered dose or the plan quality. Five potential parameters are studied for this purpose: the number of MU, the equivalent field size (EqFS), the modulation complexity score (MCS), and the components of the MCS: the aperture area variability (AAV) and the leaf sequence variability (LSV). The full adaptation and its separate steps were evaluated in simulation experiments involving a prostate phantom subjected to various interfraction transformations. The efficacy of the current VMAT adaptation was scored by target mean dose (CTV{sub mean}), conformity (CI{sub 95%}), tumor control probability (TCP), and normal tissue complication probability (NTCP). The impact of the adaptation on the plan parameters (QA) was assessed by comparison with prediction intervals (PI) derived from a statistical model of the typical variation of these parameters in a population of VMAT prostate plans (n = 63). These prediction intervals are the adaptation equivalent of the tolerance tables for couch shifts in the current clinical

  20. Digital video system for on-line portal verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Konrad W.; Shalev, Shlomo; Cosby, N. Scott

    1990-07-01

    A digital system has been developed for on-line acquisition, processing and display of portal images during radiation therapy treatment. A metal/phosphor screen combination is the primary detector, where the conversion from high-energy photons to visible light takes place. A mirror angled at 45 degrees reflects the primary image to a low-light-level camera, which is removed from the direct radiation beam. The image registered by the camera is digitized, processed and displayed on a CRT monitor. Advanced digital techniques for processing of on-line images have been developed and implemented to enhance image contrast and suppress the noise. Some elements of automated radiotherapy treatment verification have been introduced.

  1. Age and gender-invariant features of handwritten signatures for verification systems

    NASA Astrophysics Data System (ADS)

    AbdAli, Sura; Putz-Leszczynska, Joanna

    2014-11-01

    Handwritten signature is one of the most natural biometrics, the study of human physiological and behavioral patterns. Behavioral biometrics includes signatures that may be different due to its owner gender or age because of intrinsic or extrinsic factors. This paper presents the results of the author's research on age and gender influence on verification factors. The experiments in this research were conducted using a database that contains signatures and their associated metadata. The used algorithm is based on the universal forgery feature idea, where the global classifier is able to classify a signature as a genuine one or, as a forgery, without the actual knowledge of the signature template and its owner. Additionally, the reduction of the dimensionality with the MRMR method is discussed.

  2. On Hunting Animals of the Biometric Menagerie for Online Signature

    PubMed Central

    Houmani, Nesma; Garcia-Salicetti, Sonia

    2016-01-01

    Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database. PMID:27054836

  3. On Hunting Animals of the Biometric Menagerie for Online Signature.

    PubMed

    Houmani, Nesma; Garcia-Salicetti, Sonia

    2016-01-01

    Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database. PMID:27054836

  4. Offline signature verification and skilled forgery detection using HMM and sum graph features with ANN and knowledge based classifier

    NASA Astrophysics Data System (ADS)

    Mehta, Mohit; Choudhary, Vijay; Das, Rupam; Khan, Ilyas

    2010-02-01

    Signature verification is one of the most widely researched areas in document analysis and signature biometric. Various methodologies have been proposed in this area for accurate signature verification and forgery detection. In this paper we propose a unique two stage model of detecting skilled forgery in the signature by combining two feature types namely Sum graph and HMM model for signature generation and classify them with knowledge based classifier and probability neural network. We proposed a unique technique of using HMM as feature rather than a classifier as being widely proposed by most of the authors in signature recognition. Results show a higher false rejection than false acceptance rate. The system detects forgeries with an accuracy of 80% and can detect the signatures with 91% accuracy. The two stage model can be used in realistic signature biometric applications like the banking applications where there is a need to detect the authenticity of the signature before processing documents like checks.

  5. Gated Treatment Delivery Verification With On-Line Megavoltage Fluoroscopy

    SciTech Connect

    Tai An; Christensen, James D.; Gore, Elizabeth; Khamene, Ali; Boettger, Thomas; Li, X. Allen

    2010-04-15

    Purpose: To develop and clinically demonstrate the use of on-line real-time megavoltage (MV) fluoroscopy for gated treatment delivery verification. Methods and Materials: Megavoltage fluoroscopy (MVF) image sequences were acquired using a flat panel equipped for MV cone-beam CT in synchrony with the respiratory signal obtained from the Anzai gating device. The MVF images can be obtained immediately before or during gated treatment delivery. A prototype software tool (named RTReg4D) was developed to register MVF images with phase-sequenced digitally reconstructed radiograph images generated from the treatment planning system based on four-dimensional CT. The image registration can be used to reposition the patient before or during treatment delivery. To demonstrate the reliability and clinical usefulness, the system was first tested using a thoracic phantom and then prospectively in actual patient treatments under an institutional review board-approved protocol. Results: The quality of the MVF images for lung tumors is adequate for image registration with phase-sequenced digitally reconstructed radiographs. The MVF was found to be useful for monitoring inter- and intrafractional variations of tumor positions. With the planning target volume contour displayed on the MVF images, the system can verify whether the moving target stays within the planning target volume margin during gated delivery. Conclusions: The use of MVF images was found to be clinically effective in detecting discrepancies in tumor location before and during respiration-gated treatment delivery. The tools and process developed can be useful for gated treatment delivery verification.

  6. On the pinned field image binarization for signature generation in image ownership verification method

    NASA Astrophysics Data System (ADS)

    Lee, Mn-Ta; Chang, Hsuan Ting

    2011-12-01

    The issue of pinned field image binarization for signature generation in the ownership verification of the protected image is investigated. The pinned field explores the texture information of the protected image and can be employed to enhance the watermark robustness. In the proposed method, four optimization schemes are utilized to determine the threshold values for transforming the pinned field into a binary feature image, which is then utilized to generate an effective signature image. Experimental results show that the utilization of optimization schemes can significantly improve the signature robustness from the previous method (Lee and Chang, Opt. Eng. 49(9), 097005, 2010). While considering both the watermark retrieval rate and the computation speed, the genetic algorithm is strongly recommended. In addition, compared with Chang and Lin's scheme (J. Syst. Softw. 81(7), 1118-1129, 2008), the proposed scheme also has better performance.

  7. GRAZING-ANGLE FOURIER TRANSFORM INFRARED SPECTROSCOPY FOR ONLINE SURFACE CLEANLINESS VERIFICATION. YEAR 1

    EPA Science Inventory

    As part of the Online Surface Cleanliness Project, the Naval Facilities Engineering Service Center (NFESC) conducted a study of grazing-angle reflectance Fourier Transform Infrared (FTIR) Spectroscopy as a tool for online cleanliness verification at Department of Defense (DoD) cl...

  8. On-line failure detection and damping measurement of aerospace structures by random decrement signatures

    NASA Technical Reports Server (NTRS)

    Cole, H. A., Jr.

    1973-01-01

    Random decrement signatures of structures vibrating in a random environment are studied through use of computer-generated and experimental data. Statistical properties obtained indicate that these signatures are stable in form and scale and hence, should have wide application in one-line failure detection and damping measurement. On-line procedures are described and equations for estimating record-length requirements to obtain signatures of a prescribed precision are given.

  9. Optical security verification by synthesizing thin films with unique polarimetric signatures.

    PubMed

    Carnicer, Artur; Arteaga, Oriol; Pascual, Esther; Canillas, Adolf; Vallmitjana, Santiago; Javidi, Bahram; Bertran, Enric

    2015-11-15

    This Letter reports the production and optical polarimetric verification of codes based on thin-film technology for security applications. Because thin-film structures display distinctive polarization signatures, this data is used to authenticate the message encoded. Samples are analyzed using an imaging ellipsometer able to measure the 16 components of the Mueller matrix. As a result, the behavior of the thin film under polarized light becomes completely characterized. This information is utilized to distinguish among true and false codes by means of correlation. Without the imaging optics the components of the Mueller matrix become noise-like distributions and, consequently, the message encoded is no longer available. Then, a set of Stokes vectors are generated numerically for any polarization state of the illuminating beam and thus, machine learning techniques can be used to perform classification. We show that successful authentication is possible using the k-nearest neighbors algorithm in thin-films codes that have been anisotropically phase-encoded with pseudorandom phase code. PMID:26565884

  10. A method for online verification of adapted fields using an independent dose monitor

    SciTech Connect

    Chang Jina; Norrlinger, Bernhard D.; Heaton, Robert K.; Jaffray, David A.; Cho, Young-Bin; Islam, Mohammad K.; Mahon, Robert

    2013-07-15

    Purpose: Clinical implementation of online adaptive radiotherapy requires generation of modified fields and a method of dosimetric verification in a short time. We present a method of treatment field modification to account for patient setup error, and an online method of verification using an independent monitoring system.Methods: The fields are modified by translating each multileaf collimator (MLC) defined aperture in the direction of the patient setup error, and magnifying to account for distance variation to the marked isocentre. A modified version of a previously reported online beam monitoring system, the integral quality monitoring (IQM) system, was investigated for validation of adapted fields. The system consists of a large area ion-chamber with a spatial gradient in electrode separation to provide a spatially sensitive signal for each beam segment, mounted below the MLC, and a calculation algorithm to predict the signal. IMRT plans of ten prostate patients have been modified in response to six randomly chosen setup errors in three orthogonal directions.Results: A total of approximately 49 beams for the modified fields were verified by the IQM system, of which 97% of measured IQM signal agree with the predicted value to within 2%.Conclusions: The modified IQM system was found to be suitable for online verification of adapted treatment fields.

  11. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    SciTech Connect

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H

    2015-06-15

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  12. Laboratory verification of on-line lithium analysis using ultraviolet absorption spectrometry

    SciTech Connect

    Beemster, B.J.; Schlager, K.J.; Schloegel, K.M.; Kahle, S.J.; Fredrichs, T.L.

    1992-12-31

    Several laboratory experiments were performed to evaluate the capability of absorption spectrometry in the ultraviolet-visible wavelength range with the objective of developing methods for on-line analysis of lithium directly in the primary coolant of Pressurized Water Reactors using optical probes. Although initial laboratory tests seemed to indicate that lithium could be detected using primary absorption (detection of natural spectra unassisted by reagents), subsequent field tests demonstrated that no primary absorption spectra existed for lithium in the ultraviolet-visible wavelength range. A second series of tests that were recently conducted did, however, confirm results reported in the literature to the effect that reagents were available that will react with lithium to form chelates that possess detectable absorption and fluorescent signatures. These results point to the possible use of secondary techniques for on-line analysis of lithium.

  13. 75 FR 42575 - Electronic Signature and Storage of Form I-9, Employment Eligibility Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ... I-9. Documents Acceptable for Employment Eligibility Verification, 73 FR 76505 (Dec. 17, 2008). C... storage of the Form I-9. 71 FR 34510 (June 15, 2006). The interim rule implemented Public Law 108-390, 118... if DHS would provide additional guidance concerning the use of contract services for the...

  14. Is Your Avatar Ethical? On-Line Course Tools that Are Methods for Student Identity and Verification

    ERIC Educational Resources Information Center

    Semple, Mid; Hatala, Jeffrey; Franks, Patricia; Rossi, Margherita A.

    2011-01-01

    On-line college courses present a mandate for student identity verification for accreditation and funding sources. Student authentication requires course modification to detect fraud and misrepresentation of authorship in assignment submissions. The reality is that some college students cheat in face-to-face classrooms; however, the potential for…

  15. Efficient cost-sensitive human-machine collaboration for offline signature verification

    NASA Astrophysics Data System (ADS)

    Coetzer, Johannes; Swanepoel, Jacques; Sabourin, Robert

    2012-01-01

    We propose a novel strategy for the optimal combination of human and machine decisions in a cost-sensitive environment. The proposed algorithm should be especially beneficial to financial institutions where off-line signatures, each associated with a specific transaction value, require authentication. When presented with a collection of genuine and fraudulent training signatures, produced by so-called guinea pig writers, the proficiency of a workforce of human employees and a score-generating machine can be estimated and represented in receiver operating characteristic (ROC) space. Using a set of Boolean fusion functions, the majority vote decision of the human workforce is combined with each threshold-specific machine-generated decision. The performance of the candidate ensembles is estimated and represented in ROC space, after which only the optimal ensembles and associated decision trees are retained. When presented with a questioned signature linked to an arbitrary writer, the system first uses the ROC-based cost gradient associated with the transaction value to select the ensemble that minimises the expected cost, and then uses the corresponding decision tree to authenticate the signature in question. We show that, when utilising the entire human workforce, the incorporation of a machine streamlines the authentication process and decreases the expected cost for all operating conditions.

  16. Investigation of the spatial resolution of an online dose verification device

    SciTech Connect

    Asuni, G.; Rickey, D. W.; McCurdy, B. M. C.

    2012-02-15

    Purpose: The aim of this work is to characterize a new online dose verification device, COMPASS transmission detector array (IBA Dosimetry, Schwarzenbruck, Germany). The array is composed of 1600 cylindrical ionization chambers of 3.8 mm diameter, separated by 6.5 mm center-to-center spacing, in a 40 x 40 arrangement. Methods: The line spread function (LSF) of a single ion chamber in the detector was measured with a narrow slit collimator for a 6 MV photon beam. The 0.25 x 10 mm{sup 2} slit was formed by two machined lead blocks. The LSF was obtained by laterally translating the detector in 0.25 mm steps underneath the slit over a range of 24 mm and taking a measurement at each step. This measurement was validated with Monte Carlo simulation using BEAMnrc and DOSXYZnrc. The presampling modulation transfer function (MTF), the Fourier transform of the line spread function, was determined and compared to calculated (Monte Carlo and analytical) MTFs. Two head-and-neck intensity modulated radiation therapy (IMRT) fields were measured using the device and were used to validate the LSF measurement. These fields were simulated with the BEAMnrc Monte Carlo model, and the Monte Carlo generated incident fluence was convolved with the 2D detector response function (derived from the measured LSF) to obtain calculated dose. The measured and calculated dose distributions were then quantitatively compared using {chi}-comparison criteria of 3% dose difference and 3 mm distance-to-agreement for in-field points (defined as those above the 10% maximum dose threshold). Results: The full width at half-maximum (FWHM) of the measured detector response for a single chamber is 4.3 mm, which is comparable to the chamber diameter of 3.8 mm. The pre-sampling MTF was calculated, and the resolution of one chamber was estimated as 0.25 lp/mm from the first zero crossing. For both examined IMRT fields, the {chi}-comparison between measured and calculated data show good agreement with 95.1% and 96

  17. Workload and Interaction: Unisa's Signature Courses--A Design Template for Transitioning to Online DE?

    ERIC Educational Resources Information Center

    Hülsmann, Thomas; Shabalala, Lindiwe

    2016-01-01

    The principal contradiction of online distance education is the disparity that exists between economies of scale and the new interactive capabilities of digital technologies. This is particularly felt where mega-universities in developing countries seek to make better use of these affordances while at the same time protecting their economies of…

  18. Sequential utilization of substrates by Pseudomonas putida CSV86: signatures of intermediate metabolites and online measurements.

    PubMed

    Basu, Aditya; Das, Debasish; Bapat, Prashant; Wangikar, Pramod P; Phale, Prashant S

    2009-01-01

    Pseudomonas putida CSV86 preferentially utilizes aromatics over glucose and co-metabolizes them with organic acids. On aromatics plus glucose, CSV86 utilized aromatics first with concomitant appearance of transient metabolites such as salicylate, benzaldehyde and benzoate. Citrate was the main extracellular metabolite observed during glucose uptake. The strain showed simultaneous utilization of organic acids and aromatic compounds. Based on the metabolite analysis and growth profiles, we hypothesize that the repression of glucose utilization could be due to organic acid intermediates generated from aromatic compound metabolism. The online measurements indicate the instantaneous metabolic state of the culture. For example, the CO(2) evolution and agitation speed show peak values during the two growth phases in the diauxic growth while dissolved oxygen values show decrease at the corresponding durations. These measurements correlated well with the offline measurements but provided a better time resolution of the process. PMID:17467253

  19. Authentication Based on Pole-zero Models of Signature Velocity

    PubMed Central

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-01-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  20. Authentication Based on Pole-zero Models of Signature Velocity.

    PubMed

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-10-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  1. A Comparison of the Use of Bony Anatomy and Internal Markers for Offline Verification and an Evaluation of the Potential Benefit of Online and Offline Verification Protocols for Prostate Radiotherapy

    SciTech Connect

    McNair, Helen A. Hansen, Vibeke N.; Parker, Christopher; Evans, Phil M.; Norman, Andrew; Miles, Elizabeth; Harris, Emma J.; Del-Acroix, Louise; Smith, Elizabeth; Keane, Richard; Khoo, Vincent S.; Thompson, Alan C.; Dearnaley, David P.

    2008-05-01

    Purpose: To evaluate the utility of intraprostatic markers in the treatment verification of prostate cancer radiotherapy. Specific aims were: to compare the effectiveness of offline correction protocols, either using gold markers or bony anatomy; to estimate the potential benefit of online correction protocol's using gold markers; to determine the presence and effect of intrafraction motion. Methods and Materials: Thirty patients with three gold markers inserted had pretreatment and posttreatment images acquired and were treated using an offline correction protocol and gold markers. Retrospectively, an offline protocol was applied using bony anatomy and an online protocol using gold markers. Results: The systematic errors were reduced from 1.3, 1.9, and 2.5 mm to 1.1, 1.1, and 1.5 mm in the right-left (RL), superoinferior (SI), and anteroposterior (AP) directions, respectively, using the offline correction protocol and gold markers instead of bony anatomy. The subsequent decrease in margins was 1.7, 3.3, and 4 mm in the RL, SI, and AP directions, respectively. An offline correction protocol combined with an online correction protocol in the first four fractions reduced random errors further to 0.9, 1.1, and 1.0 mm in the RL, SI, and AP directions, respectively. A daily online protocol reduced all errors to <1 mm. Intrafraction motion had greater impact on the effectiveness of the online protocol than the offline protocols. Conclusions: An offline protocol using gold markers is effective in reducing the systematic error. The value of online protocols is reduced by intrafraction motion.

  2. Patient-Specific 3D Pretreatment and Potential 3D Online Dose Verification of Monte Carlo-Calculated IMRT Prostate Treatment Plans

    SciTech Connect

    Boggula, Ramesh; Jahnke, Lennart; Wertz, Hansjoerg; Lohr, Frank; Wenz, Frederik

    2011-11-15

    Purpose: Fast and reliable comprehensive quality assurance tools are required to validate the safety and accuracy of complex intensity-modulated radiotherapy (IMRT) plans for prostate treatment. In this study, we evaluated the performance of the COMPASS system for both off-line and potential online procedures for the verification of IMRT treatment plans. Methods and Materials: COMPASS has a dedicated beam model and dose engine, it can reconstruct three-dimensional dose distributions on the patient anatomy based on measured fluences using either the MatriXX two-dimensional (2D) array (offline) or a 2D transmission detector (T2D) (online). For benchmarking the COMPASS dose calculation, various dose-volume indices were compared against Monte Carlo-calculated dose distributions for five prostate patient treatment plans. Gamma index evaluation and absolute point dose measurements were also performed in an inhomogeneous pelvis phantom using extended dose range films and ion chamber for five additional treatment plans. Results: MatriXX-based dose reconstruction showed excellent agreement with the ion chamber (<0.5%, except for one treatment plan, which showed 1.5%), film ({approx}100% pixels passing gamma criteria 3%/3 mm) and mean dose-volume indices (<2%). The T2D based dose reconstruction showed good agreement as well with ion chamber (<2%), film ({approx}99% pixels passing gamma criteria 3%/3 mm), and mean dose-volume indices (<5.5%). Conclusion: The COMPASS system qualifies for routine prostate IMRT pretreatment verification with the MatriXX detector and has the potential for on-line verification of treatment delivery using T2D.

  3. Online Kidney Position Verification Using Non-Contrast Radiographs on a Linear Accelerator with on Board KV X-Ray Imaging Capability

    SciTech Connect

    Willis, David J. Kron, Tomas; Hubbard, Patricia; Haworth, Annette; Wheeler, Greg; Duchesne, Gillian M.

    2009-01-01

    The kidneys are dose-limiting organs in abdominal radiotherapy. Kilovoltage (kV) radiographs can be acquired using on-board imager (OBI)-equipped linear accelerators with better soft tissue contrast and lower radiation doses than conventional portal imaging. A feasibility study was conducted to test the suitability of anterior-posterior (AP) non-contrast kV radiographs acquired at treatment time for online kidney position verification. Anthropomorphic phantoms were used to evaluate image quality and radiation dose. Institutional Review Board approval was given for a pilot study that enrolled 5 adults and 5 children. Customized digitally reconstructed radiographs (DRRs) were generated to provide a priori information on kidney shape and position. Radiotherapy treatment staff performed online evaluation of kidney visibility on OBI radiographs. Kidney dose measured in a pediatric anthropomorphic phantom was 0.1 cGy for kV imaging and 1.7 cGy for MV imaging. Kidneys were rated as well visualized in 60% of patients (90% confidence interval, 34-81%). The likelihood of visualization appears to be influenced by the relative AP separation of the abdomen and kidneys, the axial profile of the kidneys, and their relative contrast with surrounding structures. Online verification of kidney position using AP non-contrast kV radiographs on an OBI-equipped linear accelerator appears feasible for patients with suitable abdominal anatomy. Kidney position information provided is limited to 2-dimensional 'snapshots,' but this is adequate in some clinical situations and potentially advantageous in respiratory-correlated treatments. Successful clinical implementation requires customized partial DRRs, appropriate imaging parameters, and credentialing of treatment staff.

  4. SU-E-J-146: A Research of PET-CT SUV Range for the Online Dose Verification in Carbon Ion Radiation Therapy

    SciTech Connect

    Sun, L; Hu, W; Moyers, M; Zhao, J; Hsi, W

    2015-06-15

    Purpose: Positron-emitting isotope distributions can be used for the image fusion of the carbon ion planning CT and online target verification PETCT, after radiation in the same decay period,the relationship between the same target volume and the SUV value of different every single fraction dose can be found,then the range of SUV for the radiation target could be decided.So this online range also can provide reference for the correlation and consistency in planning target dose verification and evaluation for the clinical trial. Methods: The Rando head phantom can be used as real body,the 10cc cube volume target contouring is done,beam ISO Center depth is 7.6cm and the 90 degree fixed carbon ion beams should be delivered in single fraction effective dose of 2.5GyE,5GyE and 8GyE.After irradiation,390 seconds later the 30 minutes PET-CT scanning is performed,parameters are set to 50Kg virtual weight,0.05mCi activity.MIM Maestro is used for the image processing and fusion,five 16mm diameter SUV spheres have been chosen in the different direction in the target.The average SUV in target for different fraction dose can be found by software. Results: For 10cc volume target,390 seconds decay period,the Single fraction effective dose equal to 2.5Gy,Ethe SUV mean value is 3.42,the relative range is 1.72 to 6.83;Equal to 5GyE,SUV mean value is 9.946,the relative range is 7.016 to 12.54;Equal or above to 8GyE,SUV mean value is 20.496,the relative range is 11.16 to 34.73. Conclusion: Making an evaluation for accuracy of the dose distribution using the SUV range which is from the planning CT with after treatment online PET-CT fusion for the normal single fraction carbon ion treatment is available.Even to the plan which single fraction dose is above 2GyE,in the condition of other parameters all the same,the SUV range is linearly dependent with single fraction dose,so this method also can be used in the hyper-fraction treatment plan.

  5. Applying dynamic methods in off-line signature recognition

    NASA Astrophysics Data System (ADS)

    Igarza, Juan Jose; Hernaez, Inmaculada; Goirizelaia, Inaki; Espinosa, Koldo

    2004-08-01

    In this paper we present the work developed on off-line signature verification using Hidden Markov Models (HMM). HMM is a well-known technique used by other biometric features, for instance, in speaker recognition and dynamic or on-line signature verification. Our goal here is to extend Left-to-Right (LR)-HMM to the field of static or off-line signature processing using results provided by image connectivity analysis. The chain encoding of perimeter points for each blob obtained by this analysis is an ordered set of points in the space, clockwise around the perimeter of the blob. We discuss two different ways of generating the models depending on the way the blobs obtained from the connectivity analysis are ordered. In the first proposed method, blobs are ordered according to their perimeter length. In the second proposal, blobs are ordered in their natural reading order, i.e. from the top to the bottom and left to right. Finally, two LR-HMM models are trained using the parameters obtained by the mentioned techniques. Verification results of the two techniques are compared and some improvements are proposed.

  6. Cone-Beam Computed Tomography for On-Line Image Guidance of Lung Stereotactic Radiotherapy: Localization, Verification, and Intrafraction Tumor Position

    SciTech Connect

    Purdie, Thomas G. . E-mail: Tom.Purdie@rmp.uhn.on.ca; Bissonnette, Jean-Pierre; Franks, Kevin; Bezjak, Andrea; Payne, David; Sie, Fanny; Sharpe, Michael B.; Jaffray, David A.

    2007-05-01

    Purpose: Cone-beam computed tomography (CBCT) in-room imaging allows accurate inter- and intrafraction target localization in stereotactic body radiotherapy of lung tumors. Methods and Materials: Image-guided stereotactic body radiotherapy was performed in 28 patients (89 fractions) with medically inoperable Stage T1-T2 non-small-cell lung carcinoma. The targets from the CBCT and planning data set (helical or four-dimensional CT) were matched on-line to determine the couch shift required for target localization. Matching based on the bony anatomy was also performed retrospectively. Verification of target localization was done using either megavoltage portal imaging or CBCT imaging; repeat CBCT imaging was used to assess the intrafraction tumor position. Results: The mean three-dimensional tumor motion for patients with upper lesions (n = 21) and mid-lobe or lower lobe lesions (n = 7) was 4.2 and 6.7 mm, respectively. The mean difference between the target and bony anatomy matching using CBCT was 6.8 mm (SD, 4.9, maximum, 30.3); the difference exceeded 13.9 mm in 10% of the treatment fractions. The mean residual error after target localization using CBCT imaging was 1.9 mm (SD, 1.1, maximum, 4.4). The mean intrafraction tumor deviation was significantly greater (5.3 mm vs. 2.2 mm) when the interval between localization and repeat CBCT imaging (n = 8) exceeded 34 min. Conclusion: In-room volumetric imaging, such as CBCT, is essential for target localization accuracy in lung stereotactic body radiotherapy. Imaging that relies on bony anatomy as a surrogate of the target may provide erroneous results in both localization and verification.

  7. SU-E-T-582: On-Line Dosimetric Verification of Respiratory Gated Volumetric Modulated Arc Therapy Using the Electronic Portal Imaging Device

    SciTech Connect

    Schaly, B; Gaede, S; Xhaferllari, I

    2015-06-15

    Purpose: To investigate the clinical utility of on-line verification of respiratory gated VMAT dosimetry during treatment. Methods: Portal dose images were acquired during treatment in integrated mode on a Varian TrueBeam (v. 1.6) linear accelerator for gated lung and liver patients that used flattening filtered beams. The source to imager distance (SID) was set to 160 cm to ensure imager clearance in case the isocenter was off midline. Note that acquisition of integrated images resulted in no extra dose to the patient. Fraction 1 was taken as baseline and all portal dose images were compared to that of the baseline, where the gamma comparison and dose difference were used to measure day-to-day exit dose variation. All images were analyzed in the Portal Dosimetry module of Aria (v. 10). The portal imager on the TrueBeam was calibrated by following the instructions for dosimetry calibration in service mode, where we define 1 calibrated unit (CU) equal to 1 Gy for 10×10 cm field size at 100 cm SID. This reference condition was measured frequently to verify imager calibration. Results: The gamma value (3%, 3 mm, 5% threshold) ranged between 92% and 100% for the lung and liver cases studied. The exit dose can vary by as much as 10% of the maximum dose for an individual fraction. The integrated images combined with the information given by the corresponding on-line soft tissue matched cone-beam computed tomography (CBCT) images were useful in explaining dose variation. For gated lung treatment, dose variation was mainly due to the diaphragm position. For gated liver treatment, the dose variation was due to both diaphragm position and weight loss. Conclusion: Integrated images can be useful in verifying dose delivery consistency during respiratory gated VMAT, although the CBCT information is needed to explain dose differences due to anatomical changes.

  8. Creation of a Reference Image with Monte Carlo Simulations for Online EPID Verification of Daily Patient Setup

    SciTech Connect

    Descalle, M-A; Chuang, C; Pouliot, J

    2002-01-30

    Patient positioning accuracy remains an issue for external beam radiotherapy. Currently, kilovoltage verification images are used as reference by clinicians to compare the actual patient treatment position with the planned position. These images are qualitatively different from treatment-time megavoltage portal images. This study will investigate the feasibility of using PEREGRINE, a 3D Monte Carlo calculation engine, to create reference images for portal image comparisons. Portal images were acquired using an amorphous-silicon flat-panel EPID for (1) the head and pelvic sections of an anthropomorphic phantom with 7-8 mm displacements applied, and (2) a prostate patient on five treatment days. Planning CT scans were used to generate simulated reference images with PEREGRINE. A correlation algorithm quantified the setup deviations between simulated and portal images. Monte Carlo simulated images exhibit similar qualities to portal images, the phantom slabs appear clearly. Initial positioning differences and applied displacements were detected and quantified. We find that images simulated with Monte Carlo methods can be used as reference images to detect and quantify set-up errors during treatment.

  9. SU-E-J-46: Development of a Compton Camera Prototype for Online Range Verification of Laser-Accelerated Proton Beams

    SciTech Connect

    Thirolf, PG; Bortfeldt, J; Lang, C; Parodi, K; Aldawood, S; Boehmer, M; Gernhaeuser, R; Maier, L; Castelhano, I; Kolff, H van der; Schaart, DR

    2014-06-01

    Purpose: Development of a photon detection system designed for online range verification of laser-accelerated proton beams via prompt-gamma imaging of nuclear reactions. Methods: We develop a Compton camera for the position-sensitive detection of prompt photons emitted from nuclear reactions between the proton beam and biological samples. The detector is designed to be capable to reconstruct the photon source origin not only from the Compton scattering kinematics of the primary photon, but also to allow for tracking of the Compton-scattered electrons. Results: Simulation studies resulted in the design of the Compton camera based on a LaBr{sub 3}(Ce) scintillation crystal acting as absorber, preceded by a stacked array of 6 double-sided silicon strip detectors as scatterers. From the design simulations, an angular resolution of ≤ 2° and an image reconstruction efficiency of 10{sup −3} −10{sup −5} (at 2–6 MeV) can be expected. The LaBr{sub 3} crystal has been characterized with calibration sources, resulting in a time resolution of 273 ps (FWHM) and an energy resolution of about 3.8% (FWHM). Using a collimated (1 mm diameter) {sup 137}Cs calibration source, the light distribution was measured for each of 64 pixels (6×6 mm{sup 2}). Data were also taken with 0.5 mm collimation and 0.5 mm step size to generate a reference library of light distributions that allows for reconstructing the interaction position of the initial photon using a k-nearest neighbor (k-NN) algorithm developed by the Delft group. Conclusion: The Compton-camera approach for prompt-gamma detection offers promising perspectives for ion beam range verification. A Compton camera prototype is presently being developed and characterized in Garching. Furthermore, an arrangement of, e.g., 4 camera modules could even be used in a ‘gamma-PET’ mode to detect delayed annihilation radiation from positron emitters in the irradiation interrupts (with improved performance in the presence of an

  10. Actively Promoting Student Engagement within an Online Environment: Developing and Implementing a Signature Subject on "Contemporary Issues in Sex and Sexuality"

    ERIC Educational Resources Information Center

    Fletcher, Gillian; Dowsett, Gary W.; Austin, Lilian

    2012-01-01

    La Trobe University is committed to improving the first year experience, and to developing its online teaching portfolio in response to increasing student demand. This article will acknowledge that these two objectives will remain contradictory if online learning systems are used predominantly as repositories of information with little thought…

  11. Signature control

    NASA Astrophysics Data System (ADS)

    Pyati, Vittal P.

    The reduction of vehicle radar signature is accomplished by means of vehicle shaping, the use of microwave frequencies-absorbent materials, and either passive or active cancellation techniques; such techniques are also useful in the reduction of propulsion system-associated IR emissions. In some anticipated scenarios, the objective is not signature-reduction but signature control, for deception, via decoy vehicles that mimic the signature characteristics of actual weapons systems. As the stealthiness of airframes and missiles increases, their propulsion systems' exhaust plumes assume a more important role in detection by an adversary.

  12. Reliability-Based Decision Fusion in Multimodal Biometric Verification Systems

    NASA Astrophysics Data System (ADS)

    Kryszczuk, Krzysztof; Richiardi, Jonas; Prodanov, Plamen; Drygajlo, Andrzej

    2007-12-01

    We present a methodology of reliability estimation in the multimodal biometric verification scenario. Reliability estimation has shown to be an efficient and accurate way of predicting and correcting erroneous classification decisions in both unimodal (speech, face, online signature) and multimodal (speech and face) systems. While the initial research results indicate the high potential of the proposed methodology, the performance of the reliability estimation in a multimodal setting has not been sufficiently studied or evaluated. In this paper, we demonstrate the advantages of using the unimodal reliability information in order to perform an efficient biometric fusion of two modalities. We further show the presented method to be superior to state-of-the-art multimodal decision-level fusion schemes. The experimental evaluation presented in this paper is based on the popular benchmarking bimodal BANCA database.

  13. Signature-based store checking buffer

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-06-02

    A system and method for optimizing redundant output verification, are provided. A hardware-based store fingerprint buffer receives multiple instances of output from multiple instances of computation. The store fingerprint buffer generates a signature from the content included in the multiple instances of output. When a barrier is reached, the store fingerprint buffer uses the signature to verify the content is error-free.

  14. 78 FR 23743 - Proposed Information Collection; Comment Request; Delivery Verification Procedure for Imports

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-22

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF COMMERCE Bureau of Industry and Security Proposed Information Collection; Comment Request; Delivery Verification... furnish their foreign supplier with a U.S. Delivery Verification Certificate validating that...

  15. 75 FR 28550 - Proposed Information Collection; Comment Request; Delivery Verification Procedure

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-05-21

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF COMMERCE Bureau of Industry and Security Proposed Information Collection; Comment Request; Delivery Verification... their foreign supplier with a U.S. Delivery Verification Certificate validating that the...

  16. Strengths-based positive psychology interventions: a randomized placebo-controlled online trial on long-term effects for a signature strengths- vs. a lesser strengths-intervention

    PubMed Central

    Proyer, René T.; Gander, Fabian; Wellenzohn, Sara; Ruch, Willibald

    2015-01-01

    Recent years have seen an increasing interest in research in positive psychology interventions. There is broad evidence for their effectiveness in increasing well-being and ameliorating depression. Intentional activities that focus on those character strengths, which are most typical for a person (i.e., signature strengths, SS) and encourage their usage in a new way have been identified as highly effective. The current study aims at comparing an intervention aimed at using SS with one on using individual low scoring (or lesser) strengths in a randomized placebo-controlled trial. A total of 375 adults were randomly assigned to one of the two intervention conditions [i.e., using five signature vs. five lesser strengths (LS) in a new way] or a placebo control condition (i.e., early memories). We measured happiness and depressive symptoms at five time points (i.e., pre- and post-test, 1-, 3-, and 6-months follow-ups) and character strengths at pre-test. The main findings are that (1) there were increases in happiness for up to 3 months and decreases in depressive symptoms in the short term in both intervention conditions; (2) participants found working with strengths equally rewarding (enjoyment and benefit) in both conditions; (3) those participants that reported generally higher levels of strengths benefitted more from working on LS rather than SS and those with comparatively lower levels of strengths tended to benefit more from working on SS; and (4) deviations from an average profile derived from a large sample of German-speakers completing the Values-in-Action Inventory of Strengths were associated with greater benefit from the interventions in the SS-condition. We conclude that working on character strengths is effective for increasing happiness and discuss how these interventions could be tailored to the individual for promoting their effectiveness. PMID:25954221

  17. Geometric verification

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.

    1982-01-01

    Present LANDSAT data formats are reviewed to clarify how the geodetic location and registration capabilities were defined for P-tape products and RBV data. Since there is only one geometric model used in the master data processor, geometric location accuracy of P-tape products depends on the absolute accuracy of the model and registration accuracy is determined by the stability of the model. Due primarily to inaccuracies in data provided by the LANDSAT attitude management system, desired accuracies are obtained only by using ground control points and a correlation process. The verification of system performance with regards to geodetic location requires the capability to determine pixel positions of map points in a P-tape array. Verification of registration performance requires the capability to determine pixel positions of common points (not necessarily map points) in 2 or more P-tape arrays for a given world reference system scene. Techniques for registration verification can be more varied and automated since map data are not required. The verification of LACIE extractions is used as an example.

  18. Redactable signatures for signed CDA Documents.

    PubMed

    Wu, Zhen-Yu; Hsueh, Chih-Wen; Tsai, Cheng-Yu; Lai, Feipei; Lee, Hung-Chang; Chung, Yufang

    2012-06-01

    The Clinical Document Architecture, introduced by Health Level Seven, is a XML-based standard intending to specify the encoding, structure, and semantics of clinical documents for exchange. Since the clinical document is in XML form, its authenticity and integrity could be guaranteed by the use of the XML signature published by W3C. While a clinical document wants to conceal some personal or private information, the document needs to be redacted. It makes the signed signature of the original clinical document not be verified. The redactable signature is thus proposed to enable verification for the redacted document. Only a little research does the implementation of the redactable signature, and there still not exists an appropriate scheme for the clinical document. This paper will investigate the existing web-technologies and find a compact and applicable model to implement a suitable redactable signature for the clinical document viewer. PMID:21181244

  19. Characterization of the lignin signature in Lake Mead, NV, sediment: comparison of on-line flash chemopyrolysis (600 degrees C) and off-line chemolysis (250 degrees C).

    PubMed

    Steinberg, Spencer M; Nemr, Elkas L; Rudin, Mark

    2009-06-01

    The distribution of lignin in sediment is a useful tool for tracing the transport of land-derived organic matter in an aquatic environment. Tetramethylammonium hydroxide (TMAH) flash chemopyrolysis, or chemolysis followed by GC-MS analysis can be used for evaluating the origin of organic carbon in sediments. TMAH chemopyrolysis or chemolysis of organic matter produces a myriad of semi-volatile products. Among these products are methylated phenols which are an indirect measure of lignin in sediment. In this study, total organic carbon, elemental carbon, and lignin were measured in Lake Mead sediments. This study indicates that terrestrial runoff makes a contribution to Lake Mead sediments, and that this contribution is most apparent in sediment that is close to the Las Vegas Wash. Two chemolysis methods (on-line and off-line) were examined and compared for detection of lignin phenols. The results from these sediment cores indicate that comparable results can be obtained from the two approaches, although detection levels are significantly lower for the off-line approach. PMID:18427933

  20. A Quantum Multi-proxy Blind Signature Scheme Based on Genuine Four-Qubit Entangled State

    NASA Astrophysics Data System (ADS)

    Tian, Juan-Hong; Zhang, Jian-Zhong; Li, Yan-Ping

    2016-02-01

    In this paper, we propose a multi-proxy blind signature scheme based on controlled teleportation. Genuine four-qubit entangled state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. The security analysis shows the scheme satisfies the security features of multi-proxy signature, unforgeability, undeniability, blindness and unconditional security.

  1. Characterization of a fiber-coupled Al{sub 2}O{sub 3}:C luminescence dosimetry system for online in vivo dose verification during {sup 192}Ir brachytherapy

    SciTech Connect

    Andersen, Claus E.; Nielsen, Soeren Kynde; Greilich, Steffen; Helt-Hansen, Jakob; Lindegaard, Jacob Christian; Tanderup, Kari

    2009-03-15

    A prototype of a new dose-verification system has been developed to facilitate prevention and identification of dose delivery errors in remotely afterloaded brachytherapy. The system allows for automatic online in vivo dosimetry directly in the tumor region using small passive detector probes that fit into applicators such as standard needles or catheters. The system measures the absorbed dose rate (0.1 s time resolution) and total absorbed dose on the basis of radioluminescence (RL) and optically stimulated luminescence (OSL) from aluminum oxide crystals attached to optical fiber cables (1 mm outer diameter). The system was tested in the range from 0 to 4 Gy using a solid-water phantom, a Varian GammaMed Plus {sup 192}Ir PDR afterloader, and dosimetry probes inserted into stainless-steel brachytherapy needles. The calibrated system was found to be linear in the tested dose range. The reproducibility (one standard deviation) for RL and OSL measurements was 1.3%. The measured depth-dose profiles agreed well with the theoretical expectations computed with the EGSNRC Monte Carlo code, suggesting that the energy dependence for the dosimeter probes (relative to water) is less than 6% for source-to-probe distances in the range of 2-50 mm. Under certain conditions, the RL signal could be greatly disturbed by the so-called stem signal (i.e., unwanted light generated in the fiber cable upon irradiation). The OSL signal is not subject to this source of error. The tested system appears to be adequate for in vivo brachytherapy dosimetry.

  2. ETV - VERIFICATION TESTING (ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM)

    EPA Science Inventory

    Verification testing is a major component of the Environmental Technology Verification (ETV) program. The ETV Program was instituted to verify the performance of innovative technical solutions to problems that threaten human health or the environment and was created to substantia...

  3. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  4. Modeling the Lexical Morphology of Western Handwritten Signatures

    PubMed Central

    Diaz-Cabrera, Moises; Ferrer, Miguel A.; Morales, Aythami

    2015-01-01

    A handwritten signature is the final response to a complex cognitive and neuromuscular process which is the result of the learning process. Because of the many factors involved in signing, it is possible to study the signature from many points of view: graphologists, forensic experts, neurologists and computer vision experts have all examined them. Researchers study written signatures for psychiatric, penal, health and automatic verification purposes. As a potentially useful, multi-purpose study, this paper is focused on the lexical morphology of handwritten signatures. This we understand to mean the identification, analysis, and description of the signature structures of a given signer. In this work we analyze different public datasets involving 1533 signers from different Western geographical areas. Some relevant characteristics of signature lexical morphology have been selected, examined in terms of their probability distribution functions and modeled through a General Extreme Value distribution. This study suggests some useful models for multi-disciplinary sciences which depend on handwriting signatures. PMID:25860942

  5. Signatures of Reputation

    NASA Astrophysics Data System (ADS)

    Bethencourt, John; Shi, Elaine; Song, Dawn

    Reputation systems have become an increasingly important tool for highlighting quality information and filtering spam within online forums. However, the dependence of a user's reputation on their history of activities seems to preclude any possibility of anonymity. We show that useful reputation information can, in fact, coexist with strong privacy guarantees. We introduce and formalize a novel cryptographic primitive we call signatures of reputation which supports monotonic measures of reputation in a completely anonymous setting. In our system, a user can express trust in others by voting for them, collect votes to build up her own reputation, and attach a proof of her reputation to any data she publishes, all while maintaining the unlinkability of her actions.

  6. Verification Tools Secure Online Shopping, Banking

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Just like rover or rocket technology sent into space, the software that controls these technologies must be extensively tested to ensure reliability and effectiveness. Ames Research Center invented the open-source Java Pathfinder (JPF) toolset for the deep testing of Java-based programs. Fujitsu Labs of America Inc., based in Sunnyvale, California, improved the capabilities of the JPF Symbolic Pathfinder tool, establishing the tool as a means of thoroughly testing the functionality and security of Web-based Java applications such as those used for Internet shopping and banking.

  7. A hybrid digital-signature and zero-watermarking approach for authentication and protection of sensitive electronic documents.

    PubMed

    Tayan, Omar; Kabir, Muhammad N; Alginahi, Yasser M

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  8. A Hybrid Digital-Signature and Zero-Watermarking Approach for Authentication and Protection of Sensitive Electronic Documents

    PubMed Central

    Kabir, Muhammad N.; Alginahi, Yasser M.

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  9. Quantum blind dual-signature scheme without arbitrator

    NASA Astrophysics Data System (ADS)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-03-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology.

  10. Security Problems in the Quantum Signature Scheme with a Weak Arbitrator

    NASA Astrophysics Data System (ADS)

    Zou, Xiangfu; Qiu, Daowen; Yu, Fang; Mateus, Paulo

    2013-10-01

    Very recently, a quantum signature scheme with weak arbitrator was presented (Luo et al. in Int. J. Theor. Phys. 51:2135-2142, 2012). A weak arbitrator is only involved in the disagreement case, which means that the scheme is costless. In this paper, the security of the quantum signature scheme with weak arbitrator is analyzed. We show that attackers can counterfeit a signature for any message, which will pass the verification for the signer. In addition, they can counterfeit a signature for any one of the 4 L (L is the length of the intercepted quantum message) messages by employing the known message attack, which will pass the verification for the signed message. In particular, by employing the Z-transform attack, the attackers can forge a signature for any one of the 2 L messages, which will pass the verifications for both the signer and the signed message successfully.

  11. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  12. Developing composite signatures

    NASA Astrophysics Data System (ADS)

    Hawley, Chadwick T.; Carpenter, Tom; Cappelaere, Patrice G.; Frye, Stu; Lemoigne-Stewart, Jacqueline J.; Mandle, Dan; Montgomery, Sarah; Williams-Bess, Autumn

    2011-06-01

    A composite signature is a group of signatures that are related in such a way to more completely or further define a target or operational endeavor at a higher fidelity. This paper explores the merits of using composite signatures, in lieu of waiting for opportunities for the more elusive diagnostic signatures, to satisfy key essential elements of information Keywords: signature, composite signature, civil disaster (EEI) associated with civil disaster-related problems. It discusses efforts to refine composite signature development methodology and quantify the relative value of composite vs. diagnostic signatures. The objectives are to: 1) investigate and develop innovative composite signatures associated with civil disasters, including physical, chemical and pattern/behavioral; 2) explore the feasibility of collecting representative composite signatures using current and emerging intelligence, surveillance, and reconnaissance (ISR) collection architectures leveraging civilian and commercial architectures; and 3) collaborate extensively with scientists and engineers from U.S. government organizations and laboratories, the defense industry, and academic institutions.

  13. Electronic Signatures for Public Procurement across Europe

    NASA Astrophysics Data System (ADS)

    Ølnes, Jon; Andresen, Anette; Arbia, Stefano; Ernst, Markus; Hagen, Martin; Klein, Stephan; Manca, Giovanni; Rossi, Adriano; Schipplick, Frank; Tatti, Daniele; Wessolowski, Gesa; Windheuser, Jan

    The PEPPOL (Pan-European Public Procurement On-Line) project is a large scale pilot under the CIP programme of the EU, exploring electronic public procurement in a unified European market. An important element is interoperability of electronic signatures across borders, identified today as a major obstacle to cross-border procurement. PEPPOL will address use of signatures in procurement processes, in particular tendering but also post-award processes like orders and invoices. Signature policies, i.e. quality requirements and requirements on information captured in the signing process, will be developed. This as well as technical interoperability of e-signatures across Europe will finally be piloted in demonstrators starting late 2009 or early 2010.

  14. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  15. Anonymous Signatures Revisited

    NASA Astrophysics Data System (ADS)

    Saraswat, Vishal; Yun, Aaram

    We revisit the notion of the anonymous signature, first formalized by Yang, Wong, Deng and Wang [10], and then further developed by Fischlin [4] and Zhang and Imai [11]. We present a new formalism of anonymous signature, where instead of the message, a part of the signature is withheld to maintain anonymity. We introduce the notion unpretendability to guarantee infeasibility for someone other than the correct signer to pretend authorship of the message and signature. Our definition retains applicability for all previous applications of the anonymous signature, provides stronger security, and is conceptually simpler. We give a generic construction from any ordinary signature scheme, and also show that the short signature scheme by Boneh and Boyen [2] can be naturally regarded as such a secure anonymous signature scheme according to our formalism.

  16. Signatures support program

    NASA Astrophysics Data System (ADS)

    Hawley, Chadwick T.

    2009-05-01

    The Signatures Support Program (SSP) leverages the full spectrum of signature-related activities (collections, processing, development, storage, maintenance, and dissemination) within the Department of Defense (DOD), the intelligence community (IC), other Federal agencies, and civil institutions. The Enterprise encompasses acoustic, seismic, radio frequency, infrared, radar, nuclear radiation, and electro-optical signatures. The SSP serves the war fighter, the IC, and civil institutions by supporting military operations, intelligence operations, homeland defense, disaster relief, acquisitions, and research and development. Data centers host and maintain signature holdings, collectively forming the national signatures pool. The geographically distributed organizations are the authoritative sources and repositories for signature data; the centers are responsible for data content and quality. The SSP proactively engages DOD, IC, other Federal entities, academia, and industry to locate signatures for inclusion in the distributed national signatures pool and provides world-wide 24/7 access via the SSP application.

  17. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  18. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  19. 75 FR 4101 - Enterprise Income Verification (EIV) System User Access Authorization Form and Rules of Behavior...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-26

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT Enterprise Income Verification (EIV) System User Access Authorization Form and Rules.... This notice also lists the following information: Title of Proposal: Enterprise Income...

  20. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... From the Federal Register Online via the Government Publishing Office SOCIAL SECURITY ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social Security Administration. ACTION: Notice of Revised Transaction Fee for Consent Based Social Security Number...

  1. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    ... From the Federal Register Online via the Government Publishing Office SOCIAL SECURITY ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social Security Administration. ACTION: Notice of Revised Transaction Fee for CBSV Service. SUMMARY: We provide fee-based...

  2. 75 FR 76080 - Agency Information Collection (VetBiz Vendor Information Pages Verification Program) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF VETERANS AFFAIRS Agency Information Collection (VetBiz Vendor Information Pages Verification Program) Activity... . Please refer to ``OMB Control No. 2900- 0675.'' SUPPLEMENTAL INFORMATION: Title: VetBiz...

  3. 76 FR 39070 - Proposed Information Collection; Comment Request; Import, End-User, and Delivery Verification...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-05

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF COMMERCE Bureau of Industry and Security Proposed Information Collection; Comment Request; Import, End- User, and Delivery Verification Certificates AGENCY: Bureau of Industry and Security, Commerce. ACTION:...

  4. 76 FR 81991 - National Spectrum Sharing Research Experimentation, Validation, Verification, Demonstration and...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-12-29

    ... From the Federal Register Online via the Government Publishing Office NATIONAL SCIENCE FOUNDATION National Spectrum Sharing Research Experimentation, Validation, Verification, Demonstration and Trials: Technical Workshop II on Coordinating Federal Government/Private Sector Spectrum Innovation Testing...

  5. 17 CFR 1.4 - Electronic signatures, acknowledgments and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... commission merchant or introducing broker, a retail forex customer of a retail foreign exchange dealer or..., retail forex customer, participant, client, counterparty, swap dealer, or major swap participant will...

  6. 17 CFR 1.4 - Electronic signatures, acknowledgments and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... commission merchant or introducing broker, a retail forex customer of a retail foreign exchange dealer or..., retail forex customer, participant, client, counterparty, swap dealer, or major swap participant will...

  7. 75 FR 4100 - Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-01-26

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT Enterprise Income Verification (EIV) System-Debts Owed to PHAs and Terminations AGENCY.... This Notice Also Lists the Following Information Title of Proposal: Enterprise Income Verification...

  8. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-09-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, e.g. for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40 % relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  9. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-04-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, including for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40% relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  10. An Arbitrated Quantum Signature with Bell States

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Qin, Su-Juan; Huang, Wei

    2014-05-01

    Entanglement is the main resource in quantum communication. The main aims of the arbitrated quantum signature (AQS) scheme are to present an application of the entanglement in cryptology and to prove the possibility of the quantum signature. More specifically, the main function of quantum entangled states in the existing AQS schemes is to assist the signatory to transfer quantum states to the receiver. However, teleportation and the Leung quantum one-time pad (L-QOTP) algorithm are not enough to design a secure AQS scheme. For example, Pauli operations commute or anticommute with each other, which makes the implementation of attacks easily from the aspects of forgery and disavowal. To conquer this shortcoming, we construct an improved AQS scheme using a new QOTP algorithm. This scheme has three advantages: it randomly uses the Hadamard operation in the new QOTP to resist attacks by using the anticommutativity of nontrivial Pauli operators and it preserves almost all merits in the existing AQS schemes; even in the process of handling disputes, no party has chance to change the message and its signature without being discovered; the receiver can verify the integrity of the signature and discover the disavow of the signatory even in the last step of verification.

  11. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    SciTech Connect

    Paul, J. N.; Chin, M. R.; Sjoden, G. E.

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  12. Digital Signature Management.

    ERIC Educational Resources Information Center

    Hassler, Vesna; Biely, Helmut

    1999-01-01

    Describes the Digital Signature Project that was developed in Austria to establish an infrastructure for applying smart card-based digital signatures in banking and electronic-commerce applications. Discusses the need to conform to international standards, an international certification infrastructure, and security features for a public directory…

  13. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  14. Verification of RADTRAN

    SciTech Connect

    Kanipe, F.L.; Neuhauser, K.S.

    1995-12-31

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes.

  15. Electronic health records: what does your signature signify?

    PubMed

    Victoroff Md, Michael S

    2012-01-01

    Electronic health records serve multiple purposes, including clinical communication, legal documentation, financial transaction capture, research and analytics. Electronic signatures attached to entries in EHRs have different logical and legal meanings for different users. Some of these are vestiges from historic paper formats that require reconsideration. Traditionally accepted functions of signatures, such as identity verification, attestation, consent, authorization and non-repudiation can become ambiguous in the context of computer-assisted workflow processes that incorporate functions like logins, auto-fill and audit trails. This article exposes the incompatibility of expectations among typical users of electronically signed information. PMID:22888846

  16. Signature detection and matching for document image retrieval.

    PubMed

    Zhu, Guangyu; Zheng, Yefeng; Doermann, David; Jaeger, Stefan

    2009-11-01

    As one of the most pervasive methods of individual identification and document authentication, signatures present convincing evidence and provide an important form of indexing for effective document image processing and retrieval in a broad range of applications. However, detection and segmentation of free-form objects such as signatures from clustered background is currently an open document analysis problem. In this paper, we focus on two fundamental problems in signature-based document image retrieval. First, we propose a novel multiscale approach to jointly detecting and segmenting signatures from document images. Rather than focusing on local features that typically have large variations, our approach captures the structural saliency using a signature production model and computes the dynamic curvature of 2D contour fragments over multiple scales. This detection framework is general and computationally tractable. Second, we treat the problem of signature retrieval in the unconstrained setting of translation, scale, and rotation invariant nonrigid shape matching. We propose two novel measures of shape dissimilarity based on anisotropic scaling and registration residual error and present a supervised learning framework for combining complementary shape information from different dissimilarity metrics using LDA. We quantitatively study state-of-the-art shape representations, shape matching algorithms, measures of dissimilarity, and the use of multiple instances as query in document image retrieval. We further demonstrate our matching techniques in offline signature verification. Extensive experiments using large real-world collections of English and Arabic machine-printed and handwritten documents demonstrate the excellent performance of our approaches. PMID:19762928

  17. Traceable Ring Signature

    NASA Astrophysics Data System (ADS)

    Fujisaki, Eiichiro; Suzuki, Koutarou

    The ring signature allows a signer to leak secrets anonymously, without the risk of identity escrow. At the same time, the ring signature provides great flexibility: No group manager, no special setup, and the dynamics of group choice. The ring signature is, however, vulnerable to malicious or irresponsible signers in some applications, because of its anonymity. In this paper, we propose a traceable ring signature scheme. A traceable ring scheme is a ring signature except that it can restrict “excessive” anonymity. The traceable ring signature has a tag that consists of a list of ring members and an issue that refers to, for instance, a social affair or an election. A ring member can make any signed but anonymous opinion regarding the issue, but only once (per tag). If the member submits another signed opinion, possibly pretending to be another person who supports the first opinion, the identity of the member is immediately revealed. If the member submits the same opinion, for instance, voting “yes” regarding the same issue twice, everyone can see that these two are linked. The traceable ring signature can suit to many applications, such as an anonymous voting on a BBS. We formalize the security definitions for this primitive and show an efficient and simple construction in the random oracle model.

  18. UV Signature Mutations †

    PubMed Central

    2014-01-01

    Sequencing complete tumor genomes and exomes has sparked the cancer field's interest in mutation signatures for identifying the tumor's carcinogen. This review and meta-analysis discusses signatures and their proper use. We first distinguish between a mutagen's canonical mutations – deviations from a random distribution of base changes to create a pattern typical of that mutagen – and the subset of signature mutations, which are unique to that mutagen and permit inference backward from mutations to mutagen. To verify UV signature mutations, we assembled literature datasets on cells exposed to UVC, UVB, UVA, or solar simulator light (SSL) and tested canonical UV mutation features as criteria for clustering datasets. A confirmed UV signature was: ≥60% of mutations are C→T at a dipyrimidine site, with ≥5% CC→TT. Other canonical features such as a bias for mutations on the non-transcribed strand or at the 3' pyrimidine had limited application. The most robust classifier combined these features with criteria for the rarity of non-UV canonical mutations. In addition, several signatures proposed for specific UV wavelengths were limited to specific genes or species; non-signature mutations induced by UV may cause melanoma BRAF mutations; and the mutagen for sunlight-related skin neoplasms may vary between continents. PMID:25354245

  19. An archaeal genomic signature

    NASA Technical Reports Server (NTRS)

    Graham, D. E.; Overbeek, R.; Olsen, G. J.; Woese, C. R.

    2000-01-01

    Comparisons of complete genome sequences allow the most objective and comprehensive descriptions possible of a lineage's evolution. This communication uses the completed genomes from four major euryarchaeal taxa to define a genomic signature for the Euryarchaeota and, by extension, the Archaea as a whole. The signature is defined in terms of the set of protein-encoding genes found in at least two diverse members of the euryarchaeal taxa that function uniquely within the Archaea; most signature proteins have no recognizable bacterial or eukaryal homologs. By this definition, 351 clusters of signature proteins have been identified. Functions of most proteins in this signature set are currently unknown. At least 70% of the clusters that contain proteins from all the euryarchaeal genomes also have crenarchaeal homologs. This conservative set, which appears refractory to horizontal gene transfer to the Bacteria or the Eukarya, would seem to reflect the significant innovations that were unique and fundamental to the archaeal "design fabric." Genomic protein signature analysis methods may be extended to characterize the evolution of any phylogenetically defined lineage. The complete set of protein clusters for the archaeal genomic signature is presented as supplementary material (see the PNAS web site, www.pnas.org).

  20. A General Quality Classification System for eIDs and e-Signatures

    NASA Astrophysics Data System (ADS)

    Ølnes, Jon; Buene, Leif; Andresen, Anette; Grindheim, Håvard; Apitzsch, Jörg; Rossi, Adriano

    The PEPPOL (Pan-European Public Procurement On-Line) project is a large scale pilot under the CIP programme of the EU, exploring electronic public procurement in a unified European market. Interoperability of electronic signatures across borders is identified as a major obstacle to cross-border procurement. PEPPOL suggests specify-ing signature acceptance criteria in the form of signature policies that must be transparent and non-discriminatory. Validation solutions must then not only assess signature correctness but also signature policy adherence. This paper addresses perhaps the most important topic of a signature policy: Quality of eIDs and e-signatures. Discrete levels are suggested for: eID quality, assurance level for this quality, and for cryptographic quality of signatures.

  1. Signature extension studies

    NASA Technical Reports Server (NTRS)

    Vincent, R. K.; Thomas, G. S.; Nalepka, R. F.

    1974-01-01

    The importance of specific spectral regions to signature extension is explored. In the recent past, the signature extension task was focused on the development of new techniques. Tested techniques are now used to investigate this spectral aspect of the large area survey. Sets of channels were sought which, for a given technique, were the least affected by several sources of variation over four data sets and yet provided good object class separation on each individual data set. Using sets of channels determined as part of this study, signature extension was accomplished between data sets collected over a six-day period and over a range of about 400 kilometers.

  2. Toward Automatic Verification of Goal-Oriented Flow Simulations

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2014-01-01

    We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.

  3. Are there molecular signatures?

    SciTech Connect

    Bennett, W.P.

    1995-10-01

    This report describes molecular signatures and mutational spectrum analysis. The mutation spectrum is defined as the type and location of DNA base change. There are currently about five well documented cases. Mutations and radon-associated tumors are discussed.

  4. Meteor signature interpretation

    SciTech Connect

    Canavan, G.H.

    1997-01-01

    Meteor signatures contain information about the constituents of space debris and present potential false alarms to early warnings systems. Better models could both extract the maximum scientific information possible and reduce their danger. Accurate predictions can be produced by models of modest complexity, which can be inverted to predict the sizes, compositions, and trajectories of object from their signatures for most objects of interest and concern.

  5. UV signature mutations.

    PubMed

    Brash, Douglas E

    2015-01-01

    Sequencing complete tumor genomes and exomes has sparked the cancer field's interest in mutation signatures for identifying the tumor's carcinogen. This review and meta-analysis discusses signatures and their proper use. We first distinguish between a mutagen's canonical mutations—deviations from a random distribution of base changes to create a pattern typical of that mutagen—and the subset of signature mutations, which are unique to that mutagen and permit inference backward from mutations to mutagen. To verify UV signature mutations, we assembled literature datasets on cells exposed to UVC, UVB, UVA, or solar simulator light (SSL) and tested canonical UV mutation features as criteria for clustering datasets. A confirmed UV signature was: ≥60% of mutations are C→T at a dipyrimidine site, with ≥5% CC→TT. Other canonical features such as a bias for mutations on the nontranscribed strand or at the 3' pyrimidine had limited application. The most robust classifier combined these features with criteria for the rarity of non-UV canonical mutations. In addition, several signatures proposed for specific UV wavelengths were limited to specific genes or species; UV's nonsignature mutations may cause melanoma BRAF mutations; and the mutagen for sunlight-related skin neoplasms may vary between continents. PMID:25354245

  6. Online Degrees.

    ERIC Educational Resources Information Center

    Dolezalek, Holly

    2003-01-01

    Discusses the trend of trainers who are getting degrees through online courses delivered via the Internet. Addresses accreditation issues and what to ask before enrolling in online degree programs. (JOW)

  7. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  8. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  9. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  10. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  11. TFE verification program

    NASA Astrophysics Data System (ADS)

    1994-01-01

    This is the final semiannual progress report for the Thermionic Fuel Elements (TFE) verification. A decision was made in August 1993 to begin a Close Out Program on October 1, 1993. Final reports summarizing the design analyses and test activities of the TFE Verification Program will be written, stand-alone documents for each task. The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein includes evaluated test data, design evaluations, the results of analyses and the significance of results.

  12. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  13. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  14. Invisibly Sanitizable Signature without Pairings

    NASA Astrophysics Data System (ADS)

    Yum, Dae Hyun; Lee, Pil Joong

    Sanitizable signatures allow sanitizers to delete some pre-determined parts of a signed document without invalidating the signature. While ordinary sanitizable signatures allow verifiers to know how many subdocuments have been sanitized, invisibly sanitizable signatures do not leave any clue to the sanitized subdocuments; verifiers do not know whether or not sanitizing has been performed. Previous invisibly sanitizable signature scheme was constructed based on aggregate signature with pairings. In this article, we present the first invisibly sanitizable signature without using pairings. Our proposed scheme is secure under the RSA assumption.

  15. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  16. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  17. Context Effects in Sentence Verification.

    ERIC Educational Resources Information Center

    Kiger, John I.; Glass, Arnold L.

    1981-01-01

    Three experiments examined what happens to reaction time to verify easy items when they are mixed with difficult items in a verification task. Subjects verification of simple arithmetic equations and sentences took longer when placed in a difficult list. Difficult sentences also slowed the verification of easy arithmetic equations. (Author/RD)

  18. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  19. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  20. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  1. Telescope performance verification

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard P.; Buckley, David A. H.

    2004-09-01

    While Systems Engineering appears to be widely applied on the very large telescopes, it is lacking in the development of many of the medium and small telescopes currently in progress. The latter projects rely heavily on the experience of the project team, verbal requirements and conjecture based on the successes and failures of other telescopes. Furthermore, it is considered an unaffordable luxury to "close-the-loop" by carefully analysing and documenting the requirements and then verifying the telescope's compliance with them. In this paper the authors contend that a Systems Engineering approach is a keystone in the development of any telescope and that verification of the telescope's performance is not only an important management tool but also forms the basis upon which successful telescope operation can be built. The development of the Southern African Large Telescope (SALT) has followed such an approach and is now in the verification phase of its development. Parts of the SALT verification process will be discussed in some detail to illustrate the suitability of this approach, including oversight by the telescope shareholders, recording of requirements and results, design verification and performance testing. Initial test results will be presented where appropriate.

  2. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  3. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  4. Practical quantum digital signature

    NASA Astrophysics Data System (ADS)

    Yin, Hua-Lei; Fu, Yao; Chen, Zeng-Bing

    2016-03-01

    Guaranteeing nonrepudiation, unforgeability as well as transferability of a signature is one of the most vital safeguards in today's e-commerce era. Based on fundamental laws of quantum physics, quantum digital signature (QDS) aims to provide information-theoretic security for this cryptographic task. However, up to date, the previously proposed QDS protocols are impractical due to various challenging problems and most importantly, the requirement of authenticated (secure) quantum channels between participants. Here, we present the first quantum digital signature protocol that removes the assumption of authenticated quantum channels while remaining secure against the collective attacks. Besides, our QDS protocol can be practically implemented over more than 100 km under current mature technology as used in quantum key distribution.

  5. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  6. Current signature sensor

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M. (Inventor); Lucena, Angel (Inventor); Ihlefeld, Curtis (Inventor); Burns, Bradley (Inventor); Bassignani, Karin E. (Inventor)

    2005-01-01

    A solenoid health monitoring system uses a signal conditioner and controller assembly in one embodiment that includes analog circuitry and a DSP controller. The analog circuitry provides signal conditioning to the low-level raw signal coming from a signal acquisition assembly. Software running in a DSP analyzes the incoming data (recorded current signature) and determines the state of the solenoid whether it is energized, de-energized, or in a transitioning state. In one embodiment, the software identifies key features in the current signature during the transition phase and is able to determine the health of the solenoid.

  7. Current Signature Sensor

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M. (Inventor); Lucena, Angel (Inventor); Ihlefeld, Curtis (Inventor); Burns, Bradley (Inventor); Bassignani, Mario (Inventor); Bassignani, Karin E. (Inventor)

    2005-01-01

    A solenoid health monitoring system uses a signal conditioner and controller assembly in one embodiment that includes analog circuitry and a DSP controller. The analog circuitry provides signal conditioning to the low-level raw signal coming from a signal acquisition assembly. Software running in a DSP analyzes the incoming data (recorded current signature) and determines the state of the solenoid whether it is energized, de-energized, or in a transitioning state. In one embodiment, the software identifies key features in the current signature during the transition phase and is able to determine the health of the solenoid.

  8. Estimating the pen trajectories of static signatures using hidden Markov models.

    PubMed

    Nel, Emli-Mari; du Preez, Johan A; Herbst, B M

    2005-11-01

    Static signatures originate as handwritten images on documents and by definition do not contain any dynamic information. This lack of information makes static signature verification systems significantly less reliable than their dynamic counterparts. This study involves extracting dynamic information from static images, specifically the pen trajectory while the signature was created. We assume that a dynamic version of the static image is available (typically obtained during an earlier registration process). We then derive a hidden Markov model from the static image and match it to the dynamic version of the image. This match results in the estimated pen trajectory of the static image. PMID:16285373

  9. Fingerprint verification on medical image reporting system.

    PubMed

    Chen, Yen-Cheng; Chen, Liang-Kuang; Tsai, Ming-Dar; Chiu, Hou-Chang; Chiu, Jainn-Shiun; Chong, Chee-Fah

    2008-03-01

    The healthcare industry is recently going through extensive changes, through adoption of robust, interoperable healthcare information technology by means of electronic medical records (EMR). However, a major concern of EMR is adequate confidentiality of the individual records being managed electronically. Multiple access points over an open network like the Internet increases possible patient data interception. The obligation is on healthcare providers to procure information security solutions that do not hamper patient care while still providing the confidentiality of patient information. Medical images are also part of the EMR which need to be protected from unauthorized users. This study integrates the techniques of fingerprint verification, DICOM object, digital signature and digital envelope in order to ensure that access to the hospital Picture Archiving and Communication System (PACS) or radiology information system (RIS) is only by certified parties. PMID:18178287

  10. A Signature Style

    ERIC Educational Resources Information Center

    Smiles, Robin V.

    2005-01-01

    This article discusses Dr. Amalia Amaki and her approach to art as her signature style by turning everyday items into fine art. Amaki is an assistant professor of art, art history, and Black American studies at the University of Delaware. She loves taking unexpected an object and redefining it in the context of art--like a button, a fan, a faded…

  11. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  12. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  13. TFE verification program

    NASA Astrophysics Data System (ADS)

    1990-03-01

    The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a Thermionic Fuel Element (TFE) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program; (3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-88; and (5) Thermionic Program in 1986 and 1987.

  14. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  15. Implementation framework for digital signatures for electronic data interchange in healthcare.

    PubMed

    De Moor, Georges; Claerhout, Brecht; De Meyer, Filip

    2004-01-01

    This paper aims to propose an action plan for the deployment of the use of digital signatures in Belgian healthcare. This action plan is the result of a number of technical, legal and organisational requirements. It starts by establishing the functional components that are needed to set up a framework for the deployment of digital signatures. The main components should implement an infrastructure for: --the creation of digital signatures; --the verification of digital signatures; --the certification of signature keys; --the certification of attributes; --the handling of revocation. The tasks in the action plan are the logical consequence of all the functions that need to be addressed. The objective of this report is to list what has to be done and how it can be done in the context of healthcare, rather to state who will perform the functions required. PMID:15853257

  16. Characterization of palmprints by wavelet signatures via directional context modeling.

    PubMed

    Zhang, Lei; Zhang, David

    2004-06-01

    The palmprint is one of the most reliable physiological characteristics that can be used to distinguish between individuals. Current palmprint-based systems are more user friendly, more cost effective, and require fewer data signatures than traditional fingerprint-based identification systems. The principal lines and wrinkles captured in a low-resolution palmprint image provide more than enough information to uniquely identify an individual. This paper presents a palmprint identification scheme that characterizes a palmprint using a set of statistical signatures. The palmprint is first transformed into the wavelet domain, and the directional context of each wavelet subband is defined and computed in order to collect the predominant coefficients of its principal lines and wrinkles. A set of statistical signatures, which includes gravity center, density, spatial dispersivity and energy, is then defined to characterize the palmprint with the selected directional context values. A classification and identification scheme based on these signatures is subsequently developed. This scheme exploits the features of principal lines and prominent wrinkles sufficiently and achieves satisfactory results. Compared with the line-segments-matching or interesting-points-matching based palmprint verification schemes, the proposed scheme uses a much smaller amount of data signatures. It also provides a convenient classification strategy and more accurate identification. PMID:15484907

  17. Online Pricing.

    ERIC Educational Resources Information Center

    Garman, Nancy; And Others

    1990-01-01

    The first of four articles describes the move by the European Space Agency to eliminate connect time charges on its online retrieval system. The remaining articles describe the pricing structure of DIALOG, compare the two pricing schemes, and discuss online pricing from the user's point of view. (CLB)

  18. Web-based interrogation of gene expression signatures using EXALT

    PubMed Central

    2009-01-01

    Background Widespread use of high-throughput techniques such as microarrays to monitor gene expression levels has resulted in an explosive growth of data sets in public domains. Integration and exploration of these complex and heterogeneous data have become a major challenge. Results The EXALT (EXpression signature AnaLysis Tool) online program enables meta-analysis of gene expression profiles derived from publically accessible sources. Searches can be executed online against two large databases currently containing more than 28,000 gene expression signatures derived from GEO (Gene Expression Omnibus) and published expression profiles of human cancer. Comparisons among gene expression signatures can be performed with homology analysis and co-expression analysis. Results can be visualized instantly in a plot or a heat map. Three typical use cases are illustrated. Conclusions The EXALT online program is uniquely suited for discovering relationships among transcriptional profiles and searching gene expression patterns derived from diverse physiological and pathological settings. The EXALT online program is freely available for non-commercial users from http://seq.mc.vanderbilt.edu/exalt/. PMID:20003458

  19. Continuous verification using multimodal biometrics.

    PubMed

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  20. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  1. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  2. Identification of host response signatures of infection.

    SciTech Connect

    Branda, Steven S.; Sinha, Anupama; Bent, Zachary

    2013-02-01

    Biological weapons of mass destruction and emerging infectious diseases represent a serious and growing threat to our national security. Effective response to a bioattack or disease outbreak critically depends upon efficient and reliable distinguishing between infected vs healthy individuals, to enable rational use of scarce, invasive, and/or costly countermeasures (diagnostics, therapies, quarantine). Screening based on direct detection of the causative pathogen can be problematic, because culture- and probe-based assays are confounded by unanticipated pathogens (e.g., deeply diverged, engineered), and readily-accessible specimens (e.g., blood) often contain little or no pathogen, particularly at pre-symptomatic stages of disease. Thus, in addition to the pathogen itself, one would like to detect infection-specific host response signatures in the specimen, preferably ones comprised of nucleic acids (NA), which can be recovered and amplified from tiny specimens (e.g., fingerstick draws). Proof-of-concept studies have not been definitive, however, largely due to use of sub-optimal sample preparation and detection technologies. For purposes of pathogen detection, Sandia has developed novel molecular biology methods that enable selective isolation of NA unique to, or shared between, complex samples, followed by identification and quantitation via Second Generation Sequencing (SGS). The central hypothesis of the current study is that variations on this approach will support efficient identification and verification of NA-based host response signatures of infectious disease. To test this hypothesis, we re-engineered Sandia's sophisticated sample preparation pipelines, and developed new SGS data analysis tools and strategies, in order to pioneer use of SGS for identification of host NA correlating with infection. Proof-of-concept studies were carried out using specimens drawn from pathogen-infected non-human primates (NHP). This work provides a strong foundation for

  3. Wake Signature Detection

    NASA Astrophysics Data System (ADS)

    Spedding, Geoffrey R.

    2014-01-01

    An accumulated body of quantitative evidence shows that bluff-body wakes in stably stratified environments have an unusual degree of coherence and organization, so characteristic geometries such as arrays of alternating-signed vortices have very long lifetimes, as measured in units of buoyancy timescales, or in the downstream distance scaled by a body length. The combination of pattern geometry and persistence renders the detection of these wakes possible in principle. It now appears that identifiable signatures can be found from many disparate sources: Islands, fish, and plankton all have been noted to generate features that can be detected by climate modelers, hopeful navigators in open oceans, or hungry predators. The various types of wakes are reviewed with notes on why their signatures are important and to whom. A general theory of wake pattern formation is lacking and would have to span many orders of magnitude in Reynolds number.

  4. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis; Mahadevan, Karthikeyan

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  5. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  6. Infrasonic signature of the 2009 major sudden stratospheric warming

    NASA Astrophysics Data System (ADS)

    Evers, L. G.; Siegmund, P.

    2009-12-01

    The study of infrasound is experiencing a renaissance since it was chosen as a verification technique for the Comprehensive Nuclear-Test-Ban Treaty. The success of the verification technique strongly depends on knowledge of upper atmospheric processes. The ability of infrasound to probe the upper atmosphere starts to be exploited, taking the field beyond its monitoring application. Processes in the stratosphere couple to the troposphere and influence our daily weather and climate. Infrasound delivers actual observations on the state of the stratosphere with a high spatial and temporal resolution. Here we show the infrasonic signature, passively obtained, of a drastic change in the stratosphere due to the major sudden stratospheric warming (SSW) of January 2009. With this study, we infer the enormous capacity of infrasound in acoustic remote sensing of stratospheric processes on a global scale with surface based instruments.

  7. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  8. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  9. HDL to verification logic translator

    NASA Astrophysics Data System (ADS)

    Gambles, J. W.; Windley, P. J.

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  10. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  11. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  12. Data requirements for verification of ram glow chemistry

    NASA Technical Reports Server (NTRS)

    Swenson, G. R.; Mende, S. B.

    1985-01-01

    A set of questions is posed regarding the surface chemistry producing the ram glow on the space shuttle. The questions surround verification of the chemical cycle involved in the physical processes leading to the glow. The questions, and a matrix of measurements required for most answers, are presented. The measurements include knowledge of the flux composition to and from a ram surface as well as spectroscopic signatures from the U to visible to IR. A pallet set of experiments proposed to accomplish the measurements is discussed. An interim experiment involving an available infrared instrument to be operated from the shuttle Orbiter cabin is also be discussed.

  13. Signatures of nonthermal melting.

    PubMed

    Zier, Tobias; Zijlstra, Eeuwe S; Kalitsov, Alan; Theodonis, Ioannis; Garcia, Martin E

    2015-09-01

    Intense ultrashort laser pulses can melt crystals in less than a picosecond but, in spite of over thirty years of active research, for many materials it is not known to what extent thermal and nonthermal microscopic processes cause this ultrafast phenomenon. Here, we perform ab-initio molecular-dynamics simulations of silicon on a laser-excited potential-energy surface, exclusively revealing nonthermal signatures of laser-induced melting. From our simulated atomic trajectories, we compute the decay of five structure factors and the time-dependent structure function. We demonstrate how these quantities provide criteria to distinguish predominantly nonthermal from thermal melting. PMID:26798822

  14. Signature CERN-URSS

    ScienceCinema

    None

    2011-04-25

    Le DG W.Jentschke souhaite la bienvenue à l'assemblée et aux invités pour la signature du protocole entre le Cern et l'URSS qui est un événement important. C'est en 1955 que 55 visiteurs soviétiques ont visité le Cern pour la première fois. Le premier DG au Cern, F.Bloch, et Mons.Amaldi sont aussi présents. Tandis que le discours anglais de W.Jentschke est traduit en russe, le discours russe de Mons.Morozov est traduit en anglais.

  15. Signatures of nonthermal melting

    PubMed Central

    Zier, Tobias; Zijlstra, Eeuwe S.; Kalitsov, Alan; Theodonis, Ioannis; Garcia, Martin E.

    2015-01-01

    Intense ultrashort laser pulses can melt crystals in less than a picosecond but, in spite of over thirty years of active research, for many materials it is not known to what extent thermal and nonthermal microscopic processes cause this ultrafast phenomenon. Here, we perform ab-initio molecular-dynamics simulations of silicon on a laser-excited potential-energy surface, exclusively revealing nonthermal signatures of laser-induced melting. From our simulated atomic trajectories, we compute the decay of five structure factors and the time-dependent structure function. We demonstrate how these quantities provide criteria to distinguish predominantly nonthermal from thermal melting. PMID:26798822

  16. Advanced spectral signature discrimination algorithm

    NASA Astrophysics Data System (ADS)

    Chakravarty, Sumit; Cao, Wenjie; Samat, Alim

    2013-05-01

    This paper presents a novel approach to the task of hyperspectral signature analysis. Hyperspectral signature analysis has been studied a lot in literature and there has been a lot of different algorithms developed which endeavors to discriminate between hyperspectral signatures. There are many approaches for performing the task of hyperspectral signature analysis. Binary coding approaches like SPAM and SFBC use basic statistical thresholding operations to binarize a signature which are then compared using Hamming distance. This framework has been extended to techniques like SDFC wherein a set of primate structures are used to characterize local variations in a signature together with the overall statistical measures like mean. As we see such structures harness only local variations and do not exploit any covariation of spectrally distinct parts of the signature. The approach of this research is to harvest such information by the use of a technique similar to circular convolution. In the approach we consider the signature as cyclic by appending the two ends of it. We then create two copies of the spectral signature. These three signatures can be placed next to each other like the rotating discs of a combination lock. We then find local structures at different circular shifts between the three cyclic spectral signatures. Texture features like in SDFC can be used to study the local structural variation for each circular shift. We can then create different measure by creating histogram from the shifts and thereafter using different techniques for information extraction from the histograms. Depending on the technique used different variant of the proposed algorithm are obtained. Experiments using the proposed technique show the viability of the proposed methods and their performances as compared to current binary signature coding techniques.

  17. Clustering signatures classify directed networks

    NASA Astrophysics Data System (ADS)

    Ahnert, S. E.; Fink, T. M. A.

    2008-09-01

    We use a clustering signature, based on a recently introduced generalization of the clustering coefficient to directed networks, to analyze 16 directed real-world networks of five different types: social networks, genetic transcription networks, word adjacency networks, food webs, and electric circuits. We show that these five classes of networks are cleanly separated in the space of clustering signatures due to the statistical properties of their local neighborhoods, demonstrating the usefulness of clustering signatures as a classifier of directed networks.

  18. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  19. Multimodal signature modeling of humans

    NASA Astrophysics Data System (ADS)

    Cathcart, J. Michael; Kocher, Brian; Prussing, Keith; Lane, Sarah; Thomas, Alan

    2010-04-01

    Georgia Tech been investigating method for the detection of covert personnel in traditionally difficult environments (e.g., urban, caves). This program focuses on a detailed phenomenological analysis of human physiology and signatures with the subsequent identification and characterization of potential observables. Both aspects are needed to support the development of personnel detection and tracking algorithms. The difficult nature of these personnel-related problems dictates a multimodal sensing approach. Human signature data of sufficient and accurate quality and quantity do not exist, thus the development of an accurate signature model for a human is needed. This model should also simulate various human activities to allow motion-based observables to be exploited. This paper will describe a multimodal signature modeling approach that incorporates human physiological aspects, thermoregulation, and dynamics into the signature calculation. This approach permits both passive and active signatures to be modeled. The focus of the current effort involved the computation of signatures in urban environments. This paper will discuss the development of a human motion model for use in simulating both electro-optical signatures and radar-based signatures. Video sequences of humans in a simulated urban environment will also be presented; results using these sequences for personnel tracking will be presented.

  20. Online Monitoring of Induction Motors

    SciTech Connect

    McJunkin, Timothy R.; Agarwal, Vivek; Lybeck, Nancy Jean

    2016-01-01

    The online monitoring of active components project, under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program, researched diagnostic and prognostic models for alternating current induction motors (IM). Idaho National Laboratory (INL) worked with the Electric Power Research Institute (EPRI) to augment and revise the fault signatures previously implemented in the Asset Fault Signature Database of EPRI’s Fleet Wide Prognostic and Health Management (FW PHM) Suite software. Induction Motor diagnostic models were researched using the experimental data collected by Idaho State University. Prognostic models were explored in the set of literature and through a limited experiment with 40HP to seek the Remaining Useful Life Database of the FW PHM Suite.

  1. 77 FR 28572 - Notice of Submission for OMB Review; Federal Student Aid; Loan Verification Certificate for...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-05-15

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF EDUCATION Notice of Submission for OMB Review; Federal Student Aid; Loan Verification Certificate for Special... which the U.S. Department of Education (the Department) collects certain information from...

  2. 78 FR 52085 - VA Veteran-Owned Small Business Verification Guidelines

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-22

    ... (77 FR 38181) an interim final rule that revised the requirement for re-verification of SDVOSB/VOSB... rule amending 38 CFR part 74, which was published on June 27, 2012, at 77 FR 38181, is adopted without... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF...

  3. 76 FR 20536 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-13

    ... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 75 RIN 2060-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing Correction In rule document 2011-6216 appearing on pages 17288-17325 in...

  4. 75 FR 77959 - VetBiz Vendor Information Pages Verification Program; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-14

    ...-461-7485. Correction In FR Doc. 2010-30550, published on December 7, 2010, at 75 FR 76080, make the... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF VETERANS AFFAIRS VetBiz Vendor Information Pages Verification Program; Correction AGENCY: Center for...

  5. 76 FR 60004 - Proposed Information Collection; Comment Request; Data Collection and Verification for the Marine...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... Collection and Verification for the Marine Protected Areas Inventory AGENCY: National Oceanic and Atmospheric... well as other stakeholders, to identify and inventory the nation's existing MPAs. Toward this end, the... Areas Inventory, an online spatial database that provides detailed information on MPAs nationwide....

  6. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  7. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  8. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  9. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  10. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  11. Neural signatures of autism

    PubMed Central

    Kaiser, Martha D.; Hudac, Caitlin M.; Shultz, Sarah; Lee, Su Mei; Cheung, Celeste; Berken, Allison M.; Deen, Ben; Pitskel, Naomi B.; Sugrue, Daniel R.; Voos, Avery C.; Saulnier, Celine A.; Ventola, Pamela; Wolf, Julie M.; Klin, Ami; Vander Wyk, Brent C.; Pelphrey, Kevin A.

    2010-01-01

    Functional magnetic resonance imaging of brain responses to biological motion in children with autism spectrum disorder (ASD), unaffected siblings (US) of children with ASD, and typically developing (TD) children has revealed three types of neural signatures: (i) state activity, related to the state of having ASD that characterizes the nature of disruption in brain circuitry; (ii) trait activity, reflecting shared areas of dysfunction in US and children with ASD, thereby providing a promising neuroendophenotype to facilitate efforts to bridge genomic complexity and disorder heterogeneity; and (iii) compensatory activity, unique to US, suggesting a neural system–level mechanism by which US might compensate for an increased genetic risk for developing ASD. The distinct brain responses to biological motion exhibited by TD children and US are striking given the identical behavioral profile of these two groups. These findings offer far-reaching implications for our understanding of the neural systems underlying autism. PMID:21078973

  12. Signatures of aging revisited

    SciTech Connect

    Drell, S.; Jeanloz, R.; Cornwall, J.; Dyson, F.; Eardley, D.

    1998-03-18

    This study is a follow-on to the review made by JASON during its 1997 Summer Study of what is known about the aging of critical constituents, particularly the high explosives, metals (Pu, U), and polymers in the enduring stockpile. The JASON report (JSR-97-320) that summarized the findings was based on briefings by the three weapons labs (LANL, LLNL, SNL). They presented excellent technical analyses covering a broad range of scientific and engineering problems pertaining to determining signatures of aging. But the report also noted: `Missing, however, from the briefings and the written documents made available to us by the labs and DOE, was evidence of an adequately sharp focus and high priorities on a number of essential near-term needs of maintaining weapons in the stockpile.

  13. Electromagnetic Signature Technique as a Promising Tool to Verify Nuclear Weapons Storage and Dismantlement under a Nuclear Arms Control Regime

    SciTech Connect

    Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.; Ramuhalli, Pradeep

    2012-08-01

    The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without the use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.

  14. Multisensors signature prediction workbench

    NASA Astrophysics Data System (ADS)

    Latger, Jean; Cathala, Thierry

    2015-10-01

    Guidance of weapon systems relies on sensors to analyze targets signature. Defense weapon systems also need to detect then identify threats also using sensors. The sensors performance is very dependent on conditions e.g. time of day, atmospheric propagation, background ... Visible camera are very efficient for diurnal fine weather conditions, long wave infrared sensors for night vision, radar systems very efficient for seeing through atmosphere and/or foliage ... Besides, multi sensors systems, combining several collocated sensors with associated algorithms of fusion, provide better efficiency (typically for Enhanced Vision Systems). But these sophisticated systems are all the more difficult to conceive, assess and qualify. In that frame, multi sensors simulation is highly required. This paper focuses on multi sensors simulation tools. A first part makes a state of the Art of such simulation workbenches with a special focus on SE-Workbench. SEWorkbench is described with regards to infrared/EO sensors, millimeter waves sensors, active EO sensors and GNSS sensors. Then a general overview of simulation of targets and backgrounds signature objectives is presented, depending on the type of simulation required (parametric studies, open loop simulation, closed loop simulation, hybridization of SW simulation and HW ...). After the objective review, the paper presents some basic requirements for simulation implementation such as the deterministic behavior of simulation, mandatory to repeat it many times for parametric studies... Several technical topics are then discussed, such as the rendering technique (ray tracing vs. rasterization), the implementation (CPU vs. GP GPU) and the tradeoff between physical accuracy and performance of computation. Examples of results using SE-Workbench are showed and commented.

  15. Signatures of dark matter

    NASA Astrophysics Data System (ADS)

    Baltz, Edward Anthony

    It is well known that most of the mass in the universe remains unobserved save for its gravitational effect on luminous matter. The nature of this ``dark matter'' remains a mystery. From measurements of the primordial deuterium abundance, the theory of big bang nucleosynthesis predicts that there are not enough baryons to account for the amount of dark matter observed, thus the missing mass must take an exotic form. Several promising candidates have been proposed. In this work I will describe my research along two main lines of inquiry into the dark matter puzzle. The first possibility is that the dark matter is exotic massive particles, such as those predicted by supersymmetric extensions to the standard model of particle physics. Such particles are generically called WIMPs, for weakly interacting massive particles. Focusing on the so-called neutralino in supersymmetric models, I discuss the possible signatures of such particles, including their direct detection via nuclear recoil experiments and their indirect detection via annihilations in the halos of galaxies, producing high energy antiprotons, positrons and gamma rays. I also discuss signatures of the possible slow decays of such particles. The second possibility is that there is a population of black holes formed in the early universe. Any dark objects in galactic halos, black holes included, are called MACHOs, for massive compact halo objects. Such objects can be detected by their gravitational microlensing effects. Several possibilities for sources of baryonic dark matter are also interesting for gravitational microlensing. These include brown dwarf stars and old, cool white dwarf stars. I discuss the theory of gravitational microlensing, focusing on the technique of pixel microlensing. I make predictions for several planned microlensing experiments with ground based and space based telescopes. Furthermore, I discuss binary lenses in the context of pixel microlensing. Finally, I develop a new technique for

  16. Signatures of AGN feedback

    NASA Astrophysics Data System (ADS)

    Wylezalek, D.; Zakamska, N.

    2016-06-01

    Feedback from active galactic nuclei (AGN) is widely considered to be the main driver in regulating the growth of massive galaxies. It operates by either heating or driving the gas that would otherwise be available for star formation out of the galaxy, preventing further increase in stellar mass. Observational proof for this scenario has, however, been hard to come by. We have assembled a large sample of 133 radio-quiet type-2 and red AGN at 0.1signatures are hosted in galaxies that are more `quenched' considering their stellar mass than galaxies with weaker outflow signatures. This correlation is only seen in AGN host galaxies with SFR >100 M_{⊙} yr^{-1} where presumably the coupling of the AGN-driven wind to the gas is strongest. This observation is consistent with the AGN having a net suppression, or `negative' impact, through feedback on the galaxies' star formation history.

  17. TPS verification with UUT simulation

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  18. Biometric verification with correlation filters.

    PubMed

    Vijaya Kumar, B V K; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-10

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification. PMID:14735958

  19. Biometric verification with correlation filters

    NASA Astrophysics Data System (ADS)

    Vijaya Kumar, B. V. K.; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-01

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification.

  20. Being Online

    ERIC Educational Resources Information Center

    Hale, Sharon Joy Ng

    2007-01-01

    Online education is particularly well suited to the needs of community college students. Although community colleges have lower per-unit fees than four-year colleges and universities, many community college students still experience economic hardship. Even with fee waivers, students may have problems finding the money for textbooks,…

  1. Online 1990.

    ERIC Educational Resources Information Center

    Goldstein, Morris

    This paper examines the co-existence of online and CD-ROM technologies in terms of their existing pricing structures, marketing strategies, functionality, and future roles. "Fixed Price Unlimited Usage" (FPUU) pricing and flat-rate pricing are discussed as viable alternatives to current pricing practices. In addition, it is argued that the…

  2. Online inspection

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An online line-scan imaging system capable of both hyperspectral and multispectral visible/near-infrared reflectance imaging was developed to inspect freshly slaughtered chickens on a processing line for wholesomeness. In-plant testing results indicated that the imaging inspection system achieved o...

  3. Online Learning

    ERIC Educational Resources Information Center

    Perry, Edward H.; Pilati, Michelle L.

    2011-01-01

    Distance education, which began as correspondence courses in the nineteenth century and grew into educational television during the twentieth century, evolved into learning on the Web by the mid-1990s. Accompanying the rise in online learning has been a similar rise in organizations and publications dedicated to serving the needs of online…

  4. Off-line signature recognition based on dynamic methods

    NASA Astrophysics Data System (ADS)

    Igarza, Juan J.; Hernaez, Inmaculada; Goirizelaia, Inaki; Espinosa, Koldo; Escolar, Jon

    2005-03-01

    In this paper we present the work developed on off-line signature verification as a continuation of a previous work using Left-to-Right Hidden Markov Models (LR-HMM) in order to extend those models to the field of static or off-line signature processing using results provided by image connectivity analysis. The chain encoding of perimeter points for each blob obtained by this analysis is an ordered set of points in the space, clockwise around the perimeter of the blob. Two models are generated depending on the way the blobs obtained from the connectivity analysis are ordered. In the first one, blobs are ordered according to their perimeter length. In the second proposal, blobs are ordered in their natural reading order, i.e. from the top to the bottom and left to right. Finally, two LR-HMM models are trained using the (x,y) coordinates of the chain codes obtained by the two mentioned techniques and a set of geometrical local features obtained from them such as polar coordinates referred to the center of ink, local radii, segment lengths and local tangent angle. Verification results of the two techniques are compared over a biometrical database containing skilled forgeries.

  5. A Blind Quantum Signature Scheme with χ-type Entangled States

    NASA Astrophysics Data System (ADS)

    Yin, Xun-Ru; Ma, Wen-Ping; Liu, Wei-Yan

    2012-02-01

    A blind quantum signature scheme with χ-type entangled states is proposed, which can be applied to E-voting system. In this scheme, the particles in χ-type state sequence are used for quantum key distribution first, and then for quantum signature. Our scheme is characterized by its blindness, impossibility of forgery, impossibility of disavowal. In addition, our scheme can perform an audit program with respect to the validity of the verification process in the light of actual requirements. The security of the scheme is also analyzed.

  6. On the signature of LINCOS

    NASA Astrophysics Data System (ADS)

    Ollongren, Alexander

    2010-12-01

    Suppose the international SETI effort yields the discovery of some signal of evidently non-natural origin. Could it contain linguistic information formulated in some kind of Lingua Cosmica? One way to get insight into this matter is to consider what specific (bio) linguistic signature( s) could be attached to a cosmic language for interstellar communication—designed by humans or an alien society having reached a level of intelligence (and technology) comparable to or surpassing ours. For this purpose, we consider in the present paper the logico-linguistic system LINCOS for ( A)CETI, developed during a number of years by the author in several papers and a monograph [1]. The system has a two-fold signature, which distinguishes it significantly from natural languages. In fact abstract and concrete signatures can be distinguished. That an abstract kind occurs is due to the manner in which abstractions of reality are represented in LINCOS-texts. They can take compound forms because the system is multi-expressive—partly due to the availability of inductive (recursive) entities. On the other hand, the concrete signature of LINCOS is related to the distribution of delimiters and predefined tokens in texts. Assigning measures to concrete signatures will be discussed elsewhere. The present contribution concentrates on the abstract signature of the language. At the same time, it is realized that an alien Lingua Cosmica might, but not necessarily needs to have this kind of signatures.

  7. Infrasound Rocket Signatures

    NASA Astrophysics Data System (ADS)

    Olson, J.

    2012-09-01

    This presentation reviews the work performed by our research group at the Geophysical Institute as we have applied the tools of infrasound research to rocket studies. This report represents one aspect of the effort associated with work done for the National Consortium for MASINT Research (NCMR) program operated by the National MASINT Office (NMO) of the Defense Intelligence Agency (DIA). Infrasound, the study of acoustic signals and their propagation in a frequency band below 15 Hz, enables an investigator to collect and diagnose acoustic signals from distant sources. Absorption of acoustic energy in the atmosphere decreases as the frequency is reduced. In the infrasound band signals can propagate hundreds and thousands of kilometers with little degradation. We will present an overview of signatures from rockets ranging from small sounding rockets such as the Black Brandt and Orion series to larger rockets such as Delta 2,4 and Atlas V. Analysis of the ignition transients provides information that can uniquely identify the motor type. After the rocket ascends infrasound signals can be used to characterize the rocket and identify the various events that take place along a trajectory such as staging and maneuvering. We have also collected information on atmospheric shocks and sonic booms from the passage of supersonic vehicles such as the shuttle. This review is intended to show the richness of the unique signal set that occurs in the low-frequency infrasound band.

  8. Statistical clumped isotope signatures

    PubMed Central

    Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  9. UHECR: Signatures and models

    NASA Astrophysics Data System (ADS)

    Berezinsky, V.

    2013-06-01

    The signatures of Ultra High Energy (E ≳ 1 EeV) proton propagation through CMB radiation are pair-production dip and GZK cutoff. The visible characteristics of these two spectral features are ankle, which is intrinsic part of the dip, beginning of GZK cutoff in the differential spectrum and E1/2 in integral spectrum. Measured by HiRes and Telescope Array (TA) these characteristics agree with theoretical predictions. However, directly measured mass composition remains a puzzle. While HiRes and TA detectors observe the proton-dominated mass composition, the data of Auger detector strongly evidence for nuclei mass composition becoming progressively heavier at energy higher than 4 EeV and reaching Iron at energy about 35 EeV. The models based on the Auger and HiRes/TA data are considered independently and classified using the transition from galactic to extragalactic cosmic rays. The ankle cannot provide this transition. since data of all three detector at energy (1-3) EeV agree with pure proton composition (or at least not heavier than Helium). If produced in Galaxy these particles result in too high anisotropy. This argument excludes or strongly disfavours all ankle models with ankle energy Ea > 3 EeV. The calculation of elongation curves, Xmax(E), for different ankle models strengthens further this conclusion. Status of other models, the dip, mixed composition and Auger based models are discussed.

  10. Statistical clumped isotope signatures.

    PubMed

    Röckmann, T; Popa, M E; Krol, M C; Hofmann, M E G

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  11. Verification hybrid control of a wheeled mobile robot and manipulator

    NASA Astrophysics Data System (ADS)

    Muszynska, Magdalena; Burghardt, Andrzej; Kurc, Krzysztof; Szybicki, Dariusz

    2016-04-01

    In this article, innovative approaches to realization of the wheeled mobile robots and manipulator tracking are presented. Conceptions include application of the neural-fuzzy systems to compensation of the controlled system's nonlinearities in the tracking control task. Proposed control algorithms work on-line, contain structure, that adapt to the changeable work conditions of the controlled systems, and do not require the preliminary learning. The algorithm was verification on the real object which was a Scorbot - ER 4pc robotic manipulator and a Pioneer - 2DX mobile robot.

  12. A proposed neutral line signature

    NASA Technical Reports Server (NTRS)

    Doxas, I.; Speiser, T. W.; Dusenbery, P. B.; Horton, W.

    1992-01-01

    An identifying signature is proposed for the existence and location of the neutral line in the magnetotail. The signature, abrupt density, and temperature changes in the Earthtail direction, was first discovered in test particle simulations. Such temperature variations have been observed in ISEE data (Huang et. al. 1992), but their connection to the possible existence of a neutral line in the tail has not yet been established. The proposed signature develops earlier than the ion velocity space ridge of Martin and Speiser (1988), but can only be seen by spacecraft in the vicinity of the neutral line, while the latter can locate a neutral line remotely.

  13. Signature surveillance of nuclear fuel

    SciTech Connect

    Bernatowicz, H.; Schoenig, F.C.

    1982-08-31

    Typical nuclear fuel material contains tramp ferromagnetic particles of random size and distribution. Also, selected amounts of paramagnetic or ferromagnetic material can be added at random or at known positions in the fuel material. The fuel material in its nonmagnetic container can be scanned by magnetic susceptibility change detecting apparatus to provide a unique signal waveform of the container of fuel material as a signature thereof. At subsequent times in its life, the container similarly can be scanned to provide subsequent signatures. Comparison of the signatures reveals any alteration or tampering with the fuel material.

  14. Nondestructive verification with minimal movement of irradiated light-water-reactor fuel assemblies

    SciTech Connect

    Phillips, J.R.; Bosler, G.E.; Halbig, J.K.; Klosterbuer, S.F.; Menlove, H.O.

    1982-10-01

    Nondestructive verification of irradiated light-water reactor fuel assemblies can be performed rapidly and precisely by measuring their gross gamma-ray and neutron signatures. A portable system measured fuel assemblies with exposures ranging from 18.4 to 40.6 GWd/tU and with cooling times ranging from 1575 to 2638 days. Differences in the measured results for side or corner measurements are discussed. 25 figures, 20 tables.

  15. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  16. Intrusion detection using secure signatures

    DOEpatents

    Nelson, Trent Darnel; Haile, Jedediah

    2014-09-30

    A method and device for intrusion detection using secure signatures comprising capturing network data. A search hash value, value employing at least one one-way function, is generated from the captured network data using a first hash function. The presence of a search hash value match in a secure signature table comprising search hash values and an encrypted rule is determined. After determining a search hash value match, a decryption key is generated from the captured network data using a second hash function, a hash function different form the first hash function. One or more of the encrypted rules of the secure signatures table having a hash value equal to the generated search hash value are then decrypted using the generated decryption key. The one or more decrypted secure signature rules are then processed for a match and one or more user notifications are deployed if a match is identified.

  17. Signature-based image identification

    NASA Astrophysics Data System (ADS)

    Abdel-Mottaleb, Mohamed; Vaithilingam, Gandhimathi; Krishnamachari, Santhana

    1999-11-01

    The use of digital images and video is growing on the Internet and on consumer devices. Digital images and video are easy to manipulate, but this ease of manipulation makes tampering with digital content possible. Examples of the misuse of digital content include violating copyrights of the content and tampering with important material such as contents of video surveillance. In this paper we present an algorithm that extracts a binary signature from an image. This approach can be used to search for possible copyright violations by finding images with signatures close to that of a given image. The experimental results show that the algorithm can be very effective in helping users to retrieve sets of almost identical images from large collections of images. The signature can also be used for tamper detection. We will show that the signatures we extract are immune to quantization errors that result from compression and decompression.

  18. Identifying, Visualizing, and Fusing Social Media Data to Support Nonproliferation and Arms Control Treaty Verification: Preliminary Results

    SciTech Connect

    Gastelum, Zoe N.; Cramer, Nicholas O.; Benz, Jacob M.; Kreyling, Sean J.; Henry, Michael J.; Corley, Courtney D.; Whattam, Kevin M.

    2013-07-11

    While international nonproliferation and arms control verification capabilities have their foundations in physical and chemical sensors, state declarations, and on-site inspections, verification experts are beginning to consider the importance of open source data to complement and support traditional means of verification. One of those new, and increasingly expanding, sources of open source information is social media, which can be ingested and understood through social media analytics (SMA). Pacific Northwest National Laboratory (PNNL) is conducting research to further our ability to identify, visualize, and fuse social media data to support nonproliferation and arms control treaty verification efforts. This paper will describe our preliminary research to examine social media signatures of nonproliferation or arms control proxy events. We will describe the development of our preliminary nonproliferation and arms control proxy events, outline our initial findings, and propose ideas for future work.

  19. Diagnostic marker signature for esophageal cancer from transcriptome analysis.

    PubMed

    Warnecke-Eberz, Ute; Metzger, Ralf; Hölscher, Arnulf H; Drebber, Uta; Bollschweiler, Elfriede

    2016-05-01

    Esophageal cancer is often diagnosed at an advanced stage. Diagnostic markers are needed for achieving a cure in esophageal cancer detecting and treating tumor cells earlier. In patients with locally advanced squamous cell carcinoma of the esophagus (ESCC), we profiled the gene expression of ESCC compared to corresponding normal biopsies for diagnostic markers by genome microarrays. Profiling of gene expression identified 4844 genes differentially expressed, 2122 upregulated and 2722 downregulated in ESCC. Twenty-three overexpressed candidates with best scores from significance analysis have been selected for further analysis by TaqMan low-density array-technique using a validation cohort of 40 patients. The verification rate was 100 % for ESCC. Twenty-two markers were additionally overexpressed in adenocarcinoma of the esophagus (EAC). The markers significantly overexpressed already in earlier tumor stages (pT1-2) of both histological subtypes (n = 19) have been clustered in a "diagnostic signature": PLA2G7, PRAME, MMP1, MMP3, MMP12, LIlRB2, TREM2, CHST2, IGFBP2, IGFBP7, KCNJ8, EMILIN2, CTHRC1, EMR2, WDR72, LPCAT1, COL4A2, CCL4, and SNX10. The marker signature will be translated to clinical practice to prove its diagnostic impact. This diagnostic signature may contribute to the earlier detection of tumor cells, with the aim to complement clinical techniques resulting in the development of better detection of concepts of esophageal cancer for earlier therapy and more favorite prognosis. PMID:26631031

  20. Improving multispectral mapping by spectral modeling with hyperspectral signatures

    NASA Astrophysics Data System (ADS)

    Kruse, Fred A.; Perry, Sandra L.

    2009-01-01

    Hyperspectral imaging (HSI) data in the 0.4 - 2.5 micrometer spectral range allow direct identification of materials using their spectral signatures, however, spatial coverage is limited. Multispectral Imaging (MSI) data are spectrally undersampled and may not allow unique identification, but they do provide synoptic spatial coverage. We have developed an approach that uses coincident HSI/MSI data to extend mineral mapping to larger areas. Hyperspectral data are used to model and extend signatures to multispectral Advanced Spaceborne Thermal Emmission and Reflection Radiometer (ASTER) data. Analysis consists of 1. Atmospheric correction of both the hyperspectral and multispectral data, 2. Analysis of the hyperspectral data to determine spectral endmembers and their spatial distributions, 3. Spectral modeling to convert the hyperspectral signatures to the multispectral response, and 4. Analysis of the MSI data to extend mapping to the larger spatial coverage of the multispectral data. Comparing overlapping area with extensive field verification shows that ASTER mineral mapping using these methods approaches 70% accuracy compared to HSI for selected minerals. Spot checking of extended ASTER mapping results also shows good correspondence. While examples shown are specific to ASTER Short Wave Infrared (SWIR) data, the approach could also be used for other multispectral sensors and spectral ranges.

  1. Ballastic signature identification systems study

    NASA Technical Reports Server (NTRS)

    Reich, A.; Hine, T. L.

    1976-01-01

    The results are described of an attempt to establish a uniform procedure for documenting (recording) expended bullet signatures as effortlessly as possible and to build a comprehensive library of these signatures in a form that will permit the automated comparison of a new suspect bullet with the prestored library. The ultimate objective is to achieve a standardized format that will permit nationwide interaction between police departments, crime laboratories, and other interested law enforcement agencies.

  2. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  3. Space Telescope performance and verification

    NASA Technical Reports Server (NTRS)

    Wright, W. F.

    1980-01-01

    The verification philosophy for the Space Telescope (ST) has evolved from years of experience with multispacecraft programs modified by the new factors introduced by the Space Transportation System. At the systems level of test, the ST will undergo joint qualification/acceptance tests with environment simulation using Lockheed's large spacecraft test facilities. These tests continue the process of detecting workmanship defects and module interface incompatibilities. The test program culminates in an 'all up' ST environmental test verification program resulting in a 'ready to launch' ST.

  4. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  5. 1 CFR 18.7 - Signature.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 1 General Provisions 1 2010-01-01 2010-01-01 false Signature. 18.7 Section 18.7 General Provisions... PREPARATION AND TRANSMITTAL OF DOCUMENTS GENERALLY § 18.7 Signature. The original and each duplicate original... stamped beneath the signature. Initialed or impressed signatures will not be accepted. Documents...

  6. 1 CFR 18.7 - Signature.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 1 General Provisions 1 2011-01-01 2011-01-01 false Signature. 18.7 Section 18.7 General Provisions... PREPARATION AND TRANSMITTAL OF DOCUMENTS GENERALLY § 18.7 Signature. The original and each duplicate original... stamped beneath the signature. Initialed or impressed signatures will not be accepted. Documents...

  7. Quantum messages with signatures forgeable in arbitrated quantum signature schemes

    NASA Astrophysics Data System (ADS)

    Kim, Taewan; Choi, Jeong Woon; Jho, Nam-Su; Lee, Soojoon

    2015-02-01

    Even though a method to perfectly sign quantum messages has not been known, the arbitrated quantum signature scheme has been considered as one of the good candidates. However, its forgery problem has been an obstacle to the scheme becoming a successful method. In this paper, we consider one situation, which is slightly different from the forgery problem, that we use to check whether at least one quantum message with signature can be forged in a given scheme, although all the messages cannot be forged. If there are only a finite number of forgeable quantum messages in the scheme, then the scheme can be secured against the forgery attack by not sending forgeable quantum messages, and so our situation does not directly imply that we check whether the scheme is secure against the attack. However, if users run a given scheme without any consideration of forgeable quantum messages, then a sender might transmit such forgeable messages to a receiver and in such a case an attacker can forge the messages if the attacker knows them. Thus it is important and necessary to look into forgeable quantum messages. We show here that there always exists such a forgeable quantum message-signature pair for every known scheme with quantum encryption and rotation, and numerically show that there are no forgeable quantum message-signature pairs that exist in an arbitrated quantum signature scheme.

  8. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  9. A verification system of RMAP protocol controller

    NASA Astrophysics Data System (ADS)

    Khanov, V. Kh; Shakhmatov, A. V.; Chekmarev, S. A.

    2015-01-01

    The functional verification problem of IP blocks of RMAP protocol controller is considered. The application of the verification method using fully- functional models of the processor and the internal bus of a system-on-chip is justified. Principles of construction of a verification system based on the given approach are proposed. The practical results of creating a system of verification of IP block of RMAP protocol controller is presented.

  10. Development of Asset Fault Signatures for Prognostic and Health Management in the Nuclear Industry

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2014-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: Diagnostic Advisor, Asset Fault Signature (AFS) Database, Remaining Useful Life Advisor, and Remaining Useful Life Database. This paper focuses on development of asset fault signatures to assess the health status of generator step-up generators and emergency diesel generators in nuclear power plants. Asset fault signatures describe the distinctive features based on technical examinations that can be used to detect a specific fault type. At the most basic level, fault signatures are comprised of an asset type, a fault type, and a set of one or more fault features (symptoms) that are indicative of the specified fault. The AFS Database is populated with asset fault signatures via a content development exercise that is based on the results of intensive technical research and on the knowledge and experience of technical experts. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  11. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  12. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  13. Signature molecular descriptor : advanced applications.

    SciTech Connect

    Visco, Donald Patrick, Jr.

    2010-04-01

    In this work we report on the development of the Signature Molecular Descriptor (or Signature) for use in the solution of inverse design problems as well as in highthroughput screening applications. The ultimate goal of using Signature is to identify novel and non-intuitive chemical structures with optimal predicted properties for a given application. We demonstrate this in three studies: green solvent design, glucocorticoid receptor ligand design and the design of inhibitors for Factor XIa. In many areas of engineering, compounds are designed and/or modified in incremental ways which rely upon heuristics or institutional knowledge. Often multiple experiments are performed and the optimal compound is identified in this brute-force fashion. Perhaps a traditional chemical scaffold is identified and movement of a substituent group around a ring constitutes the whole of the design process. Also notably, a chemical being evaluated in one area might demonstrate properties very attractive in another area and serendipity was the mechanism for solution. In contrast to such approaches, computer-aided molecular design (CAMD) looks to encompass both experimental and heuristic-based knowledge into a strategy that will design a molecule on a computer to meet a given target. Depending on the algorithm employed, the molecule which is designed might be quite novel (re: no CAS registration number) and/or non-intuitive relative to what is known about the problem at hand. While CAMD is a fairly recent strategy (dating to the early 1980s), it contains a variety of bottlenecks and limitations which have prevented the technique from garnering more attention in the academic, governmental and industrial institutions. A main reason for this is how the molecules are described in the computer. This step can control how models are developed for the properties of interest on a given problem as well as how to go from an output of the algorithm to an actual chemical structure. This report

  14. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  15. Visual Attention During Sentence Verification.

    ERIC Educational Resources Information Center

    Lucas, Peter A.

    Eye movement data were collected for 28 college students reading 32 sentences with sentence verification questions. The factors observed were target sentence voice (active/passive), probe voice, and correct response (true/false). Pairs of subjects received the same set of stimuli, but with agents and objects in the sentences reversed. As expected,…

  16. Improved method for coliform verification.

    PubMed

    Diehl, J D

    1991-02-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  17. Improved method for coliform verification.

    PubMed Central

    Diehl, J D

    1991-01-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  18. A scheme for symmetrization verification

    NASA Astrophysics Data System (ADS)

    Sancho, Pedro

    2011-08-01

    We propose a scheme for symmetrization verification in two-particle systems, based on one-particle detection and state determination. In contrast to previous proposals, it does not follow a Hong-Ou-Mandel-type approach. Moreover, the technique can be used to generate superposition states of single particles.

  19. VERIFICATION OF WATER QUALITY MODELS

    EPA Science Inventory

    The basic concepts of water quality models are reviewed and the need to recognize calibration and verification of models with observed data is stressed. Post auditing of models after environmental control procedures are implemented is necessary to determine true model prediction ...

  20. Online Learning Grows Up.

    ERIC Educational Resources Information Center

    Vail, Kathleen

    2001-01-01

    Describes various American efforts to develop online schools and classes. Discusses attributes of successful online teachers and students. Lists 17 online learning support companies and their Web sites. (PKP)

  1. Multidimensional signatures in antimicrobial peptides

    PubMed Central

    Yount, Nannette Y.; Yeaman, Michael R.

    2004-01-01

    Conventional analyses distinguish between antimicrobial peptides by differences in amino acid sequence. Yet structural paradigms common to broader classes of these molecules have not been established. The current analyses examined the potential conservation of structural themes in antimicrobial peptides from evolutionarily diverse organisms. Using proteomics, an antimicrobial peptide signature was discovered to integrate stereospecific sequence patterns and a hallmark three-dimensional motif. This striking multidimensional signature is conserved among disulfide-containing antimicrobial peptides spanning biological kingdoms, and it transcends motifs previously limited to defined peptide subclasses. Experimental data validating this model enabled the identification of previously unrecognized antimicrobial activity in peptides of known identity. The multidimensional signature model provides a unifying structural theme in broad classes of antimicrobial peptides, will facilitate discovery of antimicrobial peptides as yet unknown, and offers insights into the evolution of molecular determinants in these and related host defense effector molecules. PMID:15118082

  2. Graph Analytics for Signature Discovery

    SciTech Connect

    Hogan, Emilie A.; Johnson, John R.; Halappanavar, Mahantesh; Lo, Chaomei

    2013-06-01

    Within large amounts of seemingly unstructured data it can be diffcult to find signatures of events. In our work we transform unstructured data into a graph representation. By doing this we expose underlying structure in the data and can take advantage of existing graph analytics capabilities, as well as develop new capabilities. Currently we focus on applications in cybersecurity and communication domains. Within cybersecurity we aim to find signatures for perpetrators using the pass-the-hash attack, and in communications we look for emails or phone calls going up or down a chain of command. In both of these areas, and in many others, the signature we look for is a path with certain temporal properties. In this paper we discuss our methodology for finding these temporal paths within large graphs.

  3. Signature Visualization of Software Binaries

    SciTech Connect

    Panas, T

    2008-07-01

    In this paper we present work on the visualization of software binaries. In particular, we utilize ROSE, an open source compiler infrastructure, to pre-process software binaries, and we apply a landscape metaphor to visualize the signature of each binary (malware). We define the signature of a binary as a metric-based layout of the functions contained in the binary. In our initial experiment, we visualize the signatures of a series of computer worms that all originate from the same line. These visualizations are useful for a number of reasons. First, the images reveal how the archetype has evolved over a series of versions of one worm. Second, one can see the distinct changes between version. This allows the viewer to form conclusions about the development cycle of a particular worm.

  4. catRAPID signature: identification of ribonucleoproteins and RNA-binding regions

    PubMed Central

    Livi, Carmen Maria; Klus, Petr; Delli Ponti, Riccardo; Tartaglia, Gian Gaetano

    2016-01-01

    Motivation: Recent technological advances revealed that an unexpected large number of proteins interact with transcripts even if the RNA-binding domains are not annotated. We introduce catRAPID signature to identify ribonucleoproteins based on physico-chemical features instead of sequence similarity searches. The algorithm, trained on human proteins and tested on model organisms, calculates the overall RNA-binding propensity followed by the prediction of RNA-binding regions. catRAPID signature outperforms other algorithms in the identification of RNA-binding proteins and detection of non-classical RNA-binding regions. Results are visualized on a webpage and can be downloaded or forwarded to catRAPID omics for predictions of RNA targets. Availability and implementation: catRAPID signature can be accessed at http://s.tartaglialab.com/new_submission/signature. Contact: gian.tartaglia@crg.es or gian@tartaglialab.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26520853

  5. Hybrid Enrichment Assay Methods for a UF6 Cylinder Verification Station: FY10 Progress Report

    SciTech Connect

    Smith, Leon E.; Jordan, David V.; Orton, Christopher R.; Misner, Alex C.; Mace, Emily K.

    2010-08-01

    Pacific Northwest National Laboratory (PNNL) is developing the concept of an automated UF6 cylinder verification station that would be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until the arrival of International Atomic Energy Agency (IAEA) inspectors. At the center of this unattended system is a hybrid enrichment assay technique that combines the traditional enrichment-meter method (based on the 186 keV peak from 235U) with non-traditional neutron-induced high-energy gamma-ray signatures (spawned primarily by 234U alpha emissions and 19F(alpha, neutron) reactions). Previous work by PNNL provided proof-of-principle for the non-traditional signatures to support accurate, full-volume interrogation of the cylinder enrichment, thereby reducing the systematic uncertainties in enrichment assay due to UF6 heterogeneity and providing greater sensitivity to material substitution scenarios. The work described here builds on that preliminary evaluation of the non-traditional signatures, but focuses on a prototype field system utilizing NaI(Tl) and LaBr3(Ce) spectrometers, and enrichment analysis algorithms that integrate the traditional and non-traditional signatures. Results for the assay of Type-30B cylinders ranging from 0.2 to 4.95 wt% 235U, at an AREVA fuel fabrication plant in Richland, WA, are described for the following enrichment analysis methods: 1) traditional enrichment meter signature (186 keV peak) as calculated using a square-wave convolute (SWC) algorithm; 2) non-traditional high-energy gamma-ray signature that provides neutron detection without neutron detectors and 3) hybrid algorithm that merges the traditional and non-traditional signatures. Uncertainties for each method, relative to the declared enrichment for each cylinder, are calculated and compared to the uncertainties from an attended

  6. Ballistic Signature Identification System Study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The first phase of a research project directed toward development of a high speed automatic process to be used to match gun barrel signatures imparted to fired bullets was documented. An optical projection technique has been devised to produce and photograph a planar image of the entire signature, and the phototransparency produced is subjected to analysis using digital Fourier transform techniques. The success of this approach appears to be limited primarily by the accuracy of the photographic step since no significant processing limitations have been encountered.

  7. Topological Signatures for Population Admixture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Topological Signatures for Population AdmixtureDeniz Yorukoglu1, Filippo Utro1, David Kuhn2, Saugata Basu3 and Laxmi Parida1* Abstract Background: As populations with multi-linear transmission (i.e., mixing of genetic material from two parents, say) evolve over generations, the genetic transmission...

  8. Invisibly Sanitizable Digital Signature Scheme

    NASA Astrophysics Data System (ADS)

    Miyazaki, Kunihiko; Hanaoka, Goichiro; Imai, Hideki

    A digital signature does not allow any alteration of the document to which it is attached. Appropriate alteration of some signed documents, however, should be allowed because there are security requirements other than the integrity of the document. In the disclosure of official information, for example, sensitive information such as personal information or national secrets is masked when an official document is sanitized so that its nonsensitive information can be disclosed when it is requested by a citizen. If this disclosure is done digitally by using the current digital signature schemes, the citizen cannot verify the disclosed information because it has been altered to prevent the leakage of sensitive information. The confidentiality of official information is thus incompatible with the integrity of that information, and this is called the digital document sanitizing problem. Conventional solutions such as content extraction signatures and digitally signed document sanitizing schemes with disclosure condition control can either let the sanitizer assign disclosure conditions or hide the number of sanitized portions. The digitally signed document sanitizing scheme we propose here is based on the aggregate signature derived from bilinear maps and can do both. Moreover, the proposed scheme can sanitize a signed document invisibly, that is, no one can distinguish whether the signed document has been sanitized or not.

  9. Disaster relief through composite signatures

    NASA Astrophysics Data System (ADS)

    Hawley, Chadwick T.; Hyde, Brian; Carpenter, Tom; Nichols, Steve

    2012-06-01

    A composite signature is a group of signatures that are related in such a way to more completely or further define a target or operational endeavor at a higher fidelity. This paper builds on previous work developing innovative composite signatures associated with civil disasters, including physical, chemical and pattern/behavioral. For the composite signature approach to be successful it requires effective data fusion and visualization. This plays a key role in both preparedness and the response and recovery which are critical to saving lives. Visualization tools enhance the overall understanding of the crisis by pulling together and analyzing the data, and providing a clear and complete analysis of the information to the organizations/agencies dependant on it for a successful operation. An example of this, Freedom Web, is an easy-to-use data visualization and collaboration solution for use in homeland security, emergency preparedness, situational awareness, and event management. The solution provides a nationwide common operating picture for all levels of government through a web based, map interface. The tool was designed to be utilized by non-geospatial experts and is easily tailored to the specific needs of the users. Consisting of standard COTS and open source databases and a web server, users can view, edit, share, and highlight information easily and quickly through a standard internet browser.

  10. Online Nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Meyer Jordan, Bradley, IV; The, Lih-Sin; Robbins, Stuart

    2004-05-01

    Nuclear-reaction network codes are important to astronomers seeking to explore nucleosynthetic implications of astrophysical models and to nuclear physicists seeking to understand the role of nuclear properties or reaction rates in element formation. However, many users do not have the time or inclination to download and compile the codes, to manage the requisite input files, or to explore the often complex output with their own graphics programs. To help make nucleosynthesis calculations more readily available, we have placed the Clemson Nucleosynthesis code on the world-wide web at http://www.ces.clemson.edu/physics/nucleo/nuclearNetwork At this web site, any Internet user may set his or her own reaction network, nuclear properties and reaction rates, and thermodynamic trajectories. The user then submits the nucleosynthesis calculation, which runs on a dedicated server professionally maintained at Clemson University. Once the calculation is completed, the user may explore the results through dynamically produced and downloadable tables and graphs. Online help guides the user through the necessary steps. We hope this web site will prove a user-friendly and helpful tool for professional scientists as well as for students seeking to explore element formation.

  11. COS Internal NUV Wavelength Verification

    NASA Astrophysics Data System (ADS)

    Keyes, Charles

    2009-07-01

    This program will be executed after the uplink of the OSM2 position updates derived from the determination of the wavelength-scale zero points and desired spectral ranges for each grating in activity COS14 {program 11474 - COS NUV Internal/External Wavelength Scales}. This program will verify that the operational spectral ranges for each grating, central wavelength, and FP-POS are those desired. Subsequent to a successful verification, COS NUV ERO observations and NUV science can be enabled. An internal wavelength calibration spectrum using the default PtNe lamp {lamp 1} with each NUV grating at each central wavelength setting and each FP-POS position will be obtained for the verification. Additional exposures and waits between certain exposures will be required to avoid - and to evaluate - mechanism drifts.

  12. COS Internal FUV Wavelength Verification

    NASA Astrophysics Data System (ADS)

    Keyes, Charles

    2009-07-01

    This program will be executed after the uplink of the OSM1 position updates derived from the determination of the wavelength-scale zero points and desired spectral ranges for each grating in activity COS29 {program 11487 - COS FUV Internal/External Wavelength Scales}. This program will verify that the operational spectral ranges for each grating, central wavelength, and FP-POS are those desired. Subsequent to a successful verification, COS FUV ERO observations that require accurate wavelength scales {if any} and FUV science can be enabled. An internal wavelength calibration spectrum using the default PtNe lamp {lamp 1} with each FUV grating at each central wavelength setting and each FP-POS position will be obtained for the verification. Additional exposures and waits between certain exposures will be required to avoid - and to evaluate - mechanism drifts.

  13. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  14. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  15. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  16. University Student Online Plagiarism

    ERIC Educational Resources Information Center

    Wang, Yu-mei

    2008-01-01

    This article reports a study investigating university student online plagiarism. The following questions are investigated: (a) What is the incidence of student online plagiarism? (b) What are student perceptions regarding online plagiarism? (c) Are there any differences in terms of student perceptions of online plagiarism and print plagiarism? (d)…

  17. Strategies for Online Educators

    ERIC Educational Resources Information Center

    Motte, Kristy

    2013-01-01

    For a variety of reasons, online education is an increasingly viable option for many students seeking to further their education. Because of this, the demand for online instructors continues to increase. Instructors transitioning to the online environment from the traditional classroom may find teaching online overwhelming. While some practices…

  18. Online Organic Chemistry

    ERIC Educational Resources Information Center

    Janowicz, Philip A.

    2010-01-01

    This is a comprehensive study of the many facets of an entirely online organic chemistry course. Online homework with structure-drawing capabilities was found to be more effective than written homework. Online lecture was found to be just as effective as in-person lecture, and students prefer an online lecture format with shorter Webcasts. Online…

  19. Online Search Optimization.

    ERIC Educational Resources Information Center

    Homan, Michael; Worley, Penny

    This course syllabus describes methods for optimizing online searching, using as an example searching on the National Library of Medicine (NLM) online system. Four major activities considered are the online interview, query analysis and search planning, online interaction, and post-search analysis. Within the context of these activities, concepts…

  20. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  1. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  2. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  3. Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2015-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation of the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  4. CAMPR3: a database on sequences, structures and signatures of antimicrobial peptides

    PubMed Central

    Waghu, Faiza Hanif; Barai, Ram Shankar; Gurung, Pratima; Idicula-Thomas, Susan

    2016-01-01

    Antimicrobial peptides (AMPs) are known to have family-specific sequence composition, which can be mined for discovery and design of AMPs. Here, we present CAMPR3; an update to the existing CAMP database available online at www.camp3.bicnirrh.res.in. It is a database of sequences, structures and family-specific signatures of prokaryotic and eukaryotic AMPs. Family-specific sequence signatures comprising of patterns and Hidden Markov Models were generated for 45 AMP families by analysing 1386 experimentally studied AMPs. These were further used to retrieve AMPs from online sequence databases. More than 4000 AMPs could be identified using these signatures. AMP family signatures provided in CAMPR3 can thus be used to accelerate and expand the discovery of AMPs. CAMPR3 presently holds 10247 sequences, 757 structures and 114 family-specific signatures of AMPs. Users can avail the sequence optimization algorithm for rational design of AMPs. The database integrated with tools for AMP sequence and structure analysis will be a valuable resource for family-based studies on AMPs. PMID:26467475

  5. Attitudes toward buying online.

    PubMed

    Yang, Bijou; Lester, David

    2004-02-01

    A survey of 11 positive features and 10 discouraging features of online shopping was carried out on 180 students and identified certain behavioral patterns for online shoppers versus non-shoppers. It was found that online shoppers have consistently stronger positive feelings about online shopping than do non-shoppers. On the other hand, non-shoppers have more negative feelings about online shopping than do shoppers, but not consistently so. Online shoppers are aware of some of the discouraging features of online shopping, but these features do not deter them from shopping online. The implication for marketers is that they should focus on making the experience of online shopping more accommodating and more user-friendly since the positive features of online shopping ("convenience" and "efficiency") appear to be more important than the negative features ("effort/impersonality"). PMID:15006173

  6. Block truncation signature coding for hyperspectral analysis

    NASA Astrophysics Data System (ADS)

    Chakravarty, Sumit; Chang, Chein-I.

    2008-08-01

    This paper introduces a new signature coding which is designed based on the well-known Block Truncation Coding (BTC). It comprises of bit-maps of the signature blocks generated by different threshold criteria. Two new BTC-based algorithms are developed for signature coding, to be called Block Truncation Signature Coding (BTSC) and 2-level BTSC (2BTSC). In order to compare the developed BTC based algorithms with current binary signature coding schemes such as Spectral Program Analysis Manager (SPAM) developed by Mazer et al. and Spectral Feature-based Binary Coding (SFBC) by Qian et al., three different thresholding functions, local block mean, local block gradient, local block correlation are derived to improve the BTSC performance where the combined bit-maps generated by these thresholds can provide better spectral signature characterization. Experimental results reveal that the new BTC-based signature coding performs more effectively in characterizing spectral variations than currently available binary signature coding methods.

  7. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  8. Microcalibrator system for chemical signature and reagent delivery.

    SciTech Connect

    Staton, Alan W.; Simonson, Robert Joseph; Adkins, Douglas Ray; Rawlinson, Kim Scott; Robinson, Alex Lockwood; Hance, Bradley G.; Manginell, Ronald Paul; Sanchez, Lawrence James; Ellison, Jennifer Anne; Sokolowski, Sara Suzette

    2005-03-01

    Networked systems of low-cost, small, integrable chemical sensors will enable monitoring of Nonproliferation and Materials Control targets and chemical weapons threats. Sandia-designed prototype chemical sensor systems are undergoing extended field testing supported by DOE and other government agencies. A required surety component will be verification of microanalytical system performance, which can be achieved by providing a programmable source of chemical signature(s) for autonomous calibration of analytical systems. In addition, such a controlled chemical source could be used to dispense microaliquots of derivatization reagents, extending the analysis capability of chemical sensors to a wider range of targets. We have developed a microfabricated system for controlled release of selected compounds (calibrants) into the analytical stream of microsensor systems. To minimize pumping and valve requirements of microfluidic systems, and to avoid degradation issues associated with storage of dilute solutions, we have utilized thermally labile organic salts as solid-phase reservoir materials. Reproducible deposition of tetrapropyl ammonium hydroxide onto arrays of microfabricated heating elements can provide a pair of calibration marker compounds (one fast and one slow-eluting compound) for GC analyses. The use of this microaliquot gas source array for hydrogen generation is currently under further development. The goal of the latter effort will be to provide a source of high-pressure, low viscosity GC carrier gas for Sandia's next-generation microfabricated gas-phase chemical analysis systems.

  9. Application of computer vision to automatic prescription verification in pharmaceutical mail order

    NASA Astrophysics Data System (ADS)

    Alouani, Ali T.

    2005-05-01

    In large volume pharmaceutical mail order, before shipping out prescriptions, licensed pharmacists ensure that the drug in the bottle matches the information provided in the patient prescription. Typically, the pharmacist has about 2 sec to complete the prescription verification process of one prescription. Performing about 1800 prescription verification per hour is tedious and can generate human errors as a result of visual and brain fatigue. Available automatic drug verification systems are limited to a single pill at a time. This is not suitable for large volume pharmaceutical mail order, where a prescription can have as many as 60 pills and where thousands of prescriptions are filled every day. In an attempt to reduce human fatigue, cost, and limit human error, the automatic prescription verification system (APVS) was invented to meet the need of large scale pharmaceutical mail order. This paper deals with the design and implementation of the first prototype online automatic prescription verification machine to perform the same task currently done by a pharmacist. The emphasis here is on the visual aspects of the machine. The system has been successfully tested on 43,000 prescriptions.

  10. The Global Diffusion of Societal Verification Tools: A Quantitative Assessment of the Public’s Ability to Engage Nonproliferation Treaty Monitoring

    SciTech Connect

    Sayre, Amanda M.; Kreyling, Sean J.; West, Curtis L.

    2015-07-11

    The spread of nuclear and dual-use technologies and the need for more robust, effective and efficient nonproliferation and arms control treaties has led to an increasing need for innovative verification approaches and technologies. This need, paired with advancements in online computing, mobile devices, commercially available satellite imagery and the evolution of online social networks, has led to a resurgence of the concept of societal verification for arms control and nonproliferation treaties. In the event a country accepts its citizens’ assistance in supporting transparency, confidence-building and societal verification, the host government will need a population that is willing and able to participate. While scholarly interest in societal verification continues to grow, social scientific research on the topic is lacking. The aim of this paper is to begin the process of understanding public societal verification capabilities, extend the availability of quantitative research on societal verification and set in motion complementary research to increase the breadth and depth of knowledge on this topic. This paper presents a potential framework and outlines a research roadmap for the development of such a societal verification capability index.