Science.gov

Sample records for online signature verification

  1. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  2. Online Signature Verification Based on DCT and Sparse Representation.

    PubMed

    Liu, Yishu; Yang, Zhihua; Yang, Lihua

    2015-11-01

    In this paper, a novel online signature verification technique based on discrete cosine transform (DCT) and sparse representation is proposed. We find a new property of DCT, which can be used to obtain a compact representation of an online signature using a fixed number of coefficients, leading to simple matching procedures and providing an effective alternative to deal with time series of different lengths. The property is also used to extract energy features. Furthermore, a new attempt to apply sparse representation to online signature verification is made, and a novel task-specific method for building overcomplete dictionaries is proposed, then sparsity features are extracted. Finally, energy features and sparsity features are concatenated to form a feature vector. Experiments are conducted on the Sabancı University's Signature Database (SUSIG)-Visual and SVC2004 databases, and the results show that our proposed method authenticates persons very reliably with a verification performance which is better than those of state-of-the-art methods on the same databases.

  3. Fusion strategies for boosting cancelable online signature verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    2010-04-01

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. As one solution, we proposed a scheme to enhance the performance of a cancelable approach for online signature verification by combining scores calculated from two transformed datasets generated using two keys. Generally, the same verification algorithm is used for transformed data as for raw (non-transformed) data in cancelable approaches, and, in our previous work, a verification system developed for a non-transformed dataset was used to calculate the scores from transformed data. In this paper, we modify the verification system by using transformed data for training. Several experiments were performed by using public databases, and the experimental results show that the modification of the verification system improved the performances. Our cancelable system combines two scores to make a decision. Several fusion strategies are also considered, and the experimental results are reported here.

  4. Online Handwritten Signature Verification Using Neural Network Classifier Based on Principal Component Analysis

    PubMed Central

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%. PMID:25133227

  5. Online handwritten signature verification using neural network classifier based on principal component analysis.

    PubMed

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Yussof, Salman; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%. PMID:25133227

  6. Online handwritten signature verification using neural network classifier based on principal component analysis.

    PubMed

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Yussof, Salman; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%.

  7. Signature verification by only single genuine sample in offline and online systems

    NASA Astrophysics Data System (ADS)

    Adamski, Marcin; Saeed, Khalid

    2016-06-01

    The paper presents innovatory methods and algorithms with experimental results on signature verification. It is mainly focused on applications where there is only one reference signature available for comparison. Such restriction is often present in practice and requires selection of specific methods. In this context, both offline and online approaches are investigated. In offline approach, binary image of the signature is initially thinned to obtain a one pixel-wide line. Then, a sampling technique is applied in order to form the signature feature vector. The identification and verification processes are based on comparing the reference feature vector with the questioned samples using Shape Context algorithm. In the case of online data, the system makes use of dynamic information such as trajectory, pen pressure, pen azimuth and pen altitude collected at the time of signing. After further preprocessing, these functional features are verified by means of Dynamic Time Warping method.

  8. Hill-Climbing Attacks and Robust Online Signature Verification Algorithm against Hill-Climbing Attacks

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo

    Attacks using hill-climbing methods have been reported as a vulnerability of biometric authentication systems. In this paper, we propose a robust online signature verification algorithm against such attacks. Specifically, the attack considered in this paper is a hill-climbing forged data attack. Artificial forgeries are generated offline by using the hill-climbing method, and the forgeries are input to a target system to be attacked. In this paper, we analyze the menace of hill-climbing forged data attacks using six types of hill-climbing forged data and propose a robust algorithm by incorporating the hill-climbing method into an online signature verification algorithm. Experiments to evaluate the proposed system were performed using a public online signature database. The proposed algorithm showed improved performance against this kind of attack.

  9. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  10. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  11. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  12. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  13. New online signature acquisition system

    NASA Astrophysics Data System (ADS)

    Oulefki, Adel; Mostefai, Messaoud; Abbadi, Belkacem; Djebrani, Samira; Bouziane, Abderraouf; Chahir, Youssef

    2013-01-01

    We present a nonconstraining and low-cost online signature acquisition system that has been developed to enhance the performances of an existing multimodal biometric authentication system (based initially on both voice and image modalities). A laboratory prototype has been developed and validated for an online signature acquisition.

  14. An individuality model for online signatures using global Fourier descriptors

    NASA Astrophysics Data System (ADS)

    Kholmatov, Alisher; Yanikoglu, Berrin

    2008-03-01

    The discriminative capability of a biometric is based on its individuality/uniqueness and is an important factor in choosing a biometric for a large-scale deployment. Individuality studies have been carried out rigorously for only certain biometrics, in particular fingerprint and iris, while work on establishing handwriting and signature individuality has been mainly on feature level. In this study, we present a preliminary individuality model for online signatures using the Fourier domain representation of the signature. Using the normalized Fourier coefficients as global features describing the signature, we derive a formula for the probability of coincidentally matching a given signature. Estimating model parameters from a large database and making certain simplifying assumptions, the probability of two arbitrary signatures to match in 13 of the coefficients is calculated as 4.7x10 -4. When compared with the results of a verification algorithm that parallels the theoretical model, the results show that the theoretical model fits the random forgery test results fairly well. While online signatures are sometimes dismissed as not very secure, our results show that the probability of successfully guessing an online signature is very low. Combined with the fact that signature is a behavioral biometric with adjustable complexity, these results support the use of online signatures for biometric authentication.

  15. Applications of a hologram watermarking protocol: aging-aware biometric signature verification and time validity check with personal documents

    NASA Astrophysics Data System (ADS)

    Vielhauer, Claus; Croce Ferri, Lucilla

    2003-06-01

    Our paper addresses two issues of a biometric authentication algorithm for ID cardholders previously presented namely the security of the embedded reference data and the aging process of the biometric data. We describe a protocol that allows two levels of verification, combining a biometric hash technique based on handwritten signature and hologram watermarks with cryptographic signatures in a verification infrastructure. This infrastructure consists of a Trusted Central Public Authority (TCPA), which serves numerous Enrollment Stations (ES) in a secure environment. Each individual performs an enrollment at an ES, which provides the TCPA with the full biometric reference data and a document hash. The TCPA then calculates the authentication record (AR) with the biometric hash, a validity timestamp, and a document hash provided by the ES. The AR is then signed with a cryptographic signature function, initialized with the TCPA's private key and embedded in the ID card as a watermark. Authentication is performed at Verification Stations (VS), where the ID card will be scanned and the signed AR is retrieved from the watermark. Due to the timestamp mechanism and a two level biometric verification technique based on offline and online features, the AR can deal with the aging process of the biometric feature by forcing a re-enrollment of the user after expiry, making use of the ES infrastructure. We describe some attack scenarios and we illustrate the watermarking embedding, retrieval and dispute protocols, analyzing their requisites, advantages and disadvantages in relation to security requirements.

  16. Analyzing personalized policies for online biometric verification.

    PubMed

    Sadhwani, Apaar; Yang, Yan; Wein, Lawrence M

    2014-01-01

    Motivated by India's nationwide biometric program for social inclusion, we analyze verification (i.e., one-to-one matching) in the case where we possess similarity scores for 10 fingerprints and two irises between a resident's biometric images at enrollment and his biometric images during his first verification. At subsequent verifications, we allow individualized strategies based on these 12 scores: we acquire a subset of the 12 images, get new scores for this subset that quantify the similarity to the corresponding enrollment images, and use the likelihood ratio (i.e., the likelihood of observing these scores if the resident is genuine divided by the corresponding likelihood if the resident is an imposter) to decide whether a resident is genuine or an imposter. We also consider two-stage policies, where additional images are acquired in a second stage if the first-stage results are inconclusive. Using performance data from India's program, we develop a new probabilistic model for the joint distribution of the 12 similarity scores and find near-optimal individualized strategies that minimize the false reject rate (FRR) subject to constraints on the false accept rate (FAR) and mean verification delay for each resident. Our individualized policies achieve the same FRR as a policy that acquires (and optimally fuses) 12 biometrics for each resident, which represents a five (four, respectively) log reduction in FRR relative to fingerprint (iris, respectively) policies previously proposed for India's biometric program. The mean delay is [Formula: see text] sec for our proposed policy, compared to 30 sec for a policy that acquires one fingerprint and 107 sec for a policy that acquires all 12 biometrics. This policy acquires iris scans from 32-41% of residents (depending on the FAR) and acquires an average of 1.3 fingerprints per resident.

  17. Online adaptation and verification of VMAT

    SciTech Connect

    Crijns, Wouter; Defraene, Gilles; Depuydt, Tom; Haustermans, Karin; Van Herck, Hans; Maes, Frederik; Van den Heuvel, Frank

    2015-07-15

    Purpose: This work presents a method for fast volumetric modulated arc therapy (VMAT) adaptation in response to interfraction anatomical variations. Additionally, plan parameters extracted from the adapted plans are used to verify the quality of these plans. The methods were tested as a prostate class solution and compared to replanning and to their current clinical practice. Methods: The proposed VMAT adaptation is an extension of their previous intensity modulated radiotherapy (IMRT) adaptation. It follows a direct (forward) planning approach: the multileaf collimator (MLC) apertures are corrected in the beam’s eye view (BEV) and the monitor units (MUs) are corrected using point dose calculations. All MLC and MU corrections are driven by the positions of four fiducial points only, without need for a full contour set. Quality assurance (QA) of the adapted plans is performed using plan parameters that can be calculated online and that have a relation to the delivered dose or the plan quality. Five potential parameters are studied for this purpose: the number of MU, the equivalent field size (EqFS), the modulation complexity score (MCS), and the components of the MCS: the aperture area variability (AAV) and the leaf sequence variability (LSV). The full adaptation and its separate steps were evaluated in simulation experiments involving a prostate phantom subjected to various interfraction transformations. The efficacy of the current VMAT adaptation was scored by target mean dose (CTV{sub mean}), conformity (CI{sub 95%}), tumor control probability (TCP), and normal tissue complication probability (NTCP). The impact of the adaptation on the plan parameters (QA) was assessed by comparison with prediction intervals (PI) derived from a statistical model of the typical variation of these parameters in a population of VMAT prostate plans (n = 63). These prediction intervals are the adaptation equivalent of the tolerance tables for couch shifts in the current clinical

  18. Spectral signature verification using statistical analysis and text mining

    NASA Astrophysics Data System (ADS)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  19. Age and gender-invariant features of handwritten signatures for verification systems

    NASA Astrophysics Data System (ADS)

    AbdAli, Sura; Putz-Leszczynska, Joanna

    2014-11-01

    Handwritten signature is one of the most natural biometrics, the study of human physiological and behavioral patterns. Behavioral biometrics includes signatures that may be different due to its owner gender or age because of intrinsic or extrinsic factors. This paper presents the results of the author's research on age and gender influence on verification factors. The experiments in this research were conducted using a database that contains signatures and their associated metadata. The used algorithm is based on the universal forgery feature idea, where the global classifier is able to classify a signature as a genuine one or, as a forgery, without the actual knowledge of the signature template and its owner. Additionally, the reduction of the dimensionality with the MRMR method is discussed.

  20. Offline signature verification and skilled forgery detection using HMM and sum graph features with ANN and knowledge based classifier

    NASA Astrophysics Data System (ADS)

    Mehta, Mohit; Choudhary, Vijay; Das, Rupam; Khan, Ilyas

    2010-02-01

    Signature verification is one of the most widely researched areas in document analysis and signature biometric. Various methodologies have been proposed in this area for accurate signature verification and forgery detection. In this paper we propose a unique two stage model of detecting skilled forgery in the signature by combining two feature types namely Sum graph and HMM model for signature generation and classify them with knowledge based classifier and probability neural network. We proposed a unique technique of using HMM as feature rather than a classifier as being widely proposed by most of the authors in signature recognition. Results show a higher false rejection than false acceptance rate. The system detects forgeries with an accuracy of 80% and can detect the signatures with 91% accuracy. The two stage model can be used in realistic signature biometric applications like the banking applications where there is a need to detect the authenticity of the signature before processing documents like checks.

  1. Spectral signature verification using statistical analysis and text mining

    NASA Astrophysics Data System (ADS)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  2. On Hunting Animals of the Biometric Menagerie for Online Signature.

    PubMed

    Houmani, Nesma; Garcia-Salicetti, Sonia

    2016-01-01

    Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database.

  3. On Hunting Animals of the Biometric Menagerie for Online Signature

    PubMed Central

    Houmani, Nesma; Garcia-Salicetti, Sonia

    2016-01-01

    Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database. PMID:27054836

  4. On Hunting Animals of the Biometric Menagerie for Online Signature.

    PubMed

    Houmani, Nesma; Garcia-Salicetti, Sonia

    2016-01-01

    Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database. PMID:27054836

  5. Gated Treatment Delivery Verification With On-Line Megavoltage Fluoroscopy

    SciTech Connect

    Tai An; Christensen, James D.; Gore, Elizabeth; Khamene, Ali; Boettger, Thomas; Li, X. Allen

    2010-04-15

    Purpose: To develop and clinically demonstrate the use of on-line real-time megavoltage (MV) fluoroscopy for gated treatment delivery verification. Methods and Materials: Megavoltage fluoroscopy (MVF) image sequences were acquired using a flat panel equipped for MV cone-beam CT in synchrony with the respiratory signal obtained from the Anzai gating device. The MVF images can be obtained immediately before or during gated treatment delivery. A prototype software tool (named RTReg4D) was developed to register MVF images with phase-sequenced digitally reconstructed radiograph images generated from the treatment planning system based on four-dimensional CT. The image registration can be used to reposition the patient before or during treatment delivery. To demonstrate the reliability and clinical usefulness, the system was first tested using a thoracic phantom and then prospectively in actual patient treatments under an institutional review board-approved protocol. Results: The quality of the MVF images for lung tumors is adequate for image registration with phase-sequenced digitally reconstructed radiographs. The MVF was found to be useful for monitoring inter- and intrafractional variations of tumor positions. With the planning target volume contour displayed on the MVF images, the system can verify whether the moving target stays within the planning target volume margin during gated delivery. Conclusions: The use of MVF images was found to be clinically effective in detecting discrepancies in tumor location before and during respiration-gated treatment delivery. The tools and process developed can be useful for gated treatment delivery verification.

  6. On the pinned field image binarization for signature generation in image ownership verification method

    NASA Astrophysics Data System (ADS)

    Lee, Mn-Ta; Chang, Hsuan Ting

    2011-12-01

    The issue of pinned field image binarization for signature generation in the ownership verification of the protected image is investigated. The pinned field explores the texture information of the protected image and can be employed to enhance the watermark robustness. In the proposed method, four optimization schemes are utilized to determine the threshold values for transforming the pinned field into a binary feature image, which is then utilized to generate an effective signature image. Experimental results show that the utilization of optimization schemes can significantly improve the signature robustness from the previous method (Lee and Chang, Opt. Eng. 49(9), 097005, 2010). While considering both the watermark retrieval rate and the computation speed, the genetic algorithm is strongly recommended. In addition, compared with Chang and Lin's scheme (J. Syst. Softw. 81(7), 1118-1129, 2008), the proposed scheme also has better performance.

  7. On-line failure detection and damping measurement of aerospace structures by random decrement signatures

    NASA Technical Reports Server (NTRS)

    Cole, H. A., Jr.

    1973-01-01

    Random decrement signatures of structures vibrating in a random environment are studied through use of computer-generated and experimental data. Statistical properties obtained indicate that these signatures are stable in form and scale and hence, should have wide application in one-line failure detection and damping measurement. On-line procedures are described and equations for estimating record-length requirements to obtain signatures of a prescribed precision are given.

  8. Optical security verification by synthesizing thin films with unique polarimetric signatures.

    PubMed

    Carnicer, Artur; Arteaga, Oriol; Pascual, Esther; Canillas, Adolf; Vallmitjana, Santiago; Javidi, Bahram; Bertran, Enric

    2015-11-15

    This Letter reports the production and optical polarimetric verification of codes based on thin-film technology for security applications. Because thin-film structures display distinctive polarization signatures, this data is used to authenticate the message encoded. Samples are analyzed using an imaging ellipsometer able to measure the 16 components of the Mueller matrix. As a result, the behavior of the thin film under polarized light becomes completely characterized. This information is utilized to distinguish among true and false codes by means of correlation. Without the imaging optics the components of the Mueller matrix become noise-like distributions and, consequently, the message encoded is no longer available. Then, a set of Stokes vectors are generated numerically for any polarization state of the illuminating beam and thus, machine learning techniques can be used to perform classification. We show that successful authentication is possible using the k-nearest neighbors algorithm in thin-films codes that have been anisotropically phase-encoded with pseudorandom phase code. PMID:26565884

  9. A method for online verification of adapted fields using an independent dose monitor

    SciTech Connect

    Chang Jina; Norrlinger, Bernhard D.; Heaton, Robert K.; Jaffray, David A.; Cho, Young-Bin; Islam, Mohammad K.; Mahon, Robert

    2013-07-15

    Purpose: Clinical implementation of online adaptive radiotherapy requires generation of modified fields and a method of dosimetric verification in a short time. We present a method of treatment field modification to account for patient setup error, and an online method of verification using an independent monitoring system.Methods: The fields are modified by translating each multileaf collimator (MLC) defined aperture in the direction of the patient setup error, and magnifying to account for distance variation to the marked isocentre. A modified version of a previously reported online beam monitoring system, the integral quality monitoring (IQM) system, was investigated for validation of adapted fields. The system consists of a large area ion-chamber with a spatial gradient in electrode separation to provide a spatially sensitive signal for each beam segment, mounted below the MLC, and a calculation algorithm to predict the signal. IMRT plans of ten prostate patients have been modified in response to six randomly chosen setup errors in three orthogonal directions.Results: A total of approximately 49 beams for the modified fields were verified by the IQM system, of which 97% of measured IQM signal agree with the predicted value to within 2%.Conclusions: The modified IQM system was found to be suitable for online verification of adapted treatment fields.

  10. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    SciTech Connect

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H

    2015-06-15

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  11. 75 FR 42575 - Electronic Signature and Storage of Form I-9, Employment Eligibility Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ...); Documents Acceptable for Employment Eligibility Verification, 74 FR 2838 (Jan. 16, 2009) (correction... form. http://uscis.gov/files/form/i-9.pdf . This final rule permits employers to complete, sign, scan... I-9. Documents Acceptable for Employment Eligibility Verification, 73 FR 76505 (Dec. 17, 2008)....

  12. An Optimized Online Verification Imaging Procedure for External Beam Partial Breast Irradiation

    SciTech Connect

    Willis, David J.; Kron, Tomas; Chua, Boon

    2011-07-01

    The purpose of this study was to evaluate the capabilities of a kilovoltage (kV) on-board imager (OBI)-equipped linear accelerator in the setting of on-line verification imaging for external-beam partial breast irradiation. Available imaging techniques were optimized and assessed for image quality using a modified anthropomorphic phantom. Imaging dose was also assessed. Imaging techniques were assessed for physical clearance between patient and treatment machine using a volunteer. Nonorthogonal kV image pairs were identified as optimal in terms of image quality, clearance, and dose. After institutional review board approval, this approach was used for 17 patients receiving accelerated partial breast irradiation. Imaging was performed before every fraction verification with online correction of setup deviations >5 mm (total image sessions = 170). Treatment staff rated risk of collision and visibility of tumor bed surgical clips where present. Image session duration and detected setup deviations were recorded. For all cases, both image projections (n = 34) had low collision risk. Surgical clips were rated as well as visualized in all cases where they were present (n = 5). The average imaging session time was 6 min, 16 sec, and a reduction in duration was observed as staff became familiar with the technique. Setup deviations of up to 1.3 cm were detected before treatment and subsequently confirmed offline. Nonorthogonal kV image pairs allowed effective and efficient online verification for partial breast irradiation. It has yet to be tested in a multicenter study to determine whether it is dependent on skilled treatment staff.

  13. Laboratory verification of on-line lithium analysis using ultraviolet absorption spectrometry

    SciTech Connect

    Beemster, B.J.; Schlager, K.J.; Schloegel, K.M.; Kahle, S.J.; Fredrichs, T.L.

    1992-12-31

    Several laboratory experiments were performed to evaluate the capability of absorption spectrometry in the ultraviolet-visible wavelength range with the objective of developing methods for on-line analysis of lithium directly in the primary coolant of Pressurized Water Reactors using optical probes. Although initial laboratory tests seemed to indicate that lithium could be detected using primary absorption (detection of natural spectra unassisted by reagents), subsequent field tests demonstrated that no primary absorption spectra existed for lithium in the ultraviolet-visible wavelength range. A second series of tests that were recently conducted did, however, confirm results reported in the literature to the effect that reagents were available that will react with lithium to form chelates that possess detectable absorption and fluorescent signatures. These results point to the possible use of secondary techniques for on-line analysis of lithium.

  14. Is Your Avatar Ethical? On-Line Course Tools that Are Methods for Student Identity and Verification

    ERIC Educational Resources Information Center

    Semple, Mid; Hatala, Jeffrey; Franks, Patricia; Rossi, Margherita A.

    2011-01-01

    On-line college courses present a mandate for student identity verification for accreditation and funding sources. Student authentication requires course modification to detect fraud and misrepresentation of authorship in assignment submissions. The reality is that some college students cheat in face-to-face classrooms; however, the potential for…

  15. Efficient cost-sensitive human-machine collaboration for offline signature verification

    NASA Astrophysics Data System (ADS)

    Coetzer, Johannes; Swanepoel, Jacques; Sabourin, Robert

    2012-01-01

    We propose a novel strategy for the optimal combination of human and machine decisions in a cost-sensitive environment. The proposed algorithm should be especially beneficial to financial institutions where off-line signatures, each associated with a specific transaction value, require authentication. When presented with a collection of genuine and fraudulent training signatures, produced by so-called guinea pig writers, the proficiency of a workforce of human employees and a score-generating machine can be estimated and represented in receiver operating characteristic (ROC) space. Using a set of Boolean fusion functions, the majority vote decision of the human workforce is combined with each threshold-specific machine-generated decision. The performance of the candidate ensembles is estimated and represented in ROC space, after which only the optimal ensembles and associated decision trees are retained. When presented with a questioned signature linked to an arbitrary writer, the system first uses the ROC-based cost gradient associated with the transaction value to select the ensemble that minimises the expected cost, and then uses the corresponding decision tree to authenticate the signature in question. We show that, when utilising the entire human workforce, the incorporation of a machine streamlines the authentication process and decreases the expected cost for all operating conditions.

  16. Hybrid Enrichment Verification Array: Investigations of the High-Energy Gamma-Ray Signature Origin and Use for Partial Defect Detection

    SciTech Connect

    Kulisek, Jonathan A.; Jordan, David V.; Mace, Emily K.; McDonald, Benjamin S.; Smith, Leon E.

    2014-06-11

    The International Atomic Energy Agency (IAEA) is exploring the use of an Unattended Cylinder Verification Station (UCVS) to provide independent verification of the declared relative 235U enrichment, 235U mass and total uranium mass of the declared UF6 cylinders moving through modern centrifuge enrichment plants. The Hybrid Enrichment Verification Array (HEVA) method is a candidate nondestructive assay method for inclusion in a UCVS. Modeling and measured data from several field campaigns have demonstrated the potential of the HEVA method to assay relative cylinder enrichment with a precision comparable to or substantially better than today’s high-resolution handheld devices. The HEVA instrument is comprised of an array of sodium iodide gamma-ray detectors that measure two primary spectral components. One of these components is the traditional, direct, weakly-penetrating 235U gamma-ray signature, which contains the 186 keV photopeak. The other spectral component is a non-traditional, high-energy (above ~3 MeV) gamma-ray signature, which is generated indirectly from neutrons emitted from within the UF6 cylinder. These neutrons are more penetrating and create high-energy gamma rays through neutron capture reactions in the steel collimators surrounding the detectors and within the detector crystals. This paper will present results from Monte Carlo simulations and analyses of the HEVA method with a focus on the origins of this high-energy signature, the optimization of instrument design to enhance the signature, and the ability of this non-traditional signature to reveal partial defect scenarios wherein material is missing or substituted in the interior of the cylinder.

  17. Investigation of the spatial resolution of an online dose verification device

    SciTech Connect

    Asuni, G.; Rickey, D. W.; McCurdy, B. M. C.

    2012-02-15

    Purpose: The aim of this work is to characterize a new online dose verification device, COMPASS transmission detector array (IBA Dosimetry, Schwarzenbruck, Germany). The array is composed of 1600 cylindrical ionization chambers of 3.8 mm diameter, separated by 6.5 mm center-to-center spacing, in a 40 x 40 arrangement. Methods: The line spread function (LSF) of a single ion chamber in the detector was measured with a narrow slit collimator for a 6 MV photon beam. The 0.25 x 10 mm{sup 2} slit was formed by two machined lead blocks. The LSF was obtained by laterally translating the detector in 0.25 mm steps underneath the slit over a range of 24 mm and taking a measurement at each step. This measurement was validated with Monte Carlo simulation using BEAMnrc and DOSXYZnrc. The presampling modulation transfer function (MTF), the Fourier transform of the line spread function, was determined and compared to calculated (Monte Carlo and analytical) MTFs. Two head-and-neck intensity modulated radiation therapy (IMRT) fields were measured using the device and were used to validate the LSF measurement. These fields were simulated with the BEAMnrc Monte Carlo model, and the Monte Carlo generated incident fluence was convolved with the 2D detector response function (derived from the measured LSF) to obtain calculated dose. The measured and calculated dose distributions were then quantitatively compared using {chi}-comparison criteria of 3% dose difference and 3 mm distance-to-agreement for in-field points (defined as those above the 10% maximum dose threshold). Results: The full width at half-maximum (FWHM) of the measured detector response for a single chamber is 4.3 mm, which is comparable to the chamber diameter of 3.8 mm. The pre-sampling MTF was calculated, and the resolution of one chamber was estimated as 0.25 lp/mm from the first zero crossing. For both examined IMRT fields, the {chi}-comparison between measured and calculated data show good agreement with 95.1% and 96

  18. Aging in biometrics: an experimental analysis on on-line signature.

    PubMed

    Galbally, Javier; Martinez-Diaz, Marcos; Fierrez, Julian

    2013-01-01

    The first consistent and reproducible evaluation of the effect of aging on dynamic signature is reported. Experiments are carried out on a database generated from two previous datasets which were acquired, under very similar conditions, in 6 sessions distributed in a 15-month time span. Three different systems, representing the current most popular approaches in signature recognition, are used in the experiments, proving the degradation suffered by this trait with the passing of time. Several template update strategies are also studied as possible measures to reduce the impact of aging on the system's performance. Different results regarding the way in which signatures tend to change with time, and their most and least stable features, are also given.

  19. Aging in Biometrics: An Experimental Analysis on On-Line Signature

    PubMed Central

    Galbally, Javier; Martinez-Diaz, Marcos; Fierrez, Julian

    2013-01-01

    The first consistent and reproducible evaluation of the effect of aging on dynamic signature is reported. Experiments are carried out on a database generated from two previous datasets which were acquired, under very similar conditions, in 6 sessions distributed in a 15-month time span. Three different systems, representing the current most popular approaches in signature recognition, are used in the experiments, proving the degradation suffered by this trait with the passing of time. Several template update strategies are also studied as possible measures to reduce the impact of aging on the system’s performance. Different results regarding the way in which signatures tend to change with time, and their most and least stable features, are also given. PMID:23894557

  20. Workload and Interaction: Unisa's Signature Courses--A Design Template for Transitioning to Online DE?

    ERIC Educational Resources Information Center

    Hülsmann, Thomas; Shabalala, Lindiwe

    2016-01-01

    The principal contradiction of online distance education is the disparity that exists between economies of scale and the new interactive capabilities of digital technologies. This is particularly felt where mega-universities in developing countries seek to make better use of these affordances while at the same time protecting their economies of…

  1. MARQ: an online tool to mine GEO for experiments with similar or opposite gene expression signatures.

    PubMed

    Vazquez, Miguel; Nogales-Cadenas, Ruben; Arroyo, Javier; Botías, Pedro; García, Raul; Carazo, Jose M; Tirado, Francisco; Pascual-Montano, Alberto; Carmona-Saez, Pedro

    2010-07-01

    The enormous amount of data available in public gene expression repositories such as Gene Expression Omnibus (GEO) offers an inestimable resource to explore gene expression programs across several organisms and conditions. This information can be used to discover experiments that induce similar or opposite gene expression patterns to a given query, which in turn may lead to the discovery of new relationships among diseases, drugs or pathways, as well as the generation of new hypotheses. In this work, we present MARQ, a web-based application that allows researchers to compare a query set of genes, e.g. a set of over- and under-expressed genes, against a signature database built from GEO datasets for different organisms and platforms. MARQ offers an easy-to-use and integrated environment to mine GEO, in order to identify conditions that induce similar or opposite gene expression patterns to a given experimental condition. MARQ also includes additional functionalities for the exploration of the results, including a meta-analysis pipeline to find genes that are differentially expressed across different experiments. The application is freely available at http://marq.dacya.ucm.es.

  2. Authentication Based on Pole-zero Models of Signature Velocity.

    PubMed

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-10-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  3. Patient-Specific 3D Pretreatment and Potential 3D Online Dose Verification of Monte Carlo-Calculated IMRT Prostate Treatment Plans

    SciTech Connect

    Boggula, Ramesh; Jahnke, Lennart; Wertz, Hansjoerg; Lohr, Frank; Wenz, Frederik

    2011-11-15

    Purpose: Fast and reliable comprehensive quality assurance tools are required to validate the safety and accuracy of complex intensity-modulated radiotherapy (IMRT) plans for prostate treatment. In this study, we evaluated the performance of the COMPASS system for both off-line and potential online procedures for the verification of IMRT treatment plans. Methods and Materials: COMPASS has a dedicated beam model and dose engine, it can reconstruct three-dimensional dose distributions on the patient anatomy based on measured fluences using either the MatriXX two-dimensional (2D) array (offline) or a 2D transmission detector (T2D) (online). For benchmarking the COMPASS dose calculation, various dose-volume indices were compared against Monte Carlo-calculated dose distributions for five prostate patient treatment plans. Gamma index evaluation and absolute point dose measurements were also performed in an inhomogeneous pelvis phantom using extended dose range films and ion chamber for five additional treatment plans. Results: MatriXX-based dose reconstruction showed excellent agreement with the ion chamber (<0.5%, except for one treatment plan, which showed 1.5%), film ({approx}100% pixels passing gamma criteria 3%/3 mm) and mean dose-volume indices (<2%). The T2D based dose reconstruction showed good agreement as well with ion chamber (<2%), film ({approx}99% pixels passing gamma criteria 3%/3 mm), and mean dose-volume indices (<5.5%). Conclusion: The COMPASS system qualifies for routine prostate IMRT pretreatment verification with the MatriXX detector and has the potential for on-line verification of treatment delivery using T2D.

  4. Online Kidney Position Verification Using Non-Contrast Radiographs on a Linear Accelerator with on Board KV X-Ray Imaging Capability

    SciTech Connect

    Willis, David J. Kron, Tomas; Hubbard, Patricia; Haworth, Annette; Wheeler, Greg; Duchesne, Gillian M.

    2009-01-01

    The kidneys are dose-limiting organs in abdominal radiotherapy. Kilovoltage (kV) radiographs can be acquired using on-board imager (OBI)-equipped linear accelerators with better soft tissue contrast and lower radiation doses than conventional portal imaging. A feasibility study was conducted to test the suitability of anterior-posterior (AP) non-contrast kV radiographs acquired at treatment time for online kidney position verification. Anthropomorphic phantoms were used to evaluate image quality and radiation dose. Institutional Review Board approval was given for a pilot study that enrolled 5 adults and 5 children. Customized digitally reconstructed radiographs (DRRs) were generated to provide a priori information on kidney shape and position. Radiotherapy treatment staff performed online evaluation of kidney visibility on OBI radiographs. Kidney dose measured in a pediatric anthropomorphic phantom was 0.1 cGy for kV imaging and 1.7 cGy for MV imaging. Kidneys were rated as well visualized in 60% of patients (90% confidence interval, 34-81%). The likelihood of visualization appears to be influenced by the relative AP separation of the abdomen and kidneys, the axial profile of the kidneys, and their relative contrast with surrounding structures. Online verification of kidney position using AP non-contrast kV radiographs on an OBI-equipped linear accelerator appears feasible for patients with suitable abdominal anatomy. Kidney position information provided is limited to 2-dimensional 'snapshots,' but this is adequate in some clinical situations and potentially advantageous in respiratory-correlated treatments. Successful clinical implementation requires customized partial DRRs, appropriate imaging parameters, and credentialing of treatment staff.

  5. SU-E-J-146: A Research of PET-CT SUV Range for the Online Dose Verification in Carbon Ion Radiation Therapy

    SciTech Connect

    Sun, L; Hu, W; Moyers, M; Zhao, J; Hsi, W

    2015-06-15

    Purpose: Positron-emitting isotope distributions can be used for the image fusion of the carbon ion planning CT and online target verification PETCT, after radiation in the same decay period,the relationship between the same target volume and the SUV value of different every single fraction dose can be found,then the range of SUV for the radiation target could be decided.So this online range also can provide reference for the correlation and consistency in planning target dose verification and evaluation for the clinical trial. Methods: The Rando head phantom can be used as real body,the 10cc cube volume target contouring is done,beam ISO Center depth is 7.6cm and the 90 degree fixed carbon ion beams should be delivered in single fraction effective dose of 2.5GyE,5GyE and 8GyE.After irradiation,390 seconds later the 30 minutes PET-CT scanning is performed,parameters are set to 50Kg virtual weight,0.05mCi activity.MIM Maestro is used for the image processing and fusion,five 16mm diameter SUV spheres have been chosen in the different direction in the target.The average SUV in target for different fraction dose can be found by software. Results: For 10cc volume target,390 seconds decay period,the Single fraction effective dose equal to 2.5Gy,Ethe SUV mean value is 3.42,the relative range is 1.72 to 6.83;Equal to 5GyE,SUV mean value is 9.946,the relative range is 7.016 to 12.54;Equal or above to 8GyE,SUV mean value is 20.496,the relative range is 11.16 to 34.73. Conclusion: Making an evaluation for accuracy of the dose distribution using the SUV range which is from the planning CT with after treatment online PET-CT fusion for the normal single fraction carbon ion treatment is available.Even to the plan which single fraction dose is above 2GyE,in the condition of other parameters all the same,the SUV range is linearly dependent with single fraction dose,so this method also can be used in the hyper-fraction treatment plan.

  6. Applying dynamic methods in off-line signature recognition

    NASA Astrophysics Data System (ADS)

    Igarza, Juan Jose; Hernaez, Inmaculada; Goirizelaia, Inaki; Espinosa, Koldo

    2004-08-01

    In this paper we present the work developed on off-line signature verification using Hidden Markov Models (HMM). HMM is a well-known technique used by other biometric features, for instance, in speaker recognition and dynamic or on-line signature verification. Our goal here is to extend Left-to-Right (LR)-HMM to the field of static or off-line signature processing using results provided by image connectivity analysis. The chain encoding of perimeter points for each blob obtained by this analysis is an ordered set of points in the space, clockwise around the perimeter of the blob. We discuss two different ways of generating the models depending on the way the blobs obtained from the connectivity analysis are ordered. In the first proposed method, blobs are ordered according to their perimeter length. In the second proposal, blobs are ordered in their natural reading order, i.e. from the top to the bottom and left to right. Finally, two LR-HMM models are trained using the parameters obtained by the mentioned techniques. Verification results of the two techniques are compared and some improvements are proposed.

  7. Cone-Beam Computed Tomography for On-Line Image Guidance of Lung Stereotactic Radiotherapy: Localization, Verification, and Intrafraction Tumor Position

    SciTech Connect

    Purdie, Thomas G. . E-mail: Tom.Purdie@rmp.uhn.on.ca; Bissonnette, Jean-Pierre; Franks, Kevin; Bezjak, Andrea; Payne, David; Sie, Fanny; Sharpe, Michael B.; Jaffray, David A.

    2007-05-01

    Purpose: Cone-beam computed tomography (CBCT) in-room imaging allows accurate inter- and intrafraction target localization in stereotactic body radiotherapy of lung tumors. Methods and Materials: Image-guided stereotactic body radiotherapy was performed in 28 patients (89 fractions) with medically inoperable Stage T1-T2 non-small-cell lung carcinoma. The targets from the CBCT and planning data set (helical or four-dimensional CT) were matched on-line to determine the couch shift required for target localization. Matching based on the bony anatomy was also performed retrospectively. Verification of target localization was done using either megavoltage portal imaging or CBCT imaging; repeat CBCT imaging was used to assess the intrafraction tumor position. Results: The mean three-dimensional tumor motion for patients with upper lesions (n = 21) and mid-lobe or lower lobe lesions (n = 7) was 4.2 and 6.7 mm, respectively. The mean difference between the target and bony anatomy matching using CBCT was 6.8 mm (SD, 4.9, maximum, 30.3); the difference exceeded 13.9 mm in 10% of the treatment fractions. The mean residual error after target localization using CBCT imaging was 1.9 mm (SD, 1.1, maximum, 4.4). The mean intrafraction tumor deviation was significantly greater (5.3 mm vs. 2.2 mm) when the interval between localization and repeat CBCT imaging (n = 8) exceeded 34 min. Conclusion: In-room volumetric imaging, such as CBCT, is essential for target localization accuracy in lung stereotactic body radiotherapy. Imaging that relies on bony anatomy as a surrogate of the target may provide erroneous results in both localization and verification.

  8. Creation of a Reference Image with Monte Carlo Simulations for Online EPID Verification of Daily Patient Setup

    SciTech Connect

    Descalle, M-A; Chuang, C; Pouliot, J

    2002-01-30

    Patient positioning accuracy remains an issue for external beam radiotherapy. Currently, kilovoltage verification images are used as reference by clinicians to compare the actual patient treatment position with the planned position. These images are qualitatively different from treatment-time megavoltage portal images. This study will investigate the feasibility of using PEREGRINE, a 3D Monte Carlo calculation engine, to create reference images for portal image comparisons. Portal images were acquired using an amorphous-silicon flat-panel EPID for (1) the head and pelvic sections of an anthropomorphic phantom with 7-8 mm displacements applied, and (2) a prostate patient on five treatment days. Planning CT scans were used to generate simulated reference images with PEREGRINE. A correlation algorithm quantified the setup deviations between simulated and portal images. Monte Carlo simulated images exhibit similar qualities to portal images, the phantom slabs appear clearly. Initial positioning differences and applied displacements were detected and quantified. We find that images simulated with Monte Carlo methods can be used as reference images to detect and quantify set-up errors during treatment.

  9. SU-E-T-582: On-Line Dosimetric Verification of Respiratory Gated Volumetric Modulated Arc Therapy Using the Electronic Portal Imaging Device

    SciTech Connect

    Schaly, B; Gaede, S; Xhaferllari, I

    2015-06-15

    Purpose: To investigate the clinical utility of on-line verification of respiratory gated VMAT dosimetry during treatment. Methods: Portal dose images were acquired during treatment in integrated mode on a Varian TrueBeam (v. 1.6) linear accelerator for gated lung and liver patients that used flattening filtered beams. The source to imager distance (SID) was set to 160 cm to ensure imager clearance in case the isocenter was off midline. Note that acquisition of integrated images resulted in no extra dose to the patient. Fraction 1 was taken as baseline and all portal dose images were compared to that of the baseline, where the gamma comparison and dose difference were used to measure day-to-day exit dose variation. All images were analyzed in the Portal Dosimetry module of Aria (v. 10). The portal imager on the TrueBeam was calibrated by following the instructions for dosimetry calibration in service mode, where we define 1 calibrated unit (CU) equal to 1 Gy for 10×10 cm field size at 100 cm SID. This reference condition was measured frequently to verify imager calibration. Results: The gamma value (3%, 3 mm, 5% threshold) ranged between 92% and 100% for the lung and liver cases studied. The exit dose can vary by as much as 10% of the maximum dose for an individual fraction. The integrated images combined with the information given by the corresponding on-line soft tissue matched cone-beam computed tomography (CBCT) images were useful in explaining dose variation. For gated lung treatment, dose variation was mainly due to the diaphragm position. For gated liver treatment, the dose variation was due to both diaphragm position and weight loss. Conclusion: Integrated images can be useful in verifying dose delivery consistency during respiratory gated VMAT, although the CBCT information is needed to explain dose differences due to anatomical changes.

  10. Actively Promoting Student Engagement within an Online Environment: Developing and Implementing a Signature Subject on "Contemporary Issues in Sex and Sexuality"

    ERIC Educational Resources Information Center

    Fletcher, Gillian; Dowsett, Gary W.; Austin, Lilian

    2012-01-01

    La Trobe University is committed to improving the first year experience, and to developing its online teaching portfolio in response to increasing student demand. This article will acknowledge that these two objectives will remain contradictory if online learning systems are used predominantly as repositories of information with little thought…

  11. Geometric verification

    NASA Technical Reports Server (NTRS)

    Grebowsky, G. J.

    1982-01-01

    Present LANDSAT data formats are reviewed to clarify how the geodetic location and registration capabilities were defined for P-tape products and RBV data. Since there is only one geometric model used in the master data processor, geometric location accuracy of P-tape products depends on the absolute accuracy of the model and registration accuracy is determined by the stability of the model. Due primarily to inaccuracies in data provided by the LANDSAT attitude management system, desired accuracies are obtained only by using ground control points and a correlation process. The verification of system performance with regards to geodetic location requires the capability to determine pixel positions of map points in a P-tape array. Verification of registration performance requires the capability to determine pixel positions of common points (not necessarily map points) in 2 or more P-tape arrays for a given world reference system scene. Techniques for registration verification can be more varied and automated since map data are not required. The verification of LACIE extractions is used as an example.

  12. Signature-based store checking buffer

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-06-02

    A system and method for optimizing redundant output verification, are provided. A hardware-based store fingerprint buffer receives multiple instances of output from multiple instances of computation. The store fingerprint buffer generates a signature from the content included in the multiple instances of output. When a barrier is reached, the store fingerprint buffer uses the signature to verify the content is error-free.

  13. On-line high-performance liquid chromatography-ultraviolet-nuclear magnetic resonance method of the markers of nerve agents for verification of the Chemical Weapons Convention.

    PubMed

    Mazumder, Avik; Gupta, Hemendra K; Garg, Prabhat; Jain, Rajeev; Dubey, Devendra K

    2009-07-01

    This paper details an on-flow liquid chromatography-ultraviolet-nuclear magnetic resonance (LC-UV-NMR) method for the retrospective detection and identification of alkyl alkylphosphonic acids (AAPAs) and alkylphosphonic acids (APAs), the markers of the toxic nerve agents for verification of the Chemical Weapons Convention (CWC). Initially, the LC-UV-NMR parameters were optimized for benzyl derivatives of the APAs and AAPAs. The optimized parameters include stationary phase C(18), mobile phase methanol:water 78:22 (v/v), UV detection at 268nm and (1)H NMR acquisition conditions. The protocol described herein allowed the detection of analytes through acquisition of high quality NMR spectra from the aqueous solution of the APAs and AAPAs with high concentrations of interfering background chemicals which have been removed by preceding sample preparation. The reported standard deviation for the quantification is related to the UV detector which showed relative standard deviations (RSDs) for quantification within +/-1.1%, while lower limit of detection upto 16mug (in mug absolute) for the NMR detector. Finally the developed LC-UV-NMR method was applied to identify the APAs and AAPAs in real water samples, consequent to solid phase extraction and derivatization. The method is fast (total experiment time approximately 2h), sensitive, rugged and efficient.

  14. Strengths-based positive psychology interventions: a randomized placebo-controlled online trial on long-term effects for a signature strengths- vs. a lesser strengths-intervention.

    PubMed

    Proyer, René T; Gander, Fabian; Wellenzohn, Sara; Ruch, Willibald

    2015-01-01

    Recent years have seen an increasing interest in research in positive psychology interventions. There is broad evidence for their effectiveness in increasing well-being and ameliorating depression. Intentional activities that focus on those character strengths, which are most typical for a person (i.e., signature strengths, SS) and encourage their usage in a new way have been identified as highly effective. The current study aims at comparing an intervention aimed at using SS with one on using individual low scoring (or lesser) strengths in a randomized placebo-controlled trial. A total of 375 adults were randomly assigned to one of the two intervention conditions [i.e., using five signature vs. five lesser strengths (LS) in a new way] or a placebo control condition (i.e., early memories). We measured happiness and depressive symptoms at five time points (i.e., pre- and post-test, 1-, 3-, and 6-months follow-ups) and character strengths at pre-test. The main findings are that (1) there were increases in happiness for up to 3 months and decreases in depressive symptoms in the short term in both intervention conditions; (2) participants found working with strengths equally rewarding (enjoyment and benefit) in both conditions; (3) those participants that reported generally higher levels of strengths benefitted more from working on LS rather than SS and those with comparatively lower levels of strengths tended to benefit more from working on SS; and (4) deviations from an average profile derived from a large sample of German-speakers completing the Values-in-Action Inventory of Strengths were associated with greater benefit from the interventions in the SS-condition. We conclude that working on character strengths is effective for increasing happiness and discuss how these interventions could be tailored to the individual for promoting their effectiveness.

  15. Strengths-based positive psychology interventions: a randomized placebo-controlled online trial on long-term effects for a signature strengths- vs. a lesser strengths-intervention

    PubMed Central

    Proyer, René T.; Gander, Fabian; Wellenzohn, Sara; Ruch, Willibald

    2015-01-01

    Recent years have seen an increasing interest in research in positive psychology interventions. There is broad evidence for their effectiveness in increasing well-being and ameliorating depression. Intentional activities that focus on those character strengths, which are most typical for a person (i.e., signature strengths, SS) and encourage their usage in a new way have been identified as highly effective. The current study aims at comparing an intervention aimed at using SS with one on using individual low scoring (or lesser) strengths in a randomized placebo-controlled trial. A total of 375 adults were randomly assigned to one of the two intervention conditions [i.e., using five signature vs. five lesser strengths (LS) in a new way] or a placebo control condition (i.e., early memories). We measured happiness and depressive symptoms at five time points (i.e., pre- and post-test, 1-, 3-, and 6-months follow-ups) and character strengths at pre-test. The main findings are that (1) there were increases in happiness for up to 3 months and decreases in depressive symptoms in the short term in both intervention conditions; (2) participants found working with strengths equally rewarding (enjoyment and benefit) in both conditions; (3) those participants that reported generally higher levels of strengths benefitted more from working on LS rather than SS and those with comparatively lower levels of strengths tended to benefit more from working on SS; and (4) deviations from an average profile derived from a large sample of German-speakers completing the Values-in-Action Inventory of Strengths were associated with greater benefit from the interventions in the SS-condition. We conclude that working on character strengths is effective for increasing happiness and discuss how these interventions could be tailored to the individual for promoting their effectiveness. PMID:25954221

  16. Redactable signatures for signed CDA Documents.

    PubMed

    Wu, Zhen-Yu; Hsueh, Chih-Wen; Tsai, Cheng-Yu; Lai, Feipei; Lee, Hung-Chang; Chung, Yufang

    2012-06-01

    The Clinical Document Architecture, introduced by Health Level Seven, is a XML-based standard intending to specify the encoding, structure, and semantics of clinical documents for exchange. Since the clinical document is in XML form, its authenticity and integrity could be guaranteed by the use of the XML signature published by W3C. While a clinical document wants to conceal some personal or private information, the document needs to be redacted. It makes the signed signature of the original clinical document not be verified. The redactable signature is thus proposed to enable verification for the redacted document. Only a little research does the implementation of the redactable signature, and there still not exists an appropriate scheme for the clinical document. This paper will investigate the existing web-technologies and find a compact and applicable model to implement a suitable redactable signature for the clinical document viewer. PMID:21181244

  17. A Quantum Multi-proxy Blind Signature Scheme Based on Genuine Four-Qubit Entangled State

    NASA Astrophysics Data System (ADS)

    Tian, Juan-Hong; Zhang, Jian-Zhong; Li, Yan-Ping

    2016-02-01

    In this paper, we propose a multi-proxy blind signature scheme based on controlled teleportation. Genuine four-qubit entangled state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. The security analysis shows the scheme satisfies the security features of multi-proxy signature, unforgeability, undeniability, blindness and unconditional security.

  18. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  19. Characterization of a fiber-coupled Al{sub 2}O{sub 3}:C luminescence dosimetry system for online in vivo dose verification during {sup 192}Ir brachytherapy

    SciTech Connect

    Andersen, Claus E.; Nielsen, Soeren Kynde; Greilich, Steffen; Helt-Hansen, Jakob; Lindegaard, Jacob Christian; Tanderup, Kari

    2009-03-15

    A prototype of a new dose-verification system has been developed to facilitate prevention and identification of dose delivery errors in remotely afterloaded brachytherapy. The system allows for automatic online in vivo dosimetry directly in the tumor region using small passive detector probes that fit into applicators such as standard needles or catheters. The system measures the absorbed dose rate (0.1 s time resolution) and total absorbed dose on the basis of radioluminescence (RL) and optically stimulated luminescence (OSL) from aluminum oxide crystals attached to optical fiber cables (1 mm outer diameter). The system was tested in the range from 0 to 4 Gy using a solid-water phantom, a Varian GammaMed Plus {sup 192}Ir PDR afterloader, and dosimetry probes inserted into stainless-steel brachytherapy needles. The calibrated system was found to be linear in the tested dose range. The reproducibility (one standard deviation) for RL and OSL measurements was 1.3%. The measured depth-dose profiles agreed well with the theoretical expectations computed with the EGSNRC Monte Carlo code, suggesting that the energy dependence for the dosimeter probes (relative to water) is less than 6% for source-to-probe distances in the range of 2-50 mm. Under certain conditions, the RL signal could be greatly disturbed by the so-called stem signal (i.e., unwanted light generated in the fiber cable upon irradiation). The OSL signal is not subject to this source of error. The tested system appears to be adequate for in vivo brachytherapy dosimetry.

  20. Realizing digital signatures for medical imaging and reporting in a PACS environment.

    PubMed

    Lien, Chung-Yueh; Yang, Tsung-Lung; Hsiao, Chia-Hung; Kao, Tsair

    2013-02-01

    According to Taiwan's legislation pertaining to the protection of electronic data, the creators of electronic medical records (EMR) are solely responsible for the security of EMR. However, actual implementations that fulfill the security standards and requirements for electronic medical record systems are still lacking. Most EMR created from picture archive and communication system are not considered secure, as security protection mechanisms have not yet been granted legal status. This paper describes the details of establishing a digital signature system using Taiwan health professional cards. A digital signature system has been included to ensure quality assurance (QA) operations are controlled by technicians, and reporting capabilities have been provided for radiologist. Six imaging modalities and eight types of radiology reports have also been included in the system. Results indicate that the process of creating QA signatures does not have an adverse effect on the workflow of the facility, requiring less time for the signing and verification of radiology reports. This system has already been used routinely online in a real clinical setting for more than 2 years.

  1. Modeling the lexical morphology of Western handwritten signatures.

    PubMed

    Diaz-Cabrera, Moises; Ferrer, Miguel A; Morales, Aythami

    2015-01-01

    A handwritten signature is the final response to a complex cognitive and neuromuscular process which is the result of the learning process. Because of the many factors involved in signing, it is possible to study the signature from many points of view: graphologists, forensic experts, neurologists and computer vision experts have all examined them. Researchers study written signatures for psychiatric, penal, health and automatic verification purposes. As a potentially useful, multi-purpose study, this paper is focused on the lexical morphology of handwritten signatures. This we understand to mean the identification, analysis, and description of the signature structures of a given signer. In this work we analyze different public datasets involving 1533 signers from different Western geographical areas. Some relevant characteristics of signature lexical morphology have been selected, examined in terms of their probability distribution functions and modeled through a General Extreme Value distribution. This study suggests some useful models for multi-disciplinary sciences which depend on handwriting signatures.

  2. Modeling the Lexical Morphology of Western Handwritten Signatures

    PubMed Central

    Diaz-Cabrera, Moises; Ferrer, Miguel A.; Morales, Aythami

    2015-01-01

    A handwritten signature is the final response to a complex cognitive and neuromuscular process which is the result of the learning process. Because of the many factors involved in signing, it is possible to study the signature from many points of view: graphologists, forensic experts, neurologists and computer vision experts have all examined them. Researchers study written signatures for psychiatric, penal, health and automatic verification purposes. As a potentially useful, multi-purpose study, this paper is focused on the lexical morphology of handwritten signatures. This we understand to mean the identification, analysis, and description of the signature structures of a given signer. In this work we analyze different public datasets involving 1533 signers from different Western geographical areas. Some relevant characteristics of signature lexical morphology have been selected, examined in terms of their probability distribution functions and modeled through a General Extreme Value distribution. This study suggests some useful models for multi-disciplinary sciences which depend on handwriting signatures. PMID:25860942

  3. Towards a Better Understanding of the Oxygen Isotope Signature of Atmospheric CO2: Determining the 18O-Exchange Between CO2 and H2O in Leaves and Soil On-line with Laser-Based Spectroscopy

    NASA Astrophysics Data System (ADS)

    Gangi, L.; Rothfuss, Y.; Vereecken, H.; Brueggemann, N.

    2013-12-01

    The oxygen isotope signature of carbon dioxide (δ18O-CO2) is a powerful tool to disentangle CO2 fluxes in terrestrial ecosystems, as CO2 attains a contrasting 18O signature by the interaction with isotopically different soil and leaf water pools during soil respiration and photosynthesis, respectively. However, using the δ18O-CO2 signal to quantify plant-soil-atmosphere CO2 fluxes is still challenging due to a lack of knowledge concerning the magnitude and effect of individual fractionation processes during CO2 and H2O diffusion and during CO2-H2O isotopic exchange in soils and leaves, especially related to short-term changes in environmental conditions (non-steady state). This study addresses this research gap by combined on-line monitoring of the oxygen isotopic signature of CO2 and water vapor during gas exchange in soil and plant leaves with laser-based spectroscopy, using soil columns and plant chambers. In both experimental setups, the measured δ18O of water vapor was used to infer the δ18O of liquid water, and, together with the δ18O-CO2, the degree of oxygen isotopic equilibrium between the two species (θ). Gas exchange experiments with different functional plant types (C3 coniferous, C3 monocotyledonous, C3 dicotyledonous, C4) revealed that θ and the influence of the plant on the ambient δ18O-CO2 (CO18O-isoforcing) not only varied on a diurnal timescale but also when plants were exposed to limited water availability, elevated air temperature, and abrupt changes in light intensity (sunflecks). Maximum θ before treatments ranged between 0.7 and 0.8 for the C3 dicotyledonous (poplar) and C3 monocotyledonous (wheat) plants, and between 0.5 and 0.6 for the conifer (spruce) and C4 plant (maize) while maximum CO18O-isoforcing was highest in wheat (0.03 m s-1 ‰), similar in poplar and maize (0.02 m s-1 ‰), and lowest in spruce (0.01 m s-1 ‰). Multiple regression analysis showed that up to 97 % of temporal dynamics in CO18O-isoforcing could be

  4. Signatures of Reputation

    NASA Astrophysics Data System (ADS)

    Bethencourt, John; Shi, Elaine; Song, Dawn

    Reputation systems have become an increasingly important tool for highlighting quality information and filtering spam within online forums. However, the dependence of a user's reputation on their history of activities seems to preclude any possibility of anonymity. We show that useful reputation information can, in fact, coexist with strong privacy guarantees. We introduce and formalize a novel cryptographic primitive we call signatures of reputation which supports monotonic measures of reputation in a completely anonymous setting. In our system, a user can express trust in others by voting for them, collect votes to build up her own reputation, and attach a proof of her reputation to any data she publishes, all while maintaining the unlinkability of her actions.

  5. Verification Tools Secure Online Shopping, Banking

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Just like rover or rocket technology sent into space, the software that controls these technologies must be extensively tested to ensure reliability and effectiveness. Ames Research Center invented the open-source Java Pathfinder (JPF) toolset for the deep testing of Java-based programs. Fujitsu Labs of America Inc., based in Sunnyvale, California, improved the capabilities of the JPF Symbolic Pathfinder tool, establishing the tool as a means of thoroughly testing the functionality and security of Web-based Java applications such as those used for Internet shopping and banking.

  6. Biometric verification in dynamic writing

    NASA Astrophysics Data System (ADS)

    George, Susan E.

    2002-03-01

    Pen-tablet devices capable of capturing the dynamics of writing record temporal and pressure information as well as the spatial pattern. This paper explores biometric verification based upon the dynamics of writing where writers are distinguished not on the basis of what they write (ie the signature), but how they write. We have collected samples of dynamic writing from 38 Chinese writers. Each writer was asked to provide 10 copies of a paragraph of text and the same number of signature samples. From the data we have extracted stroke-based primitives from the sentence data utilizing pen-up/down information and heuristic rules about the shape of the character. The x, y and pressure values of each primitive were interpolated into an even temporal range based upon a 20 msec sampling rate. We applied the Daubechies 1 wavelet transform to the x signal, y signal and pressure signal using the coefficients as inputs to a multi-layer perceptron trained with back-propagation on the sentence data. We found a sensitivity of 0.977 and specificity of 0.990 recognizing writers based on test primitives extracted from sentence data and measures of 0.916 and 0.961 respectively, from test primitives extracted from signature data.

  7. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  8. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  9. A hybrid digital-signature and zero-watermarking approach for authentication and protection of sensitive electronic documents.

    PubMed

    Tayan, Omar; Kabir, Muhammad N; Alginahi, Yasser M

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints.

  10. A Hybrid Digital-Signature and Zero-Watermarking Approach for Authentication and Protection of Sensitive Electronic Documents

    PubMed Central

    Kabir, Muhammad N.; Alginahi, Yasser M.

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  11. A hybrid digital-signature and zero-watermarking approach for authentication and protection of sensitive electronic documents.

    PubMed

    Tayan, Omar; Kabir, Muhammad N; Alginahi, Yasser M

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  12. Quantum blind dual-signature scheme without arbitrator

    NASA Astrophysics Data System (ADS)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-03-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology.

  13. Verification of Adaptive Systems

    SciTech Connect

    Pullum, Laura L; Cui, Xiaohui; Vassev, Emil; Hinchey, Mike; Rouff, Christopher; Buskens, Richard

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance for them. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  14. Security Problems in the Quantum Signature Scheme with a Weak Arbitrator

    NASA Astrophysics Data System (ADS)

    Zou, Xiangfu; Qiu, Daowen; Yu, Fang; Mateus, Paulo

    2013-10-01

    Very recently, a quantum signature scheme with weak arbitrator was presented (Luo et al. in Int. J. Theor. Phys. 51:2135-2142, 2012). A weak arbitrator is only involved in the disagreement case, which means that the scheme is costless. In this paper, the security of the quantum signature scheme with weak arbitrator is analyzed. We show that attackers can counterfeit a signature for any message, which will pass the verification for the signer. In addition, they can counterfeit a signature for any one of the 4 L (L is the length of the intercepted quantum message) messages by employing the known message attack, which will pass the verification for the signed message. In particular, by employing the Z-transform attack, the attackers can forge a signature for any one of the 2 L messages, which will pass the verifications for both the signer and the signed message successfully.

  15. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  16. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  17. 75 FR 76080 - Agency Information Collection (VetBiz Vendor Information Pages Verification Program) Activity...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-07

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF VETERANS AFFAIRS Agency Information Collection (VetBiz Vendor Information Pages Verification Program) Activity... . Please refer to ``OMB Control No. 2900- 0675.'' SUPPLEMENTAL INFORMATION: Title: VetBiz...

  18. Signatures support program

    NASA Astrophysics Data System (ADS)

    Hawley, Chadwick T.

    2009-05-01

    The Signatures Support Program (SSP) leverages the full spectrum of signature-related activities (collections, processing, development, storage, maintenance, and dissemination) within the Department of Defense (DOD), the intelligence community (IC), other Federal agencies, and civil institutions. The Enterprise encompasses acoustic, seismic, radio frequency, infrared, radar, nuclear radiation, and electro-optical signatures. The SSP serves the war fighter, the IC, and civil institutions by supporting military operations, intelligence operations, homeland defense, disaster relief, acquisitions, and research and development. Data centers host and maintain signature holdings, collectively forming the national signatures pool. The geographically distributed organizations are the authoritative sources and repositories for signature data; the centers are responsible for data content and quality. The SSP proactively engages DOD, IC, other Federal entities, academia, and industry to locate signatures for inclusion in the distributed national signatures pool and provides world-wide 24/7 access via the SSP application.

  19. 78 FR 47804 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-06

    ... COMMISSION Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety Systems...), revision 2 of RG 1.168, ``Verification, Validation, Reviews, and Audits for Digital Computer Software Used... available documents online in the NRC Library at http://www.nrc.gov/reading-rm/adams.html . To begin...

  20. 17 CFR 1.4 - Electronic signatures, acknowledgments and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... commission merchant or introducing broker, a retail forex customer of a retail foreign exchange dealer or..., retail forex customer, participant, client, counterparty, swap dealer, or major swap participant will...

  1. 17 CFR 1.4 - Electronic signatures, acknowledgments and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... commission merchant or introducing broker, a retail forex customer of a retail foreign exchange dealer or..., retail forex customer, participant, client, counterparty, swap dealer, or major swap participant will...

  2. Verification and arms control

    SciTech Connect

    Potter, W.C.

    1985-01-01

    Recent years have witnessed an increased stress upon the verification of arms control agreements, both as a technical problem and as a political issue. As one contribution here points out, the middle ground has shrunk between those who are persuaded that the Soviets are ''cheating'' and those who are willing to take some verification risks for the sake of achieving arms control. One angle, according to a Lawrence Livermore physicist who served as a member of the delegation to the various test-ban treaty negotiations, is the limited effectiveness of on-site inspection as compared to other means of verification.

  3. Verification of RADTRAN

    SciTech Connect

    Kanipe, F.L.; Neuhauser, K.S.

    1995-12-31

    This document presents details of the verification process of the RADTRAN computer code which was established for the calculation of risk estimates for radioactive materials transportation by highway, rail, air, and waterborne modes.

  4. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  5. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    SciTech Connect

    Paul, J. N.; Chin, M. R.; Sjoden, G. E.

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  6. An Arbitrated Quantum Signature with Bell States

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Qin, Su-Juan; Huang, Wei

    2014-05-01

    Entanglement is the main resource in quantum communication. The main aims of the arbitrated quantum signature (AQS) scheme are to present an application of the entanglement in cryptology and to prove the possibility of the quantum signature. More specifically, the main function of quantum entangled states in the existing AQS schemes is to assist the signatory to transfer quantum states to the receiver. However, teleportation and the Leung quantum one-time pad (L-QOTP) algorithm are not enough to design a secure AQS scheme. For example, Pauli operations commute or anticommute with each other, which makes the implementation of attacks easily from the aspects of forgery and disavowal. To conquer this shortcoming, we construct an improved AQS scheme using a new QOTP algorithm. This scheme has three advantages: it randomly uses the Hadamard operation in the new QOTP to resist attacks by using the anticommutativity of nontrivial Pauli operators and it preserves almost all merits in the existing AQS schemes; even in the process of handling disputes, no party has chance to change the message and its signature without being discovered; the receiver can verify the integrity of the signature and discover the disavow of the signatory even in the last step of verification.

  7. Electronic health records: what does your signature signify?

    PubMed

    Victoroff Md, Michael S

    2012-01-01

    Electronic health records serve multiple purposes, including clinical communication, legal documentation, financial transaction capture, research and analytics. Electronic signatures attached to entries in EHRs have different logical and legal meanings for different users. Some of these are vestiges from historic paper formats that require reconsideration. Traditionally accepted functions of signatures, such as identity verification, attestation, consent, authorization and non-repudiation can become ambiguous in the context of computer-assisted workflow processes that incorporate functions like logins, auto-fill and audit trails. This article exposes the incompatibility of expectations among typical users of electronically signed information. PMID:22888846

  8. [Vanguard Signature TKR--first experiences].

    PubMed

    Stempin, Radosław; Kotela, Andrzej; Ostrowska, Monika; Kotela, Ireneusz

    2011-01-01

    On 15th May 2010 in Poland first computer planned total knee arthroplasty Vanguard Signature was performed and until now, including Orthopedic Traumatology Department of Central Clinical Hospital Ministry of Interior and Administration in Warsaw and Orthopedic Surgery Department of Promienista Clinic in Poznan, 65 patients have been operated with this method. The new system includes programming technical parameters of operation on the basis of diagnostic analysis of lower extremity using CT or MRI scans. Data are transmitted on Signature Positioning Guides (SPG) which implements function of navigation during surgery. Minimal bone resection, implants sizing and placement with reconstruction of mechanical axis of the limb provides proper functioning of the knee joint and reduces the risk of implants loosening. Further benefits include: instrument reduction, lower degree of femur trauma and reduction of average postoperative blood transfusion volume. The operator using Signature technology is required to have advanced knowledge in the conventional method TKR and medium level computer skills. Access to the program and materials and online communication with the Signature team in the USA allows the surgeon to modify the parameters of the operation and the necessary expert feedback. The rapid increase in the number of registered surgeons in Signature system shows a considerable interest in this technology.

  9. Toward Automatic Verification of Goal-Oriented Flow Simulations

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2014-01-01

    We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.

  10. Digital Signature Management.

    ERIC Educational Resources Information Center

    Hassler, Vesna; Biely, Helmut

    1999-01-01

    Describes the Digital Signature Project that was developed in Austria to establish an infrastructure for applying smart card-based digital signatures in banking and electronic-commerce applications. Discusses the need to conform to international standards, an international certification infrastructure, and security features for a public directory…

  11. Controlling radar signature

    SciTech Connect

    Foulke, K.W. )

    1992-08-01

    Low observable technologies for military and tactical aircraft are reviewed including signature-reduction techniques and signal detection/jamming. Among the applications considered are low-signature sensors and the reduction of radar cross section in conjunction with radar-absorbing structures and materials. Technologies for reducing radar cross section are shown to present significant technological challenges, although they afford enhanced aircraft survivability.

  12. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  13. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  14. Wind gust warning verification

    NASA Astrophysics Data System (ADS)

    Primo, Cristina

    2016-07-01

    Operational meteorological centres around the world increasingly include warnings as one of their regular forecast products. Warnings are issued to warn the public about extreme weather situations that might occur leading to damages and losses. In forecasting these extreme events, meteorological centres help their potential users in preventing the damage or losses they might suffer. However, verifying these warnings requires specific methods. This is due not only to the fact that they happen rarely, but also because a new temporal dimension is added when defining a warning, namely the time window of the forecasted event. This paper analyses the issues that might appear when dealing with warning verification. It also proposes some new verification approaches that can be applied to wind warnings. These new techniques are later applied to a real life example, the verification of wind gust warnings at the German Meteorological Centre ("Deutscher Wetterdienst"). Finally, the results obtained from the latter are discussed.

  15. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  16. Verification of Ceramic Structures

    NASA Astrophysics Data System (ADS)

    Behar-Lafenetre, Stephanie; Cornillon, Laurence; Rancurel, Michael; De Graaf, Dennis; Hartmann, Peter; Coe, Graham; Laine, Benoit

    2012-07-01

    In the framework of the “Mechanical Design and Verification Methodologies for Ceramic Structures” contract [1] awarded by ESA, Thales Alenia Space has investigated literature and practices in affiliated industries to propose a methodological guideline for verification of ceramic spacecraft and instrument structures. It has been written in order to be applicable to most types of ceramic or glass-ceramic materials - typically Cesic®, HBCesic®, Silicon Nitride, Silicon Carbide and ZERODUR®. The proposed guideline describes the activities to be performed at material level in order to cover all the specific aspects of ceramics (Weibull distribution, brittle behaviour, sub-critical crack growth). Elementary tests and their post-processing methods are described, and recommendations for optimization of the test plan are given in order to have a consistent database. The application of this method is shown on an example in a dedicated article [7]. Then the verification activities to be performed at system level are described. This includes classical verification activities based on relevant standard (ECSS Verification [4]), plus specific analytical, testing and inspection features. The analysis methodology takes into account the specific behaviour of ceramic materials, especially the statistical distribution of failures (Weibull) and the method to transfer it from elementary data to a full-scale structure. The demonstration of the efficiency of this method is described in a dedicated article [8]. The verification is completed by classical full-scale testing activities. Indications about proof testing, case of use and implementation are given and specific inspection and protection measures are described. These additional activities are necessary to ensure the required reliability. The aim of the guideline is to describe how to reach the same reliability level as for structures made of more classical materials (metals, composites).

  17. Study of Dynamic Characteristics of Aeroelastic Systems Utilizing Randomdec Signatures

    NASA Technical Reports Server (NTRS)

    Chang, C. S.

    1975-01-01

    The feasibility of utilizing the random decrement method in conjunction with a signature analysis procedure to determine the dynamic characteristics of an aeroelastic system for the purpose of on-line prediction of potential on-set of flutter was examined. Digital computer programs were developed to simulate sampled response signals of a two-mode aeroelastic system. Simulated response data were used to test the random decrement method. A special curve-fit approach was developed for analyzing the resulting signatures. A number of numerical 'experiments' were conducted on the combined processes. The method is capable of determining frequency and damping values accurately from randomdec signatures of carefully selected lengths.

  18. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  19. Signature detection and matching for document image retrieval.

    PubMed

    Zhu, Guangyu; Zheng, Yefeng; Doermann, David; Jaeger, Stefan

    2009-11-01

    As one of the most pervasive methods of individual identification and document authentication, signatures present convincing evidence and provide an important form of indexing for effective document image processing and retrieval in a broad range of applications. However, detection and segmentation of free-form objects such as signatures from clustered background is currently an open document analysis problem. In this paper, we focus on two fundamental problems in signature-based document image retrieval. First, we propose a novel multiscale approach to jointly detecting and segmenting signatures from document images. Rather than focusing on local features that typically have large variations, our approach captures the structural saliency using a signature production model and computes the dynamic curvature of 2D contour fragments over multiple scales. This detection framework is general and computationally tractable. Second, we treat the problem of signature retrieval in the unconstrained setting of translation, scale, and rotation invariant nonrigid shape matching. We propose two novel measures of shape dissimilarity based on anisotropic scaling and registration residual error and present a supervised learning framework for combining complementary shape information from different dissimilarity metrics using LDA. We quantitatively study state-of-the-art shape representations, shape matching algorithms, measures of dissimilarity, and the use of multiple instances as query in document image retrieval. We further demonstrate our matching techniques in offline signature verification. Extensive experiments using large real-world collections of English and Arabic machine-printed and handwritten documents demonstrate the excellent performance of our approaches. PMID:19762928

  20. UV Signature Mutations †

    PubMed Central

    2014-01-01

    Sequencing complete tumor genomes and exomes has sparked the cancer field's interest in mutation signatures for identifying the tumor's carcinogen. This review and meta-analysis discusses signatures and their proper use. We first distinguish between a mutagen's canonical mutations – deviations from a random distribution of base changes to create a pattern typical of that mutagen – and the subset of signature mutations, which are unique to that mutagen and permit inference backward from mutations to mutagen. To verify UV signature mutations, we assembled literature datasets on cells exposed to UVC, UVB, UVA, or solar simulator light (SSL) and tested canonical UV mutation features as criteria for clustering datasets. A confirmed UV signature was: ≥60% of mutations are C→T at a dipyrimidine site, with ≥5% CC→TT. Other canonical features such as a bias for mutations on the non-transcribed strand or at the 3' pyrimidine had limited application. The most robust classifier combined these features with criteria for the rarity of non-UV canonical mutations. In addition, several signatures proposed for specific UV wavelengths were limited to specific genes or species; non-signature mutations induced by UV may cause melanoma BRAF mutations; and the mutagen for sunlight-related skin neoplasms may vary between continents. PMID:25354245

  1. An archaeal genomic signature

    NASA Technical Reports Server (NTRS)

    Graham, D. E.; Overbeek, R.; Olsen, G. J.; Woese, C. R.

    2000-01-01

    Comparisons of complete genome sequences allow the most objective and comprehensive descriptions possible of a lineage's evolution. This communication uses the completed genomes from four major euryarchaeal taxa to define a genomic signature for the Euryarchaeota and, by extension, the Archaea as a whole. The signature is defined in terms of the set of protein-encoding genes found in at least two diverse members of the euryarchaeal taxa that function uniquely within the Archaea; most signature proteins have no recognizable bacterial or eukaryal homologs. By this definition, 351 clusters of signature proteins have been identified. Functions of most proteins in this signature set are currently unknown. At least 70% of the clusters that contain proteins from all the euryarchaeal genomes also have crenarchaeal homologs. This conservative set, which appears refractory to horizontal gene transfer to the Bacteria or the Eukarya, would seem to reflect the significant innovations that were unique and fundamental to the archaeal "design fabric." Genomic protein signature analysis methods may be extended to characterize the evolution of any phylogenetically defined lineage. The complete set of protein clusters for the archaeal genomic signature is presented as supplementary material (see the PNAS web site, www.pnas.org).

  2. An archaeal genomic signature.

    PubMed

    Graham, D E; Overbeek, R; Olsen, G J; Woese, C R

    2000-03-28

    Comparisons of complete genome sequences allow the most objective and comprehensive descriptions possible of a lineage's evolution. This communication uses the completed genomes from four major euryarchaeal taxa to define a genomic signature for the Euryarchaeota and, by extension, the Archaea as a whole. The signature is defined in terms of the set of protein-encoding genes found in at least two diverse members of the euryarchaeal taxa that function uniquely within the Archaea; most signature proteins have no recognizable bacterial or eukaryal homologs. By this definition, 351 clusters of signature proteins have been identified. Functions of most proteins in this signature set are currently unknown. At least 70% of the clusters that contain proteins from all the euryarchaeal genomes also have crenarchaeal homologs. This conservative set, which appears refractory to horizontal gene transfer to the Bacteria or the Eukarya, would seem to reflect the significant innovations that were unique and fundamental to the archaeal "design fabric." Genomic protein signature analysis methods may be extended to characterize the evolution of any phylogenetically defined lineage. The complete set of protein clusters for the archaeal genomic signature is presented as supplementary material (see the PNAS web site, www.pnas.org).

  3. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  4. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  5. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  6. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  7. Is flow verification necessary

    SciTech Connect

    Beetle, T.M.

    1986-01-01

    Safeguards test statistics are used in an attempt to detect diversion of special nuclear material. Under assumptions concerning possible manipulation (falsification) of safeguards accounting data, the effects on the statistics due to diversion and data manipulation are described algebraically. A comprehensive set of statistics that is capable of detecting any diversion of material is defined in terms of the algebraic properties of the effects. When the assumptions exclude collusion between persons in two material balance areas, then three sets of accounting statistics are shown to be comprehensive. Two of the sets contain widely known accountancy statistics. One of them does not require physical flow verification - comparisons of operator and inspector data for receipts and shipments. The third set contains a single statistic which does not require physical flow verification. In addition to not requiring technically difficult and expensive flow verification, this single statistic has several advantages over other comprehensive sets of statistics. This algebraic approach as an alternative to flow verification for safeguards accountancy is discussed in this paper.

  8. Telescope performance verification

    NASA Astrophysics Data System (ADS)

    Swart, Gerhard P.; Buckley, David A. H.

    2004-09-01

    While Systems Engineering appears to be widely applied on the very large telescopes, it is lacking in the development of many of the medium and small telescopes currently in progress. The latter projects rely heavily on the experience of the project team, verbal requirements and conjecture based on the successes and failures of other telescopes. Furthermore, it is considered an unaffordable luxury to "close-the-loop" by carefully analysing and documenting the requirements and then verifying the telescope's compliance with them. In this paper the authors contend that a Systems Engineering approach is a keystone in the development of any telescope and that verification of the telescope's performance is not only an important management tool but also forms the basis upon which successful telescope operation can be built. The development of the Southern African Large Telescope (SALT) has followed such an approach and is now in the verification phase of its development. Parts of the SALT verification process will be discussed in some detail to illustrate the suitability of this approach, including oversight by the telescope shareholders, recording of requirements and results, design verification and performance testing. Initial test results will be presented where appropriate.

  9. Extraction and analysis of signatures from the Gene Expression Omnibus by the crowd

    NASA Astrophysics Data System (ADS)

    Wang, Zichen; Monteiro, Caroline D.; Jagodnik, Kathleen M.; Fernandez, Nicolas F.; Gundersen, Gregory W.; Rouillard, Andrew D.; Jenkins, Sherry L.; Feldmann, Axel S.; Hu, Kevin S.; McDermott, Michael G.; Duan, Qiaonan; Clark, Neil R.; Jones, Matthew R.; Kou, Yan; Goff, Troy; Woodland, Holly; Amaral, Fabio M. R.; Szeto, Gregory L.; Fuchs, Oliver; Schüssler-Fiorenza Rose, Sophia M.; Sharma, Shvetank; Schwartz, Uwe; Bausela, Xabier Bengoetxea; Szymkiewicz, Maciej; Maroulis, Vasileios; Salykin, Anton; Barra, Carolina M.; Kruth, Candice D.; Bongio, Nicholas J.; Mathur, Vaibhav; Todoric, Radmila D.; Rubin, Udi E.; Malatras, Apostolos; Fulp, Carl T.; Galindo, John A.; Motiejunaite, Ruta; Jüschke, Christoph; Dishuck, Philip C.; Lahl, Katharina; Jafari, Mohieddin; Aibar, Sara; Zaravinos, Apostolos; Steenhuizen, Linda H.; Allison, Lindsey R.; Gamallo, Pablo; de Andres Segura, Fernando; Dae Devlin, Tyler; Pérez-García, Vicente; Ma'Ayan, Avi

    2016-09-01

    Gene expression data are accumulating exponentially in public repositories. Reanalysis and integration of themed collections from these studies may provide new insights, but requires further human curation. Here we report a crowdsourcing project to annotate and reanalyse a large number of gene expression profiles from Gene Expression Omnibus (GEO). Through a massive open online course on Coursera, over 70 participants from over 25 countries identify and annotate 2,460 single-gene perturbation signatures, 839 disease versus normal signatures, and 906 drug perturbation signatures. All these signatures are unique and are manually validated for quality. Global analysis of these signatures confirms known associations and identifies novel associations between genes, diseases and drugs. The manually curated signatures are used as a training set to develop classifiers for extracting similar signatures from the entire GEO repository. We develop a web portal to serve these signatures for query, download and visualization.

  10. Extraction and analysis of signatures from the Gene Expression Omnibus by the crowd

    PubMed Central

    Wang, Zichen; Monteiro, Caroline D.; Jagodnik, Kathleen M.; Fernandez, Nicolas F.; Gundersen, Gregory W.; Rouillard, Andrew D.; Jenkins, Sherry L.; Feldmann, Axel S.; Hu, Kevin S.; McDermott, Michael G.; Duan, Qiaonan; Clark, Neil R.; Jones, Matthew R.; Kou, Yan; Goff, Troy; Woodland, Holly; Amaral, Fabio M R.; Szeto, Gregory L.; Fuchs, Oliver; Schüssler-Fiorenza Rose, Sophia M.; Sharma, Shvetank; Schwartz, Uwe; Bausela, Xabier Bengoetxea; Szymkiewicz, Maciej; Maroulis, Vasileios; Salykin, Anton; Barra, Carolina M.; Kruth, Candice D.; Bongio, Nicholas J.; Mathur, Vaibhav; Todoric, Radmila D; Rubin, Udi E.; Malatras, Apostolos; Fulp, Carl T.; Galindo, John A.; Motiejunaite, Ruta; Jüschke, Christoph; Dishuck, Philip C.; Lahl, Katharina; Jafari, Mohieddin; Aibar, Sara; Zaravinos, Apostolos; Steenhuizen, Linda H.; Allison, Lindsey R.; Gamallo, Pablo; de Andres Segura, Fernando; Dae Devlin, Tyler; Pérez-García, Vicente; Ma'ayan, Avi

    2016-01-01

    Gene expression data are accumulating exponentially in public repositories. Reanalysis and integration of themed collections from these studies may provide new insights, but requires further human curation. Here we report a crowdsourcing project to annotate and reanalyse a large number of gene expression profiles from Gene Expression Omnibus (GEO). Through a massive open online course on Coursera, over 70 participants from over 25 countries identify and annotate 2,460 single-gene perturbation signatures, 839 disease versus normal signatures, and 906 drug perturbation signatures. All these signatures are unique and are manually validated for quality. Global analysis of these signatures confirms known associations and identifies novel associations between genes, diseases and drugs. The manually curated signatures are used as a training set to develop classifiers for extracting similar signatures from the entire GEO repository. We develop a web portal to serve these signatures for query, download and visualization. PMID:27667448

  11. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  12. Description of a Computerized, On-Line Interlibrary Loan System.

    ERIC Educational Resources Information Center

    Kilgour, Frederick G.

    This paper describes the first two months of operation of the OCLC interlibrary loan system, an online system designed to increase speed and effectiveness in obtaining interlibrary loans. This system provides (1) bibliographic verification of interlibrary loan records and location of materials by using online union catalog records, (2) automatic…

  13. Are there molecular signatures?

    SciTech Connect

    Bennett, W.P.

    1995-10-01

    This report describes molecular signatures and mutational spectrum analysis. The mutation spectrum is defined as the type and location of DNA base change. There are currently about five well documented cases. Mutations and radon-associated tumors are discussed.

  14. Meteor signature interpretation

    SciTech Connect

    Canavan, G.H.

    1997-01-01

    Meteor signatures contain information about the constituents of space debris and present potential false alarms to early warnings systems. Better models could both extract the maximum scientific information possible and reduce their danger. Accurate predictions can be produced by models of modest complexity, which can be inverted to predict the sizes, compositions, and trajectories of object from their signatures for most objects of interest and concern.

  15. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  16. Online Degrees.

    ERIC Educational Resources Information Center

    Dolezalek, Holly

    2003-01-01

    Discusses the trend of trainers who are getting degrees through online courses delivered via the Internet. Addresses accreditation issues and what to ask before enrolling in online degree programs. (JOW)

  17. Secure verification by multifactor optical validation

    NASA Astrophysics Data System (ADS)

    Millán, María S.; Pérez-Cabré, Elisabet; Javidi, Bahram

    2006-08-01

    We propose a novel multifactor encryption-authentication technique that reinforces optical security by allowing the simultaneous AND-verification of more than one primary image. We describe a method to obtain four-factor authentication. The authenticators are: two different primary images containing signatures or biometric information and two different white random sequences that act as key codes. So far, optical security techniques deal with a single primary image (an object, a signature, or a biometric signal), not combined primary images. Our method involves double random-phase encoding, fully phase-based encryption and a combined nonlinear JTC and a classical 4f-correlator for simultaneous recognition and authentication of multiple images. There is no a priori constraint about the type of primary images to encode. Two reference images, double-phase encoded and encrypted in an ID tag (or card) are compared with the actual input images obtained in situ from the person whose authentication is wanted. The two key phase codes are known by the authentication processor. The complex-amplitude encoded image of the ID tag has a dim appearance that does not reveal the content of any primary reference image nor the key codes. The encoded image function fullfils the general requirements of invisible content, extreme difficulty in counterfeiting and real-time automatic verification. The possibility of introducing nonlinearities in the Fourier plane of the optical processor will be exploited to improve the system performance. This optical technique is attractive for high-security purposes that require multifactor reliable authentication.

  18. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  19. Nonintrusive verification attributes for excess fissile materials

    SciTech Connect

    Nicholas, N.J.; Eccleston, G.W.; Fearey, B.L.

    1997-10-01

    Under US initiatives, over two hundred metric tons of fissile materials have been declared to be excess to national defense needs. These excess materials are in both classified and unclassified forms. The US has expressed the intent to place these materials under international inspections as soon as practicable. To support these commitments, members of the US technical community are examining a variety of nonintrusive approaches (i.e., those that would not reveal classified or sensitive information) for verification of a range of potential declarations for these classified and unclassified materials. The most troublesome and potentially difficult issues involve approaches for international inspection of classified materials. The primary focus of the work to date has been on the measurement of signatures of relevant materials attributes (e.g., element, identification number, isotopic ratios, etc.), especially those related to classified materials and items. The authors are examining potential attributes and related measurement technologies in the context of possible verification approaches. The paper will discuss the current status of these activities, including their development, assessment, and benchmarking status.

  20. Continuous verification using multimodal biometrics.

    PubMed

    Sim, Terence; Zhang, Sheng; Janakiraman, Rajkumar; Kumar, Sandeep

    2007-04-01

    Conventional verification systems, such as those controlling access to a secure room, do not usually require the user to reauthenticate himself for continued access to the protected resource. This may not be sufficient for high-security environments in which the protected resource needs to be continuously monitored for unauthorized use. In such cases, continuous verification is needed. In this paper, we present the theory, architecture, implementation, and performance of a multimodal biometrics verification system that continuously verifies the presence of a logged-in user. Two modalities are currently used--face and fingerprint--but our theory can be readily extended to include more modalities. We show that continuous verification imposes additional requirements on multimodal fusion when compared to conventional verification systems. We also argue that the usual performance metrics of false accept and false reject rates are insufficient yardsticks for continuous verification and propose new metrics against which we benchmark our system. PMID:17299225

  1. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty

  2. Practical quantum digital signature

    NASA Astrophysics Data System (ADS)

    Yin, Hua-Lei; Fu, Yao; Chen, Zeng-Bing

    2016-03-01

    Guaranteeing nonrepudiation, unforgeability as well as transferability of a signature is one of the most vital safeguards in today's e-commerce era. Based on fundamental laws of quantum physics, quantum digital signature (QDS) aims to provide information-theoretic security for this cryptographic task. However, up to date, the previously proposed QDS protocols are impractical due to various challenging problems and most importantly, the requirement of authenticated (secure) quantum channels between participants. Here, we present the first quantum digital signature protocol that removes the assumption of authenticated quantum channels while remaining secure against the collective attacks. Besides, our QDS protocol can be practically implemented over more than 100 km under current mature technology as used in quantum key distribution.

  3. Using color for face verification

    NASA Astrophysics Data System (ADS)

    Leszczynski, Mariusz

    2009-06-01

    This paper presents research on importance of color information in face verification system. Four most popular color spaces where used: RGB, YIQ, YCbCr, luminance and compared using four types of discriminant classifiers. Experiments conducted on facial databases with complex background, different poses and light condition show that color information can improve the verification accuracy compared to the traditionally used luminance information. To achieve the best performance we recommend to use multi frames verification encoded to YIQ color space.

  4. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  5. Variability study of Ka-band HRR polarimetric signatures on 11 T-72 tanks

    NASA Astrophysics Data System (ADS)

    Nixon, William E.; Neilson, H. J.; Szatkowski, G. N.; Giles, Robert H.; Kersey, William T.; Perkins, L. C.; Waldman, Jerry

    1998-09-01

    In an effort to effectively understand signature verification requirements through the variability of a structure's RCS characteristics, the U.S. Army National Ground Intelligence Center (NGIC), with technical support from STL, originated a signature project plan to obtain MMW signatures from multiple similar tanks. In implementing this plan NGIC/STL directed and sponsored turntable measurements performed by the U.S. Army Research Laboratory Sensors and Electromagnetic Resource Directorate on eleven T-72 tanks using an HRR full-polarimetric Ka-band radar. The physical condition and configuration of these vehicles were documented by careful inspection and then photographed during the acquisition sequence at 45 degree(s) azimuth intervals. The turntable signature of one vehicle was acquired eight times over the three day signatures acquisition period for establishing measurement variability on any single target. At several intervals between target measurements, the turntable signature of a 30 m2 trihedral was also acquired as a calibration reference for the signature library. Through an RCS goodness-of-fit correlation and ISAR comparison study, the signature-to-signature variability was evaluated for the eighteen HRR turntable measurements of the T-72 tanks. This signature data is available from NGIC on request for Government Agencies and Government Contractors with an established need-to-know.

  6. Current signature sensor

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M. (Inventor); Lucena, Angel (Inventor); Ihlefeld, Curtis (Inventor); Burns, Bradley (Inventor); Bassignani, Karin E. (Inventor)

    2005-01-01

    A solenoid health monitoring system uses a signal conditioner and controller assembly in one embodiment that includes analog circuitry and a DSP controller. The analog circuitry provides signal conditioning to the low-level raw signal coming from a signal acquisition assembly. Software running in a DSP analyzes the incoming data (recorded current signature) and determines the state of the solenoid whether it is energized, de-energized, or in a transitioning state. In one embodiment, the software identifies key features in the current signature during the transition phase and is able to determine the health of the solenoid.

  7. Factor models for cancer signatures

    NASA Astrophysics Data System (ADS)

    Kakushadze, Zura; Yu, Willie

    2016-11-01

    We present a novel method for extracting cancer signatures by applying statistical risk models (http://ssrn.com/abstract=2732453) from quantitative finance to cancer genome data. Using 1389 whole genome sequenced samples from 14 cancers, we identify an "overall" mode of somatic mutational noise. We give a prescription for factoring out this noise and source code for fixing the number of signatures. We apply nonnegative matrix factorization (NMF) to genome data aggregated by cancer subtype and filtered using our method. The resultant signatures have substantially lower variability than those from unfiltered data. Also, the computational cost of signature extraction is cut by about a factor of 10. We find 3 novel cancer signatures, including a liver cancer dominant signature (96% contribution) and a renal cell carcinoma signature (70% contribution). Our method accelerates finding new cancer signatures and improves their overall stability. Reciprocally, the methods for extracting cancer signatures could have interesting applications in quantitative finance.

  8. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  9. A Signature Style

    ERIC Educational Resources Information Center

    Smiles, Robin V.

    2005-01-01

    This article discusses Dr. Amalia Amaki and her approach to art as her signature style by turning everyday items into fine art. Amaki is an assistant professor of art, art history, and Black American studies at the University of Delaware. She loves taking unexpected an object and redefining it in the context of art--like a button, a fan, a faded…

  10. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis; Mahadevan, Karthikeyan

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  11. Characterization of palmprints by wavelet signatures via directional context modeling.

    PubMed

    Zhang, Lei; Zhang, David

    2004-06-01

    The palmprint is one of the most reliable physiological characteristics that can be used to distinguish between individuals. Current palmprint-based systems are more user friendly, more cost effective, and require fewer data signatures than traditional fingerprint-based identification systems. The principal lines and wrinkles captured in a low-resolution palmprint image provide more than enough information to uniquely identify an individual. This paper presents a palmprint identification scheme that characterizes a palmprint using a set of statistical signatures. The palmprint is first transformed into the wavelet domain, and the directional context of each wavelet subband is defined and computed in order to collect the predominant coefficients of its principal lines and wrinkles. A set of statistical signatures, which includes gravity center, density, spatial dispersivity and energy, is then defined to characterize the palmprint with the selected directional context values. A classification and identification scheme based on these signatures is subsequently developed. This scheme exploits the features of principal lines and prominent wrinkles sufficiently and achieves satisfactory results. Compared with the line-segments-matching or interesting-points-matching based palmprint verification schemes, the proposed scheme uses a much smaller amount of data signatures. It also provides a convenient classification strategy and more accurate identification. PMID:15484907

  12. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  13. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  14. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  15. An ECDLP-Based Threshold Proxy Signature Scheme Using Self-Certified Public Key System

    NASA Astrophysics Data System (ADS)

    Xue, Qingshui; Li, Fengying; Zhou, Yuan; Zhang, Jiping; Cao, Zhenfu; Qian, Haifeng

    In a (t, n) threshold proxy signature scheme, one original signer delegates a group of n proxy signers to sign messages on behalf of the original signer. When the proxy signature is created, at leastt proxy signers cooperate to generate valid proxy signatures and any less than t proxy signers can’t cooperatively generate valid proxy signatures. So far, all of proposed threshold proxy signature schemes are based on public key systems with certificates, which have some disadvantages such as checking the certificate list when needing certificates. Most threshold proxy signature schemes use Shamir’s threshold secret share scheme. Identity-based public key system is not pretty mature. Self-certified public key systems have attracted more and more attention because of its advantages. Based on Hsu et al’s self-certified public key system and Li et al’s proxy signature scheme, one threshold proxy signature scheme based on ECDLP and self-certified public key system is proposed. As far as we know, it is the first scheme based on ECDLP and self-certified public key system. The proposed scheme can provide the security properties of proxy protection, verifiability, strong identifiability, strong unforgeability, strong repudiability, distinguishability, known signers and prevention of misuse of proxy signing power. That is, internal attacks, external attacks, collusion attacks, equation attacks and public key substitution attacks can be resisted. In the proxy signature verification phase, the authentication of the original and the proxy signers’ public keys and the verification of the threshold proxy signature are executed together. In addition, the computation overhead and communication cost of the proposed scheme are analyzed as well.

  16. Identification of host response signatures of infection.

    SciTech Connect

    Branda, Steven S.; Sinha, Anupama; Bent, Zachary

    2013-02-01

    Biological weapons of mass destruction and emerging infectious diseases represent a serious and growing threat to our national security. Effective response to a bioattack or disease outbreak critically depends upon efficient and reliable distinguishing between infected vs healthy individuals, to enable rational use of scarce, invasive, and/or costly countermeasures (diagnostics, therapies, quarantine). Screening based on direct detection of the causative pathogen can be problematic, because culture- and probe-based assays are confounded by unanticipated pathogens (e.g., deeply diverged, engineered), and readily-accessible specimens (e.g., blood) often contain little or no pathogen, particularly at pre-symptomatic stages of disease. Thus, in addition to the pathogen itself, one would like to detect infection-specific host response signatures in the specimen, preferably ones comprised of nucleic acids (NA), which can be recovered and amplified from tiny specimens (e.g., fingerstick draws). Proof-of-concept studies have not been definitive, however, largely due to use of sub-optimal sample preparation and detection technologies. For purposes of pathogen detection, Sandia has developed novel molecular biology methods that enable selective isolation of NA unique to, or shared between, complex samples, followed by identification and quantitation via Second Generation Sequencing (SGS). The central hypothesis of the current study is that variations on this approach will support efficient identification and verification of NA-based host response signatures of infectious disease. To test this hypothesis, we re-engineered Sandia's sophisticated sample preparation pipelines, and developed new SGS data analysis tools and strategies, in order to pioneer use of SGS for identification of host NA correlating with infection. Proof-of-concept studies were carried out using specimens drawn from pathogen-infected non-human primates (NHP). This work provides a strong foundation for

  17. Online Pricing.

    ERIC Educational Resources Information Center

    Garman, Nancy; And Others

    1990-01-01

    The first of four articles describes the move by the European Space Agency to eliminate connect time charges on its online retrieval system. The remaining articles describe the pricing structure of DIALOG, compare the two pricing schemes, and discuss online pricing from the user's point of view. (CLB)

  18. Data requirements for verification of ram glow chemistry

    NASA Technical Reports Server (NTRS)

    Swenson, G. R.; Mende, S. B.

    1985-01-01

    A set of questions is posed regarding the surface chemistry producing the ram glow on the space shuttle. The questions surround verification of the chemical cycle involved in the physical processes leading to the glow. The questions, and a matrix of measurements required for most answers, are presented. The measurements include knowledge of the flux composition to and from a ram surface as well as spectroscopic signatures from the U to visible to IR. A pallet set of experiments proposed to accomplish the measurements is discussed. An interim experiment involving an available infrared instrument to be operated from the shuttle Orbiter cabin is also be discussed.

  19. Wake Signature Detection

    NASA Astrophysics Data System (ADS)

    Spedding, Geoffrey R.

    2014-01-01

    An accumulated body of quantitative evidence shows that bluff-body wakes in stably stratified environments have an unusual degree of coherence and organization, so characteristic geometries such as arrays of alternating-signed vortices have very long lifetimes, as measured in units of buoyancy timescales, or in the downstream distance scaled by a body length. The combination of pattern geometry and persistence renders the detection of these wakes possible in principle. It now appears that identifiable signatures can be found from many disparate sources: Islands, fish, and plankton all have been noted to generate features that can be detected by climate modelers, hopeful navigators in open oceans, or hungry predators. The various types of wakes are reviewed with notes on why their signatures are important and to whom. A general theory of wake pattern formation is lacking and would have to span many orders of magnitude in Reynolds number.

  20. Infrasonic signature of the 2009 major sudden stratospheric warming

    NASA Astrophysics Data System (ADS)

    Evers, L. G.; Siegmund, P.

    2009-12-01

    The study of infrasound is experiencing a renaissance since it was chosen as a verification technique for the Comprehensive Nuclear-Test-Ban Treaty. The success of the verification technique strongly depends on knowledge of upper atmospheric processes. The ability of infrasound to probe the upper atmosphere starts to be exploited, taking the field beyond its monitoring application. Processes in the stratosphere couple to the troposphere and influence our daily weather and climate. Infrasound delivers actual observations on the state of the stratosphere with a high spatial and temporal resolution. Here we show the infrasonic signature, passively obtained, of a drastic change in the stratosphere due to the major sudden stratospheric warming (SSW) of January 2009. With this study, we infer the enormous capacity of infrasound in acoustic remote sensing of stratospheric processes on a global scale with surface based instruments.

  1. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  2. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  3. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  4. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  5. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  6. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  7. 75 FR 77959 - VetBiz Vendor Information Pages Verification Program; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-14

    ...-461-7485. Correction In FR Doc. 2010-30550, published on December 7, 2010, at 75 FR 76080, make the... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF VETERANS AFFAIRS VetBiz Vendor Information Pages Verification Program; Correction AGENCY: Center for...

  8. 77 FR 50723 - Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-08-22

    ... COMMISSION Verification, Validation, Reviews, and Audits for Digital Computer Software Used in Safety Systems... Audits for Digital Computer Software used in Safety Systems of Nuclear Power Plants.'' The DG-1210 is... access publicly available documents online in the NRC Library at...

  9. Biometric verification with correlation filters.

    PubMed

    Vijaya Kumar, B V K; Savvides, Marios; Xie, Chunyan; Venkataramani, Krithika; Thornton, Jason; Mahalanobis, Abhijit

    2004-01-10

    Using biometrics for subject verification can significantly improve security over that of approaches based on passwords and personal identification numbers, both of which people tend to lose or forget. In biometric verification the system tries to match an input biometric (such as a fingerprint, face image, or iris image) to a stored biometric template. Thus correlation filter techniques are attractive candidates for the matching precision needed in biometric verification. In particular, advanced correlation filters, such as synthetic discriminant function filters, can offer very good matching performance in the presence of variability in these biometric images (e.g., facial expressions, illumination changes, etc.). We investigate the performance of advanced correlation filters for face, fingerprint, and iris biometric verification. PMID:14735958

  10. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  11. TPS verification with UUT simulation

    NASA Astrophysics Data System (ADS)

    Wang, Guohua; Meng, Xiaofeng; Zhao, Ruixian

    2006-11-01

    TPS's (Test Program Set) verification or first article acceptance test commonly depends on fault insertion experiment on UUT (Unit Under Test). However the failure modes injected on UUT is limited and it is almost infeasible when the UUT is in development or in a distributed state. To resolve this problem, a TPS verification method based on UUT interface signal simulation is putting forward. The interoperability between ATS (automatic test system) and UUT simulation platform is very important to realize automatic TPS verification. After analyzing the ATS software architecture, the approach to realize interpretability between ATS software and UUT simulation platform is proposed. And then the UUT simulation platform software architecture is proposed based on the ATS software architecture. The hardware composition and software architecture of the UUT simulation is described in details. The UUT simulation platform has been implemented in avionics equipment TPS development, debug and verification.

  12. Signatures of nonthermal melting.

    PubMed

    Zier, Tobias; Zijlstra, Eeuwe S; Kalitsov, Alan; Theodonis, Ioannis; Garcia, Martin E

    2015-09-01

    Intense ultrashort laser pulses can melt crystals in less than a picosecond but, in spite of over thirty years of active research, for many materials it is not known to what extent thermal and nonthermal microscopic processes cause this ultrafast phenomenon. Here, we perform ab-initio molecular-dynamics simulations of silicon on a laser-excited potential-energy surface, exclusively revealing nonthermal signatures of laser-induced melting. From our simulated atomic trajectories, we compute the decay of five structure factors and the time-dependent structure function. We demonstrate how these quantities provide criteria to distinguish predominantly nonthermal from thermal melting. PMID:26798822

  13. Signatures of nonthermal melting

    PubMed Central

    Zier, Tobias; Zijlstra, Eeuwe S.; Kalitsov, Alan; Theodonis, Ioannis; Garcia, Martin E.

    2015-01-01

    Intense ultrashort laser pulses can melt crystals in less than a picosecond but, in spite of over thirty years of active research, for many materials it is not known to what extent thermal and nonthermal microscopic processes cause this ultrafast phenomenon. Here, we perform ab-initio molecular-dynamics simulations of silicon on a laser-excited potential-energy surface, exclusively revealing nonthermal signatures of laser-induced melting. From our simulated atomic trajectories, we compute the decay of five structure factors and the time-dependent structure function. We demonstrate how these quantities provide criteria to distinguish predominantly nonthermal from thermal melting. PMID:26798822

  14. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  15. Multimodal signature modeling of humans

    NASA Astrophysics Data System (ADS)

    Cathcart, J. Michael; Kocher, Brian; Prussing, Keith; Lane, Sarah; Thomas, Alan

    2010-04-01

    Georgia Tech been investigating method for the detection of covert personnel in traditionally difficult environments (e.g., urban, caves). This program focuses on a detailed phenomenological analysis of human physiology and signatures with the subsequent identification and characterization of potential observables. Both aspects are needed to support the development of personnel detection and tracking algorithms. The difficult nature of these personnel-related problems dictates a multimodal sensing approach. Human signature data of sufficient and accurate quality and quantity do not exist, thus the development of an accurate signature model for a human is needed. This model should also simulate various human activities to allow motion-based observables to be exploited. This paper will describe a multimodal signature modeling approach that incorporates human physiological aspects, thermoregulation, and dynamics into the signature calculation. This approach permits both passive and active signatures to be modeled. The focus of the current effort involved the computation of signatures in urban environments. This paper will discuss the development of a human motion model for use in simulating both electro-optical signatures and radar-based signatures. Video sequences of humans in a simulated urban environment will also be presented; results using these sequences for personnel tracking will be presented.

  16. Electromagnetic Signature Technique as a Promising Tool to Verify Nuclear Weapons Storage and Dismantlement under a Nuclear Arms Control Regime

    SciTech Connect

    Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.; Ramuhalli, Pradeep

    2012-08-01

    The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without the use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.

  17. Verification of classified fissile material using unclassified attributes

    SciTech Connect

    Nicholas, N.J.; Fearey, B.L.; Puckett, J.M.; Tape, J.W.

    1998-12-31

    This paper reports on the most recent efforts of US technical experts to explore verification by IAEA of unclassified attributes of classified excess fissile material. Two propositions are discussed: (1) that multiple unclassified attributes could be declared by the host nation and then verified (and reverified) by the IAEA in order to provide confidence in that declaration of a classified (or unclassified) inventory while protecting classified or sensitive information; and (2) that attributes could be measured, remeasured, or monitored to provide continuity of knowledge in a nonintrusive and unclassified manner. They believe attributes should relate to characteristics of excess weapons materials and should be verifiable and authenticatable with methods usable by IAEA inspectors. Further, attributes (along with the methods to measure them) must not reveal any classified information. The approach that the authors have taken is as follows: (1) assume certain attributes of classified excess material, (2) identify passive signatures, (3) determine range of applicable measurement physics, (4) develop a set of criteria to assess and select measurement technologies, (5) select existing instrumentation for proof-of-principle measurements and demonstration, and (6) develop and design information barriers to protect classified information. While the attribute verification concepts and measurements discussed in this paper appear promising, neither the attribute verification approach nor the measurement technologies have been fully developed, tested, and evaluated.

  18. Online Monitoring of Induction Motors

    SciTech Connect

    McJunkin, Timothy R.; Agarwal, Vivek; Lybeck, Nancy Jean

    2016-01-01

    The online monitoring of active components project, under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program, researched diagnostic and prognostic models for alternating current induction motors (IM). Idaho National Laboratory (INL) worked with the Electric Power Research Institute (EPRI) to augment and revise the fault signatures previously implemented in the Asset Fault Signature Database of EPRI’s Fleet Wide Prognostic and Health Management (FW PHM) Suite software. Induction Motor diagnostic models were researched using the experimental data collected by Idaho State University. Prognostic models were explored in the set of literature and through a limited experiment with 40HP to seek the Remaining Useful Life Database of the FW PHM Suite.

  19. Signature CERN-URSS

    ScienceCinema

    None

    2016-07-12

    Le DG W.Jentschke souhaite la bienvenue à l'assemblée et aux invités pour la signature du protocole entre le Cern et l'URSS qui est un événement important. C'est en 1955 que 55 visiteurs soviétiques ont visité le Cern pour la première fois. Le premier DG au Cern, F.Bloch, et Mons.Amaldi sont aussi présents. Tandis que le discours anglais de W.Jentschke est traduit en russe, le discours russe de Mons.Morozov est traduit en anglais.

  20. Verification hybrid control of a wheeled mobile robot and manipulator

    NASA Astrophysics Data System (ADS)

    Muszynska, Magdalena; Burghardt, Andrzej; Kurc, Krzysztof; Szybicki, Dariusz

    2016-04-01

    In this article, innovative approaches to realization of the wheeled mobile robots and manipulator tracking are presented. Conceptions include application of the neural-fuzzy systems to compensation of the controlled system's nonlinearities in the tracking control task. Proposed control algorithms work on-line, contain structure, that adapt to the changeable work conditions of the controlled systems, and do not require the preliminary learning. The algorithm was verification on the real object which was a Scorbot - ER 4pc robotic manipulator and a Pioneer - 2DX mobile robot.

  1. Off-line signature recognition based on dynamic methods

    NASA Astrophysics Data System (ADS)

    Igarza, Juan J.; Hernaez, Inmaculada; Goirizelaia, Inaki; Espinosa, Koldo; Escolar, Jon

    2005-03-01

    In this paper we present the work developed on off-line signature verification as a continuation of a previous work using Left-to-Right Hidden Markov Models (LR-HMM) in order to extend those models to the field of static or off-line signature processing using results provided by image connectivity analysis. The chain encoding of perimeter points for each blob obtained by this analysis is an ordered set of points in the space, clockwise around the perimeter of the blob. Two models are generated depending on the way the blobs obtained from the connectivity analysis are ordered. In the first one, blobs are ordered according to their perimeter length. In the second proposal, blobs are ordered in their natural reading order, i.e. from the top to the bottom and left to right. Finally, two LR-HMM models are trained using the (x,y) coordinates of the chain codes obtained by the two mentioned techniques and a set of geometrical local features obtained from them such as polar coordinates referred to the center of ink, local radii, segment lengths and local tangent angle. Verification results of the two techniques are compared over a biometrical database containing skilled forgeries.

  2. Signatures of dark matter

    NASA Astrophysics Data System (ADS)

    Baltz, Edward Anthony

    It is well known that most of the mass in the universe remains unobserved save for its gravitational effect on luminous matter. The nature of this ``dark matter'' remains a mystery. From measurements of the primordial deuterium abundance, the theory of big bang nucleosynthesis predicts that there are not enough baryons to account for the amount of dark matter observed, thus the missing mass must take an exotic form. Several promising candidates have been proposed. In this work I will describe my research along two main lines of inquiry into the dark matter puzzle. The first possibility is that the dark matter is exotic massive particles, such as those predicted by supersymmetric extensions to the standard model of particle physics. Such particles are generically called WIMPs, for weakly interacting massive particles. Focusing on the so-called neutralino in supersymmetric models, I discuss the possible signatures of such particles, including their direct detection via nuclear recoil experiments and their indirect detection via annihilations in the halos of galaxies, producing high energy antiprotons, positrons and gamma rays. I also discuss signatures of the possible slow decays of such particles. The second possibility is that there is a population of black holes formed in the early universe. Any dark objects in galactic halos, black holes included, are called MACHOs, for massive compact halo objects. Such objects can be detected by their gravitational microlensing effects. Several possibilities for sources of baryonic dark matter are also interesting for gravitational microlensing. These include brown dwarf stars and old, cool white dwarf stars. I discuss the theory of gravitational microlensing, focusing on the technique of pixel microlensing. I make predictions for several planned microlensing experiments with ground based and space based telescopes. Furthermore, I discuss binary lenses in the context of pixel microlensing. Finally, I develop a new technique for

  3. Multisensors signature prediction workbench

    NASA Astrophysics Data System (ADS)

    Latger, Jean; Cathala, Thierry

    2015-10-01

    Guidance of weapon systems relies on sensors to analyze targets signature. Defense weapon systems also need to detect then identify threats also using sensors. The sensors performance is very dependent on conditions e.g. time of day, atmospheric propagation, background ... Visible camera are very efficient for diurnal fine weather conditions, long wave infrared sensors for night vision, radar systems very efficient for seeing through atmosphere and/or foliage ... Besides, multi sensors systems, combining several collocated sensors with associated algorithms of fusion, provide better efficiency (typically for Enhanced Vision Systems). But these sophisticated systems are all the more difficult to conceive, assess and qualify. In that frame, multi sensors simulation is highly required. This paper focuses on multi sensors simulation tools. A first part makes a state of the Art of such simulation workbenches with a special focus on SE-Workbench. SEWorkbench is described with regards to infrared/EO sensors, millimeter waves sensors, active EO sensors and GNSS sensors. Then a general overview of simulation of targets and backgrounds signature objectives is presented, depending on the type of simulation required (parametric studies, open loop simulation, closed loop simulation, hybridization of SW simulation and HW ...). After the objective review, the paper presents some basic requirements for simulation implementation such as the deterministic behavior of simulation, mandatory to repeat it many times for parametric studies... Several technical topics are then discussed, such as the rendering technique (ray tracing vs. rasterization), the implementation (CPU vs. GP GPU) and the tradeoff between physical accuracy and performance of computation. Examples of results using SE-Workbench are showed and commented.

  4. Signatures of AGN feedback

    NASA Astrophysics Data System (ADS)

    Wylezalek, D.; Zakamska, N.

    2016-06-01

    Feedback from active galactic nuclei (AGN) is widely considered to be the main driver in regulating the growth of massive galaxies. It operates by either heating or driving the gas that would otherwise be available for star formation out of the galaxy, preventing further increase in stellar mass. Observational proof for this scenario has, however, been hard to come by. We have assembled a large sample of 133 radio-quiet type-2 and red AGN at 0.1signatures are hosted in galaxies that are more `quenched' considering their stellar mass than galaxies with weaker outflow signatures. This correlation is only seen in AGN host galaxies with SFR >100 M_{⊙} yr^{-1} where presumably the coupling of the AGN-driven wind to the gas is strongest. This observation is consistent with the AGN having a net suppression, or `negative' impact, through feedback on the galaxies' star formation history.

  5. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  6. Face verification with balanced thresholds.

    PubMed

    Yan, Shuicheng; Xu, Dong; Tang, Xiaoou

    2007-01-01

    The process of face verification is guided by a pre-learned global threshold, which, however, is often inconsistent with class-specific optimal thresholds. It is, hence, beneficial to pursue a balance of the class-specific thresholds in the model-learning stage. In this paper, we present a new dimensionality reduction algorithm tailored to the verification task that ensures threshold balance. This is achieved by the following aspects. First, feasibility is guaranteed by employing an affine transformation matrix, instead of the conventional projection matrix, for dimensionality reduction, and, hence, we call the proposed algorithm threshold balanced transformation (TBT). Then, the affine transformation matrix, constrained as the product of an orthogonal matrix and a diagonal matrix, is optimized to improve the threshold balance and classification capability in an iterative manner. Unlike most algorithms for face verification which are directly transplanted from face identification literature, TBT is specifically designed for face verification and clarifies the intrinsic distinction between these two tasks. Experiments on three benchmark face databases demonstrate that TBT significantly outperforms the state-of-the-art subspace techniques for face verification.

  7. Nondestructive verification with minimal movement of irradiated light-water-reactor fuel assemblies

    SciTech Connect

    Phillips, J.R.; Bosler, G.E.; Halbig, J.K.; Klosterbuer, S.F.; Menlove, H.O.

    1982-10-01

    Nondestructive verification of irradiated light-water reactor fuel assemblies can be performed rapidly and precisely by measuring their gross gamma-ray and neutron signatures. A portable system measured fuel assemblies with exposures ranging from 18.4 to 40.6 GWd/tU and with cooling times ranging from 1575 to 2638 days. Differences in the measured results for side or corner measurements are discussed. 25 figures, 20 tables.

  8. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  9. Online 1990.

    ERIC Educational Resources Information Center

    Goldstein, Morris

    This paper examines the co-existence of online and CD-ROM technologies in terms of their existing pricing structures, marketing strategies, functionality, and future roles. "Fixed Price Unlimited Usage" (FPUU) pricing and flat-rate pricing are discussed as viable alternatives to current pricing practices. In addition, it is argued that the…

  10. Online Learning

    ERIC Educational Resources Information Center

    Perry, Edward H.; Pilati, Michelle L.

    2011-01-01

    Distance education, which began as correspondence courses in the nineteenth century and grew into educational television during the twentieth century, evolved into learning on the Web by the mid-1990s. Accompanying the rise in online learning has been a similar rise in organizations and publications dedicated to serving the needs of online…

  11. Identifying, Visualizing, and Fusing Social Media Data to Support Nonproliferation and Arms Control Treaty Verification: Preliminary Results

    SciTech Connect

    Gastelum, Zoe N.; Cramer, Nicholas O.; Benz, Jacob M.; Kreyling, Sean J.; Henry, Michael J.; Corley, Courtney D.; Whattam, Kevin M.

    2013-07-11

    While international nonproliferation and arms control verification capabilities have their foundations in physical and chemical sensors, state declarations, and on-site inspections, verification experts are beginning to consider the importance of open source data to complement and support traditional means of verification. One of those new, and increasingly expanding, sources of open source information is social media, which can be ingested and understood through social media analytics (SMA). Pacific Northwest National Laboratory (PNNL) is conducting research to further our ability to identify, visualize, and fuse social media data to support nonproliferation and arms control treaty verification efforts. This paper will describe our preliminary research to examine social media signatures of nonproliferation or arms control proxy events. We will describe the development of our preliminary nonproliferation and arms control proxy events, outline our initial findings, and propose ideas for future work.

  12. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  13. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  14. A verification system of RMAP protocol controller

    NASA Astrophysics Data System (ADS)

    Khanov, V. Kh; Shakhmatov, A. V.; Chekmarev, S. A.

    2015-01-01

    The functional verification problem of IP blocks of RMAP protocol controller is considered. The application of the verification method using fully- functional models of the processor and the internal bus of a system-on-chip is justified. Principles of construction of a verification system based on the given approach are proposed. The practical results of creating a system of verification of IP block of RMAP protocol controller is presented.

  15. Statistical clumped isotope signatures.

    PubMed

    Röckmann, T; Popa, M E; Krol, M C; Hofmann, M E G

    2016-08-18

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules.

  16. Harmonic 'signatures' of microorganisms.

    PubMed

    Blake-Coleman, B C; Hutchings, M J; Silley, P

    1994-01-01

    The frequency/amplitude effect of various microorganisms exposed to periodic (time varying) electric fields, when proximate to immersed electrodes, has been studied using a novel analytical instrument. The harmonic distribution, in complex signals caused by cells exposed to harmonic free waveforms and occupying part of the electrode/suspension interface volume, was shown to be almost entirely due to the change in the standing interfacial transfer function by the (dielectrically nonlinear) presence of cells. Thus, the characteristic interfacial non-linearity is viewed as variable, being uniquely modulated by the presence of particular cells in the interfacial region. Little can be attributed to bulk (far field) effects. The tendency for subtle (characteristic) signal distortion to occur as a function of particulate (cell or molecular) occupancy of the near electrode interfacial region under controlled current conditions leads to the method of sample characterisation by harmonic (Fourier) analysis. We report here, as a sequel to our original studies (Hutchings et al., 1993; Hutchings and Blake-Coleman, 1993), preliminary results of the harmonic analysis of microbial suspensions under controlled signal conditions using a three-electrode configuration. These data provide three-dimensional graphical representations producing harmonic 'surfaces' for various microorganisms. Thus, cell type differences are characterised by their 'harmonic signature'. The visual distinction provided by these 'surface' forming three-dimensional plots is striking and gives a convincing impression of the ability to identify and enumerate specific microorganisms by acquisition of cell-modulated electrode interfacial Fourier spectra. PMID:8060593

  17. Infrasound Rocket Signatures

    NASA Astrophysics Data System (ADS)

    Olson, J.

    2012-09-01

    This presentation reviews the work performed by our research group at the Geophysical Institute as we have applied the tools of infrasound research to rocket studies. This report represents one aspect of the effort associated with work done for the National Consortium for MASINT Research (NCMR) program operated by the National MASINT Office (NMO) of the Defense Intelligence Agency (DIA). Infrasound, the study of acoustic signals and their propagation in a frequency band below 15 Hz, enables an investigator to collect and diagnose acoustic signals from distant sources. Absorption of acoustic energy in the atmosphere decreases as the frequency is reduced. In the infrasound band signals can propagate hundreds and thousands of kilometers with little degradation. We will present an overview of signatures from rockets ranging from small sounding rockets such as the Black Brandt and Orion series to larger rockets such as Delta 2,4 and Atlas V. Analysis of the ignition transients provides information that can uniquely identify the motor type. After the rocket ascends infrasound signals can be used to characterize the rocket and identify the various events that take place along a trajectory such as staging and maneuvering. We have also collected information on atmospheric shocks and sonic booms from the passage of supersonic vehicles such as the shuttle. This review is intended to show the richness of the unique signal set that occurs in the low-frequency infrasound band.

  18. Statistical clumped isotope signatures

    PubMed Central

    Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  19. Statistical clumped isotope signatures

    NASA Astrophysics Data System (ADS)

    Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.

    2016-08-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules.

  20. Statistical clumped isotope signatures.

    PubMed

    Röckmann, T; Popa, M E; Krol, M C; Hofmann, M E G

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  1. UHECR: Signatures and models

    NASA Astrophysics Data System (ADS)

    Berezinsky, V.

    2013-06-01

    The signatures of Ultra High Energy (E ≳ 1 EeV) proton propagation through CMB radiation are pair-production dip and GZK cutoff. The visible characteristics of these two spectral features are ankle, which is intrinsic part of the dip, beginning of GZK cutoff in the differential spectrum and E1/2 in integral spectrum. Measured by HiRes and Telescope Array (TA) these characteristics agree with theoretical predictions. However, directly measured mass composition remains a puzzle. While HiRes and TA detectors observe the proton-dominated mass composition, the data of Auger detector strongly evidence for nuclei mass composition becoming progressively heavier at energy higher than 4 EeV and reaching Iron at energy about 35 EeV. The models based on the Auger and HiRes/TA data are considered independently and classified using the transition from galactic to extragalactic cosmic rays. The ankle cannot provide this transition. since data of all three detector at energy (1-3) EeV agree with pure proton composition (or at least not heavier than Helium). If produced in Galaxy these particles result in too high anisotropy. This argument excludes or strongly disfavours all ankle models with ankle energy Ea > 3 EeV. The calculation of elongation curves, Xmax(E), for different ankle models strengthens further this conclusion. Status of other models, the dip, mixed composition and Auger based models are discussed.

  2. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  3. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... Organization: MOD-025-2 (Verification and Data Reporting of Generator Real and Reactive Power Capability and Synchronous Condenser Reactive Power Capability), MOD- 026-1 (Verification of Models and Data for...

  4. A proposed neutral line signature

    NASA Technical Reports Server (NTRS)

    Doxas, I.; Speiser, T. W.; Dusenbery, P. B.; Horton, W.

    1992-01-01

    An identifying signature is proposed for the existence and location of the neutral line in the magnetotail. The signature, abrupt density, and temperature changes in the Earthtail direction, was first discovered in test particle simulations. Such temperature variations have been observed in ISEE data (Huang et. al. 1992), but their connection to the possible existence of a neutral line in the tail has not yet been established. The proposed signature develops earlier than the ion velocity space ridge of Martin and Speiser (1988), but can only be seen by spacecraft in the vicinity of the neutral line, while the latter can locate a neutral line remotely.

  5. The science verification of FLAMES

    NASA Astrophysics Data System (ADS)

    Primas, Francesca

    2003-06-01

    After a new VLT instrument has been commissioned and thoroughly tested1, a series of scientific and technical checkups are scheduled in order to test the front-to-end operations chain before the official start of regular operations. Technically speaking, these are the socalled Dry Runs, part of which are usually devoted to the Science Verification (SV for short) of that specific instrument. A Science Verification programme includes a set of typical scientific observations with the aim of verifying and demonstrating to the community the capabilities of a new instrument in the operational framework of the VLT Paranal Observatory. Though manifold, its goals can be summarised in two main points: from the scientific point of view, by demonstrating the scientific potential of the new instrument, these observations will provide ESO users with first science- grade data, thus fostering an early scientific return. From the technical point of view, by testing the whole operational system (from the preparation of the observations to their execution and analysis), it will provide important feedback to the Instrument Operation Teams (both in Paranal and in Garching), to the Instrument Division, and to the Data Flow groups. More details about the concept(s) behind a Science Verification can be found in the “Science Verification Policy and Procedures” document (available at http://www.eso.org/science/vltsv/).

  6. Diagnostic marker signature for esophageal cancer from transcriptome analysis.

    PubMed

    Warnecke-Eberz, Ute; Metzger, Ralf; Hölscher, Arnulf H; Drebber, Uta; Bollschweiler, Elfriede

    2016-05-01

    Esophageal cancer is often diagnosed at an advanced stage. Diagnostic markers are needed for achieving a cure in esophageal cancer detecting and treating tumor cells earlier. In patients with locally advanced squamous cell carcinoma of the esophagus (ESCC), we profiled the gene expression of ESCC compared to corresponding normal biopsies for diagnostic markers by genome microarrays. Profiling of gene expression identified 4844 genes differentially expressed, 2122 upregulated and 2722 downregulated in ESCC. Twenty-three overexpressed candidates with best scores from significance analysis have been selected for further analysis by TaqMan low-density array-technique using a validation cohort of 40 patients. The verification rate was 100 % for ESCC. Twenty-two markers were additionally overexpressed in adenocarcinoma of the esophagus (EAC). The markers significantly overexpressed already in earlier tumor stages (pT1-2) of both histological subtypes (n = 19) have been clustered in a "diagnostic signature": PLA2G7, PRAME, MMP1, MMP3, MMP12, LIlRB2, TREM2, CHST2, IGFBP2, IGFBP7, KCNJ8, EMILIN2, CTHRC1, EMR2, WDR72, LPCAT1, COL4A2, CCL4, and SNX10. The marker signature will be translated to clinical practice to prove its diagnostic impact. This diagnostic signature may contribute to the earlier detection of tumor cells, with the aim to complement clinical techniques resulting in the development of better detection of concepts of esophageal cancer for earlier therapy and more favorite prognosis. PMID:26631031

  7. Programmable RET Mask Layout Verification

    NASA Astrophysics Data System (ADS)

    Beale, Daniel F.; Mayhew, Jeffrey P.; Rieger, Michael L.; Tang, Zongwu

    2002-12-01

    Emerging resolution enhancement techniques (RET) and OPC are dramatically increasing the complexity of mask layouts and, in turn, mask verification. Mask shapes needed to achieve required results on the wafer diverge significantly from corresponding shapes in the physical design, and in some cases a single chip layer may be decomposed into two masks used in multiple exposures. The mask verification challenge is to certify that a RET-synthesized mask layout will produce an acceptable facsimile of the design intent expressed in the design layout. Furthermore costs, tradeoffs between mask-complexity, design intent, targeted process latitude, and other factors are playing a growing role in helping to control rising mask costs. All of these considerations must in turn be incorporated into the mask layout verification strategy needed for data prep sign-off. In this paper we describe a technique for assessing the lithographic quality of mask layouts for diverse RET methods while effectively accommodating various manufacturing objectives and specifications. It leverages the familiar DRC paradigm for identifying errors and producing DRC-like error shapes in its output layout. It integrates a unique concept of "check figures" - layer-based geometries that dictate where and how simulations of shapes on the wafer are to be compared to the original desired layout. We will show how this provides a highly programmable environment that makes it possible to engage in "compound" check strategies that vary based on design intent and adaptive simulation with multiple checks. Verification may be applied at the "go/no go" level or can be used to build a body of data for quantitative analysis of lithographic behavior at multiple process conditions or for specific user-defined critical features. In addition, we will outline automated methods that guide the selection of input parameters controlling specific verification strategies.

  8. Intrusion detection using secure signatures

    DOEpatents

    Nelson, Trent Darnel; Haile, Jedediah

    2014-09-30

    A method and device for intrusion detection using secure signatures comprising capturing network data. A search hash value, value employing at least one one-way function, is generated from the captured network data using a first hash function. The presence of a search hash value match in a secure signature table comprising search hash values and an encrypted rule is determined. After determining a search hash value match, a decryption key is generated from the captured network data using a second hash function, a hash function different form the first hash function. One or more of the encrypted rules of the secure signatures table having a hash value equal to the generated search hash value are then decrypted using the generated decryption key. The one or more decrypted secure signature rules are then processed for a match and one or more user notifications are deployed if a match is identified.

  9. Ballastic signature identification systems study

    NASA Technical Reports Server (NTRS)

    Reich, A.; Hine, T. L.

    1976-01-01

    The results are described of an attempt to establish a uniform procedure for documenting (recording) expended bullet signatures as effortlessly as possible and to build a comprehensive library of these signatures in a form that will permit the automated comparison of a new suspect bullet with the prestored library. The ultimate objective is to achieve a standardized format that will permit nationwide interaction between police departments, crime laboratories, and other interested law enforcement agencies.

  10. Color signatures in Amorsolo paintings

    NASA Astrophysics Data System (ADS)

    Soriano, Maricor N.; Palomero, Cherry May; Cruz, Larry; Yambao, Clod Marlan Krister; Dado, Julie Mae; Salvador-Campaner, Janice May

    2010-02-01

    We present the results of a two-year project aimed at capturing quantifiable color signatures of oil paintings of Fernando Amorsolo, the Philippine's first National Artists. Color signatures are found by comparing CIE xy measurements of skin color in portraits and ground, sky and foliage in landscapes. The results are compared with results of visual examination and art historical data as well as works done by Amorsolo's contemporaries and mentors.

  11. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  12. Quantum messages with signatures forgeable in arbitrated quantum signature schemes

    NASA Astrophysics Data System (ADS)

    Kim, Taewan; Choi, Jeong Woon; Jho, Nam-Su; Lee, Soojoon

    2015-02-01

    Even though a method to perfectly sign quantum messages has not been known, the arbitrated quantum signature scheme has been considered as one of the good candidates. However, its forgery problem has been an obstacle to the scheme becoming a successful method. In this paper, we consider one situation, which is slightly different from the forgery problem, that we use to check whether at least one quantum message with signature can be forged in a given scheme, although all the messages cannot be forged. If there are only a finite number of forgeable quantum messages in the scheme, then the scheme can be secured against the forgery attack by not sending forgeable quantum messages, and so our situation does not directly imply that we check whether the scheme is secure against the attack. However, if users run a given scheme without any consideration of forgeable quantum messages, then a sender might transmit such forgeable messages to a receiver and in such a case an attacker can forge the messages if the attacker knows them. Thus it is important and necessary to look into forgeable quantum messages. We show here that there always exists such a forgeable quantum message-signature pair for every known scheme with quantum encryption and rotation, and numerically show that there are no forgeable quantum message-signature pairs that exist in an arbitrated quantum signature scheme.

  13. Hybrid Enrichment Assay Methods for a UF6 Cylinder Verification Station: FY10 Progress Report

    SciTech Connect

    Smith, Leon E.; Jordan, David V.; Orton, Christopher R.; Misner, Alex C.; Mace, Emily K.

    2010-08-01

    Pacific Northwest National Laboratory (PNNL) is developing the concept of an automated UF6 cylinder verification station that would be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until the arrival of International Atomic Energy Agency (IAEA) inspectors. At the center of this unattended system is a hybrid enrichment assay technique that combines the traditional enrichment-meter method (based on the 186 keV peak from 235U) with non-traditional neutron-induced high-energy gamma-ray signatures (spawned primarily by 234U alpha emissions and 19F(alpha, neutron) reactions). Previous work by PNNL provided proof-of-principle for the non-traditional signatures to support accurate, full-volume interrogation of the cylinder enrichment, thereby reducing the systematic uncertainties in enrichment assay due to UF6 heterogeneity and providing greater sensitivity to material substitution scenarios. The work described here builds on that preliminary evaluation of the non-traditional signatures, but focuses on a prototype field system utilizing NaI(Tl) and LaBr3(Ce) spectrometers, and enrichment analysis algorithms that integrate the traditional and non-traditional signatures. Results for the assay of Type-30B cylinders ranging from 0.2 to 4.95 wt% 235U, at an AREVA fuel fabrication plant in Richland, WA, are described for the following enrichment analysis methods: 1) traditional enrichment meter signature (186 keV peak) as calculated using a square-wave convolute (SWC) algorithm; 2) non-traditional high-energy gamma-ray signature that provides neutron detection without neutron detectors and 3) hybrid algorithm that merges the traditional and non-traditional signatures. Uncertainties for each method, relative to the declared enrichment for each cylinder, are calculated and compared to the uncertainties from an attended

  14. Development of Asset Fault Signatures for Prognostic and Health Management in the Nuclear Industry

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2014-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: Diagnostic Advisor, Asset Fault Signature (AFS) Database, Remaining Useful Life Advisor, and Remaining Useful Life Database. This paper focuses on development of asset fault signatures to assess the health status of generator step-up generators and emergency diesel generators in nuclear power plants. Asset fault signatures describe the distinctive features based on technical examinations that can be used to detect a specific fault type. At the most basic level, fault signatures are comprised of an asset type, a fault type, and a set of one or more fault features (symptoms) that are indicative of the specified fault. The AFS Database is populated with asset fault signatures via a content development exercise that is based on the results of intensive technical research and on the knowledge and experience of technical experts. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  15. Experimental verification of quantum computation

    NASA Astrophysics Data System (ADS)

    Barz, Stefanie; Fitzsimons, Joseph F.; Kashefi, Elham; Walther, Philip

    2013-11-01

    Quantum computers are expected to offer substantial speed-ups over their classical counterparts and to solve problems intractable for classical computers. Beyond such practical significance, the concept of quantum computation opens up fundamental questions, among them the issue of whether quantum computations can be certified by entities that are inherently unable to compute the results themselves. Here we present the first experimental verification of quantum computation. We show, in theory and experiment, how a verifier with minimal quantum resources can test a significantly more powerful quantum computer. The new verification protocol introduced here uses the framework of blind quantum computing and is independent of the experimental quantum-computation platform used. In our scheme, the verifier is required only to generate single qubits and transmit them to the quantum computer. We experimentally demonstrate this protocol using four photonic qubits and show how the verifier can test the computer's ability to perform quantum computation.

  16. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  17. Ontology Matching with Semantic Verification.

    PubMed

    Jean-Mary, Yves R; Shironoshita, E Patrick; Kabuka, Mansur R

    2009-09-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies.

  18. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  19. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  20. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  1. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  2. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  3. Signature molecular descriptor : advanced applications.

    SciTech Connect

    Visco, Donald Patrick, Jr.

    2010-04-01

    In this work we report on the development of the Signature Molecular Descriptor (or Signature) for use in the solution of inverse design problems as well as in highthroughput screening applications. The ultimate goal of using Signature is to identify novel and non-intuitive chemical structures with optimal predicted properties for a given application. We demonstrate this in three studies: green solvent design, glucocorticoid receptor ligand design and the design of inhibitors for Factor XIa. In many areas of engineering, compounds are designed and/or modified in incremental ways which rely upon heuristics or institutional knowledge. Often multiple experiments are performed and the optimal compound is identified in this brute-force fashion. Perhaps a traditional chemical scaffold is identified and movement of a substituent group around a ring constitutes the whole of the design process. Also notably, a chemical being evaluated in one area might demonstrate properties very attractive in another area and serendipity was the mechanism for solution. In contrast to such approaches, computer-aided molecular design (CAMD) looks to encompass both experimental and heuristic-based knowledge into a strategy that will design a molecule on a computer to meet a given target. Depending on the algorithm employed, the molecule which is designed might be quite novel (re: no CAS registration number) and/or non-intuitive relative to what is known about the problem at hand. While CAMD is a fairly recent strategy (dating to the early 1980s), it contains a variety of bottlenecks and limitations which have prevented the technique from garnering more attention in the academic, governmental and industrial institutions. A main reason for this is how the molecules are described in the computer. This step can control how models are developed for the properties of interest on a given problem as well as how to go from an output of the algorithm to an actual chemical structure. This report

  4. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All

  5. Measurement of sniper infrared signatures

    NASA Astrophysics Data System (ADS)

    Kastek, M.; Dulski, R.; Trzaskawka, P.; Bieszczad, G.

    2009-09-01

    The paper presents some practical aspects of sniper IR signature measurements. Description of particular signatures for sniper and background in typical scenarios has been presented. We take into consideration sniper activities in open area as well as in urban environment. The measurements were made at field test ground. High precision laboratory measurements were also performed. Several infrared cameras were used during measurements to cover all measurement assumptions. Some of the cameras are measurement class devices with high accuracy and speed. The others are microbolometer cameras with FPA detector similar to those used in real commercial counter-sniper systems. The registration was made in SWIR and LWIR spectral bands simultaneously. An ultra fast visual camera was also used for visible spectra registration. Exemplary sniper IR signatures for typical situation were presented.

  6. Graph Analytics for Signature Discovery

    SciTech Connect

    Hogan, Emilie A.; Johnson, John R.; Halappanavar, Mahantesh; Lo, Chaomei

    2013-06-01

    Within large amounts of seemingly unstructured data it can be diffcult to find signatures of events. In our work we transform unstructured data into a graph representation. By doing this we expose underlying structure in the data and can take advantage of existing graph analytics capabilities, as well as develop new capabilities. Currently we focus on applications in cybersecurity and communication domains. Within cybersecurity we aim to find signatures for perpetrators using the pass-the-hash attack, and in communications we look for emails or phone calls going up or down a chain of command. In both of these areas, and in many others, the signature we look for is a path with certain temporal properties. In this paper we discuss our methodology for finding these temporal paths within large graphs.

  7. Materials with controllable signature properties

    NASA Astrophysics Data System (ADS)

    Dickman, O.; Holmberg, B.; Karlsson, T.; Savage, S.

    1995-02-01

    We have in this report considered some types of material with potential for use in signature control of structures. The material types selected for inclusion in this study were electrically conductive polymers, fullerenes, nanostructured materials and Langmuir-Blodgett films. To control the signature of a structure in real time it must be possible to vary the material emissivity, structural transmission, and reflection or absorption of electromagnetic radiation in the relevant wavelength region. This may be achieved by changes in temperature, pressure, electrical or magnetic field or by the concentration of a chemical substance within the material. It is concluded that it is feasible to develop electrically conductive polymeric materials with controllable properties for practical signature control application within 5 to 10 years.

  8. Signature Visualization of Software Binaries

    SciTech Connect

    Panas, T

    2008-07-01

    In this paper we present work on the visualization of software binaries. In particular, we utilize ROSE, an open source compiler infrastructure, to pre-process software binaries, and we apply a landscape metaphor to visualize the signature of each binary (malware). We define the signature of a binary as a metric-based layout of the functions contained in the binary. In our initial experiment, we visualize the signatures of a series of computer worms that all originate from the same line. These visualizations are useful for a number of reasons. First, the images reveal how the archetype has evolved over a series of versions of one worm. Second, one can see the distinct changes between version. This allows the viewer to form conclusions about the development cycle of a particular worm.

  9. Online Learning Grows Up.

    ERIC Educational Resources Information Center

    Vail, Kathleen

    2001-01-01

    Describes various American efforts to develop online schools and classes. Discusses attributes of successful online teachers and students. Lists 17 online learning support companies and their Web sites. (PKP)

  10. catRAPID signature: identification of ribonucleoproteins and RNA-binding regions

    PubMed Central

    Livi, Carmen Maria; Klus, Petr; Delli Ponti, Riccardo; Tartaglia, Gian Gaetano

    2016-01-01

    Motivation: Recent technological advances revealed that an unexpected large number of proteins interact with transcripts even if the RNA-binding domains are not annotated. We introduce catRAPID signature to identify ribonucleoproteins based on physico-chemical features instead of sequence similarity searches. The algorithm, trained on human proteins and tested on model organisms, calculates the overall RNA-binding propensity followed by the prediction of RNA-binding regions. catRAPID signature outperforms other algorithms in the identification of RNA-binding proteins and detection of non-classical RNA-binding regions. Results are visualized on a webpage and can be downloaded or forwarded to catRAPID omics for predictions of RNA targets. Availability and implementation: catRAPID signature can be accessed at http://s.tartaglialab.com/new_submission/signature. Contact: gian.tartaglia@crg.es or gian@tartaglialab.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26520853

  11. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the applicable calibrations and verifications in subpart D of this part, including the linearity verifications...

  12. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all the applicable calibrations and verifications in subpart D of this part, including the linearity verifications...

  13. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  14. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  15. Ballistic Signature Identification System Study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The first phase of a research project directed toward development of a high speed automatic process to be used to match gun barrel signatures imparted to fired bullets was documented. An optical projection technique has been devised to produce and photograph a planar image of the entire signature, and the phototransparency produced is subjected to analysis using digital Fourier transform techniques. The success of this approach appears to be limited primarily by the accuracy of the photographic step since no significant processing limitations have been encountered.

  16. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  17. The Global Diffusion of Societal Verification Tools: A Quantitative Assessment of the Public’s Ability to Engage Nonproliferation Treaty Monitoring

    SciTech Connect

    Sayre, Amanda M.; Kreyling, Sean J.; West, Curtis L.

    2015-07-11

    The spread of nuclear and dual-use technologies and the need for more robust, effective and efficient nonproliferation and arms control treaties has led to an increasing need for innovative verification approaches and technologies. This need, paired with advancements in online computing, mobile devices, commercially available satellite imagery and the evolution of online social networks, has led to a resurgence of the concept of societal verification for arms control and nonproliferation treaties. In the event a country accepts its citizens’ assistance in supporting transparency, confidence-building and societal verification, the host government will need a population that is willing and able to participate. While scholarly interest in societal verification continues to grow, social scientific research on the topic is lacking. The aim of this paper is to begin the process of understanding public societal verification capabilities, extend the availability of quantitative research on societal verification and set in motion complementary research to increase the breadth and depth of knowledge on this topic. This paper presents a potential framework and outlines a research roadmap for the development of such a societal verification capability index.

  18. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  19. Guide to good practices for independent verification

    SciTech Connect

    1998-12-01

    This Guide to Good Practices is written to enhance understanding of, and provide direction for, Independent Verification, Chapter X of Department of Energy (DOE) Order 5480.19, Conduct of Operations Requirements for DOE Facilities. The practices in this guide should be considered when planning or reviewing independent verification activities. Contractors are advised to adopt procedures that meet the intent of DOE Order 5480.19. Independent Verification is an element of an effective Conduct of Operations program. The complexity and array of activities performed in DOE facilities dictate the necessity for coordinated independent verification activities to promote safe and efficient operations.

  20. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  1. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  2. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  3. Topological Signatures for Population Admixture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Topological Signatures for Population AdmixtureDeniz Yorukoglu1, Filippo Utro1, David Kuhn2, Saugata Basu3 and Laxmi Parida1* Abstract Background: As populations with multi-linear transmission (i.e., mixing of genetic material from two parents, say) evolve over generations, the genetic transmission...

  4. Graph signatures for visual analytics.

    PubMed

    Wong, Pak Chung; Foote, Harlan; Chin, George; Mackey, Patrick; Perrine, Ken

    2006-01-01

    We present a visual analytics technique to explore graphs using the concept of a data signature. A data signature, in our context, is a multidimensional vector that captures the local topology information surrounding each graph node. Signature vectors extracted from a graph are projected onto a low-dimensional scatterplot through the use of scaling. The resultant scatterplot, which reflects the similarities of the vectors, allows analysts to examine the graph structures and their corresponding real-life interpretations through repeated use of brushing and linking between the two visualizations. The interpretation of the graph structures is based on the outcomes of multiple participatory analysis sessions with intelligence analysts conducted by the authors at the Pacific Northwest National Laboratory. The paper first uses three public domain data sets with either well-known or obvious features to explain the rationale of our design and illustrate its results. More advanced examples are then used in a customized usability study to evaluate the effectiveness and efficiency of our approach. The study results reveal not only the limitations and weaknesses of the traditional approach based solely on graph visualization, but also the advantages and strengths of our signature-guided approach presented in the paper.

  5. Invisibly Sanitizable Digital Signature Scheme

    NASA Astrophysics Data System (ADS)

    Miyazaki, Kunihiko; Hanaoka, Goichiro; Imai, Hideki

    A digital signature does not allow any alteration of the document to which it is attached. Appropriate alteration of some signed documents, however, should be allowed because there are security requirements other than the integrity of the document. In the disclosure of official information, for example, sensitive information such as personal information or national secrets is masked when an official document is sanitized so that its nonsensitive information can be disclosed when it is requested by a citizen. If this disclosure is done digitally by using the current digital signature schemes, the citizen cannot verify the disclosed information because it has been altered to prevent the leakage of sensitive information. The confidentiality of official information is thus incompatible with the integrity of that information, and this is called the digital document sanitizing problem. Conventional solutions such as content extraction signatures and digitally signed document sanitizing schemes with disclosure condition control can either let the sanitizer assign disclosure conditions or hide the number of sanitized portions. The digitally signed document sanitizing scheme we propose here is based on the aggregate signature derived from bilinear maps and can do both. Moreover, the proposed scheme can sanitize a signed document invisibly, that is, no one can distinguish whether the signed document has been sanitized or not.

  6. Conformance Verification of Privacy Policies

    NASA Astrophysics Data System (ADS)

    Fu, Xiang

    Web applications are both the consumers and providers of information. To increase customer confidence, many websites choose to publish their privacy protection policies. However, policy conformance is often neglected. We propose a logic based framework for formally specifying and reasoning about the implementation of privacy protection by a web application. A first order extension of computation tree logic is used to specify a policy. A verification paradigm, built upon a static control/data flow analysis, is presented to verify if a policy is satisfied.

  7. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  8. Approaches to wind resource verification

    NASA Technical Reports Server (NTRS)

    Barchet, W. R.

    1982-01-01

    Verification of the regional wind energy resource assessments produced by the Pacific Northwest Laboratory addresses the question: Is the magnitude of the resource given in the assessments truly representative of the area of interest? Approaches using qualitative indicators of wind speed (tree deformation, eolian features), old and new data of opportunity not at sites specifically chosen for their exposure to the wind, and data by design from locations specifically selected to be good wind sites are described. Data requirements and evaluation procedures for verifying the resource are discussed.

  9. Why do verification and validation?

    DOE PAGES

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  10. Online Nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Meyer Jordan, Bradley, IV; The, Lih-Sin; Robbins, Stuart

    2004-05-01

    Nuclear-reaction network codes are important to astronomers seeking to explore nucleosynthetic implications of astrophysical models and to nuclear physicists seeking to understand the role of nuclear properties or reaction rates in element formation. However, many users do not have the time or inclination to download and compile the codes, to manage the requisite input files, or to explore the often complex output with their own graphics programs. To help make nucleosynthesis calculations more readily available, we have placed the Clemson Nucleosynthesis code on the world-wide web at http://www.ces.clemson.edu/physics/nucleo/nuclearNetwork At this web site, any Internet user may set his or her own reaction network, nuclear properties and reaction rates, and thermodynamic trajectories. The user then submits the nucleosynthesis calculation, which runs on a dedicated server professionally maintained at Clemson University. Once the calculation is completed, the user may explore the results through dynamically produced and downloadable tables and graphs. Online help guides the user through the necessary steps. We hope this web site will prove a user-friendly and helpful tool for professional scientists as well as for students seeking to explore element formation.

  11. Irma 5.2 multi-sensor signature prediction model

    NASA Astrophysics Data System (ADS)

    Savage, James; Coker, Charles; Thai, Bea; Aboutalib, Omar; Chow, Anthony; Yamaoka, Neil; Kim, Charles

    2007-04-01

    The Irma synthetic signature prediction code is being developed by the Munitions Directorate of the Air Force Research Laboratory (AFRL/MN) to facilitate the research and development of multi-sensor systems. There are over 130 users within the Department of Defense, NASA, Department of Transportation, academia, and industry. Irma began as a high-resolution, physics-based Infrared (IR) target and background signature model for tactical weapon applications and has grown to include: a laser (or active) channel (1990), improved scene generator to support correlated frame-to-frame imagery (1992), and passive IR/millimeter wave (MMW) channel for a co-registered active/passive IR/MMW model (1994). Irma version 5.0 was released in 2000 and encompassed several upgrades to both the physical models and software; host support was expanded to Windows, Linux, Solaris, and SGI Irix platforms. In 2005, version 5.1 was released after an extensive verification and validation of an upgraded and reengineered active channel. Since 2005, the reengineering effort has focused on the Irma passive channel. Field measurements for the validation effort include the unpolarized data collection. Irma 5.2 is scheduled for release in the summer of 2007. This paper will report the validation test results of the Irma passive models and discuss the new features in Irma 5.2.

  12. Irma 5.2 multi-sensor signature prediction model

    NASA Astrophysics Data System (ADS)

    Savage, James; Coker, Charles; Thai, Bea; Aboutalib, Omar; Pau, John

    2008-04-01

    The Irma synthetic signature prediction code is being developed by the Munitions Directorate of the Air Force Research Laboratory (AFRL/RW) to facilitate the research and development of multi-sensor systems. There are over 130 users within the Department of Defense, NASA, Department of Transportation, academia, and industry. Irma began as a high-resolution, physics-based Infrared (IR) target and background signature model for tactical weapon applications and has grown to include: a laser (or active) channel (1990), improved scene generator to support correlated frame-to-frame imagery (1992), and passive IR/millimeter wave (MMW) channel for a co-registered active/passive IR/MMW model (1994). Irma version 5.0 was released in 2000 and encompassed several upgrades to both the physical models and software; host support was expanded to Windows, Linux, Solaris, and SGI Irix platforms. In 2005, version 5.1 was released after extensive verification and validation of an upgraded and reengineered ladar channel. The reengineering effort then shifted focus to the Irma passive channel. Field measurements for the validation effort include both polarized and unpolarized data collection. Irma 5.2 was released in 2007 with a reengineered passive channel. This paper summarizes the capabilities of Irma and the progress toward Irma 5.3, which includes a reengineered radar channel.

  13. Microcalibrator system for chemical signature and reagent delivery.

    SciTech Connect

    Staton, Alan W.; Simonson, Robert Joseph; Adkins, Douglas Ray; Rawlinson, Kim Scott; Robinson, Alex Lockwood; Hance, Bradley G.; Manginell, Ronald Paul; Sanchez, Lawrence James; Ellison, Jennifer Anne; Sokolowski, Sara Suzette

    2005-03-01

    Networked systems of low-cost, small, integrable chemical sensors will enable monitoring of Nonproliferation and Materials Control targets and chemical weapons threats. Sandia-designed prototype chemical sensor systems are undergoing extended field testing supported by DOE and other government agencies. A required surety component will be verification of microanalytical system performance, which can be achieved by providing a programmable source of chemical signature(s) for autonomous calibration of analytical systems. In addition, such a controlled chemical source could be used to dispense microaliquots of derivatization reagents, extending the analysis capability of chemical sensors to a wider range of targets. We have developed a microfabricated system for controlled release of selected compounds (calibrants) into the analytical stream of microsensor systems. To minimize pumping and valve requirements of microfluidic systems, and to avoid degradation issues associated with storage of dilute solutions, we have utilized thermally labile organic salts as solid-phase reservoir materials. Reproducible deposition of tetrapropyl ammonium hydroxide onto arrays of microfabricated heating elements can provide a pair of calibration marker compounds (one fast and one slow-eluting compound) for GC analyses. The use of this microaliquot gas source array for hydrogen generation is currently under further development. The goal of the latter effort will be to provide a source of high-pressure, low viscosity GC carrier gas for Sandia's next-generation microfabricated gas-phase chemical analysis systems.

  14. Block truncation signature coding for hyperspectral analysis

    NASA Astrophysics Data System (ADS)

    Chakravarty, Sumit; Chang, Chein-I.

    2008-08-01

    This paper introduces a new signature coding which is designed based on the well-known Block Truncation Coding (BTC). It comprises of bit-maps of the signature blocks generated by different threshold criteria. Two new BTC-based algorithms are developed for signature coding, to be called Block Truncation Signature Coding (BTSC) and 2-level BTSC (2BTSC). In order to compare the developed BTC based algorithms with current binary signature coding schemes such as Spectral Program Analysis Manager (SPAM) developed by Mazer et al. and Spectral Feature-based Binary Coding (SFBC) by Qian et al., three different thresholding functions, local block mean, local block gradient, local block correlation are derived to improve the BTSC performance where the combined bit-maps generated by these thresholds can provide better spectral signature characterization. Experimental results reveal that the new BTC-based signature coding performs more effectively in characterizing spectral variations than currently available binary signature coding methods.

  15. Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2015-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation of the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  16. CAMPR3: a database on sequences, structures and signatures of antimicrobial peptides

    PubMed Central

    Waghu, Faiza Hanif; Barai, Ram Shankar; Gurung, Pratima; Idicula-Thomas, Susan

    2016-01-01

    Antimicrobial peptides (AMPs) are known to have family-specific sequence composition, which can be mined for discovery and design of AMPs. Here, we present CAMPR3; an update to the existing CAMP database available online at www.camp3.bicnirrh.res.in. It is a database of sequences, structures and family-specific signatures of prokaryotic and eukaryotic AMPs. Family-specific sequence signatures comprising of patterns and Hidden Markov Models were generated for 45 AMP families by analysing 1386 experimentally studied AMPs. These were further used to retrieve AMPs from online sequence databases. More than 4000 AMPs could be identified using these signatures. AMP family signatures provided in CAMPR3 can thus be used to accelerate and expand the discovery of AMPs. CAMPR3 presently holds 10247 sequences, 757 structures and 114 family-specific signatures of AMPs. Users can avail the sequence optimization algorithm for rational design of AMPs. The database integrated with tools for AMP sequence and structure analysis will be a valuable resource for family-based studies on AMPs. PMID:26467475

  17. Online Search Optimization.

    ERIC Educational Resources Information Center

    Homan, Michael; Worley, Penny

    This course syllabus describes methods for optimizing online searching, using as an example searching on the National Library of Medicine (NLM) online system. Four major activities considered are the online interview, query analysis and search planning, online interaction, and post-search analysis. Within the context of these activities, concepts…

  18. Strategies for Online Educators

    ERIC Educational Resources Information Center

    Motte, Kristy

    2013-01-01

    For a variety of reasons, online education is an increasingly viable option for many students seeking to further their education. Because of this, the demand for online instructors continues to increase. Instructors transitioning to the online environment from the traditional classroom may find teaching online overwhelming. While some practices…

  19. University Student Online Plagiarism

    ERIC Educational Resources Information Center

    Wang, Yu-mei

    2008-01-01

    This article reports a study investigating university student online plagiarism. The following questions are investigated: (a) What is the incidence of student online plagiarism? (b) What are student perceptions regarding online plagiarism? (c) Are there any differences in terms of student perceptions of online plagiarism and print plagiarism? (d)…

  20. Online Organic Chemistry

    ERIC Educational Resources Information Center

    Janowicz, Philip A.

    2010-01-01

    This is a comprehensive study of the many facets of an entirely online organic chemistry course. Online homework with structure-drawing capabilities was found to be more effective than written homework. Online lecture was found to be just as effective as in-person lecture, and students prefer an online lecture format with shorter Webcasts. Online…

  1. RISKIND verification and benchmark comparisons

    SciTech Connect

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  2. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  3. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  4. A Verification Method for MASOES.

    PubMed

    Perozo, N; Aguilar Perozo, J; Terán, O; Molina, H

    2013-02-01

    MASOES is a 3agent architecture for designing and modeling self-organizing and emergent systems. This architecture describes the elements, relationships, and mechanisms, both at the individual and the collective levels, that favor the analysis of the self-organizing and emergent phenomenon without mathematically modeling the system. In this paper, a method is proposed for verifying MASOES from the point of view of design in order to study the self-organizing and emergent behaviors of the modeled systems. The verification criteria are set according to what is proposed in MASOES for modeling self-organizing and emerging systems and the principles of the wisdom of crowd paradigm and the fuzzy cognitive map (FCM) theory. The verification method for MASOES has been implemented in a tool called FCM Designer and has been tested to model a community of free software developers that works under the bazaar style as well as a Wikipedia community in order to study their behavior and determine their self-organizing and emergent capacities.

  5. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  6. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, “inside similarity” and “outside similarity” are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  7. GENERIC VERIFICATION PROTOCOL FOR THE VERIFICATION OF PESTICIDE SPRAY DRIFT REDUCTION TECHNOLOGIES FOR ROW AND FIELD CROPS

    EPA Science Inventory

    This ETV program generic verification protocol was prepared and reviewed for the Verification of Pesticide Drift Reduction Technologies project. The protocol provides a detailed methodology for conducting and reporting results from a verification test of pesticide drift reductio...

  8. Students' Verification Strategies for Combinatorial Problems

    ERIC Educational Resources Information Center

    Mashiach Eizenberg, Michal; Zaslavsky, Orit

    2004-01-01

    We focus on a major difficulty in solving combinatorial problems, namely, on the verification of a solution. Our study aimed at identifying undergraduate students' tendencies to verify their solutions, and the verification strategies that they employ when solving these problems. In addition, an attempt was made to evaluate the level of efficiency…

  9. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  10. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Verification program. 460.17 Section...

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  12. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  14. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  15. 29 CFR 1903.19 - Abatement verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 29 Labor 5 2014-07-01 2014-07-01 false Abatement verification. 1903.19 Section 1903.19 Labor Regulations Relating to Labor (Continued) OCCUPATIONAL SAFETY AND HEALTH ADMINISTRATION, DEPARTMENT OF LABOR INSPECTIONS, CITATIONS AND PROPOSED PENALTIES § 1903.19 Abatement verification. Purpose. OSHA's inspections are intended to result in...

  16. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  17. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  18. CFE verification: The decision to inspect

    SciTech Connect

    Allentuck, J.

    1990-01-01

    Verification of compliance with the provisions of the treaty on Conventional Forces-Europe (CFE) is subject to inspection quotas of various kinds. Thus the decision to carry out a specific inspection or verification activity must be prudently made. This decision process is outlined, and means for conserving quotas'' are suggested. 4 refs., 1 fig.

  19. Attitudes toward buying online.

    PubMed

    Yang, Bijou; Lester, David

    2004-02-01

    A survey of 11 positive features and 10 discouraging features of online shopping was carried out on 180 students and identified certain behavioral patterns for online shoppers versus non-shoppers. It was found that online shoppers have consistently stronger positive feelings about online shopping than do non-shoppers. On the other hand, non-shoppers have more negative feelings about online shopping than do shoppers, but not consistently so. Online shoppers are aware of some of the discouraging features of online shopping, but these features do not deter them from shopping online. The implication for marketers is that they should focus on making the experience of online shopping more accommodating and more user-friendly since the positive features of online shopping ("convenience" and "efficiency") appear to be more important than the negative features ("effort/impersonality").

  20. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  1. New method of verificating optical flat flatness

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Li, Xueyuan; Han, Sen; Zhu, Jianrong; Guo, Zhenglai; Fu, Yuegang

    2014-11-01

    Optical flat is commonly used in optical testing instruments, flatness is the most important parameter of forming errors. As measurement criteria, optical flat flatness (OFF) index needs to have good precision. Current measurement in China is heavily dependent on the artificial visual interpretation, through discrete points to characterize the flatness. The efficiency and accuracy of this method can not meet the demand of industrial development. In order to improve the testing efficiency and accuracy of measurement, it is necessary to develop an optical flat verification system, which can obtain all surface information rapidly and efficiently, at the same time, in accordance with current national metrological verification procedures. This paper reviews current optical flat verification method and solves the problems existing in previous test, by using new method and its supporting software. Final results show that the new system can improve verification efficiency and accuracy, by comparing with JJG 28-2000 metrological verification procedures method.

  2. Working memory mechanism in proportional quantifier verification.

    PubMed

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-12-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g., "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow dots". The second study reveals that both types of sentences are correlated with memory storage, however, only proportional sentences are associated with the cognitive control. This result suggests that the cognitive mechanism underlying the verification of proportional quantifiers is crucially related to the integration process, in which an individual has to compare in memory the cardinalities of two sets. In the third study we find that the numerical distance between two cardinalities that must be compared significantly influences the verification time and accuracy. The results of our studies are discussed in the broader context of processing complex sentences. PMID:24374596

  3. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  4. 17 CFR 232.302 - Signatures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 17 Commodity and Securities Exchanges 2 2013-04-01 2013-04-01 false Signatures. 232.302 Section 232.302 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION REGULATION S-T-GENERAL RULES AND REGULATIONS FOR ELECTRONIC FILINGS Preparation of Electronic Submissions § 232.302 Signatures. (a) Required signatures to, or within,...

  5. 48 CFR 4.102 - Contractor's signature.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Contractor's signature. 4... ADMINISTRATIVE MATTERS Contract Execution 4.102 Contractor's signature. (a) Individuals. A contract with an... be signed by that individual, and the signature shall be followed by the individual's typed,...

  6. Analyzing Online Behaviors, Roles, and Learning Communities via Online Discussions

    ERIC Educational Resources Information Center

    Yeh, Yu-Chu

    2010-01-01

    Online learning communities are an important means of sharing and creating knowledge. Online behaviors and online roles can reveal how online learning communities function. However, no study has elucidated the relationships among online behaviors, online roles, and online learning communities. In this study, 32 preservice teachers participated in…

  7. Evolutionary Signatures of River Networks

    NASA Astrophysics Data System (ADS)

    Paik, K.

    2014-12-01

    River networks exhibit fractal characteristics and it has long been wondered how such regular patterns have been formed. This subject has been actively investigated mainly by two great schools of thoughts, i.e., chance and organization. Along this line, several fundamental questions have partially been addressed or remained. They include whether river networks pursue certain optimal conditions, and if so what is the ultimate optimality signature. Hydrologists have traditionally perceived this issue from fluvial-oriented perspectives. Nevertheless, geological processes can be more dominant in the formation of river networks in reality. To shed new lights on this subject, it is necessary to better understand complex feedbacks between various processes over different time scales, and eventually the emerging characteristic signature. Here, I will present highlights of earlier studies on this line and some noteworthy approaches being tried recently.

  8. Signatures of topological Josephson junctions

    NASA Astrophysics Data System (ADS)

    Peng, Yang; Pientka, Falko; Berg, Erez; Oreg, Yuval; von Oppen, Felix

    2016-08-01

    Quasiparticle poisoning and diabatic transitions may significantly narrow the window for the experimental observation of the 4 π -periodic dc Josephson effect predicted for topological Josephson junctions. Here, we show that switching-current measurements provide accessible and robust signatures for topological superconductivity which persist in the presence of quasiparticle poisoning processes. Such measurements provide access to the phase-dependent subgap spectrum and Josephson currents of the topological junction when incorporating it into an asymmetric SQUID together with a conventional Josephson junction with large critical current. We also argue that pump-probe experiments with multiple current pulses can be used to measure the quasiparticle poisoning rates of the topological junction. The proposed signatures are particularly robust, even in the presence of Zeeman fields and spin-orbit coupling, when focusing on short Josephson junctions. Finally, we also consider microwave excitations of short topological Josephson junctions which may complement switching-current measurements.

  9. Signatures of a Shadow Biosphere

    NASA Astrophysics Data System (ADS)

    Davies, Paul C. W.; Benner, Steven A.; Cleland, Carol E.; Lineweaver, Charles H.; McKay, Christopher P.; Wolfe-Simon, Felisa

    2009-03-01

    Astrobiologists are aware that extraterrestrial life might differ from known life, and considerable thought has been given to possible signatures associated with weird forms of life on other planets. So far, however, very little attention has been paid to the possibility that our own planet might also host communities of weird life. If life arises readily in Earth-like conditions, as many astrobiologists contend, then it may well have formed many times on Earth itself, which raises the question whether one or more shadow biospheres have existed in the past or still exist today. In this paper, we discuss possible signatures of weird life and outline some simple strategies for seeking evidence of a shadow biosphere.

  10. Signatures of a shadow biosphere.

    PubMed

    Davies, Paul C W; Benner, Steven A; Cleland, Carol E; Lineweaver, Charles H; McKay, Christopher P; Wolfe-Simon, Felisa

    2009-03-01

    Astrobiologists are aware that extraterrestrial life might differ from known life, and considerable thought has been given to possible signatures associated with weird forms of life on other planets. So far, however, very little attention has been paid to the possibility that our own planet might also host communities of weird life. If life arises readily in Earth-like conditions, as many astrobiologists contend, then it may well have formed many times on Earth itself, which raises the question whether one or more shadow biospheres have existed in the past or still exist today. In this paper, we discuss possible signatures of weird life and outline some simple strategies for seeking evidence of a shadow biosphere. PMID:19292603

  11. Polarization signatures of airborne particulates

    NASA Astrophysics Data System (ADS)

    Raman, Prashant; Fuller, Kirk A.; Gregory, Don A.

    2013-07-01

    Exploratory research has been conducted with the aim of completely determining the polarization signatures of selected particulates as a function of wavelength. This may lead to a better understanding of the interaction between electromagnetic radiation and such materials, perhaps leading to the point detection of bio-aerosols present in the atmosphere. To this end, a polarimeter capable of measuring the complete Mueller matrix of highly scattering samples in transmission and reflection (with good spectral resolution from 300 to 1100 nm) has been developed. The polarization properties of Bacillus subtilis (surrogate for anthrax spore) are compared to ambient particulate matter species such as pollen, dust, and soot. Differentiating features in the polarization signatures of these samples have been identified, thus demonstrating the potential applicability of this technique for the detection of bio-aerosol in the ambient atmosphere.

  12. DETECTORS AND EXPERIMENTAL METHODS: Online measurement of the BEPC II background using RadFET dosimeters

    NASA Astrophysics Data System (ADS)

    Gong, Hui; Li, Jin; Gong, Guang-Hua; Li, Yu-Xiong; Hou, Lei; Shao, Bei-Bei

    2009-09-01

    To monitor the integral dose deposited in the BESIII electromagnetic calorimeter whose performance degrades due to exposure to the BEPC II background, a 400 nm IMPL RadFET dosimeter-based integral dose online monitor system is built. After calibration with the 60Co source and verification with TLD in the pulse radiation fields, an experiment was arranged to measure the BEPC II background online. The results are presented.

  13. Nonlinear control of magnetic signatures

    NASA Astrophysics Data System (ADS)

    Niemoczynski, Bogdan

    Magnetic properties of ferrite structures are known to cause fluctuations in Earth's magnetic field around the object. These fluctuations are known as the object's magnetic signature and are unique based on the object's geometry and material. It is a common practice to neutralize magnetic signatures periodically after certain time intervals, however there is a growing interest to develop real time degaussing systems for various applications. Development of real time degaussing system is a challenging problem because of magnetic hysteresis and difficulties in measurement or estimation of near-field flux data. The goal of this research is to develop a real time feedback control system that can be used to minimize magnetic signatures for ferrite structures. Experimental work on controlling the magnetic signature of a cylindrical steel shell structure with a magnetic disturbance provided evidence that the control process substantially increased the interior magnetic flux. This means near field estimation using interior sensor data is likely to be inaccurate. Follow up numerical work for rectangular and cylindrical cross sections investigated variations in shell wall flux density under a variety of ambient excitation and applied disturbances. Results showed magnetic disturbances could corrupt interior sensor data and magnetic shielding due to the shell walls makes the interior very sensitive to noise. The magnetic flux inside the shell wall showed little variation due to inner disturbances and its high base value makes it less susceptible to noise. This research proceeds to describe a nonlinear controller to use the shell wall data as an input. A nonlinear plant model of magnetics is developed using a constant tau to represent domain rotation lag and a gain function k to describe the magnetic hysteresis curve for the shell wall. The model is justified by producing hysteresis curves for multiple materials, matching experimental data using a particle swarm algorithm, and

  14. Microbial Lifestyle and Genome Signatures

    PubMed Central

    Dutta, Chitra; Paul, Sandip

    2012-01-01

    Microbes are known for their unique ability to adapt to varying lifestyle and environment, even to the extreme or adverse ones. The genomic architecture of a microbe may bear the signatures not only of its phylogenetic position, but also of the kind of lifestyle to which it is adapted. The present review aims to provide an account of the specific genome signatures observed in microbes acclimatized to distinct lifestyles or ecological niches. Niche-specific signatures identified at different levels of microbial genome organization like base composition, GC-skew, purine-pyrimidine ratio, dinucleotide abundance, codon bias, oligonucleotide composition etc. have been discussed. Among the specific cases highlighted in the review are the phenomena of genome shrinkage in obligatory host-restricted microbes, genome expansion in strictly intra-amoebal pathogens, strand-specific codon usage in intracellular species, acquisition of genome islands in pathogenic or symbiotic organisms, discriminatory genomic traits of marine microbes with distinct trophic strategies, and conspicuous sequence features of certain extremophiles like those adapted to high temperature or high salinity. PMID:23024607

  15. Selection signatures in Shetland ponies.

    PubMed

    Frischknecht, M; Flury, C; Leeb, T; Rieder, S; Neuditschko, M

    2016-06-01

    Shetland ponies were selected for numerous traits including small stature, strength, hardiness and longevity. Despite the different selection criteria, Shetland ponies are well known for their small stature. We performed a selection signature analysis including genome-wide SNPs of 75 Shetland ponies and 76 large-sized horses. Based upon this dataset, we identified a selection signature on equine chromosome (ECA) 1 between 103.8 Mb and 108.5 Mb. A total of 33 annotated genes are located within this interval including the IGF1R gene at 104.2 Mb and the ADAMTS17 gene at 105.4 Mb. These two genes are well known to have a major impact on body height in numerous species including humans. Homozygosity mapping in the Shetland ponies identified a region with increased homozygosity between 107.4 Mb and 108.5 Mb. None of the annotated genes in this region have so far been associated with height. Thus, we cannot exclude the possibility that the identified selection signature on ECA1 is associated with some trait other than height, for which Shetland ponies were selected. PMID:26857482

  16. Comparison of MMW ground vehicle signatures

    NASA Astrophysics Data System (ADS)

    Saylor, Ph. D., Annie V.; Kissell, Ann

    2006-05-01

    A continuing question asked of MMW target signature and model providers is the applicability of data from one frequency band to another. Recent monopulse Ka-band ground target signature measurements made by US Army programs provide an opportunity to do an in-depth comparison of signatures of several ground vehicles. The vehicles measured correspond to those measured at W-band by another Army program. This paper provides a comparison of vehicle signatures produced by models derived by AMRDEC from the measurements. The results have implications for missile programs that do not have an extensive measurement budget but require target signatures and models for algorithm development.

  17. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  18. MCFC power plant system verification

    SciTech Connect

    Farooque, M.; Bernard, R.; Doyon, J.; Paetsch, L.; Patel, P.; Skok, A.; Yuh, C.

    1993-11-01

    In pursuit of commercialization, efforts are underway to: (1) advance the technology base by enhancing performance and demonstrating endurance, (2) scale up stack to the full area and height, (3) acquire stack manufacturing capability and experience, (4) establish capability as well as gain experience for power plant system testing of the full-height carbonate fuel cell stack, (5) and define power plant design and develop critical subsystem components. All the major project objectives have already been attained. Over the last year, significant progress has been achieved in establishing the full-height stack design, gaining stack manufacturing and system integrated testing experience, and verifying the major equipment design in power plant system tests. In this paper, recent progresses on stack scaleup, demonstration testing, BOP verification, and stack endurance are presented.

  19. Formal Definition and Construction of Nominative Signature

    NASA Astrophysics Data System (ADS)

    Liu, Dennis Y. W.; Wong, Duncan S.; Huang, Xinyi; Wang, Guilin; Huang, Qiong; Mu, Yi; Susilo, Willy

    Since the introduction of nominative signature in 1996, there are three problems that have still not been solved. First, there is no convincing application proposed; second, there is no formal security model available; and third, there is no proven secure scheme constructed, given that all the previous schemes have already been found flawed. In this paper, we give positive answers to these problems. First, we illustrate that nominative signature is a better tool for building user certification systems which were originally implemented using universal designated-verifier signature. Second, we propose a formal definition and adversarial model for nominative signature. Third, we show that Chaum's undeniable signature can be transformed to an efficient nominative signature by simply using a standard signature. The security of our transformation can be proven under the standard number-theoretic assumption.

  20. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  1. Brain oscillatory signatures of motor tasks.

    PubMed

    Ramos-Murguialday, Ander; Birbaumer, Niels

    2015-06-01

    Noninvasive brain-computer-interfaces (BCI) coupled with prosthetic devices were recently introduced in the rehabilitation of chronic stroke and other disorders of the motor system. These BCI systems and motor rehabilitation in general involve several motor tasks for training. This study investigates the neurophysiological bases of an EEG-oscillation-driven BCI combined with a neuroprosthetic device to define the specific oscillatory signature of the BCI task. Controlling movements of a hand robotic orthosis with motor imagery of the same movement generates sensorimotor rhythm oscillation changes and involves three elements of tasks also used in stroke motor rehabilitation: passive and active movement, motor imagery, and motor intention. We recorded EEG while nine healthy participants performed five different motor tasks consisting of closing and opening of the hand as follows: 1) motor imagery without any external feedback and without overt hand movement, 2) motor imagery that moves the orthosis proportional to the produced brain oscillation change with online proprioceptive and visual feedback of the hand moving through a neuroprosthetic device (BCI condition), 3) passive and 4) active movement of the hand with feedback (seeing and feeling the hand moving), and 5) rest. During the BCI condition, participants received contingent online feedback of the decrease of power of the sensorimotor rhythm, which induced orthosis movement and therefore proprioceptive and visual information from the moving hand. We analyzed brain activity during the five conditions using time-frequency domain bootstrap-based statistical comparisons and Morlet transforms. Activity during rest was used as a reference. Significant contralateral and ipsilateral event-related desynchronization of sensorimotor rhythm was present during all motor tasks, largest in contralateral-postcentral, medio-central, and ipsilateral-precentral areas identifying the ipsilateral precentral cortex as an integral

  2. Brain oscillatory signatures of motor tasks

    PubMed Central

    Birbaumer, Niels

    2015-01-01

    Noninvasive brain-computer-interfaces (BCI) coupled with prosthetic devices were recently introduced in the rehabilitation of chronic stroke and other disorders of the motor system. These BCI systems and motor rehabilitation in general involve several motor tasks for training. This study investigates the neurophysiological bases of an EEG-oscillation-driven BCI combined with a neuroprosthetic device to define the specific oscillatory signature of the BCI task. Controlling movements of a hand robotic orthosis with motor imagery of the same movement generates sensorimotor rhythm oscillation changes and involves three elements of tasks also used in stroke motor rehabilitation: passive and active movement, motor imagery, and motor intention. We recorded EEG while nine healthy participants performed five different motor tasks consisting of closing and opening of the hand as follows: 1) motor imagery without any external feedback and without overt hand movement, 2) motor imagery that moves the orthosis proportional to the produced brain oscillation change with online proprioceptive and visual feedback of the hand moving through a neuroprosthetic device (BCI condition), 3) passive and 4) active movement of the hand with feedback (seeing and feeling the hand moving), and 5) rest. During the BCI condition, participants received contingent online feedback of the decrease of power of the sensorimotor rhythm, which induced orthosis movement and therefore proprioceptive and visual information from the moving hand. We analyzed brain activity during the five conditions using time-frequency domain bootstrap-based statistical comparisons and Morlet transforms. Activity during rest was used as a reference. Significant contralateral and ipsilateral event-related desynchronization of sensorimotor rhythm was present during all motor tasks, largest in contralateral-postcentral, medio-central, and ipsilateral-precentral areas identifying the ipsilateral precentral cortex as an integral

  3. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  4. 77 FR 40612 - Notice to All Interested Parties of the Termination of the Receivership of 10375, Signature Bank...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... From the Federal Register Online via the Government Publishing Office FEDERAL DEPOSIT INSURANCE CORPORATION Notice to All Interested Parties of the Termination of the Receivership of 10375, Signature Bank, Windsor, CO Notice is hereby given that the Federal Deposit Insurance Corporation (``FDIC'') as...

  5. Curricular Innovation and Digitisation at a Mega University in the Developing World--The UNISA "Signature Course" Project

    ERIC Educational Resources Information Center

    Baijnath, Narend

    2014-01-01

    As part of the endeavor to reposition itself in the open distance and e-learning arena, the University of South Africa (UNISA) has designed and developed six modular courses (one module per College) referred to as "Signature Courses". The focus of these modules is on a student-centred online teaching and learning approach; extensive…

  6. Genetic signatures of heroin addiction

    PubMed Central

    Chen, Shaw-Ji; Liao, Ding-Lieh; Shen, Tsu-Wang; Yang, Hsin-Chou; Chen, Kuang-Chi; Chen, Chia-Hsiang

    2016-01-01

    Abstract Heroin addiction is a complex psychiatric disorder with a chronic course and a high relapse rate, which results from the interaction between genetic and environmental factors. Heroin addiction has a substantial heritability in its etiology; hence, identification of individuals with a high genetic propensity to heroin addiction may help prevent the occurrence and relapse of heroin addiction and its complications. The study aimed to identify a small set of genetic signatures that may reliably predict the individuals with a high genetic propensity to heroin addiction. We first measured the transcript level of 13 genes (RASA1, PRKCB, PDK1, JUN, CEBPG, CD74, CEBPB, AUTS2, ENO2, IMPDH2, HAT1, MBD1, and RGS3) in lymphoblastoid cell lines in a sample of 124 male heroin addicts and 124 male control subjects using real-time quantitative PCR. Seven genes (PRKCB, PDK1, JUN, CEBPG, CEBPB, ENO2, and HAT1) showed significant differential expression between the 2 groups. Further analysis using 3 statistical methods including logistic regression analysis, support vector machine learning analysis, and a computer software BIASLESS revealed that a set of 4 genes (JUN, CEBPB, PRKCB, ENO2, or CEBPG) could predict the diagnosis of heroin addiction with the accuracy rate around 85% in our dataset. Our findings support the idea that it is possible to identify genetic signatures of heroin addiction using a small set of expressed genes. However, the study can only be considered as a proof-of-concept study. As the establishment of lymphoblastoid cell line is a laborious and lengthy process, it would be more practical in clinical settings to identify genetic signatures for heroin addiction directly from peripheral blood cells in the future study. PMID:27495086

  7. Genetic signatures of heroin addiction.

    PubMed

    Chen, Shaw-Ji; Liao, Ding-Lieh; Shen, Tsu-Wang; Yang, Hsin-Chou; Chen, Kuang-Chi; Chen, Chia-Hsiang

    2016-08-01

    Heroin addiction is a complex psychiatric disorder with a chronic course and a high relapse rate, which results from the interaction between genetic and environmental factors. Heroin addiction has a substantial heritability in its etiology; hence, identification of individuals with a high genetic propensity to heroin addiction may help prevent the occurrence and relapse of heroin addiction and its complications. The study aimed to identify a small set of genetic signatures that may reliably predict the individuals with a high genetic propensity to heroin addiction. We first measured the transcript level of 13 genes (RASA1, PRKCB, PDK1, JUN, CEBPG, CD74, CEBPB, AUTS2, ENO2, IMPDH2, HAT1, MBD1, and RGS3) in lymphoblastoid cell lines in a sample of 124 male heroin addicts and 124 male control subjects using real-time quantitative PCR. Seven genes (PRKCB, PDK1, JUN, CEBPG, CEBPB, ENO2, and HAT1) showed significant differential expression between the 2 groups. Further analysis using 3 statistical methods including logistic regression analysis, support vector machine learning analysis, and a computer software BIASLESS revealed that a set of 4 genes (JUN, CEBPB, PRKCB, ENO2, or CEBPG) could predict the diagnosis of heroin addiction with the accuracy rate around 85% in our dataset. Our findings support the idea that it is possible to identify genetic signatures of heroin addiction using a small set of expressed genes. However, the study can only be considered as a proof-of-concept study. As the establishment of lymphoblastoid cell line is a laborious and lengthy process, it would be more practical in clinical settings to identify genetic signatures for heroin addiction directly from peripheral blood cells in the future study. PMID:27495086

  8. The Online Underworld.

    ERIC Educational Resources Information Center

    Scrogan, Len

    1988-01-01

    Discusses some of the misuses of telecommunicating using school computers, including online piracy, hacking, phreaking, online crime, and destruction boards. Suggests ways that schools can deal with these problems. (TW)

  9. Gut microbiota signatures of longevity.

    PubMed

    Kong, Fanli; Hua, Yutong; Zeng, Bo; Ning, Ruihong; Li, Ying; Zhao, Jiangchao

    2016-09-26

    An aging global population poses substantial challenges to society [1]. Centenarians are a model for healthy aging because they have reached the extreme limit of life by escaping, surviving, or delaying chronic diseases [2]. The genetics of centenarians have been extensively examined [3], but less is known about their gut microbiotas. Recently, Biagi et al.[4] characterized the gut microbiota in Italian centenarians and semi-supercentenarians. Here, we compare the gut microbiota of Chinese long-living people with younger age groups, and with the results from the Italian population [4], to identify gut-microbial signatures of healthy aging. PMID:27676296

  10. Quantum signatures of chimera states

    NASA Astrophysics Data System (ADS)

    Bastidas, V. M.; Omelchenko, I.; Zakharova, A.; Schöll, E.; Brandes, T.

    2015-12-01

    Chimera states are complex spatiotemporal patterns in networks of identical oscillators, characterized by the coexistence of synchronized and desynchronized dynamics. Here we propose to extend the phenomenon of chimera states to the quantum regime, and uncover intriguing quantum signatures of these states. We calculate the quantum fluctuations about semiclassical trajectories and demonstrate that chimera states in the quantum regime can be characterized by bosonic squeezing, weighted quantum correlations, and measures of mutual information. Our findings reveal the relation of chimera states to quantum information theory, and give promising directions for experimental realization of chimera states in quantum systems.

  11. Spectroscopic signature for ferroelectric ice

    NASA Astrophysics Data System (ADS)

    Wójcik, Marek J.; Gług, Maciej; Boczar, Marek; Boda, Łukasz

    2014-09-01

    Various forms of ice exist within our galaxy. Particularly intriguing type of ice - ‘ferroelectric ice' was discovered experimentally and is stable in temperatures below 72 K. This form of ice can generate enormous electric fields and can play an important role in planetary formation. In this letter we present Car-Parrinello simulation of infrared spectra of ferroelectric ice and compare them with spectra of hexagonal ice. Librational region of the spectra can be treated as spectroscopic signature of ice XI and can be of help to identify ferroelectric ice in the Universe.

  12. Quantum signatures of chimera states.

    PubMed

    Bastidas, V M; Omelchenko, I; Zakharova, A; Schöll, E; Brandes, T

    2015-12-01

    Chimera states are complex spatiotemporal patterns in networks of identical oscillators, characterized by the coexistence of synchronized and desynchronized dynamics. Here we propose to extend the phenomenon of chimera states to the quantum regime, and uncover intriguing quantum signatures of these states. We calculate the quantum fluctuations about semiclassical trajectories and demonstrate that chimera states in the quantum regime can be characterized by bosonic squeezing, weighted quantum correlations, and measures of mutual information. Our findings reveal the relation of chimera states to quantum information theory, and give promising directions for experimental realization of chimera states in quantum systems.

  13. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  14. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  15. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  16. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  17. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis...

  18. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... plan or system; (f) Direct observation or measurement at a CCP; (g) Sample collection and analysis...

  19. PERFORMANCE VERIFICATION OF WATER SECURITY - RELATED TECHNOLOGIES

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program's Advanced Monitoring Systems (AMS) Center has been charged by EPA to verify the performance of commercially available monitoring technologies for air, water, soil. Four categories of water security technologies (most of whi...

  20. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  1. U.S. Environmental Technology Verification Program

    EPA Science Inventory

    Overview of the U.S. Environmental Technology Verification Program (ETV), the ETV Greenhouse Gas Technology Center, and energy-related ETV projects. Presented at the Department of Energy's National Renewable Laboratory in Boulder, Colorado on June 23, 2008.

  2. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    SciTech Connect

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in social media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation

  3. Quantum broadcasting multiple blind signature with constant size

    NASA Astrophysics Data System (ADS)

    Xiao, Min; Li, Zhenli

    2016-09-01

    Using quantum homomorphic signature in quantum network, we propose a quantum broadcasting multiple blind signature scheme. Different from classical signature and current quantum signature schemes, the multi-signature proposed in our scheme is not generated by simply putting the individual signatures together, but by aggregating the individual signatures based on homomorphic property. Therefore, the size of the multi-signature is constant. Furthermore, based on a wide range of investigation for the security of existing quantum signature protocols, our protocol is designed to resist possible forgery attacks against signature and message from the various attack sources and disavowal attacks from participants.

  4. Quantum broadcasting multiple blind signature with constant size

    NASA Astrophysics Data System (ADS)

    Xiao, Min; Li, Zhenli

    2016-06-01

    Using quantum homomorphic signature in quantum network, we propose a quantum broadcasting multiple blind signature scheme. Different from classical signature and current quantum signature schemes, the multi-signature proposed in our scheme is not generated by simply putting the individual signatures together, but by aggregating the individual signatures based on homomorphic property. Therefore, the size of the multi-signature is constant. Furthermore, based on a wide range of investigation for the security of existing quantum signature protocols, our protocol is designed to resist possible forgery attacks against signature and message from the various attack sources and disavowal attacks from participants.

  5. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  6. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  7. Reflectors as Online Extraverts?

    ERIC Educational Resources Information Center

    Downing, Kevin; Chim, Tat Mei

    2004-01-01

    Increasingly, online learning is perceived as an effective method of instruction. Much recent educational research has focused on examining the purposes and situations for which online education is best suited. In this paper, students enrolled in two online courses are compared with their peers enrolled in equivalent classroom-based courses to…

  8. Online Training in Australia

    ERIC Educational Resources Information Center

    Kuzic, Joze

    2013-01-01

    On-line training is becoming an interesting phenomenon in Australia and has attracted a lot of interest across many industries and businesses (Chan and Ngai, 2007). The research reported here looks at the use of online training in corporations in Australia. It focuses on two aspects of online training, the factors that "warrant" its…

  9. Effective Online Teachers

    ERIC Educational Resources Information Center

    Muirhead, Brent

    2006-01-01

    Effective online teaching is a popular topic in today's educational technology journals due to the vital role that educators play in the teaching and learning process. The author will provide insights into effective online teachers and highlight training and mentoring practices for online instructors at the University of Phoenix.

  10. Assessing Online Learning

    ERIC Educational Resources Information Center

    Comeaux, Patricia, Ed.

    2004-01-01

    Students in traditional as well as online classrooms need more than grades from their instructors--they also need meaningful feedback to help bridge their academic knowledge and skills with their daily lives. With the increasing number of online learning classrooms, the question of how to consistently assess online learning has become increasingly…

  11. Perceptions of Online Instruction

    ERIC Educational Resources Information Center

    Fish, Wade W.; Gill, Peggy B.

    2009-01-01

    Online instruction has influenced how higher education redefines teaching as universities understand the significance and move towards the paradigm of online teaching and learning. Despite the benefits of online teaching, many university faculty members tend to gravitate toward instructional practices that are most comfortable to them. The purpose…

  12. Developing Online Doctoral Programmes

    ERIC Educational Resources Information Center

    Chipere, Ngoni

    2015-01-01

    The objectives of the study were to identify best practices in online doctoral programming and to synthesise these practices into a framework for developing online doctoral programmes. The field of online doctoral studies is nascent and presents challenges for conventional forms of literature review. The literature was therefore reviewed using a…

  13. Implementing Online Physical Education

    ERIC Educational Resources Information Center

    Mohnsen, Bonnie

    2012-01-01

    Online physical education, although seemingly an oxymoron, appears to be the wave of the future at least for some students. The purpose of this article is to explore research and options for online learning in physical education and to examine a curriculum, assessment, and instructional model for online learning. The article examines how physical…

  14. Automated UF6 Cylinder Enrichment Assay: Status of the Hybrid Enrichment Verification Array (HEVA) Project: POTAS Phase II

    SciTech Connect

    Jordan, David V.; Orton, Christopher R.; Mace, Emily K.; McDonald, Benjamin S.; Kulisek, Jonathan A.; Smith, Leon E.

    2012-06-01

    Pacific Northwest National Laboratory (PNNL) intends to automate the UF6 cylinder nondestructive assay (NDA) verification currently performed by the International Atomic Energy Agency (IAEA) at enrichment plants. PNNL is proposing the installation of a portal monitor at a key measurement point to positively identify each cylinder, measure its mass and enrichment, store the data along with operator inputs in a secure database, and maintain continuity of knowledge on measured cylinders until inspector arrival. This report summarizes the status of the research and development of an enrichment assay methodology supporting the cylinder verification concept. The enrichment assay approach exploits a hybrid of two passively-detected ionizing-radiation signatures: the traditional enrichment meter signature (186-keV photon peak area) and a non-traditional signature, manifested in the high-energy (3 to 8 MeV) gamma-ray continuum, generated by neutron emission from UF6. PNNL has designed, fabricated, and field-tested several prototype assay sensor packages in an effort to demonstrate proof-of-principle for the hybrid assay approach, quantify the expected assay precision for various categories of cylinder contents, and assess the potential for unsupervised deployment of the technology in a portal-monitor form factor. We refer to recent sensor-package prototypes as the Hybrid Enrichment Verification Array (HEVA). The report provides an overview of the assay signatures and summarizes the results of several HEVA field measurement campaigns on populations of Type 30B UF6 cylinders containing low-enriched uranium (LEU), natural uranium (NU), and depleted uranium (DU). Approaches to performance optimization of the assay technique via radiation transport modeling are briefly described, as are spectroscopic and data-analysis algorithms.

  15. Theoretical Characterizaiton of Visual Signatures

    NASA Astrophysics Data System (ADS)

    Kashinski, D. O.; Chase, G. M.; di Nallo, O. E.; Scales, A. N.; Vanderley, D. L.; Byrd, E. F. C.

    2015-05-01

    We are investigating the accuracy of theoretical models used to predict the visible, ultraviolet, and infrared spectra, as well as other properties, of product materials ejected from the muzzle of currently fielded systems. Recent advances in solid propellants has made the management of muzzle signature (flash) a principle issue in weapons development across the calibers. A priori prediction of the electromagnetic spectra of formulations will allow researchers to tailor blends that yield desired signatures and determine spectrographic detection ranges. Quantum chemistry methods at various levels of sophistication have been employed to optimize molecular geometries, compute unscaled vibrational frequencies, and determine the optical spectra of specific gas-phase species. Electronic excitations are being computed using Time Dependent Density Functional Theory (TD-DFT). A full statistical analysis and reliability assessment of computational results is currently underway. A comparison of theoretical results to experimental values found in the literature is used to assess any affects of functional choice and basis set on calculation accuracy. The status of this work will be presented at the conference. Work supported by the ARL, DoD HPCMP, and USMA.

  16. Update on PIN or Signature

    NASA Astrophysics Data System (ADS)

    Matyas, Vashek

    We promised a year back some data on the experiment that we ran with chip and PIN. If you recall, it was the first phase that we reported on here last year, where we used the University bookstore, and two PIN pads, one with very solid privacy shielding, the other one without any. We ran 17 people through the first one, 15 people through the second one, and we also had the students do, about half of them forging the signature, half of them signing their own signature, on the back of the card that is used for purchasing books, or whatever.We had a second phase of the experiment, after long negotiations, and very complicated logistics, with a supermarket in Brno where we were able to do anything that we wanted through the experiment for five hours on the floor, with only the supermarket manager, the head of security, and the camera operators knowing about the experiment. So the shop assistants, the ground floor security, everybody basically on the floor, did not know about the experiment. That was one of the reasons why the supermarket, or management, agreed to take part, they wanted to control their own internal security procedures.

  17. Development of advanced seal verification

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.

    1992-01-01

    The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.

  18. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  19. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  20. (Convertible) Undeniable Signatures Without Random Oracles

    NASA Astrophysics Data System (ADS)

    Yuen, Tsz Hon; Au, Man Ho; Liu, Joseph K.; Susilo, Willy

    We propose a convertible undeniable signature scheme without random oracles. Our construction is based on Waters' and Kurosawa and Heng's schemes that were proposed in Eurocrypt 2005. The security of our scheme is based on the CDH and the decision linear assumption. Comparing only the part of undeniable signatures, our scheme uses more standard assumptions than the existing undeniable signatures without random oracles due to Laguillamie and Vergnaud.

  1. Narrow terahertz attenuation signatures in Bacillus thuringiensis.

    PubMed

    Zhang, Weidong; Brown, Elliott R; Viveros, Leamon; Burris, Kellie P; Stewart, C Neal

    2014-10-01

    Terahertz absorption signatures from culture-cultivated Bacillus thuringiensis were measured with a THz photomixing spectrometer operating from 400 to 1200 GHz. We observe two distinct signatures centered at ∼955 and 1015 GHz, and attribute them to the optically coupled particle vibrational resonance (surface phonon-polariton) of Bacillus spores. This demonstrates the potential of the THz attenuation signatures as "fingerprints" for label-free biomolecular detection.

  2. Cryptanalysis of Quantum Blind Signature Scheme

    NASA Astrophysics Data System (ADS)

    Zuo, Huijuan

    2013-01-01

    In this paper, we study the cryptanalysis of two quantum blind signature schemes and one quantum proxy blind signature protocol. We show that in these protocols the verifier can forge the signature under known message attack. The attack strategies are described in detail respectively. This kind of problem deserves more research attention in the following related study. We further point out that the arbitrator should be involved in the procedure of any dispute and some discussions of these protocols are given.

  3. The infrasonic signature of the 2009 major Sudden Stratospheric Warming

    NASA Astrophysics Data System (ADS)

    Evers, L.; Siegmund, P.

    2009-12-01

    The study of infrasound is experiencing a renaissance since it was chosen as a verification technique for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The success of the verification technique strongly depends on knowledge of upper atmospheric processes. The ability of infrasound to probe the upper atmosphere starts to be exploited, taking the field beyond its monitoring application. Processes in the stratosphere couple to the troposphere and influence our daily weather and climate. Infrasound delivers actual observations on the state of the stratosphere with a high spatial and temporal resolution. Here we show the infrasonic signature, passively obtained, of a drastic change in the stratosphere due to the major Sudden Stratospheric Warming (SSW) of January 2009. A major SSW started around January 15. At the altitude of 30 km, the average temperature to the north of 65N increased in one week by more than 50 deg C, leading to exceptionally high temperatures of about -20 deg C. Simultaneously, the polar vortex reversed direction from eastward to westward. The warming was accompanied by a split-up of the polar vortex and an increased amplitude of the zonal wavenumber number 2 planetary waves. Infrasound recordings on the Northern Hemisphere have been analysed. These arrays are part of the International Monitoring System (IMS) for the CTBT. Interacting oceanic waves are almost continuously emitting infrasound, where the whole atmospheric wind and temperature structure determines the detectability of these so-called microbaroms. Changes in this detectability have been associated to wind and temperatures changes around 50 km altitude due to the major SSW. With this study, we infer the enormous capacity of infrasound in passive acoustic remote sensing of stratospheric processes on a global scale with surface based instruments.

  4. Imaging radar polarization signatures - Theory and observation

    NASA Technical Reports Server (NTRS)

    Van Zyl, Jakob J.; Zebker, Howard A.; Elachi, Charles

    1987-01-01

    Radar polarimetry theory is reviewed, and comparison between theory and experimental results obtained with an imaging radar polarimeter employing two orthogonally polarized antennas is made. Knowledge of the scattering matrix permits calculation of the scattering cross section of a scatterer for any transmit and receive polarization combination, and a new way of displaying the resulting scattering cross section as a function of polarization is introduced. Examples of polarization signatures are presented for several theoretical models of surface scattering, and these signatures are compared with experimentally measured polarization signatures. The coefficient of variation, derived from the polarization signature, may provide information regarding the amount of variation in scattering properties for a given area.

  5. 5 CFR 850.106 - Electronic signatures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... card; (iii) Digitized signature; or (iv) Biometrics, such as fingerprints, retinal patterns, and voice recognition; (2) Cryptographic control methods, including— (i) Shared symmetric key cryptography; (ii)...

  6. Novel Quantum Proxy Signature without Entanglement

    NASA Astrophysics Data System (ADS)

    Xu, Guang-bao

    2015-08-01

    Proxy signature is an important research topic in classic cryptography since it has many application occasions in our real life. But only a few quantum proxy signature schemes have been proposed up to now. In this paper, we propose a quantum proxy signature scheme, which is designed based on quantum one-time pad. Our scheme can be realized easily since it only uses single-particle states. Security analysis shows that it is secure and meets all the properties of a proxy signature, such as verifiability, distinguishability, unforgeability and undeniability.

  7. Intrusion signature creation via clustering anomalies

    NASA Astrophysics Data System (ADS)

    Hendry, Gilbert R.; Yang, Shanchieh J.

    2008-03-01

    Current practices for combating cyber attacks typically use Intrusion Detection Systems (IDSs) to detect and block multistage attacks. Because of the speed and impacts of new types of cyber attacks, current IDSs are limited in providing accurate detection while reliably adapting to new attacks. In signature-based IDS systems, this limitation is made apparent by the latency from day zero of an attack to the creation of an appropriate signature. This work hypothesizes that this latency can be shortened by creating signatures via anomaly-based algorithms. A hybrid supervised and unsupervised clustering algorithm is proposed for new signature creation. These new signatures created in real-time would take effect immediately, ideally detecting new attacks. This work first investigates a modified density-based clustering algorithm as an IDS, with its strengths and weaknesses identified. A signature creation algorithm leveraging the summarizing abilities of clustering is investigated. Lessons learned from the supervised signature creation are then leveraged for the development of unsupervised real-time signature classification. Automating signature creation and classification via clustering is demonstrated as satisfactory but with limitations.

  8. Secure Obfuscation for Encrypted Group Signatures

    PubMed Central

    Fan, Hongfei; Liu, Qin

    2015-01-01

    In recent years, group signature techniques are widely used in constructing privacy-preserving security schemes for various information systems. However, conventional techniques keep the schemes secure only in normal black-box attack contexts. In other words, these schemes suppose that (the implementation of) the group signature generation algorithm is running in a platform that is perfectly protected from various intrusions and attacks. As a complementary to existing studies, how to generate group signatures securely in a more austere security context, such as a white-box attack context, is studied in this paper. We use obfuscation as an approach to acquire a higher level of security. Concretely, we introduce a special group signature functionality-an encrypted group signature, and then provide an obfuscator for the proposed functionality. A series of new security notions for both the functionality and its obfuscator has been introduced. The most important one is the average-case secure virtual black-box property w.r.t. dependent oracles and restricted dependent oracles which captures the requirement of protecting the output of the proposed obfuscator against collision attacks from group members. The security notions fit for many other specialized obfuscators, such as obfuscators for identity-based signatures, threshold signatures and key-insulated signatures. Finally, the correctness and security of the proposed obfuscator have been proven. Thereby, the obfuscated encrypted group signature functionality can be applied to variants of privacy-preserving security schemes and enhance the security level of these schemes. PMID:26167686

  9. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  10. Signature Product Code for Predicting Protein-Protein Interactions

    SciTech Connect

    Martin, Shawn B.; Brown, William M.

    2004-09-25

    The SigProdV1.0 software consists of four programs which together allow the prediction of protein-protein interactions using only amino acid sequences and experimental data. The software is based on the use of tensor products of amino acid trimers coupled with classifiers known as support vector machines. Essentially the program looks for amino acid trimer pairs which occur more frequently in protein pairs which are known to interact. These trimer pairs are then used to make predictions about unknown protein pairs. A detailed description of the method can be found in the paper: S. Martin, D. Roe, J.L. Faulon. "Predicting protein-protein interactions using signature products," Bioinformatics, available online from Advance Access, Aug. 19, 2004.

  11. Signature Product Code for Predicting Protein-Protein Interactions

    2004-09-25

    The SigProdV1.0 software consists of four programs which together allow the prediction of protein-protein interactions using only amino acid sequences and experimental data. The software is based on the use of tensor products of amino acid trimers coupled with classifiers known as support vector machines. Essentially the program looks for amino acid trimer pairs which occur more frequently in protein pairs which are known to interact. These trimer pairs are then used to make predictionsmore » about unknown protein pairs. A detailed description of the method can be found in the paper: S. Martin, D. Roe, J.L. Faulon. "Predicting protein-protein interactions using signature products," Bioinformatics, available online from Advance Access, Aug. 19, 2004.« less

  12. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  13. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  14. Verification of Internal Dose Calculations.

    NASA Astrophysics Data System (ADS)

    Aissi, Abdelmadjid

    The MIRD internal dose calculations have been in use for more than 15 years, but their accuracy has always been questionable. There have been attempts to verify these calculations; however, these attempts had various shortcomings which kept the question of verification of the MIRD data still unanswered. The purpose of this research was to develop techniques and methods to verify the MIRD calculations in a more systematic and scientific manner. The research consisted of improving a volumetric dosimeter, developing molding techniques, and adapting the Monte Carlo computer code ALGAM to the experimental conditions and vice versa. The organic dosimetric system contained TLD-100 powder and could be shaped to represent human organs. The dosimeter possessed excellent characteristics for the measurement of internal absorbed doses, even in the case of the lungs. The molding techniques are inexpensive and were used in the fabrication of dosimetric and radioactive source organs. The adaptation of the computer program provided useful theoretical data with which the experimental measurements were compared. The experimental data and the theoretical calculations were compared for 6 source organ-7 target organ configurations. The results of the comparison indicated the existence of an agreement between measured and calculated absorbed doses, when taking into consideration the average uncertainty (16%) of the measurements, and the average coefficient of variation (10%) of the Monte Carlo calculations. However, analysis of the data gave also an indication that the Monte Carlo method might overestimate the internal absorbed doses. Even if the overestimate exists, at least it could be said that the use of the MIRD method in internal dosimetry was shown to lead to no unnecessary exposure to radiation that could be caused by underestimating the absorbed dose. The experimental and the theoretical data were also used to test the validity of the Reciprocity Theorem for heterogeneous

  15. Ozone Monitoring Instrument geolocation verification

    NASA Astrophysics Data System (ADS)

    Kroon, M.; Dobber, M. R.; Dirksen, R.; Veefkind, J. P.; van den Oord, G. H. J.; Levelt, P. F.

    2008-08-01

    Verification of the geolocation assigned to individual ground pixels as measured by the Ozone Monitoring Instrument (OMI) aboard the NASA EOS-Aura satellite was performed by comparing geophysical Earth surface details as observed in OMI false color images with the high-resolution continental outline vector map as provided by the Interactive Data Language (IDL) software tool from ITT Visual Information Solutions. The OMI false color images are generated from the OMI visible channel by integration over 20-nm-wide spectral bands of the Earth radiance intensity around 484 nm, 420 nm, and 360 nm wavelength per ground pixel. Proportional to the integrated intensity, we assign color values composed of CRT standard red, green, and blue to the OMI ground pixels. Earth surface details studied are mostly high-contrast coast lines where arid land or desert meets deep blue ocean. The IDL high-resolution vector map is based on the 1993 CIA World Database II Map with a 1-km accuracy. Our results indicate that the average OMI geolocation offset over the years 2005-2006 is 0.79 km in latitude and 0.29 km in longitude, with a standard deviation of 1.64 km in latitude and 2.04 km in longitude, respectively. Relative to the OMI nadir pixel size, one obtains mean displacements of ˜6.1% in latitude and ˜1.2% in longitude, with standard deviations of 12.6% and 7.9%, respectively. We conclude that the geolocation assigned to individual OMI ground pixels is sufficiently accurate to support scientific studies of atmospheric features as observed in OMI level 2 satellite data products, such as air quality issues on urban scales or volcanic eruptions and its plumes, that occur on spatial scales comparable to or smaller than OMI nadir pixels.

  16. Automated verification of system configuration

    NASA Astrophysics Data System (ADS)

    Andrews, W. H., Jr.; Baker, S. P.; Blalock, A. V.

    1991-05-01

    Errors in field wiring can result in significant correction costs (if the errors are discovered prior to use), in erroneous or unusable data (if the errors are not discovered in time), or in serious accidents (if the errors corrupt critical data). Detailed field wiring checkout rework are tedious and expensive, but they are essential steps in the quality assurance process for large, complex instrumentation and control systems. A recent Oak Ridge National Laboratory (ORNL) development, the CONFiguration IDEnification System (CONFIDES) automates verification of field wiring. In CONFIDES, an identifier module is installed on or integrated into each component (e.g., sensor, actuator, cable, distribution panel) to be verified. Interrogator modules, controlled by a personal computer (PC), are installed at the connections of the field wiring to the inputs of the data acquisition and control system (DACS). Interrogator modules poll the components connected to each channel of the DACS and can determine the path taken by each channel's signal to or from the end device for that channel. The system will provide not only the identification (ID) code for the cables and patch panels in the path to a particular sensor or actuator but for individual cable conductor IDs as well. One version of the system uses existing signal wires for communications between CONFIDES modules. Another, more powerful version requires a single dedicated conductor in each cable. Both version can operate with or without instrument power applied and neither interferes with the normal operation of the DACS. Identifier modules can provide a variety of information including status and calibration data.

  17. Holographic signatures of cosmological singularities.

    PubMed

    Engelhardt, Netta; Hertog, Thomas; Horowitz, Gary T

    2014-09-19

    To gain insight into the quantum nature of cosmological singularities, we study anisotropic Kasner solutions in gauge-gravity duality. The dual description of the bulk evolution towards the singularity involves N=4 super Yang-Mills theory on the expanding branch of deformed de Sitter space and is well defined. We compute two-point correlators of Yang-Mills operators of large dimensions using spacelike geodesics anchored on the boundary. The correlators show a strong signature of the singularity around horizon scales and decay at large boundary separation at different rates in different directions. More generally, the boundary evolution exhibits a process of particle creation similar to that in inflation. This leads us to conjecture that information on the quantum nature of cosmological singularities is encoded in long-wavelength features of the boundary wave function.

  18. Metabolic Signatures of Bacterial Vaginosis

    PubMed Central

    Morgan, Martin T.; Fiedler, Tina L.; Djukovic, Danijel; Hoffman, Noah G.; Raftery, Daniel; Marrazzo, Jeanne M.

    2015-01-01

    ABSTRACT Bacterial vaginosis (BV) is characterized by shifts in the vaginal microbiota from Lactobacillus dominant to a microbiota with diverse anaerobic bacteria. Few studies have linked specific metabolites with bacteria found in the human vagina. Here, we report dramatic differences in metabolite compositions and concentrations associated with BV using a global metabolomics approach. We further validated important metabolites using samples from a second cohort of women and a different platform to measure metabolites. In the primary study, we compared metabolite profiles in cervicovaginal lavage fluid from 40 women with BV and 20 women without BV. Vaginal bacterial representation was determined using broad-range PCR with pyrosequencing and concentrations of bacteria by quantitative PCR. We detected 279 named biochemicals; levels of 62% of metabolites were significantly different in women with BV. Unsupervised clustering of metabolites separated women with and without BV. Women with BV have metabolite profiles marked by lower concentrations of amino acids and dipeptides, concomitant with higher levels of amino acid catabolites and polyamines. Higher levels of the signaling eicosanoid 12-hydroxyeicosatetraenoic acid (12-HETE), a biomarker for inflammation, were noted in BV. Lactobacillus crispatus and Lactobacillus jensenii exhibited similar metabolite correlation patterns, which were distinct from correlation patterns exhibited by BV-associated bacteria. Several metabolites were significantly associated with clinical signs and symptoms (Amsel criteria) used to diagnose BV, and no metabolite was associated with all four clinical criteria. BV has strong metabolic signatures across multiple metabolic pathways, and these signatures are associated with the presence and concentrations of particular bacteria. PMID:25873373

  19. Online Monitoring of Plant Assets in the Nuclear Industry

    SciTech Connect

    Nancy Lybeck; Vivek Agarwal; Binh Pham; Richard Rusaw; Randy Bickford

    2013-10-01

    Today’s online monitoring technologies provide opportunities to perform predictive and proactive health management of assets within many different industries, in particular the defense and aerospace industries. The nuclear industry can leverage these technologies to enhance safety, productivity, and reliability of the aging fleet of existing nuclear power plants. The U.S. Department of Energy’s Light Water Reactor Sustainability Program is collaborating with the Electric Power Research Institute’s (EPRI’s) Long-Term Operations program to implement online monitoring in existing nuclear power plants. Proactive online monitoring in the nuclear industry is being explored using EPRI’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software, a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. This paper focuses on development of asset fault signatures used to assess the health status of generator step-up transformers and emergency diesel generators in nuclear power plants. Asset fault signatures describe the distinctive features based on technical examinations that can be used to detect a specific fault type. Fault signatures are developed based on the results of detailed technical research and on the knowledge and experience of technical experts. The Diagnostic Advisor of the FW-PHM Suite software matches developed fault signatures with operational data to provide early identification of critical faults and troubleshooting advice that could be used to distinguish between faults with similar symptoms. This research is important as it will support the automation of predictive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  20. A Nucleotide Signature for the Identification of Angelicae Sinensis Radix (Danggui) and Its Products

    PubMed Central

    Wang, Xiaoyue; Liu, Yang; Wang, Lili; Han, Jianping; Chen, Shilin

    2016-01-01

    It is very difficult to identify Angelicae sinensis radix (Danggui) when it is processed into Chinese patent medicines. The proposed internal transcribed spacer 2 (ITS2) is not sufficient to resolve heavily processed materials. Therefore, a short barcode for the identification of processed materials is urgently needed. In this study, 265 samples of Angelicae sinensis radix and adulterants were collected. The ITS2 region was sequenced, and based on one single nucleotide polymorphism(SNP) site unique to Angelica sinensis, a nucleotide signature consisting of 37-bp (5′-aatccgcgtc atcttagtga gctcaaggac ccttagg-3′) was developed. It is highly conserved and specific within Angelica sinensis while divergent among other species. Then, we designed primers (DG01F/DG01R) to amplify the nucleotide signature region from processed materials. 15 samples procured online were analysed. By seeking the signature, we found that 7 of them were counterfeits. 28 batches of Chinese patent medicines containing Danggui were amplified. 19 of them were found to contain the signature, and adulterants such as Ligusticum sinense, Notopterygium incisum, Angelica decursiva and Angelica gigas were detected in other batches. Thus, this nucleotide signature, with only 37-bp, will broaden the application of DNA barcoding to identify the components in decoctions, Chinese patent medicines and other products with degraded DNA. PMID:27713564

  1. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Electronic signature components and controls. 11... SERVICES GENERAL ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Electronic Signatures § 11.200 Electronic signature components and controls. (a) Electronic signatures that are not based upon biometrics shall:...

  2. Rhythmic TMS Causes Local Entrainment of Natural Oscillatory Signatures

    PubMed Central

    Thut, Gregor; Veniero, Domenica; Romei, Vincenzo; Miniussi, Carlo; Schyns, Philippe; Gross, Joachim

    2011-01-01

    Summary Background Neuronal elements underlying perception, cognition, and action exhibit distinct oscillatory phenomena, measured in humans by electro- or magnetoencephalography (EEG/MEG). So far, the correlative or causal nature of the link between brain oscillations and functions has remained elusive. A compelling demonstration of causality would primarily generate oscillatory signatures that are known to correlate with particular cognitive functions and then assess the behavioral consequences. Here, we provide the first direct evidence for causal entrainment of brain oscillations by transcranial magnetic stimulation (TMS) using concurrent EEG. Results We used rhythmic TMS bursts to directly interact with an MEG-identified parietal α-oscillator, activated by attention and linked to perception. With TMS bursts tuned to its preferred α-frequency (α-TMS), we confirmed the three main predictions of entrainment of a natural oscillator: (1) that α-oscillations are induced during α-TMS (reproducing an oscillatory signature of the stimulated parietal cortex), (2) that there is progressive enhancement of this α-activity (synchronizing the targeted, α-generator to the α-TMS train), and (3) that this depends on the pre-TMS phase of the background α-rhythm (entrainment of natural, ongoing α-oscillations). Control conditions testing different TMS burst profiles and TMS-EEG in a phantom head confirmed specificity of α-boosting to the case of synchronization between TMS train and neural oscillator. Conclusions The periodic electromagnetic force that is generated during rhythmic TMS can cause local entrainment of natural brain oscillations, emulating oscillatory signatures activated by cognitive tasks. This reveals a new mechanism of online TMS action on brain activity and can account for frequency-specific behavioral TMS effects at the level of biologically relevant rhythms. PMID:21723129

  3. 5 CFR 850.106 - Electronic signatures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Electronic signatures. 850.106 Section 850.106 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS... password; (ii) Smart card; (iii) Digitized signature; or (iv) Biometrics, such as fingerprints,...

  4. 5 CFR 850.106 - Electronic signatures.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Electronic signatures. 850.106 Section 850.106 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS... password; (ii) Smart card; (iii) Digitized signature; or (iv) Biometrics, such as fingerprints,...

  5. Signature Genes as a Phylogenomic Tool

    PubMed Central

    Snel, Berend; Ettema, Thijs J. G.; Huynen, Martijn A.

    2008-01-01

    Gene content has been shown to contain a strong phylogenetic signal, yet its usage for phylogenetic questions is hampered by horizontal gene transfer and parallel gene loss and until now required completely sequenced genomes. Here, we introduce an approach that allows the phylogenetic signal in gene content to be applied to any set of sequences, using signature genes for phylogenetic classification. The hundreds of publicly available genomes allow us to identify signature genes at various taxonomic depths, and we show how the presence of signature genes in an unspecified sample can be used to characterize its taxonomic composition. We identify 8,362 signature genes specific for 112 prokaryotic taxa. We show that these signature genes can be used to address phylogenetic questions on the basis of gene content in cases where classic gene content or sequence analyses provide an ambiguous answer, such as for Nanoarchaeum equitans, and even in cases where complete genomes are not available, such as for metagenomics data. Cross-validation experiments leaving out up to 30% of the species show that ∼92% of the signature genes correctly place the species in a related clade. Analyses of metagenomics data sets with the signature gene approach are in good agreement with the previously reported species distributions based on phylogenetic analysis of marker genes. Summarizing, signature genes can complement traditional sequence-based methods in addressing taxonomic questions. PMID:18492663

  6. Does Social Work Have a Signature Pedagogy?

    ERIC Educational Resources Information Center

    Earls Larrison, Tara; Korr, Wynne S.

    2013-01-01

    This article contributes to discourse on signature pedagogy by reconceptualizing how our pedagogies are understood and defined for social work education. We critique the view that field education is social work's signature pedagogy and consider what pedagogies are distinct about the teaching and learning of social work. Using Shulman's…

  7. 5 CFR 850.106 - Electronic signatures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false Electronic signatures. 850.106 Section 850.106 Administrative Personnel OFFICE OF PERSONNEL MANAGEMENT (CONTINUED) CIVIL SERVICE REGULATIONS (CONTINUED) RETIREMENT SYSTEMS MODERNIZATION General Provisions § 850.106 Electronic signatures. (a) Subject to any provisions prescribed by...

  8. A Real Quantum Designated Verifier Signature Scheme

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Min; Zhou, Yi-Hua; Yang, Yu-Guang

    2015-09-01

    The effectiveness of most quantum signature schemes reported in the literature can be verified by a designated person, however, those quantum signature schemes aren't the real traditional designated verifier signature schemes, because the designated person hasn't the capability to efficiently simulate a signature which is indistinguishable from a signer, which cannot satisfy the requirements in some special environments such as E-voting, call for tenders and software licensing. For solving this problem, a real quantum designated verifier signature scheme is proposed in this paper. According to the property of unitary transformation and quantum one-way function, only a verifier designated by a signer can verify the "validity of a signature" and the designated verifier cannot prove to a third party that the signature was produced by the signer or by himself through a transcript simulation algorithm. Moreover, the quantum key distribution and quantum encryption algorithm guarantee the unconditional security of this scheme. Analysis results show that this new scheme satisfies the main security requirements of designated verifier signature scheme and the major attack strategies.

  9. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training. PMID:26660699

  10. Verification against perturbed analyses and observations

    NASA Astrophysics Data System (ADS)

    Bowler, N. E.; Cullen, M. J. P.; Piccolo, C.

    2015-07-01

    It has long been known that verification of a forecast against the sequence of analyses used to produce those forecasts can under-estimate the magnitude of forecast errors. Here we show that under certain conditions the verification of a short-range forecast against a perturbed analysis coming from an ensemble data assimilation scheme can give the same root-mean-square error as verification against the truth. This means that a perturbed analysis can be used as a reliable proxy for the truth. However, the conditions required for this result to hold are rather restrictive: the analysis must be optimal, the ensemble spread must be equal to the error in the mean, the ensemble size must be large and the forecast being verified must be the background forecast used in the data assimilation. Although these criteria are unlikely to be met exactly it becomes clear that for most cases verification against a perturbed analysis gives better results than verification against an unperturbed analysis. We demonstrate the application of these results in a idealised model framework and a numerical weather prediction context. In deriving this result we recall that an optimal (Kalman) analysis is one for which the analysis increments are uncorrelated with the analysis errors.

  11. Complementary technologies for verification of excess plutonium

    SciTech Connect

    Langner, , D.G.; Nicholas, N.J.; Ensslin, N.; Fearey, B.L.; Mitchell, D.J.; Marlow, K.W.; Luke, S.J.; Gosnell, T.B.

    1998-12-31

    Three complementary measurement technologies have been identified as candidates for use in the verification of excess plutonium of weapons origin. These technologies: high-resolution gamma-ray spectroscopy, neutron multiplicity counting, and low-resolution gamma-ray spectroscopy, are mature, robust technologies. The high-resolution gamma-ray system, Pu-600, uses the 630--670 keV region of the emitted gamma-ray spectrum to determine the ratio of {sup 240}Pu to {sup 239}Pu. It is useful in verifying the presence of plutonium and the presence of weapons-grade plutonium. Neutron multiplicity counting is well suited for verifying that the plutonium is of a safeguardable quantity and is weapons-quality material, as opposed to residue or waste. In addition, multiplicity counting can independently verify the presence of plutonium by virtue of a measured neutron self-multiplication and can detect the presence of non-plutonium neutron sources. The low-resolution gamma-ray spectroscopic technique is a template method that can provide continuity of knowledge that an item that enters the a verification regime remains under the regime. In the initial verification of an item, multiple regions of the measured low-resolution spectrum form a unique, gamma-radiation-based template for the item that can be used for comparison in subsequent verifications. In this paper the authors discuss these technologies as they relate to the different attributes that could be used in a verification regime.

  12. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training.

  13. Verification of COSMO model over Poland

    NASA Astrophysics Data System (ADS)

    Linkowska, Joanna; Mazur, Andrzej; Wyszogrodzki, Andrzej

    2014-05-01

    The Polish National Weather Service and Institute of Meteorology and Water Management - National Research Institute (IMWM-NRI, Warsaw, Poland) joined the Consortium for Small-Scale Modeling (COSMO) in 2002. Thanks to cooperation in the consortium the meteorological model COSMO is run operationally at IMWM-NRI at both 2.8km and 7km horizontal resolutions. In research mode, data assimilation tests have been carried out using a 6-hourly cycle nudging scheme. We would like to present verification results of the COSMO model, comparing model generated surface temperature, wind and rain fall rates with the Synop measurements. In addition, verification results of vertical profiles for chosen variables will also be analyzed and presented. The verification is divided into the following areas: i) assessing impact of data assimilation on the quality of 2.8km resolution model forecasts by switching data assimilation on and off, ii) spatio-temporal verification of model results at 7km resolution and iii) conditional verification of selected parameters against chosen meteorological condition(s).

  14. Signatures of mutational processes in human cancer

    PubMed Central

    Alexandrov, Ludmil B.; Nik-Zainal, Serena; Wedge, David C.; Aparicio, Samuel A.J.R.; Behjati, Sam; Biankin, Andrew V.; Bignell, Graham R.; Bolli, Niccolo; Borg, Ake; Børresen-Dale, Anne-Lise; Boyault, Sandrine; Burkhardt, Birgit; Butler, Adam P.; Caldas, Carlos; Davies, Helen R.; Desmedt, Christine; Eils, Roland; Eyfjörd, Jórunn Erla; Foekens, John A.; Greaves, Mel; Hosoda, Fumie; Hutter, Barbara; Ilicic, Tomislav; Imbeaud, Sandrine; Imielinsk, Marcin; Jäger, Natalie; Jones, David T.W.; Jones, David; Knappskog, Stian; Kool, Marcel; Lakhani, Sunil R.; López-Otín, Carlos; Martin, Sancha; Munshi, Nikhil C.; Nakamura, Hiromi; Northcott, Paul A.; Pajic, Marina; Papaemmanuil, Elli; Paradiso, Angelo; Pearson, John V.; Puente, Xose S.; Raine, Keiran; Ramakrishna, Manasa; Richardson, Andrea L.; Richter, Julia; Rosenstiel, Philip; Schlesner, Matthias; Schumacher, Ton N.; Span, Paul N.; Teague, Jon W.; Totoki, Yasushi; Tutt, Andrew N.J.; Valdés-Mas, Rafael; van Buuren, Marit M.; van ’t Veer, Laura; Vincent-Salomon, Anne; Waddell, Nicola; Yates, Lucy R.; Zucman-Rossi, Jessica; Futreal, P. Andrew; McDermott, Ultan; Lichter, Peter; Meyerson, Matthew; Grimmond, Sean M.; Siebert, Reiner; Campo, Elías; Shibata, Tatsuhiro; Pfister, Stefan M.; Campbell, Peter J.; Stratton, Michael R.

    2013-01-01

    All cancers are caused by somatic mutations. However, understanding of the biological processes generating these mutations is limited. The catalogue of somatic mutations from a cancer genome bears the signatures of the mutational processes that have been operative. Here, we analysed 4,938,362 mutations from 7,042 cancers and extracted more than 20 distinct mutational signatures. Some are present in many cancer types, notably a signature attributed to the APOBEC family of cytidine deaminases, whereas others are confined to a single class. Certain signatures are associated with age of the patient at cancer diagnosis, known mutagenic exposures or defects in DNA maintenance, but many are of cryptic origin. In addition to these genome-wide mutational signatures, hypermutation localized to small genomic regions, kataegis, is found in many cancer types. The results reveal the diversity of mutational processes underlying the development of cancer with potential implications for understanding of cancer etiology, prevention and therapy. PMID:23945592

  15. Real time gamma-ray signature identifier

    DOEpatents

    Rowland, Mark; Gosnell, Tom B.; Ham, Cheryl; Perkins, Dwight; Wong, James

    2012-05-15

    A real time gamma-ray signature/source identification method and system using principal components analysis (PCA) for transforming and substantially reducing one or more comprehensive spectral libraries of nuclear materials types and configurations into a corresponding concise representation/signature(s) representing and indexing each individual predetermined spectrum in principal component (PC) space, wherein an unknown gamma-ray signature may be compared against the representative signature to find a match or at least characterize the unknown signature from among all the entries in the library with a single regression or simple projection into the PC space, so as to substantially reduce processing time and computing resources and enable real-time characterization and/or identification.

  16. Signatures of mutational processes in human cancer.

    PubMed

    Alexandrov, Ludmil B; Nik-Zainal, Serena; Wedge, David C; Aparicio, Samuel A J R; Behjati, Sam; Biankin, Andrew V; Bignell, Graham R; Bolli, Niccolò; Borg, Ake; Børresen-Dale, Anne-Lise; Boyault, Sandrine; Burkhardt, Birgit; Butler, Adam P; Caldas, Carlos; Davies, Helen R; Desmedt, Christine; Eils, Roland; Eyfjörd, Jórunn Erla; Foekens, John A; Greaves, Mel; Hosoda, Fumie; Hutter, Barbara; Ilicic, Tomislav; Imbeaud, Sandrine; Imielinski, Marcin; Imielinsk, Marcin; Jäger, Natalie; Jones, David T W; Jones, David; Knappskog, Stian; Kool, Marcel; Lakhani, Sunil R; López-Otín, Carlos; Martin, Sancha; Munshi, Nikhil C; Nakamura, Hiromi; Northcott, Paul A; Pajic, Marina; Papaemmanuil, Elli; Paradiso, Angelo; Pearson, John V; Puente, Xose S; Raine, Keiran; Ramakrishna, Manasa; Richardson, Andrea L; Richter, Julia; Rosenstiel, Philip; Schlesner, Matthias; Schumacher, Ton N; Span, Paul N; Teague, Jon W; Totoki, Yasushi; Tutt, Andrew N J; Valdés-Mas, Rafael; van Buuren, Marit M; van 't Veer, Laura; Vincent-Salomon, Anne; Waddell, Nicola; Yates, Lucy R; Zucman-Rossi, Jessica; Futreal, P Andrew; McDermott, Ultan; Lichter, Peter; Meyerson, Matthew; Grimmond, Sean M; Siebert, Reiner; Campo, Elías; Shibata, Tatsuhiro; Pfister, Stefan M; Campbell, Peter J; Stratton, Michael R

    2013-08-22

    All cancers are caused by somatic mutations; however, understanding of the biological processes generating these mutations is limited. The catalogue of somatic mutations from a cancer genome bears the signatures of the mutational processes that have been operative. Here we analysed 4,938,362 mutations from 7,042 cancers and extracted more than 20 distinct mutational signatures. Some are present in many cancer types, notably a signature attributed to the APOBEC family of cytidine deaminases, whereas others are confined to a single cancer class. Certain signatures are associated with age of the patient at cancer diagnosis, known mutagenic exposures or defects in DNA maintenance, but many are of cryptic origin. In addition to these genome-wide mutational signatures, hypermutation localized to small genomic regions, 'kataegis', is found in many cancer types. The results reveal the diversity of mutational processes underlying the development of cancer, with potential implications for understanding of cancer aetiology, prevention and therapy.

  17. Security Weaknesses in Arbitrated Quantum Signature Protocols

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Zhang, Kejia; Cao, Tianqing

    2014-01-01

    Arbitrated quantum signature (AQS) is a cryptographic scenario in which the sender (signer), Alice, generates the signature of a message and then a receiver (verifier), Bob, can verify the signature with the help of a trusted arbitrator, Trent. In this paper, we point out there exist some security weaknesses in two AQS protocols. Our analysis shows Alice can successfully disavow any of her signatures by a simple attack in the first protocol. Furthermore, we study the security weaknesses of the second protocol from the aspects of forgery and disavowal. Some potential improvements of this kind of protocols are given. We also design a new method to authenticate a signature or a message, which makes AQS protocols immune to Alice's disavowal attack and Bob's forgery attack effectively.

  18. Data verification in the residue laboratory.

    PubMed

    Ault, J A; Cassidy, P S; Crawford, C J; Jablonski, J E; Kenyon, R G

    1994-12-01

    Residue analysis frequently presents a challenge to the quality assurance (QA) auditor due to the sheer volume of data to be audited. In the face of multiple boxes of raw data, some process must be defined that assures the scientist and the QA auditor of the quality and integrity of the data. A program that ensures that complete and appropriate verification of data before it reaches the Quality Assurance Unit (QAU) is presented. The "Guidelines for Peer Review of Data" were formulated by the Residue Analysis Business Center at Ricerca, Inc. to accommodate efficient use of review time and to define any uncertainties concerning what are acceptable data. The core of this program centers around five elements: Study initiation (definitional) meetings, calculations, verification, approval, and the use of a verification checklist.

  19. Fuel Retrieval System (FRS) Design Verification

    SciTech Connect

    YANOCHKO, R.M.

    2000-01-27

    This document was prepared as part of an independent review to explain design verification activities already completed, and to define the remaining design verification actions for the Fuel Retrieval System. The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR).

  20. Heavy water physical verification in power plants

    SciTech Connect

    Morsy, S.; Schuricht, V.; Beetle, T.; Szabo, E.

    1986-01-01

    This paper is a report on the Agency experience in verifying heavy water inventories in power plants. The safeguards objectives and goals for such activities are defined in the paper. The heavy water is stratified according to the flow within the power plant, including upgraders. A safeguards scheme based on a combination of records auditing, comparing records and reports, and physical verification has been developed. This scheme has elevated the status of heavy water safeguards to a level comparable to nuclear material safeguards in bulk facilities. It leads to attribute and variable verification of the heavy water inventory in the different system components and in the store. The verification methods include volume and weight determination, sampling and analysis, non-destructive assay (NDA), and criticality check. The analysis of the different measurement methods and their limits of accuracy are discussed in the paper.

  1. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  2. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  3. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  4. Land Ice Verification and Validation Kit

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&Vmore » involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  5. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  6. Online medication history retrieval.

    PubMed

    Herdman, Bruce W; Varghese, Sandy R; Domer-Shank, Reed Bosch

    2015-01-01

    The difficulty of obtaining accurate medication history from inmates at the time of incarceration is daunting. This article summarizes the success of a large urban jail in the use of online data to identify medication history upon incarceration. This article describes the scope of available prescription data, the implementation of online retrieval, system limitations, planned improvements, and suggestions of additional applications of online retrieval services.

  7. Online use statistics.

    PubMed

    Tannery, Nancy Hrinya; Silverman, Deborah L; Epstein, Barbara A

    2002-01-01

    Online use statistics can provide libraries with a tool to be used when developing an online collection of resources. Statistics can provide information on overall use of a collection, individual print and electronic journal use, and collection use by specific user populations. They can also be used to determine the number of user licenses to purchase. This paper focuses on the issue of use statistics made available for one collection of online resources.

  8. Applying Causal Discovery to the Output of Climate Models - What Can We Learn from the Causal Signatures?

    NASA Astrophysics Data System (ADS)

    Ebert-Uphoff, I.; Hammerling, D.; Samarasinghe, S.; Baker, A. H.

    2015-12-01

    The framework of causal discovery provides algorithms that seek to identify potential cause-effect relationships from observational data. The output of such algorithms is a graph structure that indicates the potential causal connections between the observed variables. Originally developed for applications in the social sciences and economics, causal discovery has been used with great success in bioinformatics and, most recently, in climate science, primarily to identify interaction patterns between compound climate variables and to track pathways of interactions between different locations around the globe. Here we apply causal discovery to the output data of climate models to learn so-called causal signatures from the data that indicate interactions between the different atmospheric variables. These causal signatures can act like fingerprints for the underlying dynamics and thus serve a variety of diagnostic purposes. We study the use of the causal signatures for three applications: 1) For climate model software verification we suggest to use causal signatures as a means of detecting statistical differences between model runs, thus identifying potential errors and supplementing the Community Earth System Model Ensemble Consistency Testing (CESM-ECT) tool recently developed at NCAR for CESM verification. 2) In the context of data compression of model runs, we will test how much the causal signatures of the model outputs changes after different compression algorithms have been applied. This may result in additional means to determine which type and amount of compression is acceptable. 3) This is the first study applying causal discovery simultaneously to a large number of different atmospheric variables, and in the process of studying the resulting interaction patterns for the two aforementioned applications, we expect to gain some new insights into their relationships from this approach. We will present first results obtained for Applications 1 and 2 above.

  9. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  10. Signatures of time reversal symmetry breaking in multiband superconductors

    NASA Astrophysics Data System (ADS)

    Maiti, Saurabh

    Multiband superconductors serve as natural host to several possible gound states that compete with each other. At the boundaries of such competing phases, the system usually compromises and settles for `mixed' phases that can show intriguing properties like co-existence of magnetism and superconductiivty or even co-existence of different superconducting phases. The latter is particularly interesting as it can lead to non-magnetic ground states that spontaneously break Time-Reversal symmetry. While the experimental verification of such states has proved to been challenging, the theoretical investigations have provided exciting new insights into the nature of the ground state and its excitations all of which have experimental consequences of some sort. These include extrinsic properties like spontaneous currents around impurity sites, and intrinsic properties in the form of collective excitations. These collective modes bear a unique signature and should provide clear evidence for time reversal symmetry broken state. While the results are general, in light of recent Raman scattering experiments, its direct relevance to extremely hole doped Ba(1-x)K(FeAs)2 will be presented where a strong competition of s-wave and d-wave ground state is expected.

  11. Three plasma metabolite signatures for diagnosing high altitude pulmonary edema

    NASA Astrophysics Data System (ADS)

    Guo, Li; Tan, Guangguo; Liu, Ping; Li, Huijie; Tang, Lulu; Huang, Lan; Ren, Qian

    2015-10-01

    High-altitude pulmonary edema (HAPE) is a potentially fatal condition, occurring at altitudes greater than 3,000 m and affecting rapidly ascending, non-acclimatized healthy individuals. However, the lack of biomarkers for this disease still constitutes a bottleneck in the clinical diagnosis. Here, ultra-high performance liquid chromatography coupled with Q-TOF mass spectrometry was applied to study plasma metabolite profiling from 57 HAPE and 57 control subjects. 14 differential plasma metabolites responsible for the discrimination between the two groups from discovery set (35 HAPE subjects and 35 healthy controls) were identified. Furthermore, 3 of the 14 metabolites (C8-ceramide, sphingosine and glutamine) were selected as candidate diagnostic biomarkers for HAPE using metabolic pathway impact analysis. The feasibility of using the combination of these three biomarkers for HAPE was evaluated, where the area under the receiver operating characteristic curve (AUC) was 0.981 and 0.942 in the discovery set and the validation set (22 HAPE subjects and 22 healthy controls), respectively. Taken together, these results suggested that this composite plasma metabolite signature may be used in HAPE diagnosis, especially after further investigation and verification with larger samples.

  12. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... signature components and controls. (a) Electronic signatures that are not based upon biometrics shall: (1... signatures based upon biometrics shall be designed to ensure that they cannot be used by anyone other...

  13. DNA Methylation Signature of Childhood Chronic Physical Aggression in T Cells of Both Men and Women

    PubMed Central

    Guillemin, Claire; Provençal, Nadine; Suderman, Matthew; Côté, Sylvana M.; Vitaro, Frank; Hallett, Michael; Tremblay, Richard E.; Szyf, Moshe

    2014-01-01

    Background High frequency of physical aggression is the central feature of severe conduct disorder and is associated with a wide range of social, mental and physical health problems. We have previously tested the hypothesis that differential DNA methylation signatures in peripheral T cells are associated with a chronic aggression trajectory in males. Despite the fact that sex differences appear to play a pivotal role in determining the development, magnitude and frequency of aggression, most of previous studies focused on males, so little is known about female chronic physical aggression. We therefore tested here whether or not there is a signature of physical aggression in female DNA methylation and, if there is, how it relates to the signature observed in males. Methodology/Principal Findings Methylation profiles were created using the method of methylated DNA immunoprecipitation (MeDIP) followed by microarray hybridization and statistical and bioinformatic analyses on T cell DNA obtained from adult women who were found to be on a chronic physical aggression trajectory (CPA) between 6 and 12 years of age compared to women who followed a normal physical aggression trajectory. We confirmed the existence of a well-defined, genome-wide signature of DNA methylation associated with chronic physical aggression in the peripheral T cells of adult females that includes many of the genes similarly associated with physical aggression in the same cell types of adult males. Conclusions This study in a small number of women presents preliminary evidence for a genome-wide variation in promoter DNA methylation that associates with CPA in women that warrant larger studies for further verification. A significant proportion of these associations were previously observed in men with CPA supporting the hypothesis that the epigenetic signature of early life aggression in females is composed of a component specific to females and another common to both males and females. PMID:24475181

  14. A watermarking-based medical image integrity control system and an image moment signature for tampering characterization.

    PubMed

    Coatrieux, Gouenou; Huang, Hui; Shu, Huazhong; Luo, Limin; Roux, Christian

    2013-11-01

    In this paper, we present a medical image integrity verification system to detect and approximate local malevolent image alterations (e.g., removal or addition of lesions) as well as identifying the nature of a global processing an image may have undergone (e.g., lossy compression, filtering, etc.). The proposed integrity analysis process is based on nonsignificant region watermarking with signatures extracted from different pixel blocks of interest, which are compared with the recomputed ones at the verification stage. A set of three signatures is proposed. The first two devoted to detection and modification location are cryptographic hashes and checksums, while the last one is issued from the image moment theory. In this paper, we first show how geometric moments can be used to approximate any local modification by its nearest generalized 2-D Gaussian. We then demonstrate how ratios between original and recomputed geometric moments can be used as image features in a classifier-based strategy in order to determine the nature of a global image processing. Experimental results considering both local and global modifications in MRI and retina images illustrate the overall performances of our approach. With a pixel block signature of about 200 bit long, it is possible to detect, to roughly localize, and to get an idea about the image tamper.

  15. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  16. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  17. Verification of Plan Models Using UPPAAL

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Haveland, Klaus; Lau, Sonic (Technical Monitor)

    2001-01-01

    This paper describes work on the verification of HSTS, the planner and scheduler of the Remote Agent autonomous control system deployed in Deep Space 1 (DS1). The verification is done using UPPAAL, a real time model checking tool. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify. Finally, we conclude with a summary.

  18. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  19. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  20. On the Privacy Protection of Biometric Traits: Palmprint, Face, and Signature

    NASA Astrophysics Data System (ADS)

    Panigrahy, Saroj Kumar; Jena, Debasish; Korra, Sathya Babu; Jena, Sanjay Kumar

    Biometrics are expected to add a new level of security to applications, as a person attempting access must prove who he or she really is by presenting a biometric to the system. The recent developments in the biometrics area have lead to smaller, faster and cheaper systems, which in turn has increased the number of possible application areas for biometric identity verification. The biometric data, being derived from human bodies (and especially when used to identify or verify those bodies) is considered personally identifiable information (PII). The collection, use and disclosure of biometric data — image or template, invokes rights on the part of an individual and obligations on the part of an organization. As biometric uses and databases grow, so do concerns that the personal data collected will not be used in reasonable and accountable ways. Privacy concerns arise when biometric data are used for secondary purposes, invoking function creep, data matching, aggregation, surveillance and profiling. Biometric data transmitted across networks and stored in various databases by others can also be stolen, copied, or otherwise misused in ways that can materially affect the individual involved. As Biometric systems are vulnerable to replay, database and brute-force attacks, such potential attacks must be analysed before they are massively deployed in security systems. Along with security, also the privacy of the users is an important factor as the constructions of lines in palmprints contain personal characteristics, from face images a person can be recognised, and fake signatures can be practised by carefully watching the signature images available in the database. We propose a cryptographic approach to encrypt the images of palmprints, faces, and signatures by an advanced Hill cipher technique for hiding the information in the images. It also provides security to these images from being attacked by above mentioned attacks. So, during the feature extraction, the

  1. Going Online the MI Way.

    ERIC Educational Resources Information Center

    Feldt, Jill

    This booklet describes online searching using Materials Information, a metallurgy and metals science information service of the Institute of Metals in London and ASM International in Cleveland, Ohio, which is available through the major online vendors. Described in detail are online searching, online databases, costs, online hosts or vendors,…

  2. Genomic signatures in microbes -- properties and applications.

    PubMed

    Bohlin, Jon

    2011-03-22

    The ratio of genomic oligonucleotide frequencies relative to the mean genomic AT/GC content has been shown to be similar for closely related species and, therefore, said to reflect a "genomic signature". The genomic signature has been found to be more similar within genomes than between closely related genomes. Furthermore, genomic signatures of closely related organisms are, in turn, more similar than more distantly related organisms. Since the genomic signature is remarkably stable within a genome, it can be extracted from only a fraction of the genomic DNA sequence. Genomic signatures, therefore, have many applications. The most notable examples include recognition of pathogenicity islands in microbial genomes and identification of hosts from arbitrary DNA sequences, the latter being of great importance in metagenomics. What shapes the genomic signature in microbial DNA has been readily discussed, but difficult to pinpoint exactly. Most attempts so far have mainly focused on correlations from in silico data. This mini-review seeks to summarize possible influences shaping the genomic signature and to survey a set of applications.

  3. SIRUS spectral signature analysis code

    NASA Astrophysics Data System (ADS)

    Bishop, Gary J.; Caola, Mike J.; Geatches, Rachel M.; Roberts, Nick C.

    2003-09-01

    The Advanced Technology Centre (ATC) is responsible for developing IR signature prediction capabilities for its parent body, BAE SYSTEMS. To achieve this, the SIRUS code has been developed and used on a variety of projects for well over a decade. SIRUS is capable of providing accurate IR predictions for air breathing and rocket motor propelled vehicles. SIRUS models various physical components to derive its predictions. A key component is the radiance reflected from the surface of the modeled vehicle. This is modeled by fitting parameters to the measured Bi-Directional Reflectance Function (BDRF) of the surface material(s). The ATC have successfully implemented a parameterization scheme based on the published OPTASM model, and this is described. However, inconsistencies between reflectance measurements and values calculated from the parameterized fit have led to an elliptical parameter enhancement. The implementation of this is also described. Finally, an end-to-end measurement-parameterization capability is described, based on measurements taken with SOC600 instrumentation.

  4. Experimental signatures of quantum annealing

    NASA Astrophysics Data System (ADS)

    Boixo, Sergio

    2013-03-01

    Quantum annealing is a general strategy for solving optimization problems with the aid of quantum adiabatic evolution. How effective is rapid decoherence in precluding quantum effects in a quantum annealing experiment, and will engineered quantum annealing devices effectively perform classical thermalization when coupled to a decohering thermal environment? Using the D-Wave machine, we report experimental results for a simple problem which takes advantage of the fact that for quantum annealing the measurement statistics are determined by the energy spectrum along the quantum evolution, while in classical thermalization they are determined by the spectrum of the final Hamiltonian only. We establish an experimental signature which is consistent with quantum annealing, and at the same time inconsistent with classical thermalization, in spite of a decoherence timescale which is orders of magnitude shorter than the adiabatic evolution time. For larger and more difficult problems, we compare the measurements statistics of the D-Wave machine to large-scale numerical simulations of simulated annealing and simulated quantum annealing, implemented through classical and quantum Monte Carlo simulations. For our test cases the statistics of the machine are - within calibration uncertainties - indistinguishable from a simulated quantum annealer with suitably chosen parameters, but significantly different from a classical annealer. Work in collaboration with T. Albash, N. Chancellor, S. Isakov, D. Lidar, T. Roennow, F. Spedalieri, M. Troyer and Z. Wang.

  5. (abstract) Topographic Signatures in Geology

    NASA Technical Reports Server (NTRS)

    Farr, Tom G.; Evans, Diane L.

    1996-01-01

    Topographic information is required for many Earth Science investigations. For example, topography is an important element in regional and global geomorphic studies because it reflects the interplay between the climate-driven processes of erosion and the tectonic processes of uplift. A number of techniques have been developed to analyze digital topographic data, including Fourier texture analysis. A Fourier transform of the topography of an area allows the spatial frequency content of the topography to be analyzed. Band-pass filtering of the transform produces images representing the amplitude of different spatial wavelengths. These are then used in a multi-band classification to map units based on their spatial frequency content. The results using a radar image instead of digital topography showed good correspondence to a geologic map, however brightness variations in the image unrelated to topography caused errors. An additional benefit to the use of Fourier band-pass images for the classification is that the textural signatures of the units are quantative measures of the spatial characteristics of the units that may be used to map similar units in similar environments.

  6. Signature geometry and quantum engineering

    NASA Astrophysics Data System (ADS)

    Samociuk, Stefan

    2013-09-01

    As the operating frequency of electromagnetic based devices increase, physical design geometry is playing an ever more important role. Evidence is considered in support of a relationship between the dimensionality of primitive geometric forms, such as transistors, and corresponding electromagnetic coupling efficiency. The industry of electronics is defined as the construction of devices by the patterning of primitive forms to physical materials. Examples are given to show the evolution of these primitives, down to nano scales, are requiring exacting geometry and three dimensional content. Consideration of microwave monolithic integrated circuits,(MMIC), photonics and metamaterials,(MM), support this trend and also add new requirements of strict geometric periodicity and multiplicity. Signature geometries,(SG), are characterized by distinctive attributes and examples are given. The transcendent form transcode algorithm, (TTA) is introduced as a multi dimensional SG and its use in designing photonic integrated circuits and metamaterials is discussed . A creative commons licensed research database, TRANSFORM, containing TTA geometries in OASIS file formats is described. An experimental methodology for using the database is given. Multidimensional SG and extraction of three dimensional cross sections as primitive forms is discussed as a foundation for quantum engineering and the exploitation of phenomena other than the electromagnetic.

  7. Molecular signatures of vaccine adjuvants.

    PubMed

    Olafsdottir, Thorunn; Lindqvist, Madelene; Harandi, Ali M

    2015-09-29

    Mass vaccination has saved millions of human lives and improved the quality of life in both developing and developed countries. The emergence of new pathogens and inadequate protection conferred by some of the existing vaccines such as vaccines for tuberculosis, influenza and pertussis especially in certain age groups have resulted in a move from empirically developed vaccines toward more pathogen tailored and rationally engineered vaccines. A deeper understanding of the interaction of innate and adaptive immunity at molecular level enables the development of vaccines that selectively target certain type of immune responses without excessive reactogenicity. Adjuvants constitute an imperative element of modern vaccines. Although a variety of candidate adjuvants have been evaluated in the past few decades, only a limited number of vaccine adjuvants are currently available for human use. A better understanding of the mode of action of adjuvants is pivotal to harness the potential of existing and new adjuvants in shaping a desired immune response. Recent advancement in systems biology powered by the emerging cutting edge omics technology has led to the identification of molecular signatures rapidly induced after vaccination in the blood that correlate and predict a later protective immune response or vaccine safety. This can pave ways to prospectively determine the potency and safety of vaccines and adjuvants. This review is intended to highlight the importance of big data analysis in advancing our understanding of the mechanisms of actions of adjuvants to inform rational development of future human vaccines. PMID:25989447

  8. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  9. Timing signatures of large scale solar eruptions

    NASA Astrophysics Data System (ADS)

    Balasubramaniam, K. S.; Hock-Mysliwiec, Rachel; Henry, Timothy; Kirk, Michael S.

    2016-05-01

    We examine the timing signatures of large solar eruptions resulting in flares, CMEs and Solar Energetic Particle events. We probe solar active regions from the chromosphere through the corona, using data from space and ground-based observations, including ISOON, SDO, GONG, and GOES. Our studies include a number of flares and CMEs of mostly the M- and X-strengths as categorized by GOES. We find that the chromospheric signatures of these large eruptions occur 5-30 minutes in advance of coronal high temperature signatures. These timing measurements are then used as inputs to models and reconstruct the eruptive nature of these systems, and explore their utility in forecasts.

  10. Arbitrated quantum signature with an untrusted arbitrator

    NASA Astrophysics Data System (ADS)

    Yang, Yu-Guang; Zhou, Zheng; Teng, Yi-Wei; Wen, Qiao-Yan

    2011-02-01

    In an arbitrated signature scheme, all communications involve a so called arbitrator who has access to the contents of the messages. The security of most arbitrated signature schemes depends heavily on the trustworthiness of the arbitrators. In this paper we show how to construct an arbitrated quantum signature protocol of classical messages with an untrusted arbitrator. Its security is analyzed and it is proved to be secure even if the arbitrator is compromised. In addition, the proposed protocol does not require a direct quantum link between any two communicating users, which is an appealing advantage in the implementation of a practical quantum distributed communication network.

  11. Improved Quantum Signature Scheme with Weak Arbitrator

    NASA Astrophysics Data System (ADS)

    Su, Qi; Li, Wen-Min

    2013-09-01

    In this paper, we find a man-in-the-middle attack on the quantum signature scheme with a weak arbitrator (Luo et al., Int. J. Theor. Phys., 51:2135, 2012). In that scheme, the authors proposed a quantum signature based on quantum one way function which contains both verifying the signer phase and verifying the signed message phase. However, after our analysis we will show that Eve can adopt different strategies in respective phases to forge the signature without being detected. Then we present an improved scheme to increase the security.

  12. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  13. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  14. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  15. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  16. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  17. 45 CFR 1626.6 - Verification of citizenship.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 45 Public Welfare 4 2014-10-01 2014-10-01 false Verification of citizenship. 1626.6 Section 1626.6... ON LEGAL ASSISTANCE TO ALIENS § 1626.6 Verification of citizenship. (a) A recipient shall require all... require verification of citizenship. A recipient shall not consider factors such as a person's...

  18. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  19. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  20. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  1. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  2. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  3. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 8 Aliens and Nationality 1 2012-01-01 2012-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  4. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 8 Aliens and Nationality 1 2011-01-01 2011-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  5. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 8 Aliens and Nationality 1 2014-01-01 2014-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  6. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 8 Aliens and Nationality 1 2013-01-01 2013-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  7. 8 CFR 343b.5 - Verification of naturalization.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 8 Aliens and Nationality 1 2010-01-01 2010-01-01 false Verification of naturalization. 343b.5... CERTIFICATE OF NATURALIZATION FOR RECOGNITION BY A FOREIGN STATE § 343b.5 Verification of naturalization. The application shall not be granted without first obtaining verification of the applicant's naturalization....

  8. 37 CFR 261.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    .... This section prescribes general rules pertaining to the verification by any Copyright Owner or... section shall apply to situations where a Copyright Owner or a Performer and a Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner or a...

  9. 40 CFR 1065.920 - PEMS calibrations and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... test conditions. As provided in 40 CFR 1068.5, we will deem your system to not meet the requirements of... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PEMS calibrations and verifications....920 PEMS calibrations and verifications. (a) Subsystem calibrations and verifications. Use all...

  10. Quantification, Prediction, and the Online Impact of Sentence Truth-Value: Evidence from Event-Related Potentials

    ERIC Educational Resources Information Center

    Nieuwland, Mante S.

    2016-01-01

    Do negative quantifiers like "few" reduce people's ability to rapidly evaluate incoming language with respect to world knowledge? Previous research has addressed this question by examining whether online measures of quantifier comprehension match the "final" interpretation reflected in verification judgments. However, these…

  11. A feasibility study of treatment verification using EPID cine images for hypofractionated lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoli; Lin, Tong; Jiang, Steve

    2009-09-01

    We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  12. Online Collaboration: Curriculum Unbound!

    ERIC Educational Resources Information Center

    Waters, John K.

    2007-01-01

    Freed from the nuisances of paper-based methods, districts are making creative use of digital tools to move their curricular documents online, where educators can collaborate on course development and lesson planning. Back in 2003, Amarillo Independent School District (Texas) had begun using the Blackboard Content System to provide lessons online.…

  13. Online Database Searching Workbook.

    ERIC Educational Resources Information Center

    Littlejohn, Alice C.; Parker, Joan M.

    Designed primarily for use by first-time searchers, this workbook provides an overview of online searching. Following a brief introduction which defines online searching, databases, and database producers, five steps in carrying out a successful search are described: (1) identifying the main concepts of the search statement; (2) selecting a…

  14. Online Videoconferencing Products: Update

    ERIC Educational Resources Information Center

    Burton, Douglas; Kitchen, Tim

    2011-01-01

    Software allowing real-time online video connectivity is rapidly evolving. The ability to connect students, staff, and guest speakers instantaneously carries great benefits for the online distance education classroom. This evaluation report compares four software applications at opposite ends of the cost spectrum: "DimDim", "Elluminate VCS",…

  15. ALICAT. The Online Catalog.

    ERIC Educational Resources Information Center

    Ok Park, Hye, Ed.; And Others

    This guide for users of the Adelphi University Libraries provides instructions for accessing the bibliographic records of the libraries' holdings, which have been stored online since 1968. The steps necessary to search the Adelphi Libraries Catalog Online (ALICAT) by author, title, subject, or call number are explained using text and…

  16. Serving the Online Learner

    ERIC Educational Resources Information Center

    Boettcher, Judith V.

    2007-01-01

    Systems and services for recruiting, advising, and support of online students have seldom been at the top of the list when planning online and distance learning programs. That is now changing: Forces pushing advising and support services into the foreground include recognition of the student learner as "customer" and the increasing expectations…

  17. Why Teach Online.

    ERIC Educational Resources Information Center

    Kilian, Crawford

    1997-01-01

    Lists seven characteristics of online instruction, and discusses related questions. Topics include costs, increased usage of educational technology, access to information/communication for isolated groups, the present text-based system and technological change, the pattern of hypertext, effectiveness of online instruction, and the student/teacher…

  18. Teaching Astronomy Online

    NASA Astrophysics Data System (ADS)

    Maddison, Sarah T.; Mazzolini, Margaret M.

    Swinburne Astronomy Online (SAO) is a fully online graduate astronomy program with students and instructors located in over 30 countries around the globe. SAO uses a hybrid online form of delivery with image and animation-rich course content provided on CD-ROMs and the Internet used for communication research and assessment purposes. Now in its nineth semester and continuing to grow SAO can be considered a 'success story' in new teaching methods and used as an example for online education programs. One of the key distinguishing features of online education as compared to other forms of distance education is the opportunity for instructors and students to interact via online asynchronous discussion forums. Asynchronous discussion forums are used to a varying degree in different online academic programs and in widely different ways. In this paper we will give a historical overview of SAO and how it operates what astronomy we teach online and why specifically focusing on the use of asynchronous discussion forums - which are a central feature of the SAO program - as a learning and teaching tool

  19. Authoritative Online Editions

    ERIC Educational Resources Information Center

    Benton, Thomas H.

    2007-01-01

    In this article, the author discusses how it is now very easy for anyone to find formerly hard-to-find books such as the works of Walt Whitman with the help of online booksellers. The author also describes the efforts made by various institutions to produce online editions of the works of major writers. One such prominent project is the archive…

  20. Nonbibliographic Databases Online.

    ERIC Educational Resources Information Center

    Online Review, 1978

    1978-01-01

    A directory of 246 nonbibliographic data bases, which are also known as data banks and numeric data bases. Entries are arranged alphabetically by name of data base, followed by name of data base producer, subject content, and online vendor. This directory updates the listing published in Vol. 1, No. 4 of On-Line Review. (JPF)

  1. Classroom versus Online Assessment

    ERIC Educational Resources Information Center

    Spivey, Michael F.; McMillan, Jeffrey J.

    2014-01-01

    The authors examined students' effort and performance using online versus traditional classroom testing procedures. The instructor and instructional methodology were the same in different sections of an introductory finance class. Only the procedure in which students were tested--online versus in the classroom--differed. The authors measured…

  2. Adolescent Online Victimization

    ERIC Educational Resources Information Center

    Young, Adena; Young, Atrisha; Fullwood, Harry

    2007-01-01

    Online victimization is a concern among many who work with youth. This article reviews the latest research on online victimization then promotes honest dialogue, personal responsibility of the youth, and proper reporting actions as strategies to reduce this type of victimization. (Contains 1 figure and 1 table.)

  3. Taking Information Literacy Online.

    ERIC Educational Resources Information Center

    Levesque, Carla

    2003-01-01

    Explores the process of designing, teaching, and revising an online information literacy course at St. Petersburg College (SPC) (Florida). Shares methods for encouraging participation in online courses and ways of tracking students' progress. Reports that basic computer information and literacy is now a graduation requirement at SBC. Contains…

  4. Online Higher Education Commodity

    ERIC Educational Resources Information Center

    Chau, Paule

    2010-01-01

    This article analyzes the current trend towards online education. It examines some of the reasons for the trend and the ramifications it may have on students, faculty and institutions of higher learning. The success and profitability of online programs and institutions such as the University of Phoenix has helped to make the move towards online…

  5. Online Advertising in Social Networks

    NASA Astrophysics Data System (ADS)

    Bagherjeiran, Abraham; Bhatt, Rushi P.; Parekh, Rajesh; Chaoji, Vineet

    Online social networks offer opportunities to analyze user behavior and social connectivity and leverage resulting insights for effective online advertising. This chapter focuses on the role of social network information in online display advertising.

  6. A feature based comparison of pen and swipe based signature characteristics.

    PubMed

    Robertson, Joshua; Guest, Richard

    2015-10-01

    Dynamic Signature Verification (DSV) is a biometric modality that identifies anatomical and behavioral characteristics when an individual signs their name. Conventionally signature data has been captured using pen/tablet apparatus. However, the use of other devices such as the touch-screen tablets has expanded in recent years affording the possibility of assessing biometric interaction on this new technology. To explore the potential of employing DSV techniques when a user signs or swipes with their finger, we report a study to correlate pen and finger generated features. Investigating the stability and correlation between a set of characteristic features recorded in participant's signatures and touch-based swipe gestures, a statistical analysis was conducted to assess consistency between capture scenarios. The results indicate that there is a range of static and dynamic features such as the rate of jerk, size, duration and the distance the pen traveled that can lead to interoperability between these two systems for input methods for use within a potential biometric context. It can be concluded that this data indicates that a general principle is that the same underlying constructional mechanisms are evident. PMID:26097008

  7. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES... this chapter, the State must obtain the information through that service. (h) Interaction with...

  8. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES...) Interaction with program integrity requirements. Nothing in this section should be construed as limiting...

  9. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  10. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  11. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  12. Synthesis, verification, and optimization of systolic arrays

    SciTech Connect

    Rajopadhye, S.V.

    1986-01-01

    This dissertation addresses the issue of providing a sound theoretical basis for three important issues relating to systolic arrays, namely synthesis, verification, and optimization. Former research has concentrated on analysis of the dependency structure of the computation, and there have been numerous approaches to map this dependency structure onto a locally interconnected network. This study pursues a similar approach, but with a major generalization of the class of problems analyzed. In earlier research, it was essential that the dependencies were expressible as constant vectors (from a point in the domain to the points that it depended on); here they are permitted to be arbitrary linear functions of the point. Theory for synthesizing systolic architectures from such generalized specifications is developed. Also a systematic (mechanizable) approach to the synthesis of systolic architectures that have control signals is presented. In the areas of verification and optimization, a rigorous mathematical framework is presented that permits reasoning about the behavior of systolic arrays as functions on streams of data. Using this approach, the verification of such architectures reduces to the problem of verification of functional program.s

  13. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  14. The ALMA Commissioning and Science Verification Team

    NASA Astrophysics Data System (ADS)

    Hales, A.; Sheth, K.; Wilson, T. L.

    2010-04-01

    The goal of Commissioning is to take ALMA from the stage reached at the end of AIV, that is, a system that functions at an engineering level to an instrument that meets the science / astronomy requirements. Science Verification is the quantitative confirmation that the data produced by the instrument is valid and has the required characteristics in terms of sensitivity, image quality and accuracy.

  15. The Assembly, Integration, and Verification (AIV) team

    NASA Astrophysics Data System (ADS)

    2009-06-01

    Assembly, Integration, and Verification (AIV) is the process by which the software and hardware deliveries from the distributed ALMA partners (North America, South America, Europe, and East Asia) are assembled and integrated into a working system, and the initial technical capabilities tested to insure that they will meet the observatories exacting requirements for science.

  16. An Interactive System for Graduation Verification.

    ERIC Educational Resources Information Center

    Wang, Y.; Dasarathy, B.

    1981-01-01

    This description of a computerized graduation verification system developed and implemented at the University of South Carolina at Columbia discusses the "table-driven" feature of the programs and details the implementation of the system, including examples of the Extended Backus Naur Form (EBNF) notation used to represent the system language…

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF BAGHOUSE FILTRATION PRODUCTS

    EPA Science Inventory

    The Environmental Technology Verification Program (ETV) was started by EPA in 1995 to generate independent credible data on the performance of innovative technologies that have potential to improve protection of public health and the environment. ETV does not approve or certify p...

  18. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  19. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  20. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  1. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  2. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  3. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Verification. 123.8 Section 123.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... processor shall verify that the HACCP plan is adequate to control food safety hazards that are...

  4. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Verification. 123.8 Section 123.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... processor shall verify that the HACCP plan is adequate to control food safety hazards that are...

  5. Needs Assessment Project: FY 82 Verification Study.

    ERIC Educational Resources Information Center

    Shively, Joe E.; O'Donnell, Phyllis

    As part of a continuing assessment of educational needs in a seven-state region, researchers conducted a verification study to check the validity of educational needs first identified in fiscal year (FY) 1980. The seven states comprise Alabama, Kentucky, Ohio, Pennsylvania, Tennessee, Virginia, and West Virginia. This report describes assessment…

  6. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  7. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  8. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  9. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Verification program. 460.17 Section 460.17 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT...

  10. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17 Aeronautics and Space COMMERCIAL SPACE TRANSPORTATION, FEDERAL AVIATION ADMINISTRATION, DEPARTMENT...

  11. Environmental Technology Verification (ETV) Quality Program (Poster)

    EPA Science Inventory

    This is a poster created for the ETV Quality Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The...

  12. Hardware verification at Computational Logic, Inc.

    NASA Technical Reports Server (NTRS)

    Brock, Bishop C.; Hunt, Warren A., Jr.

    1990-01-01

    The following topics are covered in viewgraph form: (1) hardware verification; (2) Boyer-Moore logic; (3) core RISC; (4) the FM8502 fabrication, implementation specification, and pinout; (5) hardware description language; (6) arithmetic logic generator; (7) near term expected results; (8) present trends; (9) future directions; (10) collaborations and technology transfer; and (11) technology enablers.

  13. 24 CFR 257.112 - Mortgagee verifications.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER AUTHORITIES... income. (b) Mortgage fraud verification. The mortgagor shall provide a certification to the mortgagee that the mortgagor has not been convicted under federal or state law for fraud during the...

  14. 24 CFR 257.112 - Mortgagee verifications.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... URBAN DEVELOPMENT MORTGAGE AND LOAN INSURANCE PROGRAMS UNDER NATIONAL HOUSING ACT AND OTHER AUTHORITIES... income. (b) Mortgage fraud verification. The mortgagor shall provide a certification to the mortgagee that the mortgagor has not been convicted under federal or state law for fraud during the...

  15. Gender, Legitimation, and Identity Verification in Groups

    ERIC Educational Resources Information Center

    Burke, Peter J.; Stets, Jan E.; Cerven, Christine

    2007-01-01

    Drawing upon identity theory, expectation states theory, and legitimation theory, we examine how the task leader identity in task-oriented groups is more likely to be verified for persons with high status characteristics. We hypothesize that identity verification will be accomplished more readily for male group members and legitimated task leaders…

  16. Environmental Technology Verification Program Fact Sheet

    EPA Science Inventory

    This is a Fact Sheet for the ETV Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The program ...

  17. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  18. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  19. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  20. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  1. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  2. Teaching Astrobiology Online

    NASA Astrophysics Data System (ADS)

    Maddison, S. T.

    2004-06-01

    As part of Swinburne Astronomy Online http://astronomy.swin.edu.au/sao/ (SAO), we run an online short course entitled Searching for Extrasolar Planets and Extraterrestrial Life. The main aim of the short course is to act as a ``feeder'' into our graduate programs and allow students to trial online education while exploring one of the new hot topics of astronomy -- astrobiology. I will present a brief overview of how SAO works, followed by an outline of our short course which has been running for four semesters. In particular, I will focus on why astrobiology is a good choice of topics for an online short course, and look at the successes (and failures) of the course in attracting students to both online education and astronomy -- and astrobiology in particular.

  3. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    NASA Technical Reports Server (NTRS)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  4. Isotopic signatures by bulk analyses

    SciTech Connect

    Efurd, D.W.; Rokop, D.J.

    1997-12-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally.

  5. ACCRETING CIRCUMPLANETARY DISKS: OBSERVATIONAL SIGNATURES

    SciTech Connect

    Zhu, Zhaohuan

    2015-01-20

    I calculate the spectral energy distributions of accreting circumplanetary disks using atmospheric radiative transfer models. Circumplanetary disks only accreting at 10{sup –10} M {sub ☉} yr{sup –1} around a 1 M{sub J} planet can be brighter than the planet itself. A moderately accreting circumplanetary disk ( M-dot ∼10{sup −8} M{sub ⊙} yr{sup −1}; enough to form a 10 M{sub J} planet within 1 Myr) around a 1 M{sub J} planet has a maximum temperature of ∼2000 K, and at near-infrared wavelengths (J, H, K bands), this disk is as bright as a late-M-type brown dwarf or a 10 M{sub J} planet with a ''hot start''. To use direct imaging to find the accretion disks around low-mass planets (e.g., 1 M{sub J} ) and distinguish them from brown dwarfs or hot high-mass planets, it is crucial to obtain photometry at mid-infrared bands (L', M, N bands) because the emission from circumplanetary disks falls off more slowly toward longer wavelengths than those of brown dwarfs or planets. If young planets have strong magnetic fields (≳100 G), fields may truncate slowly accreting circumplanetary disks ( M-dot ≲10{sup −9} M{sub ⊙} yr{sup −1}) and lead to magnetospheric accretion, which can provide additional accretion signatures, such as UV/optical excess from the accretion shock and line emission.

  6. Analysis of multispectral signatures of the shot

    NASA Astrophysics Data System (ADS)

    Kastek, Mariusz; Dulski, Rafał; Piątkowski, Tadeusz; Madura, Henryk; Bareła, Jarosław; Polakowski, Henryk

    2011-06-01

    The paper presents some practical aspects of sniper IR signature measurements. Description of particular signatures for sniper shot in typical scenarios has been presented. We take into consideration sniper activities in the open area as well as in urban environment. The measurements were made at field test ground. High precision laboratory measurements were also performed. Several infrared cameras were used during measurements to cover all measurement assumptions. Some of the cameras are measurement-class devices with high accuracy and frame rates. The registrations were simultaneously made in UV, NWIR, SWIR and LWIR spectral bands. The infrared cameras have possibilities to install optical filters for multispectral measurement. An ultra fast visual camera was also used for visible spectra registration. Exemplary sniper IR signatures for typical situation were presented. LWIR imaging spectroradiometer HyperCam was also used during the laboratory measurements and field experiments. The signatures collected by HyperCam were useful for the determination of spectral characteristics of shot.

  7. Secure quantum signatures using insecure quantum channels

    NASA Astrophysics Data System (ADS)

    Amiri, Ryan; Wallden, Petros; Kent, Adrian; Andersson, Erika

    2016-03-01

    Digital signatures are widely used in modern communication to guarantee authenticity and transferability of messages. The security of currently used classical schemes relies on computational assumptions. We present a quantum signature scheme that does not require trusted quantum channels. We prove that it is unconditionally secure against the most general coherent attacks, and show that it requires the transmission of significantly fewer quantum states than previous schemes. We also show that the quantum channel noise threshold for our scheme is less strict than for distilling a secure key using quantum key distribution. This shows that "direct" quantum signature schemes can be preferable to signature schemes relying on secret shared keys generated using quantum key distribution.

  8. 42 CFR 424.36 - Signature requirements.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... beneficiary's legal guardian. (2) A relative or other person who receives social security or other... Part B may be signed by the entity on the beneficiary's behalf. (e) Acceptance of other signatures...

  9. 42 CFR 424.36 - Signature requirements.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... beneficiary's legal guardian. (2) A relative or other person who receives social security or other... Part B may be signed by the entity on the beneficiary's behalf. (e) Acceptance of other signatures...

  10. 15 CFR 908.16 - Signature.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National... or intending to conduct the weather modification activities referred to therein by such...

  11. Experimental demonstration of photonic quantum digital signatures

    NASA Astrophysics Data System (ADS)

    Collins, Robert J.; Clarke, Patrick J.; Dunjko, Vedran; Andersson, Erika; Jeffers, John; Buller, Gerald S.

    2012-09-01

    Digital signature schemes are often used in interconnected computer networks to verify the origin and authenticity of messages. Current classical digital signature schemes based on so-called "one-way functions" rely on computational complexity to provide security over sufficiently long timescales. However, there are currently no mathematical proofs that such functions will always be computationally complex. Quantum digital signatures offers a means of confirming both origin and authenticity of a message with security verified by information theoretical limits. The message cannot be forged or repudiated. We have constructed, tested and analyzed the security of what is, to the best of our knowledge, the first example of an experimental quantum digital signature system.

  12. Microbial Signatures In Sulfate-Rich Playas

    NASA Astrophysics Data System (ADS)

    Glamoclija, M.; Steele, A.; Starke, V.; Zeidan, M.; Potochniak, S.; Sirisena, K.; Widanagamage, I. H.

    2016-05-01

    Microbes that live in playas represent organisms able to cope with transient environments, ranging from fresh to hyper-saline water settings and from wet to dry. We will try to identify mineral and chemical signatures of their presence.

  13. 15 CFR 908.16 - Signature.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National... or intending to conduct the weather modification activities referred to therein by such...

  14. Connecting to On-line Data, a Progress Report

    NASA Astrophysics Data System (ADS)

    Eichhorn, G.; Astrophysics Datacenter Executive Committee (ADEC)

    2004-12-01

    The Astrophysics Datacenter Executive Committee (ADEC) has worked with the American Astronomical Society (AAS) and the University of Chicago Press (UChP) to implement links from the on-line literature to on-line data and vice versa. A first demonstration of this system is on-line in the Astrophysical Journal Supplement, Volume 154, Issue 1, a special issue about first results from Spitzer. Several of these on-line articles have links to on-line data. This linking system requires the collaboration of the data centers (marking data sets with unique identifiers, providing a verification system for identifiers, providing a systematic linking system to data sets), the ADS (providing a master verifier that connects the journal to the individual verifiers at the data centers, providing a linking server that allows stable links for the journals even if data sets move), and the AAS and the UChP (implementing LaTeX tags for identifiers, processing and verifying identifiers, implementing the links). Once the links are in place at the journal website, the publisher returns this information to the ADS and from there to the data centers in order to provide the data centers the information necessary to implement the opposite links from data sets to journal articles. The pipeline for this information flow is now fully in place and will be described in this poster. This work is supported by NASA under sevreal grants.

  15. Talking Online: Reflecting on Online Communication Tools

    ERIC Educational Resources Information Center

    Greener, Susan

    2009-01-01

    Purpose: The purpose of this paper is to reflect on the value and constraints of varied online communication tools from web 2.0 to e-mail in a higher education (HE) teaching and learning context, where these tools are used to support or be the main focus of learning. Design/methodology/approach: A structured reflection is produced with the aid of…

  16. Signature scheme based on bilinear pairs

    NASA Astrophysics Data System (ADS)

    Tong, Rui Y.; Geng, Yong J.

    2013-03-01

    An identity-based signature scheme is proposed by using bilinear pairs technology. The scheme uses user's identity information as public key such as email address, IP address, telephone number so that it erases the cost of forming and managing public key infrastructure and avoids the problem of user private generating center generating forgery signature by using CL-PKC framework to generate user's private key.

  17. Research Plan for Fire Signatures and Detection

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Viewgraphs on the prevention, suppression, and detection of fires aboard a spacecraft is presented. The topics include: 1) Fire Prevention, Detection, and Suppression Sub-Element Products; 2) FPDS Organizing Questions; 3) FPDS Organizing Questions; 4) Signatures, Sensors, and Simulations; 5) Quantification of Fire and Pre-Fire Signatures; 6) Smoke; 7) DAFT Hardware; 8) Additional Benefits of DAFT; 9) Development and Characterization of Sensors 10) Simulation of the Transport of Smoke and Fire Precursors; and 11) FPDS Organizing Questions.

  18. Quantum blind signature with an offline repository

    NASA Astrophysics Data System (ADS)

    Ribeiro, J.; Souto, A.; Mateus, P.

    2015-04-01

    We propose a quantum blind signature scheme that achieves perfect security under the assumption of an honest offline repository. The security of the protocol also relies on perfect private quantum channels, which are achievable using quantum one-time pads with keys shared via a quantum key distribution (QKD) protocol. The proposed approach ensures that signatures cannot be copied and that the sender must compromise to a single message, which are important advantages over classical protocols for certain applications.

  19. Online Sellers’ Website Quality Influencing Online Buyers’ Purchase Intention

    NASA Astrophysics Data System (ADS)

    Shea Lee, Tan; Ariff, Mohd Shoki Md; Zakuan, Norhayati; Sulaiman, Zuraidah; Zameri Mat Saman, Muhamad

    2016-05-01

    The increase adoption of Internet among young users in Malaysia provides high prospect for online seller. Young users aged between 18 and 25 years old are important to online sellers because they are actively involved in online purchasing and this group of online buyers is expected to dominate future online market. Therefore, examining online sellers’ website quality and online buyers’ purchase intention is crucial. Based on the Theory of planned behavior (TPB), a conceptual model of online sellers’ website quality and purchase intention of online buyers was developed. E-tailQ instrument was adapted in this study which composed of website design, reliability/fulfillment, security, privacy & trust, and customer service. Using online questionnaire and convenience sampling procedure, primary data were obtained from 240 online buyers aged between 18 to 25 years old. It was discovered that website design, website reliability/fulfillment, website security, privacy & trust, and website customer service positively and significantly influence intention of online buyers to continuously purchase via online channels. This study concludes that online sellers’ website quality is important in predicting online buyers’ purchase intention. Recommendation and implication of this study were discussed focusing on how online sellers should improve their website quality to stay competitive in online business.

  20. The AdaptiV Approach to Verification of Adaptive Systems

    SciTech Connect

    Rouff, Christopher; Buskens, Richard; Pullum, Laura L; Cui, Xiaohui; Hinchey, Mike

    2012-01-01

    Adaptive systems are critical for future space and other unmanned and intelligent systems. Verification of these systems is also critical for their use in systems with potential harm to human life or with large financial investments. Due to their nondeterministic nature and extremely large state space, current methods for verification of software systems are not adequate to provide a high level of assurance. The combination of stabilization science, high performance computing simulations, compositional verification and traditional verification techniques, plus operational monitors, provides a complete approach to verification and deployment of adaptive systems that has not been used before. This paper gives an overview of this approach.

  1. On the role of code comparisons in verification and validation.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  2. Network Signatures of Survival in Glioblastoma Multiforme

    PubMed Central

    Patel, Vishal N.; Gokulrangan, Giridharan; Chowdhury, Salim A.; Chen, Yanwen; Sloan, Andrew E.; Koyutürk, Mehmet; Barnholtz-Sloan, Jill; Chance, Mark R.

    2013-01-01

    To determine a molecular basis for prognostic differences in glioblastoma multiforme (GBM), we employed a combinatorial network analysis framework to exhaustively search for molecular patterns in protein-protein interaction (PPI) networks. We identified a dysregulated molecular signature distinguishing short-term (survival<225 days) from long-term (survival>635 days) survivors of GBM using whole genome expression data from The Cancer Genome Atlas (TCGA). A 50-gene subnetwork signature achieved 80% prediction accuracy when tested against an independent gene expression dataset. Functional annotations for the subnetwork signature included “protein kinase cascade,” “IκB kinase/NFκB cascade,” and “regulation of programmed cell death” – all of which were not significant in signatures of existing subtypes. Finally, we used label-free proteomics to examine how our subnetwork signature predicted protein level expression differences in an independent GBM cohort of 16 patients. We found that the genes discovered using network biology had a higher probability of dysregulated protein expression than either genes exhibiting individual differential expression or genes derived from known GBM subtypes. In particular, the long-term survivor subtype was characterized by increased protein expression of DNM1 and MAPK1 and decreased expression of HSPA9, PSMD3, and CANX. Overall, we demonstrate that the combinatorial analysis of gene expression data constrained by PPIs outlines an approach for the discovery of robust and translatable molecular signatures in GBM. PMID:24068912

  3. Assessing the Quality of Bioforensic Signatures

    SciTech Connect

    Sego, Landon H.; Holmes, Aimee E.; Gosink, Luke J.; Webb-Robertson, Bobbie-Jo M.; Kreuzer, Helen W.; Anderson, Richard M.; Brothers, Alan J.; Corley, Courtney D.; Tardiff, Mark F.

    2013-06-04

    We present a mathematical framework for assessing the quality of signature systems in terms of fidelity, cost, risk, and utility—a method we refer to as Signature Quality Metrics (SQM). We demonstrate the SQM approach by assessing the quality of a signature system designed to predict the culture medium used to grow a microorganism. The system consists of four chemical assays designed to identify various ingredients that could be used to produce the culture medium. The analytical measurements resulting from any combination of these four assays can be used in a Bayesian network to predict the probabilities that the microorganism was grown using one of eleven culture media. We evaluated fifteen combinations of the signature system by removing one or more of the assays from the Bayes network. We demonstrated that SQM can be used to distinguish between the various combinations in terms of attributes of interest. The approach assisted in clearly identifying assays that were least informative, largely in part because they only could discriminate between very few culture media, and in particular, culture media that are rarely used. There are limitations associated with the data that were used to train and test the signature system. Consequently, our intent is not to draw formal conclusions regarding this particular bioforensic system, but rather to illustrate an analytical approach that could be useful in comparing one signature system to another.

  4. Chemical and Physical Signatures for Microbial Forensics

    SciTech Connect

    Cliff, John B.; Kreuzer, Helen W.; Ehrhardt, Christopher J.; Wunschel, David S.

    2012-01-03

    Chemical and physical signatures for microbial forensics John Cliff and Helen Kreuzer-Martin, eds. Humana Press Chapter 1. Introduction: Review of history and statement of need. Randy Murch, Virginia Tech Chapter 2. The Microbe: Structure, morphology, and physiology of the microbe as they relate to potential signatures of growth conditions. Joany Jackman, Johns Hopkins University Chapter 3. Science for Forensics: Special considerations for the forensic arena - quality control, sample integrity, etc. Mark Wilson (retired FBI): Western Carolina University Chapter 4. Physical signatures: Light and electron microscopy, atomic force microscopy, gravimetry etc. Joseph Michael, Sandia National Laboratory Chapter 5. Lipids: FAME, PLFA, steroids, LPS, etc. James Robertson, Federal Bureau of Investigation Chapter 6. Carbohydrates: Cell wall components, cytoplasm components, methods Alvin Fox, University of South Carolina School of Medicine David Wunschel, Pacific Northwest National Laboratory Chapter 7. Peptides: Peptides, proteins, lipoproteins David Wunschel, Pacific Northwest National Laboratory Chapter 8. Elemental content: CNOHPS (treated in passing), metals, prospective cell types John Cliff, International Atomic Energy Agency Chapter 9. Isotopic signatures: Stable isotopes C,N,H,O,S, 14C dating, potential for heavy elements. Helen Kreuzer-Martin, Pacific Northwest National Laboratory Michaele Kashgarian, Lawrence Livermore National Laboratory Chapter 10. Extracellular signatures: Cellular debris, heme, agar, headspace, spent media, etc Karen Wahl, Pacific Northwest National Laboratory Chapter 11. Data Reduction and Integrated Microbial Forensics: Statistical concepts, parametric and multivariate statistics, integrating signatures Kristin Jarman, Pacific Northwest National Laboratory

  5. ID-Based Blind Signature and Proxy Blind Signature without Trusted PKG

    NASA Astrophysics Data System (ADS)

    Yu, Yihua; Zheng, Shihui; Yang, Yixian

    Private key escrow is an inherent disadvantage for ID-based cryptosystem, i.e., the PKG knows each signer's private key and can forge the signature of any signer. Blind signature plays a central role in electronic cash system. Private key escrow is more severe in electronic cash system since money is directly involved. To avoid the key escrow problem, we propose an ID-based blind signature and proxy blind signature without trusted PKG. If the dishonest PKG impersonates an honest signer to sign a document, the signer can provide a proof to convince that the PKG is dishonest.

  6. Signature extension through the application of cluster matching algorithms to determine appropriate signature transformations

    NASA Technical Reports Server (NTRS)

    Lambeck, P. F.; Rice, D. P.

    1976-01-01

    Signature extension is intended to increase the space-time range over which a set of training statistics can be used to classify data without significant loss of recognition accuracy. A first cluster matching algorithm MASC (Multiplicative and Additive Signature Correction) was developed at the Environmental Research Institute of Michigan to test the concept of using associations between training and recognition area cluster statistics to define an average signature transformation. A more recent signature extension module CROP-A (Cluster Regression Ordered on Principal Axis) has shown evidence of making significant associations between training and recognition area cluster statistics, with the clusters to be matched being selected automatically by the algorithm.

  7. Subduction signature in backarc mantle?

    NASA Astrophysics Data System (ADS)

    Nelson, W. R.; Snow, J. E.; Brandon, A. D.; Ohara, Y.

    2013-12-01

    Abyssal peridotites exposed during seafloor extension provide a rare glimpse into the processes occurring within the oceanic mantle. Whole rock and mineral-scale major element data from abyssal peridotites record processes intimately associated with melt-depletion and melt-rock interaction occurring just prior to exposure of the mantle at the surface. Isotopic data, however, can provide insight into the long-term evolution of the oceanic mantle. A number of studies of mantle material exposed along mid-ocean ridges have demonstrated that abyssal peridotites from Mid-Atlantic Ridge, Gakkel Ridge, and Southwest Indian Ridge commonly display a range of whole rock Os isotopic ratios (187Os/188Os = 0.118- 0.130; Brandon et al., 2000; Standish et al., 2002; Alard et al., 2005; Harvey et al., 2006; Liu et al., 2008). The range of isotopic values in each region demonstrates that the oceanic mantle does not melt uniformly over time. Instead, anciently depleted regions (187Os/188Os ≈ 0.118) are juxtaposed against relatively fertile regions (187Os/188Os ≈ 0.130) that are isotopically similar to established primitive mantle values (187Os/188Os = 0.1296; Meisel et al. 2001). Abyssal peridotites from the Godzilla Megamullion and Chaotic Terrain in the backarc Parece Vela Basin (Philippine Sea) display a range of Os isotopic values extending to similar unradiogenic values. However, some of the backarc basin abyssal peridotites record more radiogenic 187Os/188Os values (0.135-0.170) than mid-ocean ridge peridotites. Comparable radiogenic signatures are reported only in highly weathered abyssal peridotites (187Os/188Os ≤ 0.17, Standish et al., 2002) and subduction-related volcanic arc peridotites (187Os/188Os ≤ 0.16, Brandon et al., 1996; Widom et al., 2003). In both the weathered peridotites and arc peridotites, the 187Os/188Os value is negatively correlated with Os abundance: the most radiogenic value has the lowest Os abundance (< 1 ppb) making them highly susceptible to

  8. UTEX modeling of xenon signature sensitivity to geology and explosion cavity characteristics following an underground nuclear explosion

    NASA Astrophysics Data System (ADS)

    Lowrey, J. D.; Haas, D.

    2013-12-01

    Underground nuclear explosions (UNEs) produce anthropogenic isotopes that can potentially be used in the verification component of the Comprehensive Nuclear-Test-Ban Treaty. Several isotopes of radioactive xenon gas have been identified as radionuclides of interest within the International Monitoring System (IMS) and in an On-Site Inspection (OSI). Substantial research has been previously undertaken to characterize the geologic and atmospheric mechanisms that can drive the movement of radionuclide gas from a well-contained UNE, considering both sensitivities on gas arrival time and signature variability of xenon due to the nature of subsurface transport. This work further considers sensitivities of radioxenon gas arrival time and signatures to large variability in geologic stratification and generalized explosion cavity characteristics, as well as compares this influence to variability in the shallow surface.

  9. Meta-analysis of age-related gene expression profiles identifies common signatures of aging

    PubMed Central

    de Magalhães, João Pedro; Curado, João; Church, George M.

    2009-01-01

    Motivation: Numerous microarray studies of aging have been conducted, yet given the noisy nature of gene expression changes with age, elucidating the transcriptional features of aging and how these relate to physiological, biochemical and pathological changes remains a critical problem. Results: We performed a meta-analysis of age-related gene expression profiles using 27 datasets from mice, rats and humans. Our results reveal several common signatures of aging, including 56 genes consistently overexpressed with age, the most significant of which was APOD, and 17 genes underexpressed with age. We characterized the biological processes associated with these signatures and found that age-related gene expression changes most notably involve an overexpression of inflammation and immune response genes and of genes associated with the lysosome. An underexpression of collagen genes and of genes associated with energy metabolism, particularly mitochondrial genes, as well as alterations in the expression of genes related to apoptosis, cell cycle and cellular senescence biomarkers, were also observed. By employing a new method that emphasizes sensitivity, our work further reveals previously unknown transcriptional changes with age in many genes, processes and functions. We suggest these molecular signatures reflect a combination of degenerative processes but also transcriptional responses to the process of aging. Overall, our results help to understand how transcriptional changes relate to the process of aging and could serve as targets for future studies. Availability: http://genomics.senescence.info/uarrays/signatures.html Contact: jp@senescence.info Supplementary information: Supplementary data are available at Bioinformatics online. PMID:19189975

  10. Bayesian online compressed sensing.

    PubMed

    Rossi, Paulo V; Kabashima, Yoshiyuki; Inoue, Jun-Ichi

    2016-08-01

    In this paper, we explore the possibilities and limitations of recovering sparse signals in an online fashion. Employing a mean field approximation to the Bayes recursion formula yields an online signal recovery algorithm that can be performed with a computational cost that is linearly proportional to the signal length per update. Analysis of the resulting algorithm indicates that the online algorithm asymptotically saturates the optimal performance limit achieved by the offline method in the presence of Gaussian measurement noise, while differences in the allowable computational costs may result in fundamental gaps of the achievable performance in the absence of noise. PMID:27627276

  11. Bayesian online compressed sensing

    NASA Astrophysics Data System (ADS)

    Rossi, Paulo V.; Kabashima, Yoshiyuki; Inoue, Jun-ichi

    2016-08-01

    In this paper, we explore the possibilities and limitations of recovering sparse signals in an online fashion. Employing a mean field approximation to the Bayes recursion formula yields an online signal recovery algorithm that can be performed with a computational cost that is linearly proportional to the signal length per update. Analysis of the resulting algorithm indicates that the online algorithm asymptotically saturates the optimal performance limit achieved by the offline method in the presence of Gaussian measurement noise, while differences in the allowable computational costs may result in fundamental gaps of the achievable performance in the absence of noise.

  12. MMW, IR, and SAM signature collection

    NASA Astrophysics Data System (ADS)

    Reichstetter, Fred; Ward, Mary E.

    2002-08-01

    During the development of smart weapon's seeker/sensors, it is imperative to collect high quality signatures of the targets the system is intended to engage. These signatures are used to support algorithm development so the system can find and engage the targets of interest in the specific kill area on the target. Englin AFB FL is the AF development center for munitions; and in support of the development effort, the 46th Test Wing (46 TW) has initiated significant improvements in collection capabilities for signatures in the MMW, Infrared and Seismic, Acoustic and Magnetic (SAM) spectrum. Additionally, the Joint Munitions Test and Evaluation program office maintains a fleet of foreign ground vehicle targets used for such signature collection including items such as tanks, SCUD missile launchers, air defense units such as SA-06, SA-8, SA-13, and associated ground support trucks and general purpose vehicles. The major test facility includes a 300 ft tower used for mounting the instrumentation suite that currently includes, 10, 35 and 94 GHz MMW and 2-5(mu) and 8-12(mu) IR instrumentation systems. This facility has undergone major improvements in terms of background signature reduction, construction of a high bay building to house the turntable on which the targets are mounted, and an additional in- ground stationary turntable primarily for IR signature collection. Our experience using this facility to collect signatures for the smart weapons development community has confirmed a significant improvement in quality and efficiency. The need for the stationary turntable signature collection capability was driven by the requirements of the IR community who are interested in collecting signatures in clutter. This tends to be contrary to the MMW community that desires minimum background clutter. The resulting location, adjacent to the MMW tower, allows variations in the type and amount of clutter background that could be incorporated and also provides maximum utilization of

  13. Electronic Signatures: They're Legal, Now What?

    ERIC Educational Resources Information Center

    Broderick, Martha A.; Gibson, Virginia R.; Tarasewich, Peter

    2001-01-01

    In the United States, electronic signatures recently became as legally binding as printed signatures. Reviews the status of electronic signatures in the United States, and compares it to work done by the United Nations. Summarizes the technology that can be used to implement electronic signatures. Discusses problems and open issues surrounding the…

  14. Irma 5.1 multisensor signature prediction model

    NASA Astrophysics Data System (ADS)

    Savage, James; Coker, Charles; Edwards, Dave; Thai, Bea; Aboutalib, Omar; Chow, Anthony; Yamaoka, Neil; Kim, Charles

    2006-05-01

    includes polarization effects, time jittering, speckle effect, and atmospheric turbulence. More importantly, the Munitions Directorate has funded three field tests to verify and validate the re-engineered ladar channel. Each of the field tests was comprehensive and included one month of sensor characterization and a week of data collection. After each field test, the analysis included comparisons of Irma predicted signatures with measured signatures, and if necessary, refining the model to produce realistic imagery. This paper will focus on two areas of the Irma 5.1 development effort: report on the analysis results of the validation and verification of the Irma 5.1 ladar channel, and the software development plan and validation efforts of the Irma passive channel. As scheduled, the Irma passive code is being re-engineered using object oriented language (C++), and field data collection is being conducted to validate the re-engineered passive code. This software upgrade will remove many constraints and limitations of the legacy code including limits on image size and facet counts. The field test to validate the passive channel is expected to be complete in the second quarter of 2006.

  15. Exploring Online Purchasing.

    ERIC Educational Resources Information Center

    Fickes, Michael

    2001-01-01

    Examines the setup of American University's electronic procurement system and explains how it works and the benefits it brings to the school. Cautionary advice on setting up an online purchasing system is also offered. (GR)

  16. Online learning for all.

    PubMed

    Doherty, Simon

    2015-05-30

    Last year, Simon Doherty, president of the North of Ireland Veterinary Association and the BVA's NI Branch, completed a number of MOOCs (massive open online courses) having read about them in Vet Record Careers; here, he describes his experiences.

  17. Online bartering motivations.

    PubMed

    Lee, Hsiang-Ming; Chen, Tsai; Hung, Min-Li

    2014-08-01

    This study examined the role of enjoyment in people's decision to barter online. A survey in barter BBS/discussion forums and websites collected data from 135 participants (30 men, 105 women; 71% in the age group of 21-30 years) who barter online. To test a modification of the Expectation Confirmation Model, perceived enjoyment, confirmation of expectations, perceived usefulness, satisfaction, and continuance intention were measured. The data analysis showed that the expanded ECM had good explanatory power, with all paths supported except for perceived usefulness-satisfaction. In the proposed model, 33.1% of the variance in continuance intentions was predicted by the independent variables. Thus, the expanded ECM can provide supplementary information that is relevant for understanding continued online bartering usage. Barter website managers may encourage users' intentions to continue using these websites by emphasizing enjoyable aspects of online bartering.

  18. Retrieving Patent Information Online

    ERIC Educational Resources Information Center

    Kaback, Stuart M.

    1978-01-01

    This paper discusses patent information retrieval from online files in terms of types of questions, file contents, coverage, timeliness, and other file variations. CLAIMS, Derwent, WPI, APIPAT and Chemical Abstracts Service are described. (KP)

  19. Invoices for Online Services.

    ERIC Educational Resources Information Center

    Forrest, Vicki

    1988-01-01

    Review of current billing practices for online services in Great Britain discusses determinants of successful numeric data display; invoice structure; display format; invoice content; display characteristics; and invoice regularity. Sample invoices from several services are included. (three references) (MES)

  20. The Anatomy of Online Offerings

    ERIC Educational Resources Information Center

    Hoskins, Barbara J.

    2014-01-01

    The perceptions about online teaching and learning are frequently different from the reality. Some students say they expected the online course to be easier than the traditional face-to-face course and are surprised by the rigor, while skeptics decry the quality of online offerings since students cannot possibly learn as well online as they do in…