Science.gov

Sample records for online signature verification

  1. Threshold Equalization for On-Line Signature Verification

    NASA Astrophysics Data System (ADS)

    Nakanishi, Isao; Sakamoto, Hiroyuki; Itoh, Yoshio; Fukui, Yutaka

    In on-line signature verification, complexity of signature shape can influence the value of the optimal threshold for individual signatures. Writer-dependent threshold selection has been proposed but it requires forgery data. It is not easy to collect such forgery data in practical applications. Therefore, some threshold equalization method using only genuine data is needed. In this letter, we propose three different threshold equalization methods based on the complexity of signature. Their effectiveness is confirmed in experiments using a multi-matcher DWT on-line signature verification system.

  2. Glove-based approach to online signature verification.

    PubMed

    Kamel, Nidal S; Sayeed, Shohel; Ellis, Grant A

    2008-06-01

    Utilizing the multiple degrees of freedom offered by the data glove for each finger and the hand, a novel on-line signature verification system using the Singular Value Decomposition (SVD) numerical tool for signature classification and verification is presented. The proposed technique is based on the Singular Value Decomposition in finding r singular vectors sensing the maximal energy of glove data matrix A, called principal subspace, so the effective dimensionality of A can be reduced. Having modeled the data glove signature through its r-principal subspace, signature authentication is performed by finding the angles between the different subspaces. A demonstration of the data glove is presented as an effective high-bandwidth data entry device for signature verification. This SVD-based signature verification technique is tested and its performance is shown to be able to recognize forgery signatures with a false acceptance rate of less than 1.2%. PMID:18421114

  3. On-line signature verification method by Laplacian spectral analysis and dynamic time warping

    NASA Astrophysics Data System (ADS)

    Li, Changting; Peng, Liangrui; Liu, Changsong; Ding, Xiaoqing

    2013-12-01

    As smartphones and touch screens are more and more popular, on-line signature verification technology can be used as one of personal identification means for mobile computing. In this paper, a novel Laplacian Spectral Analysis (LSA) based on-line signature verification method is presented and an integration framework of LSA and Dynamic Time Warping (DTW) based methods for practical application is proposed. In LSA based method, a Laplacian matrix is constructed by regarding the on-line signature as a graph. The signature's writing speed information is utilized in the Laplacian matrix of the graph. The eigenvalue spectrum of the Laplacian matrix is analyzed and used for signature verification. The framework to integrate LSA and DTW methods is further proposed. DTW is integrated at two stages. First, it is used to provide stroke matching results for the LSA method to construct the corresponding graph better. Second, the on-line signature verification results by DTW are fused with that of the LSA method. Experimental results on public signature database and practical signature data on mobile phones proved the effectiveness of the proposed method.

  4. Online Handwritten Signature Verification Using Neural Network Classifier Based on Principal Component Analysis

    PubMed Central

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%. PMID:25133227

  5. Online handwritten signature verification using neural network classifier based on principal component analysis.

    PubMed

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Yussof, Salman; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%. PMID:25133227

  6. Online Signature Verification System for Kaisyo Script Based on Structured Learning and Segmentation of HMM

    NASA Astrophysics Data System (ADS)

    Zhang, Dapeng; Inagaki, Shinkichi; Suzuki, Tatsuya; Kanada, Naoki

    This paper presents a new Hidden Markov Model (HMM) for the online signature verification of oriental characters such as Japanese and Chinese. These oriental characters usually consist of many individual strokes such as dots and straight lines. Taking into account of this characteristic, a new HMM is proposed, which is composed of many sub-models each of which corresponds to an individual stroke. In addition, the pen-up state which represents the movement between strokes is explicitly introduced. Then, a parameter re-estimation scheme for this special class of HMM is derived exploiting the structure of the proposed HMM. Thanks to the structured learning mechanism, the proposed HMM not only can drastically reduce the computational time necessary for the learning process but also shows higher recognition performance for the rejection of the skilled forgery. Finally, the usefulness of the proposed scheme is demonstrated by comparing it with conventional models.

  7. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  8. Signature Verification Based on Handwritten Text Recognition

    NASA Astrophysics Data System (ADS)

    Viriri, Serestina; Tapamo, Jules-R.

    Signatures continue to be an important biometric trait because it remains widely used primarily for authenticating the identity of human beings. This paper presents an efficient text-based directional signature recognition algorithm which verifies signatures, even when they are composed of special unconstrained cursive characters which are superimposed and embellished. This algorithm extends the character-based signature verification technique. The experiments carried out on the GPDS signature database and an additional database created from signatures captured using the ePadInk tablet, show that the approach is effective and efficient, with a positive verification rate of 94.95%.

  9. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications. PMID:17365425

  10. Signature verification with writing posture analysis

    NASA Astrophysics Data System (ADS)

    Cheng, Hsu-Yung; Yu, Chih-Chang

    2013-07-01

    A video-based handwritten signature verification framework is proposed in this paper. Using a camera as the sensor has the advantage that the entire writing processes can be captured along with the signatures. The main contribution of this work is that writing postures are analyzed to achieve the verification purpose because the writing postures cannot be easily imitated or forged. The proposed system is able to achieve low false rejection rates while maintaining low false acceptance rates for database containing both unskilled and skilled imitation signatures.

  11. FIR signature verification system characterizing dynamics of handwriting features

    NASA Astrophysics Data System (ADS)

    Thumwarin, Pitak; Pernwong, Jitawat; Matsuura, Takenobu

    2013-12-01

    This paper proposes an online signature verification method based on the finite impulse response (FIR) system characterizing time-frequency characteristics of dynamic handwriting features. First, the barycenter determined from both the center point of signature and two adjacent pen-point positions in the signing process, instead of one pen-point position, is used to reduce the fluctuation of handwriting motion. In this paper, among the available dynamic handwriting features, motion pressure and area pressure are employed to investigate handwriting behavior. Thus, the stable dynamic handwriting features can be described by the relation of the time-frequency characteristics of the dynamic handwriting features. In this study, the aforesaid relation can be represented by the FIR system with the wavelet coefficients of the dynamic handwriting features as both input and output of the system. The impulse response of the FIR system is used as the individual feature for a particular signature. In short, the signature can be verified by evaluating the difference between the impulse responses of the FIR systems for a reference signature and the signature to be verified. The signature verification experiments in this paper were conducted using the SUBCORPUS MCYT-100 signature database consisting of 5,000 signatures from 100 signers. The proposed method yielded equal error rate (EER) of 3.21% on skilled forgeries.

  12. Efficient online signature authentication approach

    NASA Astrophysics Data System (ADS)

    Kaouther, Saidani; Messaoud, Mostefai; Abderraouf, Bouziane; Youssef, Chahir

    2014-11-01

    Signature authentication systems often have to focus their processing on acquired dynamic and/or static signatures descriptors to authenticate persons. This approach gives satisfactory results in ordinary cases but remains vulnerable against skilled forgeries. This is mainly because there is no relation between the signatory and his signature. We will show that the inclusion of the hand shape in the authentication process will considerably reduce the false acceptance rates of skilled forgeries and improve the authentication accuracy performances. A new online hand signature authentication approach based on both signature and hand shape descriptor is proposed. The signature acquisition is completely transparent, which allows a high level of security against fraudulent imitation attempts. Authentication performances are evaluated with extensive experiments. The obtained test results [equal error rate (EER)=2%, genuine acceptance rate (GAR)=96%]confirm the efficiency of the proposed approach.

  13. Offline signature verification using local binary pattern and octave pattern

    NASA Astrophysics Data System (ADS)

    Ahlawat, Sahil; Goel, Anubhav; Prasad, Surabhi; Singh, Preety

    2014-01-01

    Signature verification holds a significant place in today's world as most of the bank transactions, stock trading etc. are validated via signatures. Signatures are considered as one of the most effective biometric identity but unfortunately signature forgery attempts are quite rampant. To prevent this, a robust signature verification mechanism is essential. In this paper, a new method has been proposed which uses Local Binary Pattern and geometrical features. A new geometric property has been devised i.e. Octave Pattern. Performance is analyzed by comparing random, semi-skilled and skilled forgeries with the genuine signature.

  14. Online adaptation and verification of VMAT

    SciTech Connect

    Crijns, Wouter; Defraene, Gilles; Depuydt, Tom; Haustermans, Karin; Van Herck, Hans; Maes, Frederik; Van den Heuvel, Frank

    2015-07-15

    Purpose: This work presents a method for fast volumetric modulated arc therapy (VMAT) adaptation in response to interfraction anatomical variations. Additionally, plan parameters extracted from the adapted plans are used to verify the quality of these plans. The methods were tested as a prostate class solution and compared to replanning and to their current clinical practice. Methods: The proposed VMAT adaptation is an extension of their previous intensity modulated radiotherapy (IMRT) adaptation. It follows a direct (forward) planning approach: the multileaf collimator (MLC) apertures are corrected in the beam’s eye view (BEV) and the monitor units (MUs) are corrected using point dose calculations. All MLC and MU corrections are driven by the positions of four fiducial points only, without need for a full contour set. Quality assurance (QA) of the adapted plans is performed using plan parameters that can be calculated online and that have a relation to the delivered dose or the plan quality. Five potential parameters are studied for this purpose: the number of MU, the equivalent field size (EqFS), the modulation complexity score (MCS), and the components of the MCS: the aperture area variability (AAV) and the leaf sequence variability (LSV). The full adaptation and its separate steps were evaluated in simulation experiments involving a prostate phantom subjected to various interfraction transformations. The efficacy of the current VMAT adaptation was scored by target mean dose (CTV{sub mean}), conformity (CI{sub 95%}), tumor control probability (TCP), and normal tissue complication probability (NTCP). The impact of the adaptation on the plan parameters (QA) was assessed by comparison with prediction intervals (PI) derived from a statistical model of the typical variation of these parameters in a population of VMAT prostate plans (n = 63). These prediction intervals are the adaptation equivalent of the tolerance tables for couch shifts in the current clinical practice. Results: The proposed adaptation of a two-arc VMAT plan resulted in the intended CTV{sub mean} (Δ ≤ 3%) and TCP (ΔTCP ≤ 0.001). Moreover, the method assures the intended CI{sub 95%} (Δ ≤ 11%) resulting in lowered rectal NTCP for all cases. Compared to replanning, their adaptation is faster (13 s vs 10 min) and more intuitive. Compared to the current clinical practice, it has a better protection of the healthy tissue. Compared to IMRT, VMAT is more robust to anatomical variations, but it is also less sensitive to the different correction steps. The observed variations of the plan parameters in their database included a linear dependence on the date of treatment planning and on the target radius. The MCS is not retained as QA metric due to a contrasting behavior of its components (LSV and AAV). If three out of four plan parameters (MU, EqFS, AAV, and LSV) need to lie inside a 50% prediction interval (3/4—50%PI), all adapted plans will be accepted. In contrast, all replanned plans do not meet this loose criterion, mainly because they have no connection to the initially optimized and verified plan. Conclusions: A direct (forward) VMAT adaptation performs equally well as (inverse) replanning but is faster and can be extended to real-time adaptation. The prediction intervals for the machine parameters are equivalent to the tolerance tables for couch shifts in the current clinical practice. A 3/4—50%PI QA criterion accepts all the adapted plans but rejects all the replanned plans.

  15. Multimedia document authentication using on-line signatures as watermarks

    NASA Astrophysics Data System (ADS)

    Namboodiri, Anoop M.; Jain, Anil K.

    2004-06-01

    Authentication of digital documents is an important concern as digital documents are replacing the traditional paper-based documents for offcial and legal purposes. This is especially true in the case of documents that are exchanged over the Internet, which could be accessed and modified by intruders. The most popular methods used for authentication of digital documents are public key encryption-based authentication and digital watermarking. Traditional watermarking techniques embed a pre-determined character string, such as the company logo, in a document. We propose a fragile watermarking system, which uses an on-line signature of the author as the watermark in a document. The embedding of a biometric characteristic such as signature in a document enables us to verify the identity of the author using a set of reference signatures, in addition to ascertaining the document integrity. The receiver of the document reconstructs the signature used to watermark the document, which is then used to verify the author's claimed identity. The paper presents a signature encoding scheme, which facilitates reconstruction by the receiver, while reducing the chances of collusion attacks.

  16. Age and gender-invariant features of handwritten signatures for verification systems

    NASA Astrophysics Data System (ADS)

    AbdAli, Sura; Putz-Leszczynska, Joanna

    2014-11-01

    Handwritten signature is one of the most natural biometrics, the study of human physiological and behavioral patterns. Behavioral biometrics includes signatures that may be different due to its owner gender or age because of intrinsic or extrinsic factors. This paper presents the results of the author's research on age and gender influence on verification factors. The experiments in this research were conducted using a database that contains signatures and their associated metadata. The used algorithm is based on the universal forgery feature idea, where the global classifier is able to classify a signature as a genuine one or, as a forgery, without the actual knowledge of the signature template and its owner. Additionally, the reduction of the dimensionality with the MRMR method is discussed.

  17. On-line infrared process signature measurements through combustion atmospheres

    NASA Astrophysics Data System (ADS)

    Zweibaum, F. M.; Kozlowski, A. T.; Surette, W. E., Jr.

    1980-01-01

    A number of on-line infrared process signature measurements have been made through combustion atmospheres, including those in jet engines, piston engines, and coal gasification reactors. The difficulties involved include operation in the presence of pressure as high as 1800 psi, temperatures as high as 3200 F, and explosive, corrosive and dust-laden atmospheres. Calibration problems have resulted from the use of purge gases to clear the viewing tubes, and the obscuration of the view ports by combustion products. A review of the solutions employed to counteract the problems is presented, and areas in which better solutions are required are suggested.

  18. Offline signature verification and skilled forgery detection using HMM and sum graph features with ANN and knowledge based classifier

    NASA Astrophysics Data System (ADS)

    Mehta, Mohit; Choudhary, Vijay; Das, Rupam; Khan, Ilyas

    2010-02-01

    Signature verification is one of the most widely researched areas in document analysis and signature biometric. Various methodologies have been proposed in this area for accurate signature verification and forgery detection. In this paper we propose a unique two stage model of detecting skilled forgery in the signature by combining two feature types namely Sum graph and HMM model for signature generation and classify them with knowledge based classifier and probability neural network. We proposed a unique technique of using HMM as feature rather than a classifier as being widely proposed by most of the authors in signature recognition. Results show a higher false rejection than false acceptance rate. The system detects forgeries with an accuracy of 80% and can detect the signatures with 91% accuracy. The two stage model can be used in realistic signature biometric applications like the banking applications where there is a need to detect the authenticity of the signature before processing documents like checks.

  19. Gated Treatment Delivery Verification With On-Line Megavoltage Fluoroscopy

    SciTech Connect

    Tai An; Christensen, James D.; Gore, Elizabeth; Khamene, Ali; Boettger, Thomas; Li, X. Allen

    2010-04-15

    Purpose: To develop and clinically demonstrate the use of on-line real-time megavoltage (MV) fluoroscopy for gated treatment delivery verification. Methods and Materials: Megavoltage fluoroscopy (MVF) image sequences were acquired using a flat panel equipped for MV cone-beam CT in synchrony with the respiratory signal obtained from the Anzai gating device. The MVF images can be obtained immediately before or during gated treatment delivery. A prototype software tool (named RTReg4D) was developed to register MVF images with phase-sequenced digitally reconstructed radiograph images generated from the treatment planning system based on four-dimensional CT. The image registration can be used to reposition the patient before or during treatment delivery. To demonstrate the reliability and clinical usefulness, the system was first tested using a thoracic phantom and then prospectively in actual patient treatments under an institutional review board-approved protocol. Results: The quality of the MVF images for lung tumors is adequate for image registration with phase-sequenced digitally reconstructed radiographs. The MVF was found to be useful for monitoring inter- and intrafractional variations of tumor positions. With the planning target volume contour displayed on the MVF images, the system can verify whether the moving target stays within the planning target volume margin during gated delivery. Conclusions: The use of MVF images was found to be clinically effective in detecting discrepancies in tumor location before and during respiration-gated treatment delivery. The tools and process developed can be useful for gated treatment delivery verification.

  20. EVEREST: an efficient method for verification of digital signatures in real-time teleradiology.

    PubMed

    Bicakci, Kemal; Baykal, Nazife

    2004-01-01

    The introduction of digital medical images requires a legally binding digital signature that guarantees authenticity and integrity of the image. In real-time teleradiology services, the system is expected to respond very quickly however to verify the signature a considerable amount of time is spent to compute the hash value of the image since the image size might be huge (tens of megabytes). Motivating by this fact, in this paper we propose EVEREST, an efficient methodology for verification. The key observation we have made is that in the traditional verification the processor of the verifying machine is idle (I/O blocked) while the image is downloaded. In EVEREST, to improve the real-time efficiency the receiver can perform most of the hash computation while he is receiving the image itself. One other important advantage of our scheme is the communication efficiency since getting the entire image file is no longer necessary to detect the tampering. PMID:15361011

  1. On the pinned field image binarization for signature generation in image ownership verification method

    NASA Astrophysics Data System (ADS)

    Lee, Mn-Ta; Chang, Hsuan Ting

    2011-12-01

    The issue of pinned field image binarization for signature generation in the ownership verification of the protected image is investigated. The pinned field explores the texture information of the protected image and can be employed to enhance the watermark robustness. In the proposed method, four optimization schemes are utilized to determine the threshold values for transforming the pinned field into a binary feature image, which is then utilized to generate an effective signature image. Experimental results show that the utilization of optimization schemes can significantly improve the signature robustness from the previous method (Lee and Chang, Opt. Eng. 49(9), 097005, 2010). While considering both the watermark retrieval rate and the computation speed, the genetic algorithm is strongly recommended. In addition, compared with Chang and Lin's scheme (J. Syst. Softw. 81(7), 1118-1129, 2008), the proposed scheme also has better performance.

  2. GRAZING-ANGLE FOURIER TRANSFORM INFRARED SPECTROSCOPY FOR ONLINE SURFACE CLEANLINESS VERIFICATION. YEAR 1

    EPA Science Inventory

    As part of the Online Surface Cleanliness Project, the Naval Facilities Engineering Service Center (NFESC) conducted a study of grazing-angle reflectance Fourier Transform Infrared (FTIR) Spectroscopy as a tool for online cleanliness verification at Department of Defense (DoD) cl...

  3. On-line failure detection and damping measurement of aerospace structures by random decrement signatures

    NASA Technical Reports Server (NTRS)

    Cole, H. A., Jr.

    1973-01-01

    Random decrement signatures of structures vibrating in a random environment are studied through use of computer-generated and experimental data. Statistical properties obtained indicate that these signatures are stable in form and scale and hence, should have wide application in one-line failure detection and damping measurement. On-line procedures are described and equations for estimating record-length requirements to obtain signatures of a prescribed precision are given.

  4. Online Writer Verification Using Feature Parameters Basedonthe Document Examiners' Knowledge

    NASA Astrophysics Data System (ADS)

    Nakamura, Yoshikazu; Kidode, Masatsugu

    This paper investigates writer verification using feature parameters based on the knowledge of document examiners, which are automatically extracted from handwritten kanji characters on a digitizing tablet. Criteria of feature selection using the evaluation measure that is obtained by modifying the measure of decidability, d-prime, is established and the criteria are applied to the evaluation measures that are calculated from learning samples. Then two classifiers based on the frequency distribution of deviations of the selected features are proposed and its design method using learning samples is showed. The effectiveness of the proposed method is evaluated by verification experiments with the database including skilled forgeries. The experimental results show that the proposed methods are effective in writer verification.

  5. A method for online verification of adapted fields using an independent dose monitor

    SciTech Connect

    Chang Jina; Norrlinger, Bernhard D.; Heaton, Robert K.; Jaffray, David A.; Cho, Young-Bin; Islam, Mohammad K.; Mahon, Robert

    2013-07-15

    Purpose: Clinical implementation of online adaptive radiotherapy requires generation of modified fields and a method of dosimetric verification in a short time. We present a method of treatment field modification to account for patient setup error, and an online method of verification using an independent monitoring system.Methods: The fields are modified by translating each multileaf collimator (MLC) defined aperture in the direction of the patient setup error, and magnifying to account for distance variation to the marked isocentre. A modified version of a previously reported online beam monitoring system, the integral quality monitoring (IQM) system, was investigated for validation of adapted fields. The system consists of a large area ion-chamber with a spatial gradient in electrode separation to provide a spatially sensitive signal for each beam segment, mounted below the MLC, and a calculation algorithm to predict the signal. IMRT plans of ten prostate patients have been modified in response to six randomly chosen setup errors in three orthogonal directions.Results: A total of approximately 49 beams for the modified fields were verified by the IQM system, of which 97% of measured IQM signal agree with the predicted value to within 2%.Conclusions: The modified IQM system was found to be suitable for online verification of adapted treatment fields.

  6. Optical security verification by synthesizing thin films with unique polarimetric signatures.

    PubMed

    Carnicer, Artur; Arteaga, Oriol; Pascual, Esther; Canillas, Adolf; Vallmitjana, Santiago; Javidi, Bahram; Bertran, Enric

    2015-11-15

    This Letter reports the production and optical polarimetric verification of codes based on thin-film technology for security applications. Because thin-film structures display distinctive polarization signatures, this data is used to authenticate the message encoded. Samples are analyzed using an imaging ellipsometer able to measure the 16 components of the Mueller matrix. As a result, the behavior of the thin film under polarized light becomes completely characterized. This information is utilized to distinguish among true and false codes by means of correlation. Without the imaging optics the components of the Mueller matrix become noise-like distributions and, consequently, the message encoded is no longer available. Then, a set of Stokes vectors are generated numerically for any polarization state of the illuminating beam and thus, machine learning techniques can be used to perform classification. We show that successful authentication is possible using the k-nearest neighbors algorithm in thin-films codes that have been anisotropically phase-encoded with pseudorandom phase code. PMID:26565884

  7. 75 FR 42575 - Electronic Signature and Storage of Form I-9, Employment Eligibility Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ... I-9. Documents Acceptable for Employment Eligibility Verification, 73 FR 76505 (Dec. 17, 2008). C... storage of the Form I-9. 71 FR 34510 (June 15, 2006). The interim rule implemented Public Law 108-390, 118... rulemaking. Documents Acceptable for Employment Eligibility Verification, 73 FR 76505 (Dec. 17,...

  8. An optimized online verification imaging procedure for external beam partial breast irradiation.

    PubMed

    Willis, David J; Kron, Tomas; Chua, Boon

    2011-01-01

    The purpose of this study was to evaluate the capabilities of a kilovoltage (kV) on-board imager (OBI)-equipped linear accelerator in the setting of on-line verification imaging for external-beam partial breast irradiation. Available imaging techniques were optimized and assessed for image quality using a modified anthropomorphic phantom. Imaging dose was also assessed. Imaging techniques were assessed for physical clearance between patient and treatment machine using a volunteer. Nonorthogonal kV image pairs were identified as optimal in terms of image quality, clearance, and dose. After institutional review board approval, this approach was used for 17 patients receiving accelerated partial breast irradiation. Imaging was performed before every fraction verification with online correction of setup deviations >5 mm (total image sessions = 170). Treatment staff rated risk of collision and visibility of tumor bed surgical clips where present. Image session duration and detected setup deviations were recorded. For all cases, both image projections (n = 34) had low collision risk. Surgical clips were rated as well as visualized in all cases where they were present (n = 5). The average imaging session time was 6 min, 16 sec, and a reduction in duration was observed as staff became familiar with the technique. Setup deviations of up to 1.3 cm were detected before treatment and subsequently confirmed offline. Nonorthogonal kV image pairs allowed effective and efficient online verification for partial breast irradiation. It has yet to be tested in a multicenter study to determine whether it is dependent on skilled treatment staff. PMID:20510600

  9. An Optimized Online Verification Imaging Procedure for External Beam Partial Breast Irradiation

    SciTech Connect

    Willis, David J.; Kron, Tomas; Chua, Boon

    2011-07-01

    The purpose of this study was to evaluate the capabilities of a kilovoltage (kV) on-board imager (OBI)-equipped linear accelerator in the setting of on-line verification imaging for external-beam partial breast irradiation. Available imaging techniques were optimized and assessed for image quality using a modified anthropomorphic phantom. Imaging dose was also assessed. Imaging techniques were assessed for physical clearance between patient and treatment machine using a volunteer. Nonorthogonal kV image pairs were identified as optimal in terms of image quality, clearance, and dose. After institutional review board approval, this approach was used for 17 patients receiving accelerated partial breast irradiation. Imaging was performed before every fraction verification with online correction of setup deviations >5 mm (total image sessions = 170). Treatment staff rated risk of collision and visibility of tumor bed surgical clips where present. Image session duration and detected setup deviations were recorded. For all cases, both image projections (n = 34) had low collision risk. Surgical clips were rated as well as visualized in all cases where they were present (n = 5). The average imaging session time was 6 min, 16 sec, and a reduction in duration was observed as staff became familiar with the technique. Setup deviations of up to 1.3 cm were detected before treatment and subsequently confirmed offline. Nonorthogonal kV image pairs allowed effective and efficient online verification for partial breast irradiation. It has yet to be tested in a multicenter study to determine whether it is dependent on skilled treatment staff.

  10. Laboratory verification of on-line lithium analysis using ultraviolet absorption spectrometry

    SciTech Connect

    Beemster, B.J.; Schlager, K.J.; Schloegel, K.M.; Kahle, S.J.; Fredrichs, T.L.

    1992-12-31

    Several laboratory experiments were performed to evaluate the capability of absorption spectrometry in the ultraviolet-visible wavelength range with the objective of developing methods for on-line analysis of lithium directly in the primary coolant of Pressurized Water Reactors using optical probes. Although initial laboratory tests seemed to indicate that lithium could be detected using primary absorption (detection of natural spectra unassisted by reagents), subsequent field tests demonstrated that no primary absorption spectra existed for lithium in the ultraviolet-visible wavelength range. A second series of tests that were recently conducted did, however, confirm results reported in the literature to the effect that reagents were available that will react with lithium to form chelates that possess detectable absorption and fluorescent signatures. These results point to the possible use of secondary techniques for on-line analysis of lithium.

  11. Is Your Avatar Ethical? On-Line Course Tools that Are Methods for Student Identity and Verification

    ERIC Educational Resources Information Center

    Semple, Mid; Hatala, Jeffrey; Franks, Patricia; Rossi, Margherita A.

    2011-01-01

    On-line college courses present a mandate for student identity verification for accreditation and funding sources. Student authentication requires course modification to detect fraud and misrepresentation of authorship in assignment submissions. The reality is that some college students cheat in face-to-face classrooms; however, the potential for…

  12. Is Your Avatar Ethical? On-Line Course Tools that Are Methods for Student Identity and Verification

    ERIC Educational Resources Information Center

    Semple, Mid; Hatala, Jeffrey; Franks, Patricia; Rossi, Margherita A.

    2011-01-01

    On-line college courses present a mandate for student identity verification for accreditation and funding sources. Student authentication requires course modification to detect fraud and misrepresentation of authorship in assignment submissions. The reality is that some college students cheat in face-to-face classrooms; however, the potential for

  13. Efficient cost-sensitive human-machine collaboration for offline signature verification

    NASA Astrophysics Data System (ADS)

    Coetzer, Johannes; Swanepoel, Jacques; Sabourin, Robert

    2012-01-01

    We propose a novel strategy for the optimal combination of human and machine decisions in a cost-sensitive environment. The proposed algorithm should be especially beneficial to financial institutions where off-line signatures, each associated with a specific transaction value, require authentication. When presented with a collection of genuine and fraudulent training signatures, produced by so-called guinea pig writers, the proficiency of a workforce of human employees and a score-generating machine can be estimated and represented in receiver operating characteristic (ROC) space. Using a set of Boolean fusion functions, the majority vote decision of the human workforce is combined with each threshold-specific machine-generated decision. The performance of the candidate ensembles is estimated and represented in ROC space, after which only the optimal ensembles and associated decision trees are retained. When presented with a questioned signature linked to an arbitrary writer, the system first uses the ROC-based cost gradient associated with the transaction value to select the ensemble that minimises the expected cost, and then uses the corresponding decision tree to authenticate the signature in question. We show that, when utilising the entire human workforce, the incorporation of a machine streamlines the authentication process and decreases the expected cost for all operating conditions.

  14. Investigation of the spatial resolution of an online dose verification device

    SciTech Connect

    Asuni, G.; Rickey, D. W.; McCurdy, B. M. C.

    2012-02-15

    Purpose: The aim of this work is to characterize a new online dose verification device, COMPASS transmission detector array (IBA Dosimetry, Schwarzenbruck, Germany). The array is composed of 1600 cylindrical ionization chambers of 3.8 mm diameter, separated by 6.5 mm center-to-center spacing, in a 40 x 40 arrangement. Methods: The line spread function (LSF) of a single ion chamber in the detector was measured with a narrow slit collimator for a 6 MV photon beam. The 0.25 x 10 mm{sup 2} slit was formed by two machined lead blocks. The LSF was obtained by laterally translating the detector in 0.25 mm steps underneath the slit over a range of 24 mm and taking a measurement at each step. This measurement was validated with Monte Carlo simulation using BEAMnrc and DOSXYZnrc. The presampling modulation transfer function (MTF), the Fourier transform of the line spread function, was determined and compared to calculated (Monte Carlo and analytical) MTFs. Two head-and-neck intensity modulated radiation therapy (IMRT) fields were measured using the device and were used to validate the LSF measurement. These fields were simulated with the BEAMnrc Monte Carlo model, and the Monte Carlo generated incident fluence was convolved with the 2D detector response function (derived from the measured LSF) to obtain calculated dose. The measured and calculated dose distributions were then quantitatively compared using {chi}-comparison criteria of 3% dose difference and 3 mm distance-to-agreement for in-field points (defined as those above the 10% maximum dose threshold). Results: The full width at half-maximum (FWHM) of the measured detector response for a single chamber is 4.3 mm, which is comparable to the chamber diameter of 3.8 mm. The pre-sampling MTF was calculated, and the resolution of one chamber was estimated as 0.25 lp/mm from the first zero crossing. For both examined IMRT fields, the {chi}-comparison between measured and calculated data show good agreement with 95.1% and 96.3% of in-field points below {chi} of 1.0 for fields 1 and 2, respectively (with an average {chi} of 0.29 for IMRT field 1 and 0.24 for IMRT field 2). Conclusions: The LSF for a new novel online detector has been measured at 6 MV using a narrow slit technique, and this measurement has been validated by Monte Carlo simulation. The detector response function derived from line spread function has been applied to recover measured IMRT fields. The results have shown that the device measures IMRT fields accurately within acceptable tolerance.

  15. Measurement-based Monte Carlo dose calculation system for IMRT pretreatment and on-line transit dose verifications.

    PubMed

    Lin, Mu-Han; Chao, Tsi-Chian; Lee, Chung-Chi; Tung, Chuan-Jong; Yeh, Chie-Yi; Hong, Ji-Hong

    2009-04-01

    The aim of this study was to develop a dose simulation system based on portal dosimetry measurements and the BEAM Monte Carlo code for intensity-modulated (IM) radiotherapy dose verification. This measurement-based Monte Carlo (MBMC) system can perform, within one systematic calculation, both pretreatment and on-line transit dose verifications. BEAMnrc and DOSXYZnrc 2006 were used to simulate radiation transport from the treatment head, through the patient, to the plane of the aS500 electronic portal imaging device (EPID). In order to represent the nonuniform fluence distribution of an IM field within the MBMC simulation, an EPID-measured efficiency map was used to redistribute particle weightings of the simulated phase space distribution of an open field at a plane above a patient/phantom. This efficiency map was obtained by dividing the measured energy fluence distribution of an IM field to that of an open field at the EPID plane. The simulated dose distribution at the midplane of a homogeneous polystyrene phantom was compared to the corresponding distribution obtained from the Eclipse treatment planning system (TPS) for pretreatment verification. It also generated a simulated transit dose distribution to serve as the on-line verification reference for comparison to that measured by the EPID. Two head-and-neck (NPC1 and NPC2) and one prostate cancer fields were tested in this study. To validate the accuracy of the MBMC system, film dosimetry was performed and served as the dosimetry reference. Excellent agreement between the film dosimetry and the MBMC simulation was obtained for pretreatment verification. For all three cases tested, gamma evaluation with 3%/3 mm criteria showed a high pass percentage (> 99.7%) within the area in which the dose was greater than 30% of the maximum dose. In contrast to the TPS, the MBMC system was able to preserve multileaf collimator delivery effects such as the tongue-and-groove effect and interleaf leakage. In the NPC1 field, the TPS showed 16.5% overdose due to the tongue-and-groove effect and 14.6% overdose due to improper leaf stepping. Similarly, in the NPC2 field, the TPS showed 14.1% overdose due to the tongue-and-groove effect and 8.9% overdose due to improper leaf stepping. In the prostate cancer field, the TPS showed 6.8% overdose due to improper leaf stepping. No tongue-and-groove effect was observed for this field. For transit dose verification, agreements among the EPID measurement, the film dosimetry, and the MBMC system were also excellent with a minimum gamma pass percentage of 99.6%. PMID:19472622

  16. Effect of Translational and Rotational Errors on Complex Dose Distributions With Off-Line and On-Line Position Verification

    SciTech Connect

    Lips, Irene M. Heide, Uulke A. van der; Kotte, Alexis N.T.J.; Vulpen, Marco van; Bel, Arjan

    2009-08-01

    Purpose: To investigate the influence of translational and rotational errors on prostate intensity-modulated radiotherapy (IMRT) with an integrated boost to the tumor and to evaluate the effect of the use of an on-line correction protocol. Methods and Materials: For 19 patients, who had been treated with prostate IMRT and fiducial marker-based position verification, highly inhomogeneous IMRT plans, including an integrated tumor boost, were made using varying margins (2, 4, 6, and 8 mm). The measured translational and rotational errors were used to calculate the dose using two positioning strategies: an off-line and an on-line protocol to correct the translational shifts. The estimated dose to the targets and the organs at risk was compared with the intended dose. Results: Residual deviations after off-line correction led to statistically significant, but very small, reductions in dose coverage. Even when a 2-mm margin was used, the average reduction in dose to 99% of the volume was 1.4 {+-} 1.9 Gy for the tumor, 1.5 {+-} 1.5 Gy for the prostate without seminal vesicles (boost volume), and 4.3 {+-} 4.6 Gy, including the seminal vesicles (clinical target volume). Patients with large systematic rotational errors demonstrated a substantial decrease in dose, especially for the clinical target volume. If an on-line correction protocol was used, the average mean dose and dose to 99% of the volume of the targets improved. However, the extensive dose reduction for patients with large rotational errors barely recovered with on-line correction. Conclusion: For complex prostate IMRT with an integrated tumor boost, the use of an on-line correction protocol yields little improvement without the correction of rotational errors.

  17. Aging in Biometrics: An Experimental Analysis on On-Line Signature

    PubMed Central

    Galbally, Javier; Martinez-Diaz, Marcos; Fierrez, Julian

    2013-01-01

    The first consistent and reproducible evaluation of the effect of aging on dynamic signature is reported. Experiments are carried out on a database generated from two previous datasets which were acquired, under very similar conditions, in 6 sessions distributed in a 15-month time span. Three different systems, representing the current most popular approaches in signature recognition, are used in the experiments, proving the degradation suffered by this trait with the passing of time. Several template update strategies are also studied as possible measures to reduce the impact of aging on the systems performance. Different results regarding the way in which signatures tend to change with time, and their most and least stable features, are also given. PMID:23894557

  18. Aging in biometrics: an experimental analysis on on-line signature.

    PubMed

    Galbally, Javier; Martinez-Diaz, Marcos; Fierrez, Julian

    2013-01-01

    The first consistent and reproducible evaluation of the effect of aging on dynamic signature is reported. Experiments are carried out on a database generated from two previous datasets which were acquired, under very similar conditions, in 6 sessions distributed in a 15-month time span. Three different systems, representing the current most popular approaches in signature recognition, are used in the experiments, proving the degradation suffered by this trait with the passing of time. Several template update strategies are also studied as possible measures to reduce the impact of aging on the system's performance. Different results regarding the way in which signatures tend to change with time, and their most and least stable features, are also given. PMID:23894557

  19. MARQ: an online tool to mine GEO for experiments with similar or opposite gene expression signatures

    PubMed Central

    Vazquez, Miguel; Nogales-Cadenas, Ruben; Arroyo, Javier; Botías, Pedro; García, Raul; Carazo, Jose M.; Tirado, Francisco; Pascual-Montano, Alberto; Carmona-Saez, Pedro

    2010-01-01

    The enormous amount of data available in public gene expression repositories such as Gene Expression Omnibus (GEO) offers an inestimable resource to explore gene expression programs across several organisms and conditions. This information can be used to discover experiments that induce similar or opposite gene expression patterns to a given query, which in turn may lead to the discovery of new relationships among diseases, drugs or pathways, as well as the generation of new hypotheses. In this work, we present MARQ, a web-based application that allows researchers to compare a query set of genes, e.g. a set of over- and under-expressed genes, against a signature database built from GEO datasets for different organisms and platforms. MARQ offers an easy-to-use and integrated environment to mine GEO, in order to identify conditions that induce similar or opposite gene expression patterns to a given experimental condition. MARQ also includes additional functionalities for the exploration of the results, including a meta-analysis pipeline to find genes that are differentially expressed across different experiments. The application is freely available at http://marq.dacya.ucm.es. PMID:20513648

  20. Authentication Based on Pole-zero Models of Signature Velocity.

    PubMed

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-10-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  1. Authentication Based on Pole-zero Models of Signature Velocity

    PubMed Central

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-01-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  2. Patient-Specific 3D Pretreatment and Potential 3D Online Dose Verification of Monte Carlo-Calculated IMRT Prostate Treatment Plans

    SciTech Connect

    Boggula, Ramesh; Jahnke, Lennart; Wertz, Hansjoerg; Lohr, Frank; Wenz, Frederik

    2011-11-15

    Purpose: Fast and reliable comprehensive quality assurance tools are required to validate the safety and accuracy of complex intensity-modulated radiotherapy (IMRT) plans for prostate treatment. In this study, we evaluated the performance of the COMPASS system for both off-line and potential online procedures for the verification of IMRT treatment plans. Methods and Materials: COMPASS has a dedicated beam model and dose engine, it can reconstruct three-dimensional dose distributions on the patient anatomy based on measured fluences using either the MatriXX two-dimensional (2D) array (offline) or a 2D transmission detector (T2D) (online). For benchmarking the COMPASS dose calculation, various dose-volume indices were compared against Monte Carlo-calculated dose distributions for five prostate patient treatment plans. Gamma index evaluation and absolute point dose measurements were also performed in an inhomogeneous pelvis phantom using extended dose range films and ion chamber for five additional treatment plans. Results: MatriXX-based dose reconstruction showed excellent agreement with the ion chamber (<0.5%, except for one treatment plan, which showed 1.5%), film ({approx}100% pixels passing gamma criteria 3%/3 mm) and mean dose-volume indices (<2%). The T2D based dose reconstruction showed good agreement as well with ion chamber (<2%), film ({approx}99% pixels passing gamma criteria 3%/3 mm), and mean dose-volume indices (<5.5%). Conclusion: The COMPASS system qualifies for routine prostate IMRT pretreatment verification with the MatriXX detector and has the potential for on-line verification of treatment delivery using T2D.

  3. Online Kidney Position Verification Using Non-Contrast Radiographs on a Linear Accelerator with on Board KV X-Ray Imaging Capability

    SciTech Connect

    Willis, David J. Kron, Tomas; Hubbard, Patricia; Haworth, Annette; Wheeler, Greg; Duchesne, Gillian M.

    2009-01-01

    The kidneys are dose-limiting organs in abdominal radiotherapy. Kilovoltage (kV) radiographs can be acquired using on-board imager (OBI)-equipped linear accelerators with better soft tissue contrast and lower radiation doses than conventional portal imaging. A feasibility study was conducted to test the suitability of anterior-posterior (AP) non-contrast kV radiographs acquired at treatment time for online kidney position verification. Anthropomorphic phantoms were used to evaluate image quality and radiation dose. Institutional Review Board approval was given for a pilot study that enrolled 5 adults and 5 children. Customized digitally reconstructed radiographs (DRRs) were generated to provide a priori information on kidney shape and position. Radiotherapy treatment staff performed online evaluation of kidney visibility on OBI radiographs. Kidney dose measured in a pediatric anthropomorphic phantom was 0.1 cGy for kV imaging and 1.7 cGy for MV imaging. Kidneys were rated as well visualized in 60% of patients (90% confidence interval, 34-81%). The likelihood of visualization appears to be influenced by the relative AP separation of the abdomen and kidneys, the axial profile of the kidneys, and their relative contrast with surrounding structures. Online verification of kidney position using AP non-contrast kV radiographs on an OBI-equipped linear accelerator appears feasible for patients with suitable abdominal anatomy. Kidney position information provided is limited to 2-dimensional 'snapshots,' but this is adequate in some clinical situations and potentially advantageous in respiratory-correlated treatments. Successful clinical implementation requires customized partial DRRs, appropriate imaging parameters, and credentialing of treatment staff.

  4. Cone-Beam Computed Tomography for On-Line Image Guidance of Lung Stereotactic Radiotherapy: Localization, Verification, and Intrafraction Tumor Position

    SciTech Connect

    Purdie, Thomas G. . E-mail: Tom.Purdie@rmp.uhn.on.ca; Bissonnette, Jean-Pierre; Franks, Kevin; Bezjak, Andrea; Payne, David; Sie, Fanny; Sharpe, Michael B.; Jaffray, David A.

    2007-05-01

    Purpose: Cone-beam computed tomography (CBCT) in-room imaging allows accurate inter- and intrafraction target localization in stereotactic body radiotherapy of lung tumors. Methods and Materials: Image-guided stereotactic body radiotherapy was performed in 28 patients (89 fractions) with medically inoperable Stage T1-T2 non-small-cell lung carcinoma. The targets from the CBCT and planning data set (helical or four-dimensional CT) were matched on-line to determine the couch shift required for target localization. Matching based on the bony anatomy was also performed retrospectively. Verification of target localization was done using either megavoltage portal imaging or CBCT imaging; repeat CBCT imaging was used to assess the intrafraction tumor position. Results: The mean three-dimensional tumor motion for patients with upper lesions (n = 21) and mid-lobe or lower lobe lesions (n = 7) was 4.2 and 6.7 mm, respectively. The mean difference between the target and bony anatomy matching using CBCT was 6.8 mm (SD, 4.9, maximum, 30.3); the difference exceeded 13.9 mm in 10% of the treatment fractions. The mean residual error after target localization using CBCT imaging was 1.9 mm (SD, 1.1, maximum, 4.4). The mean intrafraction tumor deviation was significantly greater (5.3 mm vs. 2.2 mm) when the interval between localization and repeat CBCT imaging (n = 8) exceeded 34 min. Conclusion: In-room volumetric imaging, such as CBCT, is essential for target localization accuracy in lung stereotactic body radiotherapy. Imaging that relies on bony anatomy as a surrogate of the target may provide erroneous results in both localization and verification.

  5. SU-E-J-46: Development of a Compton Camera Prototype for Online Range Verification of Laser-Accelerated Proton Beams

    SciTech Connect

    Thirolf, PG; Bortfeldt, J; Lang, C; Parodi, K; Aldawood, S; Boehmer, M; Gernhaeuser, R; Maier, L; Castelhano, I; Kolff, H van der; Schaart, DR

    2014-06-01

    Purpose: Development of a photon detection system designed for online range verification of laser-accelerated proton beams via prompt-gamma imaging of nuclear reactions. Methods: We develop a Compton camera for the position-sensitive detection of prompt photons emitted from nuclear reactions between the proton beam and biological samples. The detector is designed to be capable to reconstruct the photon source origin not only from the Compton scattering kinematics of the primary photon, but also to allow for tracking of the Compton-scattered electrons. Results: Simulation studies resulted in the design of the Compton camera based on a LaBr{sub 3}(Ce) scintillation crystal acting as absorber, preceded by a stacked array of 6 double-sided silicon strip detectors as scatterers. From the design simulations, an angular resolution of ≤ 2° and an image reconstruction efficiency of 10{sup −3} −10{sup −5} (at 2–6 MeV) can be expected. The LaBr{sub 3} crystal has been characterized with calibration sources, resulting in a time resolution of 273 ps (FWHM) and an energy resolution of about 3.8% (FWHM). Using a collimated (1 mm diameter) {sup 137}Cs calibration source, the light distribution was measured for each of 64 pixels (6×6 mm{sup 2}). Data were also taken with 0.5 mm collimation and 0.5 mm step size to generate a reference library of light distributions that allows for reconstructing the interaction position of the initial photon using a k-nearest neighbor (k-NN) algorithm developed by the Delft group. Conclusion: The Compton-camera approach for prompt-gamma detection offers promising perspectives for ion beam range verification. A Compton camera prototype is presently being developed and characterized in Garching. Furthermore, an arrangement of, e.g., 4 camera modules could even be used in a ‘gamma-PET’ mode to detect delayed annihilation radiation from positron emitters in the irradiation interrupts (with improved performance in the presence of an additional third (prompt) photon (as in 10C and 14O). This work was supported by the DFG Cluster of Excellence MAP (Munich-Centre for Advanced Photonics)

  6. Verifiable threshold signature schemes against conspiracy attack.

    PubMed

    Gan, Yuan-Ju

    2004-01-01

    In this study, the author has designed new verifiable (t,n) threshold untraceable signature schemes. The proposed schemes have the following properties:(1) Verification: The shadows of the secret distributed by the trusted center can be verified by all of the participants;(2) Security: Even if the number of the dishonest member is over the value of the threshold, they cannot get the system secret parameters, such as the group secret key, and forge other member's individual signature;(3) Efficient verification: The verifier can verify the group signature easily and the verification time of the group signature is equivalent to that of an individual signature; (4) Untraceability: The signers of the group signature cannot be traced. PMID:14663852

  7. Signature data generation method

    NASA Astrophysics Data System (ADS)

    Jiao, Tianshi; Zhang, Changshui

    2001-09-01

    In on-line signature identification, it is usually difficult to get enough samples. This paper focuses on the work of generating some new signatures according to only a few samples collected advance. In the generating process, we used the knowledge concluded from the observation of lot's of signatures. The knowledge includes the relationship between the writing speed sequences of skilled signatures with different scales, that is to say: the relationship between speed and scale. The knowledge also includes the relationship between writing speed of adjacent points within a signature, which can be consider as the relationship of speed and local surrounding. By modeling the speed sequence of signature as conditional k-th step Markov chain, the relationship of speed and local surrounding is described as the transition probabilities of the Markov chain. The results of the generated signatures show that the method we proposed is useful. The results also prove that the relations we concluded are steady and can be used directly to signature identification.

  8. Signature-based store checking buffer

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-06-02

    A system and method for optimizing redundant output verification, are provided. A hardware-based store fingerprint buffer receives multiple instances of output from multiple instances of computation. The store fingerprint buffer generates a signature from the content included in the multiple instances of output. When a barrier is reached, the store fingerprint buffer uses the signature to verify the content is error-free.

  9. Strengths-based positive psychology interventions: a randomized placebo-controlled online trial on long-term effects for a signature strengths- vs. a lesser strengths-intervention.

    PubMed

    Proyer, Ren T; Gander, Fabian; Wellenzohn, Sara; Ruch, Willibald

    2015-01-01

    Recent years have seen an increasing interest in research in positive psychology interventions. There is broad evidence for their effectiveness in increasing well-being and ameliorating depression. Intentional activities that focus on those character strengths, which are most typical for a person (i.e., signature strengths, SS) and encourage their usage in a new way have been identified as highly effective. The current study aims at comparing an intervention aimed at using SS with one on using individual low scoring (or lesser) strengths in a randomized placebo-controlled trial. A total of 375 adults were randomly assigned to one of the two intervention conditions [i.e., using five signature vs. five lesser strengths (LS) in a new way] or a placebo control condition (i.e., early memories). We measured happiness and depressive symptoms at five time points (i.e., pre- and post-test, 1-, 3-, and 6-months follow-ups) and character strengths at pre-test. The main findings are that (1) there were increases in happiness for up to 3 months and decreases in depressive symptoms in the short term in both intervention conditions; (2) participants found working with strengths equally rewarding (enjoyment and benefit) in both conditions; (3) those participants that reported generally higher levels of strengths benefitted more from working on LS rather than SS and those with comparatively lower levels of strengths tended to benefit more from working on SS; and (4) deviations from an average profile derived from a large sample of German-speakers completing the Values-in-Action Inventory of Strengths were associated with greater benefit from the interventions in the SS-condition. We conclude that working on character strengths is effective for increasing happiness and discuss how these interventions could be tailored to the individual for promoting their effectiveness. PMID:25954221

  10. Strengths-based positive psychology interventions: a randomized placebo-controlled online trial on long-term effects for a signature strengths- vs. a lesser strengths-intervention

    PubMed Central

    Proyer, René T.; Gander, Fabian; Wellenzohn, Sara; Ruch, Willibald

    2015-01-01

    Recent years have seen an increasing interest in research in positive psychology interventions. There is broad evidence for their effectiveness in increasing well-being and ameliorating depression. Intentional activities that focus on those character strengths, which are most typical for a person (i.e., signature strengths, SS) and encourage their usage in a new way have been identified as highly effective. The current study aims at comparing an intervention aimed at using SS with one on using individual low scoring (or lesser) strengths in a randomized placebo-controlled trial. A total of 375 adults were randomly assigned to one of the two intervention conditions [i.e., using five signature vs. five lesser strengths (LS) in a new way] or a placebo control condition (i.e., early memories). We measured happiness and depressive symptoms at five time points (i.e., pre- and post-test, 1-, 3-, and 6-months follow-ups) and character strengths at pre-test. The main findings are that (1) there were increases in happiness for up to 3 months and decreases in depressive symptoms in the short term in both intervention conditions; (2) participants found working with strengths equally rewarding (enjoyment and benefit) in both conditions; (3) those participants that reported generally higher levels of strengths benefitted more from working on LS rather than SS and those with comparatively lower levels of strengths tended to benefit more from working on SS; and (4) deviations from an average profile derived from a large sample of German-speakers completing the Values-in-Action Inventory of Strengths were associated with greater benefit from the interventions in the SS-condition. We conclude that working on character strengths is effective for increasing happiness and discuss how these interventions could be tailored to the individual for promoting their effectiveness. PMID:25954221

  11. 76 FR 60112 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-09-28

    ... From the Federal Register Online via the Government Publishing Office SOCIAL SECURITY ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social Security... service, visit our Internet site, Social Security Online, at http://www.socialsecurity.gov . Gerard...

  12. A Quantum Multi-proxy Blind Signature Scheme Based on Genuine Four-Qubit Entangled State

    NASA Astrophysics Data System (ADS)

    Tian, Juan-Hong; Zhang, Jian-Zhong; Li, Yan-Ping

    2015-06-01

    In this paper, we propose a multi-proxy blind signature scheme based on controlled teleportation. Genuine four-qubit entangled state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. The security analysis shows the scheme satisfies the security features of multi-proxy signature, unforgeability, undeniability, blindness and unconditional security.

  13. A Quantum Multi-proxy Blind Signature Scheme Based on Genuine Four-Qubit Entangled State

    NASA Astrophysics Data System (ADS)

    Tian, Juan-Hong; Zhang, Jian-Zhong; Li, Yan-Ping

    2016-02-01

    In this paper, we propose a multi-proxy blind signature scheme based on controlled teleportation. Genuine four-qubit entangled state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. The security analysis shows the scheme satisfies the security features of multi-proxy signature, unforgeability, undeniability, blindness and unconditional security.

  14. Characterization of a fiber-coupled Al{sub 2}O{sub 3}:C luminescence dosimetry system for online in vivo dose verification during {sup 192}Ir brachytherapy

    SciTech Connect

    Andersen, Claus E.; Nielsen, Soeren Kynde; Greilich, Steffen; Helt-Hansen, Jakob; Lindegaard, Jacob Christian; Tanderup, Kari

    2009-03-15

    A prototype of a new dose-verification system has been developed to facilitate prevention and identification of dose delivery errors in remotely afterloaded brachytherapy. The system allows for automatic online in vivo dosimetry directly in the tumor region using small passive detector probes that fit into applicators such as standard needles or catheters. The system measures the absorbed dose rate (0.1 s time resolution) and total absorbed dose on the basis of radioluminescence (RL) and optically stimulated luminescence (OSL) from aluminum oxide crystals attached to optical fiber cables (1 mm outer diameter). The system was tested in the range from 0 to 4 Gy using a solid-water phantom, a Varian GammaMed Plus {sup 192}Ir PDR afterloader, and dosimetry probes inserted into stainless-steel brachytherapy needles. The calibrated system was found to be linear in the tested dose range. The reproducibility (one standard deviation) for RL and OSL measurements was 1.3%. The measured depth-dose profiles agreed well with the theoretical expectations computed with the EGSNRC Monte Carlo code, suggesting that the energy dependence for the dosimeter probes (relative to water) is less than 6% for source-to-probe distances in the range of 2-50 mm. Under certain conditions, the RL signal could be greatly disturbed by the so-called stem signal (i.e., unwanted light generated in the fiber cable upon irradiation). The OSL signal is not subject to this source of error. The tested system appears to be adequate for in vivo brachytherapy dosimetry.

  15. ETV - VERIFICATION TESTING (ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM)

    EPA Science Inventory

    Verification testing is a major component of the Environmental Technology Verification (ETV) program. The ETV Program was instituted to verify the performance of innovative technical solutions to problems that threaten human health or the environment and was created to substantia...

  16. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  17. Towards a Better Understanding of the Oxygen Isotope Signature of Atmospheric CO2: Determining the 18O-Exchange Between CO2 and H2O in Leaves and Soil On-line with Laser-Based Spectroscopy

    NASA Astrophysics Data System (ADS)

    Gangi, L.; Rothfuss, Y.; Vereecken, H.; Brueggemann, N.

    2013-12-01

    The oxygen isotope signature of carbon dioxide (δ18O-CO2) is a powerful tool to disentangle CO2 fluxes in terrestrial ecosystems, as CO2 attains a contrasting 18O signature by the interaction with isotopically different soil and leaf water pools during soil respiration and photosynthesis, respectively. However, using the δ18O-CO2 signal to quantify plant-soil-atmosphere CO2 fluxes is still challenging due to a lack of knowledge concerning the magnitude and effect of individual fractionation processes during CO2 and H2O diffusion and during CO2-H2O isotopic exchange in soils and leaves, especially related to short-term changes in environmental conditions (non-steady state). This study addresses this research gap by combined on-line monitoring of the oxygen isotopic signature of CO2 and water vapor during gas exchange in soil and plant leaves with laser-based spectroscopy, using soil columns and plant chambers. In both experimental setups, the measured δ18O of water vapor was used to infer the δ18O of liquid water, and, together with the δ18O-CO2, the degree of oxygen isotopic equilibrium between the two species (θ). Gas exchange experiments with different functional plant types (C3 coniferous, C3 monocotyledonous, C3 dicotyledonous, C4) revealed that θ and the influence of the plant on the ambient δ18O-CO2 (CO18O-isoforcing) not only varied on a diurnal timescale but also when plants were exposed to limited water availability, elevated air temperature, and abrupt changes in light intensity (sunflecks). Maximum θ before treatments ranged between 0.7 and 0.8 for the C3 dicotyledonous (poplar) and C3 monocotyledonous (wheat) plants, and between 0.5 and 0.6 for the conifer (spruce) and C4 plant (maize) while maximum CO18O-isoforcing was highest in wheat (0.03 m s-1 ‰), similar in poplar and maize (0.02 m s-1 ‰), and lowest in spruce (0.01 m s-1 ‰). Multiple regression analysis showed that up to 97 % of temporal dynamics in CO18O-isoforcing could be explained by variations in stomatal conductance, θ, and δ18O of H2O at the evaporation site. The determined maximum in vivo activity of carbonic anhydrase, the enzyme which catalyzes the CO2-H2O oxygen isotope exchange inside leaves, varied between the different plant species and was, as observed for θ, higher in poplar and wheat, and lower in maize and spruce. Preliminary experiments with soil columns filled with sand demonstrated that gas-permeable microporous polypropylene tubing, which was installed at different depths in the soil columns, was appropriate for determining δ18O-H2O and δ18O-CO2 simultaneously without fractionation. Hence, this new methodology is promising for further studies on the oxygen isotopic exchange between CO2 and H2O in soils. Altogether, this study highlights that the δ18O-CO2 exchange in the soil-plant-atmosphere continuum is highly dynamic in response to short-term variations in environmental conditions, and emphasizes the need for an improved parameterization of models simulating δ18O-CO2.

  18. Signatures of Reputation

    NASA Astrophysics Data System (ADS)

    Bethencourt, John; Shi, Elaine; Song, Dawn

    Reputation systems have become an increasingly important tool for highlighting quality information and filtering spam within online forums. However, the dependence of a user's reputation on their history of activities seems to preclude any possibility of anonymity. We show that useful reputation information can, in fact, coexist with strong privacy guarantees. We introduce and formalize a novel cryptographic primitive we call signatures of reputation which supports monotonic measures of reputation in a completely anonymous setting. In our system, a user can express trust in others by voting for them, collect votes to build up her own reputation, and attach a proof of her reputation to any data she publishes, all while maintaining the unlinkability of her actions.

  19. Modeling the Lexical Morphology of Western Handwritten Signatures

    PubMed Central

    Diaz-Cabrera, Moises; Ferrer, Miguel A.; Morales, Aythami

    2015-01-01

    A handwritten signature is the final response to a complex cognitive and neuromuscular process which is the result of the learning process. Because of the many factors involved in signing, it is possible to study the signature from many points of view: graphologists, forensic experts, neurologists and computer vision experts have all examined them. Researchers study written signatures for psychiatric, penal, health and automatic verification purposes. As a potentially useful, multi-purpose study, this paper is focused on the lexical morphology of handwritten signatures. This we understand to mean the identification, analysis, and description of the signature structures of a given signer. In this work we analyze different public datasets involving 1533 signers from different Western geographical areas. Some relevant characteristics of signature lexical morphology have been selected, examined in terms of their probability distribution functions and modeled through a General Extreme Value distribution. This study suggests some useful models for multi-disciplinary sciences which depend on handwriting signatures. PMID:25860942

  20. Modeling the lexical morphology of Western handwritten signatures.

    PubMed

    Diaz-Cabrera, Moises; Ferrer, Miguel A; Morales, Aythami

    2015-01-01

    A handwritten signature is the final response to a complex cognitive and neuromuscular process which is the result of the learning process. Because of the many factors involved in signing, it is possible to study the signature from many points of view: graphologists, forensic experts, neurologists and computer vision experts have all examined them. Researchers study written signatures for psychiatric, penal, health and automatic verification purposes. As a potentially useful, multi-purpose study, this paper is focused on the lexical morphology of handwritten signatures. This we understand to mean the identification, analysis, and description of the signature structures of a given signer. In this work we analyze different public datasets involving 1533 signers from different Western geographical areas. Some relevant characteristics of signature lexical morphology have been selected, examined in terms of their probability distribution functions and modeled through a General Extreme Value distribution. This study suggests some useful models for multi-disciplinary sciences which depend on handwriting signatures. PMID:25860942

  1. Verification Tools Secure Online Shopping, Banking

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Just like rover or rocket technology sent into space, the software that controls these technologies must be extensively tested to ensure reliability and effectiveness. Ames Research Center invented the open-source Java Pathfinder (JPF) toolset for the deep testing of Java-based programs. Fujitsu Labs of America Inc., based in Sunnyvale, California, improved the capabilities of the JPF Symbolic Pathfinder tool, establishing the tool as a means of thoroughly testing the functionality and security of Web-based Java applications such as those used for Internet shopping and banking.

  2. A Hybrid Digital-Signature and Zero-Watermarking Approach for Authentication and Protection of Sensitive Electronic Documents

    PubMed Central

    Kabir, Muhammad N.; Alginahi, Yasser M.

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  3. A hybrid digital-signature and zero-watermarking approach for authentication and protection of sensitive electronic documents.

    PubMed

    Tayan, Omar; Kabir, Muhammad N; Alginahi, Yasser M

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  4. Quantum blind dual-signature scheme without arbitrator

    NASA Astrophysics Data System (ADS)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-03-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology.

  5. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  6. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  7. Developing composite signatures

    NASA Astrophysics Data System (ADS)

    Hawley, Chadwick T.; Carpenter, Tom; Cappelaere, Patrice G.; Frye, Stu; Lemoigne-Stewart, Jacqueline J.; Mandle, Dan; Montgomery, Sarah; Williams-Bess, Autumn

    2011-06-01

    A composite signature is a group of signatures that are related in such a way to more completely or further define a target or operational endeavor at a higher fidelity. This paper explores the merits of using composite signatures, in lieu of waiting for opportunities for the more elusive diagnostic signatures, to satisfy key essential elements of information Keywords: signature, composite signature, civil disaster (EEI) associated with civil disaster-related problems. It discusses efforts to refine composite signature development methodology and quantify the relative value of composite vs. diagnostic signatures. The objectives are to: 1) investigate and develop innovative composite signatures associated with civil disasters, including physical, chemical and pattern/behavioral; 2) explore the feasibility of collecting representative composite signatures using current and emerging intelligence, surveillance, and reconnaissance (ISR) collection architectures leveraging civilian and commercial architectures; and 3) collaborate extensively with scientists and engineers from U.S. government organizations and laboratories, the defense industry, and academic institutions.

  8. Signatures support program

    NASA Astrophysics Data System (ADS)

    Hawley, Chadwick T.

    2009-05-01

    The Signatures Support Program (SSP) leverages the full spectrum of signature-related activities (collections, processing, development, storage, maintenance, and dissemination) within the Department of Defense (DOD), the intelligence community (IC), other Federal agencies, and civil institutions. The Enterprise encompasses acoustic, seismic, radio frequency, infrared, radar, nuclear radiation, and electro-optical signatures. The SSP serves the war fighter, the IC, and civil institutions by supporting military operations, intelligence operations, homeland defense, disaster relief, acquisitions, and research and development. Data centers host and maintain signature holdings, collectively forming the national signatures pool. The geographically distributed organizations are the authoritative sources and repositories for signature data; the centers are responsible for data content and quality. The SSP proactively engages DOD, IC, other Federal entities, academia, and industry to locate signatures for inclusion in the distributed national signatures pool and provides world-wide 24/7 access via the SSP application.

  9. Exotic signatures from supersymmetry

    SciTech Connect

    Hall, L.J. . Dept. of Physics Lawrence Berkeley Lab., CA )

    1989-08-01

    Minor changes to the standard supersymmetric model, such as soft flavor violation and R parity violation, cause large changes in the signatures. The origin of these changes and the resulting signatures are discussed. 15 refs., 7 figs., 2 tabs.

  10. Simulation verification techniques study

    NASA Technical Reports Server (NTRS)

    Schoonmaker, P. B.; Wenglinski, T. H.

    1975-01-01

    Results are summarized of the simulation verification techniques study which consisted of two tasks: to develop techniques for simulator hardware checkout and to develop techniques for simulation performance verification (validation). The hardware verification task involved definition of simulation hardware (hardware units and integrated simulator configurations), survey of current hardware self-test techniques, and definition of hardware and software techniques for checkout of simulator subsystems. The performance verification task included definition of simulation performance parameters (and critical performance parameters), definition of methods for establishing standards of performance (sources of reference data or validation), and definition of methods for validating performance. Both major tasks included definition of verification software and assessment of verification data base impact. An annotated bibliography of all documents generated during this study is provided.

  11. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    ... From the Federal Register Online via the Government Publishing Office SOCIAL SECURITY ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social Security...-6401, , for more information about the CBSV service, visit our Internet site, Social Security...

  12. 17 CFR 1.4 - Electronic signatures, acknowledgments and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... commission merchant or introducing broker, a retail forex customer of a retail foreign exchange dealer or..., retail forex customer, participant, client, counterparty, swap dealer, or major swap participant will...

  13. 17 CFR 1.4 - Electronic signatures, acknowledgments and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... commission merchant or introducing broker, a retail forex customer of a retail foreign exchange dealer or..., retail forex customer, participant, client, counterparty, swap dealer, or major swap participant will...

  14. An Arbitrated Quantum Signature with Bell States

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Qin, Su-Juan; Huang, Wei

    2014-05-01

    Entanglement is the main resource in quantum communication. The main aims of the arbitrated quantum signature (AQS) scheme are to present an application of the entanglement in cryptology and to prove the possibility of the quantum signature. More specifically, the main function of quantum entangled states in the existing AQS schemes is to assist the signatory to transfer quantum states to the receiver. However, teleportation and the Leung quantum one-time pad (L-QOTP) algorithm are not enough to design a secure AQS scheme. For example, Pauli operations commute or anticommute with each other, which makes the implementation of attacks easily from the aspects of forgery and disavowal. To conquer this shortcoming, we construct an improved AQS scheme using a new QOTP algorithm. This scheme has three advantages: it randomly uses the Hadamard operation in the new QOTP to resist attacks by using the anticommutativity of nontrivial Pauli operators and it preserves almost all merits in the existing AQS schemes; even in the process of handling disputes, no party has chance to change the message and its signature without being discovered; the receiver can verify the integrity of the signature and discover the disavow of the signatory even in the last step of verification.

  15. Scientists Using TCGA Data Identify 21 Mutational Signatures in Cancer

    Cancer.gov

    Many mutations have been implicated in human cancer, but the biological mechanisms that produce them remain largely unknown. In a study published online in Nature on August 14, 2013, researchers identified 21 signatures of mutational processes underlying 30 types of cancer. Characterizing mutational signatures may provide a greater understanding of the mechanistic basis of cancer and potentially lead to better treatments that target its root causes.

  16. Mobile Pit verification system design based on passive special nuclear material verification in weapons storage facilities

    SciTech Connect

    Paul, J. N.; Chin, M. R.; Sjoden, G. E.

    2013-07-01

    A mobile 'drive by' passive radiation detection system to be applied in special nuclear materials (SNM) storage facilities for validation and compliance purposes has been designed through the use of computational modeling and new radiation detection methods. This project was the result of work over a 1 year period to create optimal design specifications to include creation of 3D models using both Monte Carlo and deterministic codes to characterize the gamma and neutron leakage out each surface of SNM-bearing canisters. Results were compared and agreement was demonstrated between both models. Container leakages were then used to determine the expected reaction rates using transport theory in the detectors when placed at varying distances from the can. A 'typical' background signature was incorporated to determine the minimum signatures versus the probability of detection to evaluate moving source protocols with collimation. This established the criteria for verification of source presence and time gating at a given vehicle speed. New methods for the passive detection of SNM were employed and shown to give reliable identification of age and material for highly enriched uranium (HEU) and weapons grade plutonium (WGPu). The finalized 'Mobile Pit Verification System' (MPVS) design demonstrated that a 'drive-by' detection system, collimated and operating at nominally 2 mph, is capable of rapidly verifying each and every weapon pit stored in regularly spaced, shelved storage containers, using completely passive gamma and neutron signatures for HEU and WGPu. This system is ready for real evaluation to demonstrate passive total material accountability in storage facilities. (authors)

  17. Voltage verification unit

    DOEpatents

    Martin, Edward J. (Virginia Beach, VA)

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  18. Digital Signature Management.

    ERIC Educational Resources Information Center

    Hassler, Vesna; Biely, Helmut

    1999-01-01

    Describes the Digital Signature Project that was developed in Austria to establish an infrastructure for applying smart card-based digital signatures in banking and electronic-commerce applications. Discusses the need to conform to international standards, an international certification infrastructure, and security features for a public directory…

  19. Digital Signature Management.

    ERIC Educational Resources Information Center

    Hassler, Vesna; Biely, Helmut

    1999-01-01

    Describes the Digital Signature Project that was developed in Austria to establish an infrastructure for applying smart card-based digital signatures in banking and electronic-commerce applications. Discusses the need to conform to international standards, an international certification infrastructure, and security features for a public directory

  20. A Blind Quantum Signature Scheme with ?-type Entangled States

    NASA Astrophysics Data System (ADS)

    Yin, Xun-Ru; Ma, Wen-Ping; Liu, Wei-Yan

    2012-02-01

    A blind quantum signature scheme with ?-type entangled states is proposed, which can be applied to E-voting system. In this scheme, the particles in ?-type state sequence are used for quantum key distribution first, and then for quantum signature. Our scheme is characterized by its blindness, impossibility of forgery, impossibility of disavowal. In addition, our scheme can perform an audit program with respect to the validity of the verification process in the light of actual requirements. The security of the scheme is also analyzed.

  1. Signature detection and matching for document image retrieval.

    PubMed

    Zhu, Guangyu; Zheng, Yefeng; Doermann, David; Jaeger, Stefan

    2009-11-01

    As one of the most pervasive methods of individual identification and document authentication, signatures present convincing evidence and provide an important form of indexing for effective document image processing and retrieval in a broad range of applications. However, detection and segmentation of free-form objects such as signatures from clustered background is currently an open document analysis problem. In this paper, we focus on two fundamental problems in signature-based document image retrieval. First, we propose a novel multiscale approach to jointly detecting and segmenting signatures from document images. Rather than focusing on local features that typically have large variations, our approach captures the structural saliency using a signature production model and computes the dynamic curvature of 2D contour fragments over multiple scales. This detection framework is general and computationally tractable. Second, we treat the problem of signature retrieval in the unconstrained setting of translation, scale, and rotation invariant nonrigid shape matching. We propose two novel measures of shape dissimilarity based on anisotropic scaling and registration residual error and present a supervised learning framework for combining complementary shape information from different dissimilarity metrics using LDA. We quantitatively study state-of-the-art shape representations, shape matching algorithms, measures of dissimilarity, and the use of multiple instances as query in document image retrieval. We further demonstrate our matching techniques in offline signature verification. Extensive experiments using large real-world collections of English and Arabic machine-printed and handwritten documents demonstrate the excellent performance of our approaches. PMID:19762928

  2. Twin Signature Schemes, Revisited

    NASA Astrophysics Data System (ADS)

    Schäge, Sven

    In this paper, we revisit the twin signature scheme by Naccache, Pointcheval and Stern from CCS 2001 that is secure under the Strong RSA (SRSA) assumption and improve its efficiency in several ways. First, we present a new twin signature scheme that is based on the Strong Diffie-Hellman (SDH) assumption in bilinear groups and allows for very short signatures and key material. A big advantage of this scheme is that, in contrast to the original scheme, it does not require a computationally expensive function for mapping messages to primes. We prove this new scheme secure under adaptive chosen message attacks. Second, we present a modification that allows to significantly increase efficiency when signing long messages. This construction uses collision-resistant hash functions as its basis. As a result, our improvements make the signature length independent of the message size. Our construction deviates from the standard hash-and-sign approach in which the hash value of the message is signed in place of the message itself. We show that in the case of twin signatures, one can exploit the properties of the hash function as an integral part of the signature scheme. This improvement can be applied to both the SRSA based and SDH based twin signature scheme.

  3. An archaeal genomic signature

    NASA Technical Reports Server (NTRS)

    Graham, D. E.; Overbeek, R.; Olsen, G. J.; Woese, C. R.

    2000-01-01

    Comparisons of complete genome sequences allow the most objective and comprehensive descriptions possible of a lineage's evolution. This communication uses the completed genomes from four major euryarchaeal taxa to define a genomic signature for the Euryarchaeota and, by extension, the Archaea as a whole. The signature is defined in terms of the set of protein-encoding genes found in at least two diverse members of the euryarchaeal taxa that function uniquely within the Archaea; most signature proteins have no recognizable bacterial or eukaryal homologs. By this definition, 351 clusters of signature proteins have been identified. Functions of most proteins in this signature set are currently unknown. At least 70% of the clusters that contain proteins from all the euryarchaeal genomes also have crenarchaeal homologs. This conservative set, which appears refractory to horizontal gene transfer to the Bacteria or the Eukarya, would seem to reflect the significant innovations that were unique and fundamental to the archaeal "design fabric." Genomic protein signature analysis methods may be extended to characterize the evolution of any phylogenetically defined lineage. The complete set of protein clusters for the archaeal genomic signature is presented as supplementary material (see the PNAS web site, www.pnas.org).

  4. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    McMillan, Hilary; Westerberg, Ida

    2015-04-01

    Information that summarises the hydrological behaviour or flow regime of a catchment is essential for comparing responses of different catchments to understand catchment organisation and similarity, and for many other modelling and water-management applications. Such information types derived as an index value from observed data are known as hydrological signatures, and can include descriptors of high flows (e.g. mean annual flood), low flows (e.g. mean annual low flow, recession shape), the flow variability, flow duration curve, and runoff ratio. Because the hydrological signatures are calculated from observed data such as rainfall and flow records, they are affected by uncertainty in those data. Subjective choices in the method used to calculate the signatures create a further source of uncertainty. Uncertainties in the signatures may affect our ability to compare different locations, to detect changes, or to compare future water resource management scenarios. The aim of this study was to contribute to the hydrological community's awareness and knowledge of data uncertainty in hydrological signatures, including typical sources, magnitude and methods for its assessment. We proposed a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrated it for a variety of commonly used signatures. The study was made for two data rich catchments, the 50 km2 Mahurangi catchment in New Zealand and the 135 km2 Brue catchment in the UK. For rainfall data the uncertainty sources included point measurement uncertainty, the number of gauges used in calculation of the catchment spatial average, and uncertainties relating to lack of quality control. For flow data the uncertainty sources included uncertainties in stage/discharge measurement and in the approximation of the true stage-discharge relation by a rating curve. The resulting uncertainties were compared across the different signatures and catchments, to quantify uncertainty magnitude and bias, and to test how uncertainty depended on the density of the raingauge network and flow gauging station characteristics. The uncertainties were sometimes large (i.e. typical intervals of 10-40% relative uncertainty) and highly variable between signatures. Uncertainty in the mean discharge was around 10% for both catchments, while signatures describing the flow variability had much higher uncertainties in the Mahurangi where there was a fast rainfall-runoff response and greater high-flow rating uncertainty. Event and total runoff ratios had uncertainties from 10% to 15% depending on the number of rain gauges used; precipitation uncertainty was related to interpolation rather than point uncertainty. Uncertainty distributions in these signatures were skewed, and meant that differences in signature values between these catchments were often not significant. We hope that this study encourages others to use signatures in a way that is robust to data uncertainty.

  5. Description of a Computerized, On-Line Interlibrary Loan System.

    ERIC Educational Resources Information Center

    Kilgour, Frederick G.

    This paper describes the first two months of operation of the OCLC interlibrary loan system, an online system designed to increase speed and effectiveness in obtaining interlibrary loans. This system provides (1) bibliographic verification of interlibrary loan records and location of materials by using online union catalog records, (2) automatic…

  6. Toward Automatic Verification of Goal-Oriented Flow Simulations

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2014-01-01

    We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.

  7. [Sarcoma gene signatures].

    PubMed

    Chibon, F; Coindre, J-M

    2011-02-01

    This review reports the main gene signature specific for the diagnosis, prognosis or prediction of drug response in sarcomas. Almost half of sarcomas show a simple genetic lesion which is specific for the diagnosis: recurrent translocations in 10 to 15% of sarcomas, specific activating and inactivating mutations in GIST and rhabdoid tumor respectively, and MDM2 amplification in well-differentiated and dedifferentiated liposarcomas as well as in intimal sarcoma. A recent study reported a gene expression signature which is much better than histological grading for predicting metastasis outcome. This signature is composed of 67 genes all belonging to pathways involved in chromosome integrity suggesting an important role of these mechanisms in the development of metastases. On the other hand, and except for GIST with KIT and PDGFRA mutations, there is no validated predictive gene signature so far. PMID:21287318

  8. Meteor signature interpretation

    SciTech Connect

    Canavan, G.H.

    1997-01-01

    Meteor signatures contain information about the constituents of space debris and present potential false alarms to early warnings systems. Better models could both extract the maximum scientific information possible and reduce their danger. Accurate predictions can be produced by models of modest complexity, which can be inverted to predict the sizes, compositions, and trajectories of object from their signatures for most objects of interest and concern.

  9. Ladar signature simulation

    NASA Astrophysics Data System (ADS)

    Estep, Jeff; Gu, Zu-Han

    An account is given of a development program for a laboratory apparatus that would conduct monostatic measurements of the bidirectional reflectance properties of target samples and the generation of laser radar signatures on the basis of these data. The resulting Monostatic Bidirectional Reflectometer will be sufficiently flexible to accommodate various laser sources as detectors, accurately measuring target bidirectional reflectance distribution function and combining these data with geometrical target models for signature predictions.

  10. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  11. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  12. UV signature mutations.

    PubMed

    Brash, Douglas E

    2015-01-01

    Sequencing complete tumor genomes and exomes has sparked the cancer field's interest in mutation signatures for identifying the tumor's carcinogen. This review and meta-analysis discusses signatures and their proper use. We first distinguish between a mutagen's canonical mutationsdeviations from a random distribution of base changes to create a pattern typical of that mutagenand the subset of signature mutations, which are unique to that mutagen and permit inference backward from mutations to mutagen. To verify UV signature mutations, we assembled literature datasets on cells exposed to UVC, UVB, UVA, or solar simulator light (SSL) and tested canonical UV mutation features as criteria for clustering datasets. A confirmed UV signature was: ?60% of mutations are C?T at a dipyrimidine site, with ?5% CC?TT. Other canonical features such as a bias for mutations on the nontranscribed strand or at the 3' pyrimidine had limited application. The most robust classifier combined these features with criteria for the rarity of non-UV canonical mutations. In addition, several signatures proposed for specific UV wavelengths were limited to specific genes or species; UV's nonsignature mutations may cause melanoma BRAF mutations; and the mutagen for sunlight-related skin neoplasms may vary between continents. PMID:25354245

  13. The 70-Gene Prognostic Signature for Korean Breast Cancer Patients

    PubMed Central

    Na, Kuk Young; Lee, Jeong Eon; Kim, Hee Jeong; Yang, Jung-Hyun; Ahn, Sei-Hyun; Moon, Byung-In; Kim, Ra Mi; Ko, Si Mon; Jung, Yong Sik

    2011-01-01

    Purpose A 70-gene prognostic signature has prognostic value in patients with node-negative breast cancer in Europe. This diagnostic test known as "MammaPrint (70-gene prognostic signature)" was recently validated and implementation was feasible. Therefore, we assessed the 70-gene prognostic signature in Korean patients with breast cancer. We compared the risk predicted by the 70-gene prognostic signature with commonly used clinicopathological guidelines among Korean patients with breast cancer. We also analyzed the 70-gene prognostic signature and clinicopathological feature of the patients in comparison with a previous validation study. Methods Forty-eight eligible patients with breast cancer (clinical T1-2N0M0) were selected from four hospitals in Korea. Fresh tumor samples were analyzed with a customized microarray for the 70-gene prognostic signature. Concordance between the risk predicted by the 70-gene prognostic signature and risk predicted by commonly used clinicopathological guidelines (St. Gallen guidelines, National Institutes of Health [NIH] guideline, and Adjuvant! Online) was evaluated. Results Prognosis signatures were assessed in 36 patients. No significant differences were observed in the clinicopathological features of patients compared with previous studies. The 70-gene prognosis signature identified five (13.9%) patients with a low-risk prognosis signature and 31 (86.1%) patients with a high-risk prognosis signature. Clinical risk was concordant with the prognosis signature for 29 patients (80.6%) according to the St. Gallen guidelines; 30 patients (83.4%) according to the NIH guidelines; and 23 patients (63.8%) according to the Adjuvant! Online. Our results were different from previous validation studies in Europe with about a 40% low-risk prognosis and about a 60% high-risk prognosis. The high incidence in the high-risk group was consistent with data in Japan. Conclusion The results of 70-gene prognostic signature of Korean patients with breast cancer were somewhat different from those identified in Europe. This difference should be studied as whether there is a gene disparity between Asians and Europeans. Further large-scale studies with a follow-up evaluation are required to assess whether the use of the 70-gene prognostic signature can predict the prognosis of Korean patients with breast cancer. PMID:21847392

  14. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of achievement in V&V activities, how closely related the V&V benchmarks are to the actual application of interest, and the quantification of uncertainties related to the application of interest.

  15. A Quantum Proxy Signature Scheme Based on Genuine Five-qubit Entangled State

    NASA Astrophysics Data System (ADS)

    Cao, Hai-Jing; Huang, Jun; Yu, Yao-Feng; Jiang, Xiu-Li

    2014-09-01

    In this paper a very efficient and secure proxy signature scheme is proposed. It is based on controlled quantum teleportation. Genuine five-qubit entangled state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. Quantum key distribution and one-time pad are adopted in our scheme, which could guarantee not only the unconditional security of the scheme but also the anonymity of the messages owner.

  16. Quantum Proxy Multi-Signature Scheme Using Genuinely Entangled Six Qubits State

    NASA Astrophysics Data System (ADS)

    Cao, Hai-Jing; Wang, Huai-Sheng; Li, Peng-Fei

    2013-04-01

    A quantum proxy multi-signature scheme is presented based on controlled teleportation. Genuinely entangled six qubits quantum state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. Quantum key distribution and one-time pad are adopted in our scheme, which could guarantee not only the unconditional security of the scheme but also the anonymity of the messages owner.

  17. Verification of EPC Soundness

    NASA Astrophysics Data System (ADS)

    Mendling, Jan

    The aim of this book is to evaluate the power of metrics for predicting errors in business process models. In order to do so, we need to establish a clear and unambiguous understanding of which EPC business process model is correct and how it can be verify. This chapter presents verification techniques that can be applied to identify errors in EPCs with a focus on reachability graph analysis and reduction rules. Other verification techniques such as calculating invariants (see [313, 440]), reasoning (see [337, 112]) or model integration (see [403]) will not be considered.

  18. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  19. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  20. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through referral-based crowdsourcing: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  1. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  2. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  3. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  4. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests between the different levels (system, modules, subsystems, etc) and giving an overview of the main test defined at Spacecraft level. The paper is mainly focused on the verification aspects of the EDL Demonstrator Module and the Rover Module, for which an intense testing activity without previous heritage in Europe is foreseen. In particular the Descent Module has to survive to the Mars atmospheric entry and landing, its surface platform has to stay operational for 8 sols on Martian surface, transmitting scientific data to the Orbiter. The Rover Module has to perform 180 sols mission in Mars surface environment. These operative conditions cannot be verified only by analysis; consequently a test campaign is defined including mechanical tests to simulate the entry loads, thermal test in Mars environment and the simulation of Rover operations on a 'Mars like' terrain. Finally, the paper present an overview of the documentation flow defined to ensure the correct translation of the mission requirements in verification activities (test, analysis, review of design) until the final verification close-out of the above requirements with the final verification reports.

  5. Multibody modeling and verification

    NASA Technical Reports Server (NTRS)

    Wiens, Gloria J.

    1989-01-01

    A summary of a ten week project on flexible multibody modeling, verification and control is presented. Emphasis was on the need for experimental verification. A literature survey was conducted for gathering information on the existence of experimental work related to flexible multibody systems. The first portion of the assigned task encompassed the modeling aspects of flexible multibodies that can undergo large angular displacements. Research in the area of modeling aspects were also surveyed, with special attention given to the component mode approach. Resulting from this is a research plan on various modeling aspects to be investigated over the next year. The relationship between the large angular displacements, boundary conditions, mode selection, and system modes is of particular interest. The other portion of the assigned task was the generation of a test plan for experimental verification of analytical and/or computer analysis techniques used for flexible multibody systems. Based on current and expected frequency ranges of flexible multibody systems to be used in space applications, an initial test article was selected and designed. A preliminary TREETOPS computer analysis was run to ensure frequency content in the low frequency range, 0.1 to 50 Hz. The initial specifications of experimental measurement and instrumentation components were also generated. Resulting from this effort is the initial multi-phase plan for a Ground Test Facility of Flexible Multibody Systems for Modeling Verification and Control. The plan focusses on the Multibody Modeling and Verification (MMV) Laboratory. General requirements of the Unobtrusive Sensor and Effector (USE) and the Robot Enhancement (RE) laboratories were considered during the laboratory development.

  6. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  7. Current signature sensor

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M. (Inventor); Lucena, Angel (Inventor); Ihlefeld, Curtis (Inventor); Burns, Bradley (Inventor); Bassignani, Karin E. (Inventor)

    2005-01-01

    A solenoid health monitoring system uses a signal conditioner and controller assembly in one embodiment that includes analog circuitry and a DSP controller. The analog circuitry provides signal conditioning to the low-level raw signal coming from a signal acquisition assembly. Software running in a DSP analyzes the incoming data (recorded current signature) and determines the state of the solenoid whether it is energized, de-energized, or in a transitioning state. In one embodiment, the software identifies key features in the current signature during the transition phase and is able to determine the health of the solenoid.

  8. Current Signature Sensor

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M. (Inventor); Lucena, Angel (Inventor); Ihlefeld, Curtis (Inventor); Burns, Bradley (Inventor); Bassignani, Mario (Inventor); Bassignani, Karin E. (Inventor)

    2005-01-01

    A solenoid health monitoring system uses a signal conditioner and controller assembly in one embodiment that includes analog circuitry and a DSP controller. The analog circuitry provides signal conditioning to the low-level raw signal coming from a signal acquisition assembly. Software running in a DSP analyzes the incoming data (recorded current signature) and determines the state of the solenoid whether it is energized, de-energized, or in a transitioning state. In one embodiment, the software identifies key features in the current signature during the transition phase and is able to determine the health of the solenoid.

  9. A quantum proxy group signature scheme based on an entangled five-qubit state

    NASA Astrophysics Data System (ADS)

    Wang, Meiling; Ma, Wenping; Wang, Lili; Yin, Xunru

    2015-09-01

    A quantum proxy group signature (QPGS) scheme based on controlled teleportation is presented, by using the entangled five-qubit quantum state functions as quantum channel. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. The security of the scheme is guaranteed by the entanglement correlations of the entangled five-qubit state, the secret keys based on the quantum key distribution (QKD) and the one-time pad algorithm, all of which have been proven to be unconditionally secure and the signature anonymity.

  10. Secure verification by multifactor optical validation

    NASA Astrophysics Data System (ADS)

    Millán, María S.; Pérez-Cabré, Elisabet; Javidi, Bahram

    2006-08-01

    We propose a novel multifactor encryption-authentication technique that reinforces optical security by allowing the simultaneous AND-verification of more than one primary image. We describe a method to obtain four-factor authentication. The authenticators are: two different primary images containing signatures or biometric information and two different white random sequences that act as key codes. So far, optical security techniques deal with a single primary image (an object, a signature, or a biometric signal), not combined primary images. Our method involves double random-phase encoding, fully phase-based encryption and a combined nonlinear JTC and a classical 4f-correlator for simultaneous recognition and authentication of multiple images. There is no a priori constraint about the type of primary images to encode. Two reference images, double-phase encoded and encrypted in an ID tag (or card) are compared with the actual input images obtained in situ from the person whose authentication is wanted. The two key phase codes are known by the authentication processor. The complex-amplitude encoded image of the ID tag has a dim appearance that does not reveal the content of any primary reference image nor the key codes. The encoded image function fullfils the general requirements of invisible content, extreme difficulty in counterfeiting and real-time automatic verification. The possibility of introducing nonlinearities in the Fourier plane of the optical processor will be exploited to improve the system performance. This optical technique is attractive for high-security purposes that require multifactor reliable authentication.

  11. Verification of Numerical Algorithms

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The following strategy is suggested for specification and proof: (1) Defer the construction of a formal program specification with respect to I/O assertions unit the correctness of the program with respect to an abstract mathematical model of program intent is demonstrated. (2) Prove that an abstract machine (using infinite precision arithmetic) would compute that object exactly. (3) Prove that the computational sequences of arithmetic operations that occur in the abstract machine must be precisely the same at every step as those occurring on an actual machine (with finite precision arithmetic), executing the same program. (4) Use a Verification Conditions VC-generator that knows about the semantics of arithmetic operations to annotate the program with assertions that bound (or in some circumstances estimate) the difference between the actual machine state variables and the corresponding ones of the abstract machine. Construct the formal program specification by combining the verification conditions into theorems about computational error that can be proved with mechanical assistance.

  12. A Signature Style

    ERIC Educational Resources Information Center

    Smiles, Robin V.

    2005-01-01

    This article discusses Dr. Amalia Amaki and her approach to art as her signature style by turning everyday items into fine art. Amaki is an assistant professor of art, art history, and Black American studies at the University of Delaware. She loves taking unexpected an object and redefining it in the context of art--like a button, a fan, a faded

  13. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  14. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  15. TFE Verification Program

    SciTech Connect

    Not Available

    1990-03-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TF Verification Program builds directly on the technology and data base developed in the 1960s and 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern. The general logic and strategy of the program to achieve its objectives is shown on Fig. 1-1. Five prior programs form the basis for the TFE Verification Program: (1) AEC/NASA program of the 1960s and early 1970; (2) SP-100 concept development program;(3) SP-100 thermionic technology program; (4) Thermionic irradiations program in TRIGA in FY-86; (5) and Thermionic Technology Program in 1986 and 1987. 18 refs., 64 figs., 43 tabs.

  16. Online Pricing.

    ERIC Educational Resources Information Center

    Garman, Nancy; And Others

    1990-01-01

    The first of four articles describes the move by the European Space Agency to eliminate connect time charges on its online retrieval system. The remaining articles describe the pricing structure of DIALOG, compare the two pricing schemes, and discuss online pricing from the user's point of view. (CLB)

  17. Field Instructors and Online Training: An Exploratory Survey

    ERIC Educational Resources Information Center

    Dedman, Denise E.; Palmer, Louann Bierlein

    2011-01-01

    Despite field placement being the signature pedagogy of the social work profession, little research exists regarding methods for training field instructors. This study captures their perceptions regarding the use of online training. An online survey of 642 field instructors from 4 universities produced 208 responses. Less than 4% rejected the idea…

  18. Web-based interrogation of gene expression signatures using EXALT

    PubMed Central

    2009-01-01

    Background Widespread use of high-throughput techniques such as microarrays to monitor gene expression levels has resulted in an explosive growth of data sets in public domains. Integration and exploration of these complex and heterogeneous data have become a major challenge. Results The EXALT (EXpression signature AnaLysis Tool) online program enables meta-analysis of gene expression profiles derived from publically accessible sources. Searches can be executed online against two large databases currently containing more than 28,000 gene expression signatures derived from GEO (Gene Expression Omnibus) and published expression profiles of human cancer. Comparisons among gene expression signatures can be performed with homology analysis and co-expression analysis. Results can be visualized instantly in a plot or a heat map. Three typical use cases are illustrated. Conclusions The EXALT online program is uniquely suited for discovering relationships among transcriptional profiles and searching gene expression patterns derived from diverse physiological and pathological settings. The EXALT online program is freely available for non-commercial users from http://seq.mc.vanderbilt.edu/exalt/. PMID:20003458

  19. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  20. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  1. Wake Signature Detection

    NASA Astrophysics Data System (ADS)

    Spedding, Geoffrey R.

    2014-01-01

    An accumulated body of quantitative evidence shows that bluff-body wakes in stably stratified environments have an unusual degree of coherence and organization, so characteristic geometries such as arrays of alternating-signed vortices have very long lifetimes, as measured in units of buoyancy timescales, or in the downstream distance scaled by a body length. The combination of pattern geometry and persistence renders the detection of these wakes possible in principle. It now appears that identifiable signatures can be found from many disparate sources: Islands, fish, and plankton all have been noted to generate features that can be detected by climate modelers, hopeful navigators in open oceans, or hungry predators. The various types of wakes are reviewed with notes on why their signatures are important and to whom. A general theory of wake pattern formation is lacking and would have to span many orders of magnitude in Reynolds number.

  2. Identification of host response signatures of infection.

    SciTech Connect

    Branda, Steven S.; Sinha, Anupama; Bent, Zachary

    2013-02-01

    Biological weapons of mass destruction and emerging infectious diseases represent a serious and growing threat to our national security. Effective response to a bioattack or disease outbreak critically depends upon efficient and reliable distinguishing between infected vs healthy individuals, to enable rational use of scarce, invasive, and/or costly countermeasures (diagnostics, therapies, quarantine). Screening based on direct detection of the causative pathogen can be problematic, because culture- and probe-based assays are confounded by unanticipated pathogens (e.g., deeply diverged, engineered), and readily-accessible specimens (e.g., blood) often contain little or no pathogen, particularly at pre-symptomatic stages of disease. Thus, in addition to the pathogen itself, one would like to detect infection-specific host response signatures in the specimen, preferably ones comprised of nucleic acids (NA), which can be recovered and amplified from tiny specimens (e.g., fingerstick draws). Proof-of-concept studies have not been definitive, however, largely due to use of sub-optimal sample preparation and detection technologies. For purposes of pathogen detection, Sandia has developed novel molecular biology methods that enable selective isolation of NA unique to, or shared between, complex samples, followed by identification and quantitation via Second Generation Sequencing (SGS). The central hypothesis of the current study is that variations on this approach will support efficient identification and verification of NA-based host response signatures of infectious disease. To test this hypothesis, we re-engineered Sandia's sophisticated sample preparation pipelines, and developed new SGS data analysis tools and strategies, in order to pioneer use of SGS for identification of host NA correlating with infection. Proof-of-concept studies were carried out using specimens drawn from pathogen-infected non-human primates (NHP). This work provides a strong foundation for large-scale, highly-efficient efforts to identify and verify infection-specific host NA signatures in human populations.

  3. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  4. SPHERE Science Verification

    NASA Astrophysics Data System (ADS)

    Leibundgut, B.; Beuzit, J.-L.; Gibson, N.; Girard, J.; Kasper, M.; Kerber, F.; Lundin, L.; Mawet, D.; McClure, M.; Milli, J.; Petr-Gotzens, M.; Siebenmorgen, R.; van den Ancker, M.; Wahhaj, Z.

    2015-03-01

    Science Verification (SV) for the latest instrument to arrive on Paranal, the high-contrast and spectro-polarimetric extreme adaptive optics instrument SPHERE, is described. The process through which the SV proposals were solicited and evaluated is briefly outlined; the resulting observations took place in December 2014 and February 2015. A wide range of targets was observed, ranging from the Solar System, young stars with planets and discs, circumstellar environments of evolved stars to a galaxy nucleus. Some of the first results are previewed.

  5. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis; Mahadevan, Karthikeyan

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  6. Knowledge Signatures for Information Integration

    SciTech Connect

    Thomson, Judi; Cowell, Andrew J.; Paulson, Patrick R.; Butner, R. Scott; Whiting, Mark A.

    2003-10-25

    This paper introduces the notion of a knowledge signature: a concise, ontologically-driven representation of the semantic characteristics of data. Knowledge signatures provide programmatic access to data semantics while allowing comparisons to be made across different types of data such as text, images or video, enabling efficient, automated information integration. Through observation, which determines the degree of association between data and ontological concepts, and refinement, which uses the axioms and structure of the domain ontology to place the signature more accurately within the context of the domain, knowledge signatures can be created. A comparison of such signatures for two different pieces of data results in a measure of their semantic separation. This paper discusses the definition of knowledge signatures along with the design and prototype implementation of a knowledge signature generator.

  7. 38 CFR 74.2 - What are the eligibility requirements a concern must meet for VetBiz VIP Verification Program?

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... requirements a concern must meet for VetBiz VIP Verification Program? 74.2 Section 74.2 Pensions, Bonuses, and... Guidelines 74.2 What are the eligibility requirements a concern must meet for VetBiz VIP Verification... online Vendor Information Pages database forms at http://www.VetBiz.gov, and has been examined by...

  8. 38 CFR 74.2 - What are the eligibility requirements a concern must meet for VetBiz VIP Verification Program?

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... requirements a concern must meet for VetBiz VIP Verification Program? 74.2 Section 74.2 Pensions, Bonuses, and... Guidelines 74.2 What are the eligibility requirements a concern must meet for VetBiz VIP Verification... online Vendor Information Pages database forms at http://www.VetBiz.gov, and has been examined by...

  9. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  10. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  11. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  12. Model-driven software verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev

    2003-01-01

    In this paper we explore a different approach to software verification. With this approach, a software application can be included, without substantial change, into a verification test-harness and then verified directly, while presearving the ability to apply data abstraction techniques. Only the test-harness is written in the language of the model checker.

  13. Quantum blind signature based on Two-State Vector Formalism

    NASA Astrophysics Data System (ADS)

    Qi, Su; Zheng, Huang; Qiaoyan, Wen; Wenmin, Li

    2010-11-01

    Two-State Vector Formalism (TSVF) including pre- and postselected states is a complete description of a system between two measurements. Consequently TSVF gives a perfect solution to the Mean King problem. In this paper, utilizing the dramatic correlation in the verification, we propose a quantum blind signature scheme based on TSVF. Compared with Wen's scheme, our scheme has 100% efficiency. Our scheme guarantees the unconditional security. Moreover, the proposed scheme, which is easy to implement, can be applied to E-payment system.

  14. Signature CERN-URSS

    ScienceCinema

    None

    2011-04-25

    Le DG W.Jentschke souhaite la bienvenue à l'assemblée et aux invités pour la signature du protocole entre le Cern et l'URSS qui est un événement important. C'est en 1955 que 55 visiteurs soviétiques ont visité le Cern pour la première fois. Le premier DG au Cern, F.Bloch, et Mons.Amaldi sont aussi présents. Tandis que le discours anglais de W.Jentschke est traduit en russe, le discours russe de Mons.Morozov est traduit en anglais.

  15. Signatures of nonthermal melting

    PubMed Central

    Zier, Tobias; Zijlstra, Eeuwe S.; Kalitsov, Alan; Theodonis, Ioannis; Garcia, Martin E.

    2015-01-01

    Intense ultrashort laser pulses can melt crystals in less than a picosecond but, in spite of over thirty years of active research, for many materials it is not known to what extent thermal and nonthermal microscopic processes cause this ultrafast phenomenon. Here, we perform ab-initio molecular-dynamics simulations of silicon on a laser-excited potential-energy surface, exclusively revealing nonthermal signatures of laser-induced melting. From our simulated atomic trajectories, we compute the decay of five structure factors and the time-dependent structure function. We demonstrate how these quantities provide criteria to distinguish predominantly nonthermal from thermal melting. PMID:26798822

  16. Minimal signatures of naturalness

    NASA Astrophysics Data System (ADS)

    El Hedri, Sonia; Hook, Anson

    2013-10-01

    We study the naturalness problem using a model independent bottom up approach by considering models where only the interaction terms needed to cancel the Higgs quadratic divergences are present. If quadratic divergences are canceled by terms linear in the Higgs field, then the collider phenomenology is well covered by current electroweakino and fourth generation searches. If quadratic divergences are canceled by terms bilinear in the Higgs field, then the signatures are highly dependent on the quantum numbers of the new particles. Precision Higgs measurements can reveal the presence of new particles with either vevs or Standard Model charges. If the new particles are scalar dark matter candidates, their direct and indirect detection signatures will be highly correlated and within the reach of XENON100 and Fermi. Observation at one of these experiments would imply observation at the other one. Observable LHC decay channels can also arise if the new particles mix with lighter states. This decay channel involves only the Higgs boson and not the gauge bosons. Observation of such decays would give evidence that the new particle is tied to the naturalness problem.

  17. Techniques of Online Citation Verification in NLM Databases.

    PubMed

    McKinn, E J; Johnson, E D

    1983-01-01

    This article is an exploration of techniques for identifying items in the health related literature using NLM system databases. Examples used are derived from reference desk and interlibrary loan files, but the techniques should be useful to any medical librarian who needs to verify a citation. A knowledge of NLM indexing practices and system commands is assumcd. Because of mace limitations the TOXLINE. TOXBACK74, TOXBACK65,AVLINE files have been omitted, but other bibliographic files and SERLINE are included. Tables designed for at-terminal use are presented. PMID:15774371

  18. Overview of Code Verification

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The verified code for the SIFT Executive is not the code that executes on the SIFT system as delivered. The running versions of the SIFT Executive contain optimizations and special code relating to the messy interface to the hardware broadcast interface and to packing of data to conserve space in the store of the BDX930 processors. The running code was in fact developed prior to and without consideration of any mechanical verification. This was regarded as necessary experimentation with the SIFT hardware and special purpose Pascal compiler. The Pascal code sections cover: the selection of a schedule from the global executive broadcast, scheduling, dispatching, three way voting, and error reporting actions of the SIFT Executive. Not included in these sections of Pascal code are: the global executive, five way voting, clock synchronization, interactive consistency, low level broadcasting, and program loading, initialization, and schedule construction.

  19. Signature-based authentication system using watermarking in the ridgelet and Radon-DCT domain

    NASA Astrophysics Data System (ADS)

    Maiorana, Emanuele; Campisi, Patrizio; Neri, Alessandro

    2007-10-01

    In this paper we propose a signature-based biometric system, where watermarking is applied to signature images in order to hide and keep secret some signature features in a static representation of the signature itself. Being a behavioral biometric, signatures are intrinsically different from other commonly used biometric data, possessing dynamic properties which can not be extracted from a single signature image. The marked images can be used for user authentication, letting their static characteristics being analyzed by automatic algorithms or security attendants. When a higher security is needed, the embedded features can be extracted and used, thus realizing a multi-level decision procedure. The proposed watermarking techniques are tailored to images with sharpened edges, just like a signature picture. In order to obtain a robust method, able to hide relevant data while keeping intact the original structure of the host, the mark is embedded as close as possible to the lines that constitute the signature, using the properties of the Radon transform. An extensive set of experimental results, obtained varying the system's parameters and concerning both the mark extraction and the verification performances, show the effectiveness of our approach.

  20. Advanced spectral signature discrimination algorithm

    NASA Astrophysics Data System (ADS)

    Chakravarty, Sumit; Cao, Wenjie; Samat, Alim

    2013-05-01

    This paper presents a novel approach to the task of hyperspectral signature analysis. Hyperspectral signature analysis has been studied a lot in literature and there has been a lot of different algorithms developed which endeavors to discriminate between hyperspectral signatures. There are many approaches for performing the task of hyperspectral signature analysis. Binary coding approaches like SPAM and SFBC use basic statistical thresholding operations to binarize a signature which are then compared using Hamming distance. This framework has been extended to techniques like SDFC wherein a set of primate structures are used to characterize local variations in a signature together with the overall statistical measures like mean. As we see such structures harness only local variations and do not exploit any covariation of spectrally distinct parts of the signature. The approach of this research is to harvest such information by the use of a technique similar to circular convolution. In the approach we consider the signature as cyclic by appending the two ends of it. We then create two copies of the spectral signature. These three signatures can be placed next to each other like the rotating discs of a combination lock. We then find local structures at different circular shifts between the three cyclic spectral signatures. Texture features like in SDFC can be used to study the local structural variation for each circular shift. We can then create different measure by creating histogram from the shifts and thereafter using different techniques for information extraction from the histograms. Depending on the technique used different variant of the proposed algorithm are obtained. Experiments using the proposed technique show the viability of the proposed methods and their performances as compared to current binary signature coding techniques.

  1. Online Monitoring of Induction Motors

    SciTech Connect

    McJunkin, Timothy R.; Agarwal, Vivek; Lybeck, Nancy Jean

    2016-01-01

    The online monitoring of active components project, under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program, researched diagnostic and prognostic models for alternating current induction motors (IM). Idaho National Laboratory (INL) worked with the Electric Power Research Institute (EPRI) to augment and revise the fault signatures previously implemented in the Asset Fault Signature Database of EPRI’s Fleet Wide Prognostic and Health Management (FW PHM) Suite software. Induction Motor diagnostic models were researched using the experimental data collected by Idaho State University. Prognostic models were explored in the set of literature and through a limited experiment with 40HP to seek the Remaining Useful Life Database of the FW PHM Suite.

  2. A signature analysis based method for elliptical shape

    NASA Astrophysics Data System (ADS)

    Guarneri, Ivana; Guarnera, Mirko; Messina, Giuseppe; Tomaselli, Valeria

    2010-01-01

    The high level context image analysis regards many fields as face recognition, smile detection, automatic red eye removal, iris recognition, fingerprint verification, etc. Techniques involved in these fields need to be supported by more powerful and accurate routines. The aim of the proposed algorithm is to detect elliptical shapes from digital input images. It can be successfully applied in topics as signal detection or red eye removal, where the elliptical shape degree assessment can improve performances. The method has been designed to handle low resolution and partial occlusions. The algorithm is based on the signature contour analysis and exploits some geometrical properties of elliptical points. The proposed method is structured in two parts: firstly, the best ellipse which approximates the object shape is estimated; then, through the analysis and the comparison between the reference ellipse signature and the object signature, the algorithm establishes if the object is elliptical or not. The first part is based on symmetrical properties of the points belonging to the ellipse, while the second part is based on the signature operator which is a functional representation of a contour. A set of real images has been tested and results point out the effectiveness of the algorithm in terms of accuracy and in terms of execution time.

  3. 75 FR 12772 - Federal Labor Standards Payee Verification and Payment Processing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-03-17

    ... discrepancies have been resolved, and to issue wage restitution payments on behalf of construction and... From the Federal Register Online via the Government Printing Office DEPARTMENT OF HOUSING AND URBAN DEVELOPMENT Federal Labor Standards Payee Verification and Payment Processing AGENCY: Office...

  4. 76 FR 20536 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-13

    ... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 75 RIN 2060-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing Correction In rule document 2011-6216 appearing on pages 17288-17325 in...

  5. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Verification. 123.8 Section 123.8 Food and Drugs... CONSUMPTION FISH AND FISHERY PRODUCTS General Provisions § 123.8 Verification. (a) Overall verification. Every... likely to occur, and that the plan is being effectively implemented. Verification shall include, at...

  6. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Verification. 123.8 Section 123.8 Food and Drugs... CONSUMPTION FISH AND FISHERY PRODUCTS General Provisions § 123.8 Verification. (a) Overall verification. Every... likely to occur, and that the plan is being effectively implemented. Verification shall include, at...

  7. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  8. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  9. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  10. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  11. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  12. Does the Community of Inquiry Framework Predict Outcomes in Online MBA Courses?

    ERIC Educational Resources Information Center

    Arbaugh, J. B.

    2008-01-01

    While Garrison and colleagues' (2000) Community of Inquiry (CoI) framework has generated substantial interest among online learning researchers, it has yet to be subjected to extensive quantitative verification or tested for external validity. Using a sample of students from 55 online MBA courses, the findings of this study suggest strong

  13. Pose-oblivious shape signature.

    PubMed

    Gal, Ran; Shamir, Ariel; Cohen-Or, Daniel

    2007-01-01

    A 3D shape signature is a compact representation for some essence of a shape. Shape signatures are commonly utilized as a fast indexing mechanism for shape retrieval. Effective shape signatures capture some global geometric properties which are scale, translation, and rotation invariant. In this paper, we introduce an effective shape signature which is also pose-oblivious. This means that the signature is also insensitive to transformations which change the pose of a 3D shape such as skeletal articulations. Although some topology-based matching methods can be considered pose-oblivious as well, our new signature retains the simplicity and speed of signature indexing. Moreover, contrary to topology-based methods, the new signature is also insensitive to the topology change of the shape, allowing us to match similar shapes with different genus. Our shape signature is a 2D histogram which is a combination of the distribution of two scalar functions defined on the boundary surface of the 3D shape. The first is a definition of a novel function called the local-diameter function. This function measures the diameter of the 3D shape in the neighborhood of each vertex. The histogram of this function is an informative measure of the shape which is insensitive to pose changes. The second is the centricity function that measures the average geodesic distance from one vertex to all other vertices on the mesh. We evaluate and compare a number of methods for measuring the similarity between two signatures, and demonstrate the effectiveness of our pose-oblivious shape signature within a 3D search engine application for different databases containing hundreds of models. PMID:17218743

  14. Signatures of aging revisited

    SciTech Connect

    Drell, S.; Jeanloz, R.; Cornwall, J.; Dyson, F.; Eardley, D.

    1998-03-18

    This study is a follow-on to the review made by JASON during its 1997 Summer Study of what is known about the aging of critical constituents, particularly the high explosives, metals (Pu, U), and polymers in the enduring stockpile. The JASON report (JSR-97-320) that summarized the findings was based on briefings by the three weapons labs (LANL, LLNL, SNL). They presented excellent technical analyses covering a broad range of scientific and engineering problems pertaining to determining signatures of aging. But the report also noted: `Missing, however, from the briefings and the written documents made available to us by the labs and DOE, was evidence of an adequately sharp focus and high priorities on a number of essential near-term needs of maintaining weapons in the stockpile.

  15. Ion Signatures of Reconnection

    NASA Technical Reports Server (NTRS)

    Chandler, Michael O.; Moore, T. E.; Mozer, F. S.; Russell, C. T.

    1998-01-01

    Magnetic reconnection during periods of northward interplanetary magnetic field results in complex field line behavior. It has been shown that the velocity distribution of ions can be used as a diagnostic to determine the location of the reconnection site as well as the resulting field line topology. Ion observations in the high altitude (6-9Re) cusp region from Polar/TIDE reveal a mix of distinct ion populations (including cold ionospheric ions and magnetosheath ions) which can be attributed to different sources. In addition, the phase-space distributions of these ions reveal features which are attributed to reconnection and interactions with the magnetopause current layer (e.g. acceleration, counterstreaming, mixing of magnetosheath and ionospheric ions, and "D"-shaped distributions). These signatures has been used in several cases to infer the location of the reconnection site, the topology of the resulting field lines, and the location of the observation point relative to the magnetopause.

  16. Being Online

    ERIC Educational Resources Information Center

    Hale, Sharon Joy Ng

    2007-01-01

    Online education is particularly well suited to the needs of community college students. Although community colleges have lower per-unit fees than four-year colleges and universities, many community college students still experience economic hardship. Even with fee waivers, students may have problems finding the money for textbooks,

  17. Online inspection

    Technology Transfer Automated Retrieval System (TEKTRAN)

    An online line-scan imaging system capable of both hyperspectral and multispectral visible/near-infrared reflectance imaging was developed to inspect freshly slaughtered chickens on a processing line for wholesomeness. In-plant testing results indicated that the imaging inspection system achieved o...

  18. Being Online

    ERIC Educational Resources Information Center

    Hale, Sharon Joy Ng

    2007-01-01

    Online education is particularly well suited to the needs of community college students. Although community colleges have lower per-unit fees than four-year colleges and universities, many community college students still experience economic hardship. Even with fee waivers, students may have problems finding the money for textbooks,…

  19. Online 1990.

    ERIC Educational Resources Information Center

    Goldstein, Morris

    This paper examines the co-existence of online and CD-ROM technologies in terms of their existing pricing structures, marketing strategies, functionality, and future roles. "Fixed Price Unlimited Usage" (FPUU) pricing and flat-rate pricing are discussed as viable alternatives to current pricing practices. In addition, it is argued that the…

  20. Multisensors signature prediction workbench

    NASA Astrophysics Data System (ADS)

    Latger, Jean; Cathala, Thierry

    2015-10-01

    Guidance of weapon systems relies on sensors to analyze targets signature. Defense weapon systems also need to detect then identify threats also using sensors. The sensors performance is very dependent on conditions e.g. time of day, atmospheric propagation, background ... Visible camera are very efficient for diurnal fine weather conditions, long wave infrared sensors for night vision, radar systems very efficient for seeing through atmosphere and/or foliage ... Besides, multi sensors systems, combining several collocated sensors with associated algorithms of fusion, provide better efficiency (typically for Enhanced Vision Systems). But these sophisticated systems are all the more difficult to conceive, assess and qualify. In that frame, multi sensors simulation is highly required. This paper focuses on multi sensors simulation tools. A first part makes a state of the Art of such simulation workbenches with a special focus on SE-Workbench. SEWorkbench is described with regards to infrared/EO sensors, millimeter waves sensors, active EO sensors and GNSS sensors. Then a general overview of simulation of targets and backgrounds signature objectives is presented, depending on the type of simulation required (parametric studies, open loop simulation, closed loop simulation, hybridization of SW simulation and HW ...). After the objective review, the paper presents some basic requirements for simulation implementation such as the deterministic behavior of simulation, mandatory to repeat it many times for parametric studies... Several technical topics are then discussed, such as the rendering technique (ray tracing vs. rasterization), the implementation (CPU vs. GP GPU) and the tradeoff between physical accuracy and performance of computation. Examples of results using SE-Workbench are showed and commented.

  1. Signatures of dark matter

    NASA Astrophysics Data System (ADS)

    Baltz, Edward Anthony

    It is well known that most of the mass in the universe remains unobserved save for its gravitational effect on luminous matter. The nature of this ``dark matter'' remains a mystery. From measurements of the primordial deuterium abundance, the theory of big bang nucleosynthesis predicts that there are not enough baryons to account for the amount of dark matter observed, thus the missing mass must take an exotic form. Several promising candidates have been proposed. In this work I will describe my research along two main lines of inquiry into the dark matter puzzle. The first possibility is that the dark matter is exotic massive particles, such as those predicted by supersymmetric extensions to the standard model of particle physics. Such particles are generically called WIMPs, for weakly interacting massive particles. Focusing on the so-called neutralino in supersymmetric models, I discuss the possible signatures of such particles, including their direct detection via nuclear recoil experiments and their indirect detection via annihilations in the halos of galaxies, producing high energy antiprotons, positrons and gamma rays. I also discuss signatures of the possible slow decays of such particles. The second possibility is that there is a population of black holes formed in the early universe. Any dark objects in galactic halos, black holes included, are called MACHOs, for massive compact halo objects. Such objects can be detected by their gravitational microlensing effects. Several possibilities for sources of baryonic dark matter are also interesting for gravitational microlensing. These include brown dwarf stars and old, cool white dwarf stars. I discuss the theory of gravitational microlensing, focusing on the technique of pixel microlensing. I make predictions for several planned microlensing experiments with ground based and space based telescopes. Furthermore, I discuss binary lenses in the context of pixel microlensing. Finally, I develop a new technique for extracting the masses of the lenses from a pixel microlensing survey.

  2. Electromagnetic Signature Technique as a Promising Tool to Verify Nuclear Weapons Storage and Dismantlement under a Nuclear Arms Control Regime

    SciTech Connect

    Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.; Ramuhalli, Pradeep

    2012-08-01

    The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without the use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.

  3. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  4. Novel signatures of cancer-associated fibroblasts.

    PubMed

    Bozky, Benedek; Savchenko, Andrii; Csermely, Pter; Korcsmros, Tams; Dl, Zoltn; Pontn, Fredrik; Szkely, Lszl; Klein, George

    2013-07-15

    Increasing evidence indicates the importance of the tumor microenvironment, in particular cancer-associated fibroblasts, in cancer development and progression. In our study, we developed a novel, visually based method to identify new immunohistochemical signatures of these fibroblasts. The method employed a protein list based on 759 protein products of genes identified by RNA profiling from our previous study, comparing fibroblasts with differential growth-modulating effect on human cancers cells, and their first neighbors in the human protein interactome. These 2,654 proteins were analyzed in the Human Protein Atlas online database by comparing their immunohistochemical expression patterns in normal versus tumor-associated fibroblasts. Twelve new proteins differentially expressed in cancer-associated fibroblasts were identified (DLG1, BHLHE40, ROCK2, RAB31, AZI2, PKM2, ARHGAP31, ARHGAP26, ITCH, EGLN1, RNF19A and PLOD2), four of them can be connected to the Rho kinase signaling pathway. They were further analyzed in several additional tumor stromata and revealed that the majority showed congruence among the different tumors. Many of them were also positive in normal myofibroblast-like cells. The new signatures can be useful in immunohistochemical analysis of different tumor stromata and may also give us an insight into the pathways activated in them in their true in vivo context. The method itself could be used for other similar analysis to identify proteins expressed in other cell types in tumors and their surrounding microenvironment. PMID:23319410

  5. The In-House Impact of Online Searching.

    ERIC Educational Resources Information Center

    Wismer, Donald

    1985-01-01

    Reports on in-house use of online services at the Maine State Library for searches not specifically requested by client or initiated by staff member to advance job flow. Use of TALIMAINE service for interlibrary loan verification, document delivery, book selection, and staff development, and client breakdown for 1983-1984 are described. (EJS)

  6. SCUDO: a tool for signature-based clustering of expression profiles

    PubMed Central

    Lauria, Mario; Moyseos, Petros; Priami, Corrado

    2015-01-01

    SCUDO (Signature-based ClUstering for DiagnOstic purposes) is an online tool for the analysis of gene expression profiles for diagnostic and classification purposes. The tool is based on a new method for the clustering of profiles based on a subject-specific, as opposed to disease-specific, signature. Our approach relies on construction of a reference map of transcriptional signatures, from both healthy and affected subjects, derived from their respective mRNA or miRNA profiles. A diagnosis for a new individual can then be performed by determining the position of the individual's transcriptional signature on the map. The diagnostic power of our method has been convincingly demonstrated in an open scientific competition (SBV Improver Diagnostic Signature Challenge), scoring second place overall and first place in one of the sub-challenges. PMID:25958391

  7. A ( t, n)-Threshold Scheme of Multi-party Quantum Group Signature with Irregular Quantum Fourier Transform

    NASA Astrophysics Data System (ADS)

    Shi, Jinjing; Shi, Ronghua; Guo, Ying; Peng, Xiaoqi; Lee, Moon Ho; Park, Dongsun

    2012-04-01

    A novel ( t, n)-threshold scheme for the multi-party quantum group signature is proposed based on the irregular quantum Fourier transform, in which every t-qubit quantum message needs n participants to generate the quantum group signature. All the quantum operation gates in the quantum circuit can be distributed and arranged randomly in the irregular QFT algorithm, which can increase the von Neumann entropy of the signed quantum message and the randomicity of the quantum signature generation significantly. The generation and verification of the quantum group signature can be both performed in quantum circuits with the parallel algorithm. Security analysis shows that an available and legal quantum ( t, n)-threshold group signature can be achieved.

  8. Verification of classified fissile material using unclassified attributes

    SciTech Connect

    Nicholas, N.J.; Fearey, B.L.; Puckett, J.M.; Tape, J.W.

    1998-12-31

    This paper reports on the most recent efforts of US technical experts to explore verification by IAEA of unclassified attributes of classified excess fissile material. Two propositions are discussed: (1) that multiple unclassified attributes could be declared by the host nation and then verified (and reverified) by the IAEA in order to provide confidence in that declaration of a classified (or unclassified) inventory while protecting classified or sensitive information; and (2) that attributes could be measured, remeasured, or monitored to provide continuity of knowledge in a nonintrusive and unclassified manner. They believe attributes should relate to characteristics of excess weapons materials and should be verifiable and authenticatable with methods usable by IAEA inspectors. Further, attributes (along with the methods to measure them) must not reveal any classified information. The approach that the authors have taken is as follows: (1) assume certain attributes of classified excess material, (2) identify passive signatures, (3) determine range of applicable measurement physics, (4) develop a set of criteria to assess and select measurement technologies, (5) select existing instrumentation for proof-of-principle measurements and demonstration, and (6) develop and design information barriers to protect classified information. While the attribute verification concepts and measurements discussed in this paper appear promising, neither the attribute verification approach nor the measurement technologies have been fully developed, tested, and evaluated.

  9. On the signature of LINCOS

    NASA Astrophysics Data System (ADS)

    Ollongren, Alexander

    2010-12-01

    Suppose the international SETI effort yields the discovery of some signal of evidently non-natural origin. Could it contain linguistic information formulated in some kind of Lingua Cosmica? One way to get insight into this matter is to consider what specific (bio) linguistic signature( s) could be attached to a cosmic language for interstellar communication—designed by humans or an alien society having reached a level of intelligence (and technology) comparable to or surpassing ours. For this purpose, we consider in the present paper the logico-linguistic system LINCOS for ( A)CETI, developed during a number of years by the author in several papers and a monograph [1]. The system has a two-fold signature, which distinguishes it significantly from natural languages. In fact abstract and concrete signatures can be distinguished. That an abstract kind occurs is due to the manner in which abstractions of reality are represented in LINCOS-texts. They can take compound forms because the system is multi-expressive—partly due to the availability of inductive (recursive) entities. On the other hand, the concrete signature of LINCOS is related to the distribution of delimiters and predefined tokens in texts. Assigning measures to concrete signatures will be discussed elsewhere. The present contribution concentrates on the abstract signature of the language. At the same time, it is realized that an alien Lingua Cosmica might, but not necessarily needs to have this kind of signatures.

  10. A proposed neutral line signature

    NASA Technical Reports Server (NTRS)

    Doxas, I.; Speiser, T. W.; Dusenbery, P. B.; Horton, W.

    1992-01-01

    An identifying signature is proposed for the existence and location of the neutral line in the magnetotail. The signature, abrupt density, and temperature changes in the Earthtail direction, was first discovered in test particle simulations. Such temperature variations have been observed in ISEE data (Huang et. al. 1992), but their connection to the possible existence of a neutral line in the tail has not yet been established. The proposed signature develops earlier than the ion velocity space ridge of Martin and Speiser (1988), but can only be seen by spacecraft in the vicinity of the neutral line, while the latter can locate a neutral line remotely.

  11. Nondestructive verification with minimal movement of irradiated light-water-reactor fuel assemblies

    SciTech Connect

    Phillips, J.R.; Bosler, G.E.; Halbig, J.K.; Klosterbuer, S.F.; Menlove, H.O.

    1982-10-01

    Nondestructive verification of irradiated light-water reactor fuel assemblies can be performed rapidly and precisely by measuring their gross gamma-ray and neutron signatures. A portable system measured fuel assemblies with exposures ranging from 18.4 to 40.6 GWd/tU and with cooling times ranging from 1575 to 2638 days. Differences in the measured results for side or corner measurements are discussed. 25 figures, 20 tables.

  12. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  13. Identifying, Visualizing, and Fusing Social Media Data to Support Nonproliferation and Arms Control Treaty Verification: Preliminary Results

    SciTech Connect

    Gastelum, Zoe N.; Cramer, Nicholas O.; Benz, Jacob M.; Kreyling, Sean J.; Henry, Michael J.; Corley, Courtney D.; Whattam, Kevin M.

    2013-07-11

    While international nonproliferation and arms control verification capabilities have their foundations in physical and chemical sensors, state declarations, and on-site inspections, verification experts are beginning to consider the importance of open source data to complement and support traditional means of verification. One of those new, and increasingly expanding, sources of open source information is social media, which can be ingested and understood through social media analytics (SMA). Pacific Northwest National Laboratory (PNNL) is conducting research to further our ability to identify, visualize, and fuse social media data to support nonproliferation and arms control treaty verification efforts. This paper will describe our preliminary research to examine social media signatures of nonproliferation or arms control proxy events. We will describe the development of our preliminary nonproliferation and arms control proxy events, outline our initial findings, and propose ideas for future work.

  14. Intrusion detection using secure signatures

    DOEpatents

    Nelson, Trent Darnel; Haile, Jedediah

    2014-09-30

    A method and device for intrusion detection using secure signatures comprising capturing network data. A search hash value, value employing at least one one-way function, is generated from the captured network data using a first hash function. The presence of a search hash value match in a secure signature table comprising search hash values and an encrypted rule is determined. After determining a search hash value match, a decryption key is generated from the captured network data using a second hash function, a hash function different form the first hash function. One or more of the encrypted rules of the secure signatures table having a hash value equal to the generated search hash value are then decrypted using the generated decryption key. The one or more decrypted secure signature rules are then processed for a match and one or more user notifications are deployed if a match is identified.

  15. Ballastic signature identification systems study

    NASA Technical Reports Server (NTRS)

    Reich, A.; Hine, T. L.

    1976-01-01

    The results are described of an attempt to establish a uniform procedure for documenting (recording) expended bullet signatures as effortlessly as possible and to build a comprehensive library of these signatures in a form that will permit the automated comparison of a new suspect bullet with the prestored library. The ultimate objective is to achieve a standardized format that will permit nationwide interaction between police departments, crime laboratories, and other interested law enforcement agencies.

  16. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  17. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  18. Online Organic Chemistry

    ERIC Educational Resources Information Center

    Janowicz, Philip A.

    2010-01-01

    This is a comprehensive study of the many facets of an entirely online organic chemistry course. Online homework with structure-drawing capabilities was found to be more effective than written homework. Online lecture was found to be just as effective as in-person lecture, and students prefer an online lecture format with shorter Webcasts. Online

  19. ETV TECHNOLOGIES UNDERGOING VERIFICATION (ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM)

    EPA Science Inventory

    There are 109 technologies undergoing verification in the following categories: Air, Water, Monitoring, Pollution Prevention and Independent. For the air category there are 12 in the Air Pollution Control Technology Pilot and 6 in the Greenhouse Gas Technology Pilot. In the water...

  20. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  1. 17 CFR 232.302 - Signatures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... with an electronic filing, the term signature means an electronic entry in the form of a magnetic... comprising a name, executed, adopted or authorized as a signature. Signatures are not required in unofficial... and 270.30a-2 of this chapter) shall manually sign a signature page or other document...

  2. 17 CFR 232.302 - Signatures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... with an electronic filing, the term signature means an electronic entry in the form of a magnetic... comprising a name, executed, adopted or authorized as a signature. Signatures are not required in unofficial... and 270.30a-2 of this chapter) shall manually sign a signature page or other document...

  3. 17 CFR 232.302 - Signatures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... with an electronic filing, the term signature means an electronic entry in the form of a magnetic... comprising a name, executed, adopted or authorized as a signature. Signatures are not required in unofficial... and 270.30a-2 of this chapter) shall manually sign a signature page or other document...

  4. 17 CFR 232.302 - Signatures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... with an electronic filing, the term signature means an electronic entry in the form of a magnetic... comprising a name, executed, adopted or authorized as a signature. Signatures are not required in unofficial... and 270.30a-2 of this chapter) shall manually sign a signature page or other document...

  5. Quantum messages with signatures forgeable in arbitrated quantum signature schemes

    NASA Astrophysics Data System (ADS)

    Kim, Taewan; Choi, Jeong Woon; Jho, Nam-Su; Lee, Soojoon

    2015-02-01

    Even though a method to perfectly sign quantum messages has not been known, the arbitrated quantum signature scheme has been considered as one of the good candidates. However, its forgery problem has been an obstacle to the scheme becoming a successful method. In this paper, we consider one situation, which is slightly different from the forgery problem, that we use to check whether at least one quantum message with signature can be forged in a given scheme, although all the messages cannot be forged. If there are only a finite number of forgeable quantum messages in the scheme, then the scheme can be secured against the forgery attack by not sending forgeable quantum messages, and so our situation does not directly imply that we check whether the scheme is secure against the attack. However, if users run a given scheme without any consideration of forgeable quantum messages, then a sender might transmit such forgeable messages to a receiver and in such a case an attacker can forge the messages if the attacker knows them. Thus it is important and necessary to look into forgeable quantum messages. We show here that there always exists such a forgeable quantum message-signature pair for every known scheme with quantum encryption and rotation, and numerically show that there are no forgeable quantum message-signature pairs that exist in an arbitrated quantum signature scheme.

  6. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  7. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Verification. 2.902 Section 2.902... RULES AND REGULATIONS Equipment Authorization Procedures General Provisions 2.902 Verification. (a... the Commission pursuant to 2.957, of this part. (b) Verification attaches to all items...

  8. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions 2.902 Verification. (a) Verification is a procedure where...

  9. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Verification. 2.902 Section 2.902... RULES AND REGULATIONS Equipment Authorization Procedures General Provisions 2.902 Verification. (a... the Commission pursuant to 2.957, of this part. (b) Verification attaches to all items...

  10. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow

  11. Development of Asset Fault Signatures for Prognostic and Health Management in the Nuclear Industry

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2014-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: Diagnostic Advisor, Asset Fault Signature (AFS) Database, Remaining Useful Life Advisor, and Remaining Useful Life Database. This paper focuses on development of asset fault signatures to assess the health status of generator step-up generators and emergency diesel generators in nuclear power plants. Asset fault signatures describe the distinctive features based on technical examinations that can be used to detect a specific fault type. At the most basic level, fault signatures are comprised of an asset type, a fault type, and a set of one or more fault features (symptoms) that are indicative of the specified fault. The AFS Database is populated with asset fault signatures via a content development exercise that is based on the results of intensive technical research and on the knowledge and experience of technical experts. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  12. Real-time online unsupervised detection and classification for remotely sensed imagery

    NASA Astrophysics Data System (ADS)

    Du, Qian

    2004-08-01

    Realtime online processing is important to provide immediate data analysis for resolving critical situations in real applications of hyperspectral imaging. We have developed a Constrained Linear Discriminant Analysis (CLDA) algorithm, an excellent approach to hyperspectral image classification, and investigated its realtime online implementation. Because the required prior object spectral signatures may be unavailable in practice, we propose its unsupervised version in this paper. The new algorithm includes unsupervised signature estimation in realtime followed by realtime CLDA algorithm for classification. The unsupervised signature estimation is based on linear mixture model and least squares error criterion. The preliminary result using an HYDICE scene demonstrates its feasibility and effectiveness.

  13. Improved method for coliform verification.

    PubMed

    Diehl, J D

    1991-02-01

    Modification of a method for coliform verification presented in Standard Methods for the Examination of Water and Wastewater is described. Modification of the method, which is based on beta-galactosidase production, involves incorporation of a lactose operon inducer in medium upon which presumptive coliform isolates are cultured prior to beta-galactosidase assay. PMID:1901712

  14. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100s of warheads, and then 10s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100s, 10s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  15. VERIFICATION OF WATER QUALITY MODELS

    EPA Science Inventory

    The basic concepts of water quality models are reviewed and the need to recognize calibration and verification of models with observed data is stressed. Post auditing of models after environmental control procedures are implemented is necessary to determine true model prediction ...

  16. Signature molecular descriptor : advanced applications.

    SciTech Connect

    Visco, Donald Patrick, Jr.

    2010-04-01

    In this work we report on the development of the Signature Molecular Descriptor (or Signature) for use in the solution of inverse design problems as well as in highthroughput screening applications. The ultimate goal of using Signature is to identify novel and non-intuitive chemical structures with optimal predicted properties for a given application. We demonstrate this in three studies: green solvent design, glucocorticoid receptor ligand design and the design of inhibitors for Factor XIa. In many areas of engineering, compounds are designed and/or modified in incremental ways which rely upon heuristics or institutional knowledge. Often multiple experiments are performed and the optimal compound is identified in this brute-force fashion. Perhaps a traditional chemical scaffold is identified and movement of a substituent group around a ring constitutes the whole of the design process. Also notably, a chemical being evaluated in one area might demonstrate properties very attractive in another area and serendipity was the mechanism for solution. In contrast to such approaches, computer-aided molecular design (CAMD) looks to encompass both experimental and heuristic-based knowledge into a strategy that will design a molecule on a computer to meet a given target. Depending on the algorithm employed, the molecule which is designed might be quite novel (re: no CAS registration number) and/or non-intuitive relative to what is known about the problem at hand. While CAMD is a fairly recent strategy (dating to the early 1980s), it contains a variety of bottlenecks and limitations which have prevented the technique from garnering more attention in the academic, governmental and industrial institutions. A main reason for this is how the molecules are described in the computer. This step can control how models are developed for the properties of interest on a given problem as well as how to go from an output of the algorithm to an actual chemical structure. This report provides details on a technique to describe molecules on a computer, called Signature, as well as the computer-aided molecule design algorithm built around Signature. Two applications are provided of the CAMD algorithm with Signature. The first describes the design of green solvents based on data in the GlaxoSmithKline (GSK) Solvent Selection Guide. The second provides novel non-steroidal glucocorticoid receptor ligands with some optimally predicted properties. In addition to using the CAMD algorithm with Signature, it is demonstrated how to employ Signature in a high-throughput screening study. Here, after classifying both active and inactive inhibitors for the protein Factor XIa using Signature, the model developed is used to screen a large, publicly-available database called PubChem for the most active compounds.

  17. Temperature effects on airgun signatures

    SciTech Connect

    Langhammer, J.; Landroe, M. )

    1993-08-01

    Experiments in an 850 liter water tank were performed in order to study temperature effects on airgun signatures, and to achieve a better understanding of the physical processes that influence an airgun signature. The source was a bolt airgun with a chamber volume of 1.6 cu. in. The pressure used was 100 bar and the gun depth was 0.5 m. The water temperature in the tank was varied between 5 C and 45 C. Near-field signatures were recorded at different water temperatures. Typical signature characteristics such as the primary-to-bubble ratio and the bubble time period increased with increasing water temperature. For comparison and in order to check whether this is valid for larger guns, computer modeling of airguns with chamber volumes of 1.6 and 40 cu. in. was performed. In modeling the same behavior of the signatures with increasing water temperature can be observed. The increase in the primary-to-bubble ratio and the bubble time period with increasing water temperature can be explained by an increased mass transfer across the bubble wall.

  18. Graph Analytics for Signature Discovery

    SciTech Connect

    Hogan, Emilie A.; Johnson, John R.; Halappanavar, Mahantesh; Lo, Chaomei

    2013-06-01

    Within large amounts of seemingly unstructured data it can be diffcult to find signatures of events. In our work we transform unstructured data into a graph representation. By doing this we expose underlying structure in the data and can take advantage of existing graph analytics capabilities, as well as develop new capabilities. Currently we focus on applications in cybersecurity and communication domains. Within cybersecurity we aim to find signatures for perpetrators using the pass-the-hash attack, and in communications we look for emails or phone calls going up or down a chain of command. In both of these areas, and in many others, the signature we look for is a path with certain temporal properties. In this paper we discuss our methodology for finding these temporal paths within large graphs.

  19. Signature Visualization of Software Binaries

    SciTech Connect

    Panas, T

    2008-07-01

    In this paper we present work on the visualization of software binaries. In particular, we utilize ROSE, an open source compiler infrastructure, to pre-process software binaries, and we apply a landscape metaphor to visualize the signature of each binary (malware). We define the signature of a binary as a metric-based layout of the functions contained in the binary. In our initial experiment, we visualize the signatures of a series of computer worms that all originate from the same line. These visualizations are useful for a number of reasons. First, the images reveal how the archetype has evolved over a series of versions of one worm. Second, one can see the distinct changes between version. This allows the viewer to form conclusions about the development cycle of a particular worm.

  20. Molecular signatures for vaccine development.

    PubMed

    Maertzdorf, J; Kaufmann, S H E; Weiner, J

    2015-09-29

    The immune system has evolved complex and specialized mechanisms to mount specific defense responses against the various types of pathogens it encounters. For the development of new vaccines, it is crucial to gain a better understanding of what these mechanisms are and how they work. The field of vaccinology has adopted high-throughput profiling techniques to gain more detailed insights into the various immune responses elicited by different vaccines and natural infections. From all detailed transcriptional profiles generated today, a general picture of immunological responses emerges. First, almost every type of vaccine induces an early interferon-dominated signature. Second, different vaccine formulations induce distinct transcriptional signatures, representing the highly specialized defense mechanisms that must cope with the different pathogens and insults they cause. Transcriptional profiling has shifted its attention toward early molecular signatures, with a growing awareness that early innate responses are likely critical instructors for the development of adaptive immunity at later time points. PMID:25858856

  1. catRAPID signature: identification of ribonucleoproteins and RNA-binding regions

    PubMed Central

    Livi, Carmen Maria; Klus, Petr; Delli Ponti, Riccardo; Tartaglia, Gian Gaetano

    2016-01-01

    Motivation: Recent technological advances revealed that an unexpected large number of proteins interact with transcripts even if the RNA-binding domains are not annotated. We introduce catRAPID signature to identify ribonucleoproteins based on physico-chemical features instead of sequence similarity searches. The algorithm, trained on human proteins and tested on model organisms, calculates the overall RNA-binding propensity followed by the prediction of RNA-binding regions. catRAPID signature outperforms other algorithms in the identification of RNA-binding proteins and detection of non-classical RNA-binding regions. Results are visualized on a webpage and can be downloaded or forwarded to catRAPID omics for predictions of RNA targets. Availability and implementation: catRAPID signature can be accessed at http://s.tartaglialab.com/new_submission/signature. Contact: gian.tartaglia@crg.es or gian@tartaglialab.com Supplementary information: Supplementary data are available at Bioinformatics online. PMID:26520853

  2. Hybrid Enrichment Assay Methods for a UF6 Cylinder Verification Station: FY10 Progress Report

    SciTech Connect

    Smith, Leon E.; Jordan, David V.; Orton, Christopher R.; Misner, Alex C.; Mace, Emily K.

    2010-08-01

    Pacific Northwest National Laboratory (PNNL) is developing the concept of an automated UF6 cylinder verification station that would be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until the arrival of International Atomic Energy Agency (IAEA) inspectors. At the center of this unattended system is a hybrid enrichment assay technique that combines the traditional enrichment-meter method (based on the 186 keV peak from 235U) with non-traditional neutron-induced high-energy gamma-ray signatures (spawned primarily by 234U alpha emissions and 19F(alpha, neutron) reactions). Previous work by PNNL provided proof-of-principle for the non-traditional signatures to support accurate, full-volume interrogation of the cylinder enrichment, thereby reducing the systematic uncertainties in enrichment assay due to UF6 heterogeneity and providing greater sensitivity to material substitution scenarios. The work described here builds on that preliminary evaluation of the non-traditional signatures, but focuses on a prototype field system utilizing NaI(Tl) and LaBr3(Ce) spectrometers, and enrichment analysis algorithms that integrate the traditional and non-traditional signatures. Results for the assay of Type-30B cylinders ranging from 0.2 to 4.95 wt% 235U, at an AREVA fuel fabrication plant in Richland, WA, are described for the following enrichment analysis methods: 1) traditional enrichment meter signature (186 keV peak) as calculated using a square-wave convolute (SWC) algorithm; 2) non-traditional high-energy gamma-ray signature that provides neutron detection without neutron detectors and 3) hybrid algorithm that merges the traditional and non-traditional signatures. Uncertainties for each method, relative to the declared enrichment for each cylinder, are calculated and compared to the uncertainties from an attended HPGe verification station at AREVA, and the IAEA’s uncertainty target values for feed, tail and product cylinders. A summary of the major findings from the field measurements and subsequent analysis follows: • Traditional enrichment-meter assay using specially collimated NaI spectrometers and a Square-Wave-Convolute algorithm can achieve uncertainties comparable to HPGe and LaBr for product, natural and depleted cylinders. • Non-traditional signatures measured using NaI spectrometers enable interrogation of the entire cylinder volume and accurate measurement of absolute 235U mass in product, natural and depleted cylinders. • A hybrid enrichment assay method can achieve lower uncertainties than either the traditional or non-traditional methods acting independently because there is a low degree of correlation in the systematic errors of the two individual methods (wall thickness variation and 234U/235U variation, respectively). This work has indicated that the hybrid NDA method has the potential to serve as the foundation for an unattended cylinder verification station. When compared to today’s handheld cylinder-verification approach, such a station would have the following advantages: 1) improved enrichment assay accuracy for product, tail and feed cylinders; 2) full-volume assay of absolute 235U mass; 3) assay of minor isotopes (234U and 232U) important to verification of feedstock origin; single instrumentation design for both Type 30B and Type 48 cylinders; and 4) substantial reduction in the inspector manpower associated with cylinder verification.

  3. Ballistic Signature Identification System Study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The first phase of a research project directed toward development of a high speed automatic process to be used to match gun barrel signatures imparted to fired bullets was documented. An optical projection technique has been devised to produce and photograph a planar image of the entire signature, and the phototransparency produced is subjected to analysis using digital Fourier transform techniques. The success of this approach appears to be limited primarily by the accuracy of the photographic step since no significant processing limitations have been encountered.

  4. Online Nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Meyer Jordan, Bradley, IV; The, Lih-Sin; Robbins, Stuart

    2004-05-01

    Nuclear-reaction network codes are important to astronomers seeking to explore nucleosynthetic implications of astrophysical models and to nuclear physicists seeking to understand the role of nuclear properties or reaction rates in element formation. However, many users do not have the time or inclination to download and compile the codes, to manage the requisite input files, or to explore the often complex output with their own graphics programs. To help make nucleosynthesis calculations more readily available, we have placed the Clemson Nucleosynthesis code on the world-wide web at http://www.ces.clemson.edu/physics/nucleo/nuclearNetwork At this web site, any Internet user may set his or her own reaction network, nuclear properties and reaction rates, and thermodynamic trajectories. The user then submits the nucleosynthesis calculation, which runs on a dedicated server professionally maintained at Clemson University. Once the calculation is completed, the user may explore the results through dynamically produced and downloadable tables and graphs. Online help guides the user through the necessary steps. We hope this web site will prove a user-friendly and helpful tool for professional scientists as well as for students seeking to explore element formation.

  5. Library Online Systems.

    ERIC Educational Resources Information Center

    Folda, Linda; And Others

    1989-01-01

    Issues related to library online systems are discussed in six articles. Topics covered include staff education through vendor demonstrations, evaluation of online public access catalogs, the impact of integrated online systems on cataloging operations, the merits of smart and dumb barcodes, and points to consider in planning for the next online

  6. University Student Online Plagiarism

    ERIC Educational Resources Information Center

    Wang, Yu-mei

    2008-01-01

    This article reports a study investigating university student online plagiarism. The following questions are investigated: (a) What is the incidence of student online plagiarism? (b) What are student perceptions regarding online plagiarism? (c) Are there any differences in terms of student perceptions of online plagiarism and print plagiarism? (d)…

  7. University Student Online Plagiarism

    ERIC Educational Resources Information Center

    Wang, Yu-mei

    2008-01-01

    This article reports a study investigating university student online plagiarism. The following questions are investigated: (a) What is the incidence of student online plagiarism? (b) What are student perceptions regarding online plagiarism? (c) Are there any differences in terms of student perceptions of online plagiarism and print plagiarism? (d)

  8. Strategies for Online Educators

    ERIC Educational Resources Information Center

    Motte, Kristy

    2013-01-01

    For a variety of reasons, online education is an increasingly viable option for many students seeking to further their education. Because of this, the demand for online instructors continues to increase. Instructors transitioning to the online environment from the traditional classroom may find teaching online overwhelming. While some practices

  9. Online Organic Chemistry

    ERIC Educational Resources Information Center

    Janowicz, Philip A.

    2010-01-01

    This is a comprehensive study of the many facets of an entirely online organic chemistry course. Online homework with structure-drawing capabilities was found to be more effective than written homework. Online lecture was found to be just as effective as in-person lecture, and students prefer an online lecture format with shorter Webcasts. Online…

  10. Online Search Optimization.

    ERIC Educational Resources Information Center

    Homan, Michael; Worley, Penny

    This course syllabus describes methods for optimizing online searching, using as an example searching on the National Library of Medicine (NLM) online system. Four major activities considered are the online interview, query analysis and search planning, online interaction, and post-search analysis. Within the context of these activities, concepts

  11. Fueled emitter, TFE Verification Program

    NASA Astrophysics Data System (ADS)

    1994-07-01

    The program objective is to demonstrate the technology readiness of a TFE suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program built directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addressed that concern.

  12. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  13. Gender verification testing in sport.

    PubMed

    Ferris, E A

    1992-07-01

    Gender verification testing in sport, first introduced in 1966 by the International Amateur Athletic Federation (IAAF) in response to fears that males with a physical advantage in terms of muscle mass and strength were cheating by masquerading as females in women's competition, has led to unfair disqualifications of women athletes and untold psychological harm. The discredited sex chromatin test, which identifies only the sex chromosome component of gender and is therefore misleading, was abandoned in 1991 by the IAAF in favour of medical checks for all athletes, women and men, which preclude the need for gender testing. But, women athletes will still be tested at the Olympic Games at Albertville and Barcelona using polymerase chain reaction (PCR) to amplify DNA sequences on the Y chromosome which identifies genetic sex only. Gender verification testing may in time be abolished when the sporting community are fully cognizant of its scientific and ethical implications. PMID:1450892

  14. Topological Signatures for Population Admixture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Topological Signatures for Population AdmixtureDeniz Yorukoglu1, Filippo Utro1, David Kuhn2, Saugata Basu3 and Laxmi Parida1* Abstract Background: As populations with multi-linear transmission (i.e., mixing of genetic material from two parents, say) evolve over generations, the genetic transmission...

  15. Material discrimination using bispectral signatures

    NASA Astrophysics Data System (ADS)

    Nyffenegger, Paul A.; Hinich, Melvin J.

    2003-10-01

    A method is presented for material discrimination and characterization using bispectral signatures acquired from an object actively probed with acoustic pulses. Although bispectral techniques have proven useful in a diverse array of fields including passive acoustic ranging, bispectral processing in active acoustic applications has not been widely explored. The mechanisms responsible for the bispectral signatures revealed using active acoustics have not been well studied and little is known about the relative contributions to the bispectrum originating in the physical properties of the target material itself rather than from target structural acoustics and the propagation media. In a pilot experiment, we determine bispectral signatures for three targets of differing composition but similar dimensions using a submerged ultrasonic apparatus. The experiment is designed to isolate effects due to target properties from those attributable to propagation path, source, or receiver. The source wavelet is a broad-spectrum linear frequency modulated pulse. The normalized bispectrum is calculated using conventional nonparametic methods, and is averaged across many frames. Results indicate that at ultrasonic frequencies this technique provides signatures with the potential of discriminating between classes of materials such as plastic, metal, and rock. [Work supported by Applied Research Laboratories.

  16. Disaster relief through composite signatures

    NASA Astrophysics Data System (ADS)

    Hawley, Chadwick T.; Hyde, Brian; Carpenter, Tom; Nichols, Steve

    2012-06-01

    A composite signature is a group of signatures that are related in such a way to more completely or further define a target or operational endeavor at a higher fidelity. This paper builds on previous work developing innovative composite signatures associated with civil disasters, including physical, chemical and pattern/behavioral. For the composite signature approach to be successful it requires effective data fusion and visualization. This plays a key role in both preparedness and the response and recovery which are critical to saving lives. Visualization tools enhance the overall understanding of the crisis by pulling together and analyzing the data, and providing a clear and complete analysis of the information to the organizations/agencies dependant on it for a successful operation. An example of this, Freedom Web, is an easy-to-use data visualization and collaboration solution for use in homeland security, emergency preparedness, situational awareness, and event management. The solution provides a nationwide common operating picture for all levels of government through a web based, map interface. The tool was designed to be utilized by non-geospatial experts and is easily tailored to the specific needs of the users. Consisting of standard COTS and open source databases and a web server, users can view, edit, share, and highlight information easily and quickly through a standard internet browser.

  17. Signature simulation of mixed materials

    NASA Astrophysics Data System (ADS)

    Carson, Tyler D.; Salvaggio, Carl

    2015-05-01

    Soil target signatures vary due to geometry, chemical composition, and scene radiometry. Although radiative transfer models and function-fit physical models may describe certain targets in limited depth, the ability to incorporate all three signature variables is difficult. This work describes a method to simulate the transient signatures of soil by first considering scene geometry synthetically created using 3D physics engines. Through the assignment of spectral data from the Nonconventional Exploitation Factors Data System (NEFDS), the synthetic scene is represented as a physical mixture of particles. Finally, first principles radiometry is modeled using the Digital Imaging and Remote Sensing Image Generation (DIRSIG) model. With DIRSIG, radiometric and sensing conditions were systematically manipulated to produce and record goniometric signatures. The implementation of this virtual goniometer allows users to examine how a target bidirectional reflectance distribution function (BRDF) will change with geometry, composition, and illumination direction. By using 3D computer graphics models, this process does not require geometric assumptions that are native to many radiative transfer models. It delivers a discrete method to circumnavigate the significant cost of time and treasure associated with hardware-based goniometric data collections.

  18. Calibration and verification portable ceilometer

    NASA Astrophysics Data System (ADS)

    Kryuchkov, A. V.; Grishin, A. I.

    2015-11-01

    In this article were consider problems of calibration and verification of a portable meter of height of the cloud base. We analyzed the characteristics of the operating conditions of calibration kits. Was reveal and provided the possibility of developing methods for the use of a compact calibration device. Based on the study, the authors proposed a technical way of implementation, formulated the basic characteristics, the sources and extent of error.

  19. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  20. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  1. Attitudes toward buying online.

    PubMed

    Yang, Bijou; Lester, David

    2004-02-01

    A survey of 11 positive features and 10 discouraging features of online shopping was carried out on 180 students and identified certain behavioral patterns for online shoppers versus non-shoppers. It was found that online shoppers have consistently stronger positive feelings about online shopping than do non-shoppers. On the other hand, non-shoppers have more negative feelings about online shopping than do shoppers, but not consistently so. Online shoppers are aware of some of the discouraging features of online shopping, but these features do not deter them from shopping online. The implication for marketers is that they should focus on making the experience of online shopping more accommodating and more user-friendly since the positive features of online shopping ("convenience" and "efficiency") appear to be more important than the negative features ("effort/impersonality"). PMID:15006173

  2. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  3. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  4. Measuring verification device error rates

    SciTech Connect

    Watson, J.G.; Johnstone, I.M.; Driscoll, E.C. Jr.

    1987-07-01

    A verification device generates a Type I (II) error when it recommends to reject (accept) a valid (false) identity claim. For a given identity, the rates or probabilities of these errors quantify random variations of the device from claim to claim. These are intra-identity variations. To some degree, these rates depend on the particular identity being challenged, and there exists a distribution of error rates characterizing inter-identity variations. However, for most security system applications we only need to know averages of this distribution. These averages are called the pooled error rates. In this paper the authors present the statistical underpinnings for the measurement of pooled Type I and Type II error rates. The authors consider a conceptual experiment, ''a crate of biased coins''. This model illustrates the effects of sampling both within trials of the same individual and among trials from different individuals. Application of this simple model to verification devices yields pooled error rate estimates and confidence limits for these estimates. A sample certification procedure for verification devices is given in the appendix.

  5. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program versions [19]. These techniques compare two programs with a large degree of syntactic similarity to prove that portions of one program version are equivalent to the other. Regression verification can be used for guaranteeing backward compatibility, and for showing behavioral equivalence in programs with syntactic differences, e.g., when a program is refactored to improve its performance, maintainability, or readability. Existing regression verification techniques leverage similarities between program versions by using abstraction and decomposition techniques to improve scalability of the analysis [10, 12, 19]. The abstractions and decomposition in the these techniques, e.g., summaries of unchanged code [12] or semantically equivalent methods [19], compute an over-approximation of the program behaviors. The equivalence checking results of these techniques are sound but not complete-they may characterize programs as not functionally equivalent when, in fact, they are equivalent. In this work we describe a novel approach that leverages the impact of the differences between two programs for scaling regression verification. We partition program behaviors of each version into (a) behaviors impacted by the changes and (b) behaviors not impacted (unimpacted) by the changes. Only the impacted program behaviors are used during equivalence checking. We then prove that checking equivalence of the impacted program behaviors is equivalent to checking equivalence of all program behaviors for a given depth bound. In this work we use symbolic execution to generate the program behaviors and leverage control- and data-dependence information to facilitate the partitioning of program behaviors. The impacted program behaviors are termed as impact summaries. The dependence analyses that facilitate the generation of the impact summaries, we believe, could be used in conjunction with other abstraction and decomposition based approaches, [10, 12], as a complementary reduction technique. An evaluation of our regression verification technique shows that our approach is capable of leveraging similarities between program versions to reduce the size of the queries and the time required to check for logical equivalence. The main contributions of this work are: - A regression verification technique to generate impact summaries that can be checked for functional equivalence using an off-the-shelf decision procedure. - A proof that our approach is sound and complete with respect to the depth bound of symbolic execution. - An implementation of our technique using the LLVMcompiler infrastructure, the klee Symbolic Virtual Machine [4], and a variety of Satisfiability Modulo Theory (SMT) solvers, e.g., STP [7] and Z3 [6]. - An empirical evaluation on a set of C artifacts which shows that the use of impact summaries can reduce the cost of regression verification.

  6. Analyzing Online Behaviors, Roles, and Learning Communities via Online Discussions

    ERIC Educational Resources Information Center

    Yeh, Yu-Chu

    2010-01-01

    Online learning communities are an important means of sharing and creating knowledge. Online behaviors and online roles can reveal how online learning communities function. However, no study has elucidated the relationships among online behaviors, online roles, and online learning communities. In this study, 32 preservice teachers participated in…

  7. Gender verification in competitive sports.

    PubMed

    Simpson, J L; Ljungqvist, A; de la Chapelle, A; Ferguson-Smith, M A; Genel, M; Carlson, A S; Ehrhardt, A A; Ferris, E

    1993-11-01

    The possibility that men might masquerade as women and be unfair competitors in women's sports is accepted as outrageous by athletes and the public alike. Since the 1930s, media reports have fuelled claims that individuals who once competed as female athletes subsequently appeared to be men. In most of these cases there was probably ambiguity of the external genitalia, possibly as a result of male pseudohermaphroditism. Nonetheless, beginning at the Rome Olympic Games in 1960, the International Amateur Athletics Federation (IAAF) began establishing rules of eligibility for women athletes. Initially, physical examination was used as a method for gender verification, but this plan was widely resented. Thus, sex chromatin testing (buccal smear) was introduced at the Mexico City Olympic Games in 1968. The principle was that genetic females (46,XX) show a single X-chromatic mass, whereas males (46,XY) do not. Unfortunately, sex chromatin analysis fell out of common diagnostic use by geneticists shortly after the International Olympic Committee (IOC) began its implementation for gender verification. The lack of laboratories routinely performing the test aggravated the problem of errors in interpretation by inexperienced workers, yielding false-positive and false-negative results. However, an even greater problem is that there exist phenotypic females with male sex chromatin patterns (e.g. androgen insensitivity, XY gonadal dysgenesis). These individuals have no athletic advantage as a result of their congenital abnormality and reasonably should not be excluded from competition. That is, only the chromosomal (genetic) sex is analysed by sex chromatin testing, not the anatomical or psychosocial status. For all the above reasons sex chromatin testing unfairly excludes many athletes. Although the IOC offered follow-up physical examinations that could have restored eligibility for those 'failing' sex chromatin tests, most affected athletes seemed to prefer to 'retire'. All these problems remain with the current laboratory based gender verification test, polymerase chain reaction based testing of the SRY gene, the main candidate for male sex determination. Thus, this 'advance' in fact still fails to address the fundamental inequities of laboratory based gender verification tests. The IAAF considered the issue in 1991 and 1992, and concluded that gender verification testing was not needed. This was thought to be especially true because of the current use of urine testing to exclude doping: voiding is observed by an official in order to verify that a sample from a given athlete has actually come from his or her urethra. That males could masquerade as females in these circumstances seems extraordinarily unlikely. Screening for gender is no longer undertaken at IAAF competitions. PMID:8272686

  8. CAMPR3: a database on sequences, structures and signatures of antimicrobial peptides

    PubMed Central

    Waghu, Faiza Hanif; Barai, Ram Shankar; Gurung, Pratima; Idicula-Thomas, Susan

    2016-01-01

    Antimicrobial peptides (AMPs) are known to have family-specific sequence composition, which can be mined for discovery and design of AMPs. Here, we present CAMPR3; an update to the existing CAMP database available online at www.camp3.bicnirrh.res.in. It is a database of sequences, structures and family-specific signatures of prokaryotic and eukaryotic AMPs. Family-specific sequence signatures comprising of patterns and Hidden Markov Models were generated for 45 AMP families by analysing 1386 experimentally studied AMPs. These were further used to retrieve AMPs from online sequence databases. More than 4000 AMPs could be identified using these signatures. AMP family signatures provided in CAMPR3 can thus be used to accelerate and expand the discovery of AMPs. CAMPR3 presently holds 10247 sequences, 757 structures and 114 family-specific signatures of AMPs. Users can avail the sequence optimization algorithm for rational design of AMPs. The database integrated with tools for AMP sequence and structure analysis will be a valuable resource for family-based studies on AMPs. PMID:26467475

  9. CAMPR3: a database on sequences, structures and signatures of antimicrobial peptides.

    PubMed

    Waghu, Faiza Hanif; Barai, Ram Shankar; Gurung, Pratima; Idicula-Thomas, Susan

    2016-01-01

    Antimicrobial peptides (AMPs) are known to have family-specific sequence composition, which can be mined for discovery and design of AMPs. Here, we present CAMPR3; an update to the existing CAMP database available online at www.camp3.bicnirrh.res.in. It is a database of sequences, structures and family-specific signatures of prokaryotic and eukaryotic AMPs. Family-specific sequence signatures comprising of patterns and Hidden Markov Models were generated for 45 AMP families by analysing 1386 experimentally studied AMPs. These were further used to retrieve AMPs from online sequence databases. More than 4000 AMPs could be identified using these signatures. AMP family signatures provided in CAMPR3 can thus be used to accelerate and expand the discovery of AMPs. CAMPR3 presently holds 10247 sequences, 757 structures and 114 family-specific signatures of AMPs. Users can avail the sequence optimization algorithm for rational design of AMPs. The database integrated with tools for AMP sequence and structure analysis will be a valuable resource for family-based studies on AMPs. PMID:26467475

  10. Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2015-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation of the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  11. Validation of an online replanning technique for prostate adaptive radiotherapy

    NASA Astrophysics Data System (ADS)

    Peng, Cheng; Chen, Guangpei; Ahunbay, Ergun; Wang, Dian; Lawton, Colleen; Li, X. Allen

    2011-06-01

    We have previously developed an online adaptive replanning technique to rapidly adapt the original plan according to daily CT. This paper reports the quality assurance (QA) developments in its clinical implementation for prostate cancer patients. A series of pre-clinical validation tests were carried out to verify the overall accuracy and consistency of the online replanning procedure. These tests include (a) phantom measurements of 22 individual patient adaptive plans to verify their accuracy and deliverability and (b) efficiency and applicability of the online replanning process. A four-step QA procedure was established to ensure the safe and accurate delivery of an adaptive plan, including (1) offline phantom measurement of the original plan, (2) online independent monitor unit (MU) calculation for a redundancy check, (3) online verification of plan-data transfer using an in-house software and (4) offline validation of actually delivered beam parameters. The pre-clinical validations demonstrate that the newly implemented online replanning technique is dosimetrically accurate and practically efficient. The four-step QA procedure is capable of identifying possible errors in the process of online adaptive radiotherapy and to ensure the safe and accurate delivery of the adaptive plans. Based on the success of this work, the online replanning technique has been used in the clinic to correct for interfractional changes during the prostate radiation therapy.

  12. The Global Diffusion of Societal Verification Tools: A Quantitative Assessment of the Public’s Ability to Engage Nonproliferation Treaty Monitoring

    SciTech Connect

    Sayre, Amanda M.; Kreyling, Sean J.; West, Curtis L.

    2015-07-11

    The spread of nuclear and dual-use technologies and the need for more robust, effective and efficient nonproliferation and arms control treaties has led to an increasing need for innovative verification approaches and technologies. This need, paired with advancements in online computing, mobile devices, commercially available satellite imagery and the evolution of online social networks, has led to a resurgence of the concept of societal verification for arms control and nonproliferation treaties. In the event a country accepts its citizens’ assistance in supporting transparency, confidence-building and societal verification, the host government will need a population that is willing and able to participate. While scholarly interest in societal verification continues to grow, social scientific research on the topic is lacking. The aim of this paper is to begin the process of understanding public societal verification capabilities, extend the availability of quantitative research on societal verification and set in motion complementary research to increase the breadth and depth of knowledge on this topic. This paper presents a potential framework and outlines a research roadmap for the development of such a societal verification capability index.

  13. Application of computer vision to automatic prescription verification in pharmaceutical mail order

    NASA Astrophysics Data System (ADS)

    Alouani, Ali T.

    2005-05-01

    In large volume pharmaceutical mail order, before shipping out prescriptions, licensed pharmacists ensure that the drug in the bottle matches the information provided in the patient prescription. Typically, the pharmacist has about 2 sec to complete the prescription verification process of one prescription. Performing about 1800 prescription verification per hour is tedious and can generate human errors as a result of visual and brain fatigue. Available automatic drug verification systems are limited to a single pill at a time. This is not suitable for large volume pharmaceutical mail order, where a prescription can have as many as 60 pills and where thousands of prescriptions are filled every day. In an attempt to reduce human fatigue, cost, and limit human error, the automatic prescription verification system (APVS) was invented to meet the need of large scale pharmaceutical mail order. This paper deals with the design and implementation of the first prototype online automatic prescription verification machine to perform the same task currently done by a pharmacist. The emphasis here is on the visual aspects of the machine. The system has been successfully tested on 43,000 prescriptions.

  14. A Quantum Proxy Weak Blind Signature Scheme

    NASA Astrophysics Data System (ADS)

    Cao, Hai-Jing; Zhu, Yan-Yan; Li, Peng-Fei

    2013-09-01

    We present a weak blind signature scheme based on a genuinely entangled six qubits state. Different from classical blind signature schemes and current quantum signature schemes, our quantum weak blind signature scheme could guarantee not only the unconditionally security but also the anonymity of the message owner. To achieve that, quantum key distribution and one-time pad are adopted in our scheme. Our scheme has the characteristics of classical security and quantum security.

  15. Quantum group blind signature scheme without entanglement

    NASA Astrophysics Data System (ADS)

    Xu, Rui; Huang, Liusheng; Yang, Wei; He, Libao

    2011-07-01

    In this paper we propose a quantum group blind signature scheme designed for distributed e-voting system. Our scheme combines the properties of group signature and blind signature to provide anonymity of voters in an e-voting system. The unconditional security of our scheme is ensured by quantum mechanics. Without employing entanglement, the proposed scheme is easier to be realized comparing with other quantum signature schemes.

  16. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  17. ENVIRONMENTAL TECHNOLOGY VERIFICATION COATINGS AND COATING EQUIPMENT PROGRAM (ETV CCEP): LIQUID COATINGS--GENERIC VERIFICATION PROTOCOL

    EPA Science Inventory

    This report is a generic verification protocol or GVP which provides standards for testing liquid coatings for their enviornmental impacts under the Environmental Technology Verification program. It provides generic guidelines for product specific testing and quality assurance p...

  18. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA?S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program?s goal is to further environmental protection by a...

  19. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION: GENERIC VERIFICATION PROTOCOL FOR BIOLOGICAL AND AEROSOL TESTING OF GENERAL VENTILATION AIR CLEANERS

    EPA Science Inventory

    The U.S. Environmental Protection Agency established the Environmental Technology Verification Program to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of product performance. Research Triangl...

  1. Irma 5.2 multi-sensor signature prediction model

    NASA Astrophysics Data System (ADS)

    Savage, James; Coker, Charles; Thai, Bea; Aboutalib, Omar; Chow, Anthony; Yamaoka, Neil; Kim, Charles

    2007-04-01

    The Irma synthetic signature prediction code is being developed by the Munitions Directorate of the Air Force Research Laboratory (AFRL/MN) to facilitate the research and development of multi-sensor systems. There are over 130 users within the Department of Defense, NASA, Department of Transportation, academia, and industry. Irma began as a high-resolution, physics-based Infrared (IR) target and background signature model for tactical weapon applications and has grown to include: a laser (or active) channel (1990), improved scene generator to support correlated frame-to-frame imagery (1992), and passive IR/millimeter wave (MMW) channel for a co-registered active/passive IR/MMW model (1994). Irma version 5.0 was released in 2000 and encompassed several upgrades to both the physical models and software; host support was expanded to Windows, Linux, Solaris, and SGI Irix platforms. In 2005, version 5.1 was released after an extensive verification and validation of an upgraded and reengineered active channel. Since 2005, the reengineering effort has focused on the Irma passive channel. Field measurements for the validation effort include the unpolarized data collection. Irma 5.2 is scheduled for release in the summer of 2007. This paper will report the validation test results of the Irma passive models and discuss the new features in Irma 5.2.

  2. Irma 5.2 multi-sensor signature prediction model

    NASA Astrophysics Data System (ADS)

    Savage, James; Coker, Charles; Thai, Bea; Aboutalib, Omar; Pau, John

    2008-04-01

    The Irma synthetic signature prediction code is being developed by the Munitions Directorate of the Air Force Research Laboratory (AFRL/RW) to facilitate the research and development of multi-sensor systems. There are over 130 users within the Department of Defense, NASA, Department of Transportation, academia, and industry. Irma began as a high-resolution, physics-based Infrared (IR) target and background signature model for tactical weapon applications and has grown to include: a laser (or active) channel (1990), improved scene generator to support correlated frame-to-frame imagery (1992), and passive IR/millimeter wave (MMW) channel for a co-registered active/passive IR/MMW model (1994). Irma version 5.0 was released in 2000 and encompassed several upgrades to both the physical models and software; host support was expanded to Windows, Linux, Solaris, and SGI Irix platforms. In 2005, version 5.1 was released after extensive verification and validation of an upgraded and reengineered ladar channel. The reengineering effort then shifted focus to the Irma passive channel. Field measurements for the validation effort include both polarized and unpolarized data collection. Irma 5.2 was released in 2007 with a reengineered passive channel. This paper summarizes the capabilities of Irma and the progress toward Irma 5.3, which includes a reengineered radar channel.

  3. Using tools for verification, documentation and testing

    NASA Technical Reports Server (NTRS)

    Osterweil, L. J.

    1978-01-01

    Methodologies are discussed on four of the major approaches to program upgrading -- namely dynamic testing, symbolic execution, formal verification and static analysis. The different patterns of strengths, weaknesses and applications of these approaches are shown. It is demonstrated that these patterns are in many ways complementary, offering the hope that they can be coordinated and unified into a single comprehensive program testing and verification system capable of performing a diverse and useful variety of error detection, verification and documentation functions.

  4. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  5. 21 CFR 11.50 - Signature manifestations.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Signature manifestations. 11.50 Section 11.50 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Electronic Records 11.50 Signature manifestations. (a) Signed electronic records shall contain information...

  6. 21 CFR 11.50 - Signature manifestations.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Signature manifestations. 11.50 Section 11.50 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Electronic Records 11.50 Signature manifestations. (a) Signed electronic records shall contain information...

  7. 17 CFR 232.302 - Signatures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 17 Commodity and Securities Exchanges 2 2010-04-01 2010-04-01 false Signatures. 232.302 Section 232.302 Commodity and Securities Exchanges SECURITIES AND EXCHANGE COMMISSION REGULATION S-T-GENERAL RULES AND REGULATIONS FOR ELECTRONIC FILINGS Preparation of Electronic Submissions 232.302 Signatures. (a) Required signatures to, or within,...

  8. 48 CFR 4.102 - Contractor's signature.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 48 Federal Acquisition Regulations System 1 2012-10-01 2012-10-01 false Contractor's signature. 4... ADMINISTRATIVE MATTERS Contract Execution 4.102 Contractor's signature. (a) Individuals. A contract with an... be signed by that individual, and the signature shall be followed by the individual's typed,...

  9. Dust devil signatures in infrasound records of the International Monitoring System

    NASA Astrophysics Data System (ADS)

    Lorenz, Ralph D.; Christie, Douglas

    2015-03-01

    We explore whether dust devils have a recognizable signature in infrasound array records, since several Comprehensive Nuclear-Test-Ban Treaty verification stations conducting continuous measurements with microbarometers are in desert areas which see dust devils. The passage of dust devils (and other boundary layer vortices, whether dust laden or not) causes a local temporary drop in pressure: the high-pass time domain filtering in microbarometers results in a "heartbeat" signature, which we observe at the Warramunga station in Australia. We also observe a ~50 min pseudoperiodicity in the occurrence of these signatures and some higher-frequency infrasound. Dust devils do not significantly degrade the treaty verification capability. The pipe arrays for spatial averaging used in infrasound monitoring degrade the detection efficiency of small devils, but the long observation time may allow a useful census of large vortices, and thus, the high-sensitivity infrasonic array data from the monitoring network can be useful in studying columnar vortices in the lower atmosphere.

  10. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  11. Verification Technologies, first/second quarters 1992

    SciTech Connect

    Staehle, G.; Talaber, C.; Stull, S. )

    1992-01-01

    The purpose of this document is to enhance communication between the technologists who study the means to verify compliance with the policy makers who negotiate treaties. Topics discussed include: microwave dielectrometer for chemical weapon (cw) verification; fourier-transform ultrasonic interrogation of sealed chemical and conventional munitions; low-power portable x-ray system for chemical treaty verification; portable isotopic neutron spectroscopy for nondestructive evaluation of cw; laser acoustic spectroscopy for cw verification; ion-tube neutron spectroscopy; frequency response method; acoustic resonance spectroscopy; ultrasonic pulse-echo measurements for cw verification; and investigating fluid-filled munitions with inertial damping.

  12. Verification Technologies, first/second quarters 1992

    SciTech Connect

    Staehle, G.; Talaber, C.; Stull, S. ); Staehle, G.; Talaber, C.; Stull, S.

    1992-10-01

    The purpose of this document is to enhance communication between the technologists who study the means to verify compliance with the policy makers who negotiate treaties. Topics discussed include: microwave dielectrometer for chemical weapon (cw) verification; fourier-transform ultrasonic interrogation of sealed chemical and conventional munitions; low-power portable x-ray system for chemical treaty verification; portable isotopic neutron spectroscopy for nondestructive evaluation of cw; laser acoustic spectroscopy for cw verification; ion-tube neutron spectroscopy; frequency response method; acoustic resonance spectroscopy; ultrasonic pulse-echo measurements for cw verification; and investigating fluid-filled munitions with inertial damping.

  13. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  14. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  15. Acceptance sampling methods for sample results verification

    SciTech Connect

    Jesse, C.A.

    1993-06-01

    This report proposes a statistical sampling method for use during the sample results verification portion of the validation of data packages. In particular, this method was derived specifically for the validation of data packages for metals target analyte analysis performed under United States Environmental Protection Agency Contract Laboratory Program protocols, where sample results verification can be quite time consuming. The purpose of such a statistical method is to provide options in addition to the ``all or nothing`` options that currently exist for sample results verification. The proposed method allows the amount of data validated during the sample results verification process to be based on a balance between risks and the cost of inspection.

  16. Polarization signatures of airborne particulates

    NASA Astrophysics Data System (ADS)

    Raman, Prashant; Fuller, Kirk A.; Gregory, Don A.

    2013-07-01

    Exploratory research has been conducted with the aim of completely determining the polarization signatures of selected particulates as a function of wavelength. This may lead to a better understanding of the interaction between electromagnetic radiation and such materials, perhaps leading to the point detection of bio-aerosols present in the atmosphere. To this end, a polarimeter capable of measuring the complete Mueller matrix of highly scattering samples in transmission and reflection (with good spectral resolution from 300 to 1100 nm) has been developed. The polarization properties of Bacillus subtilis (surrogate for anthrax spore) are compared to ambient particulate matter species such as pollen, dust, and soot. Differentiating features in the polarization signatures of these samples have been identified, thus demonstrating the potential applicability of this technique for the detection of bio-aerosol in the ambient atmosphere.

  17. Evolutionary Signatures of River Networks

    NASA Astrophysics Data System (ADS)

    Paik, K.

    2014-12-01

    River networks exhibit fractal characteristics and it has long been wondered how such regular patterns have been formed. This subject has been actively investigated mainly by two great schools of thoughts, i.e., chance and organization. Along this line, several fundamental questions have partially been addressed or remained. They include whether river networks pursue certain optimal conditions, and if so what is the ultimate optimality signature. Hydrologists have traditionally perceived this issue from fluvial-oriented perspectives. Nevertheless, geological processes can be more dominant in the formation of river networks in reality. To shed new lights on this subject, it is necessary to better understand complex feedbacks between various processes over different time scales, and eventually the emerging characteristic signature. Here, I will present highlights of earlier studies on this line and some noteworthy approaches being tried recently.

  18. Odor signatures and kin recognition.

    PubMed

    Porter, R H; Cernoch, J M; Balogh, R D

    1985-03-01

    The basis of olfactory signatures mediating human kin recognition was investigated in two experiments. The odors of mothers and offspring were correctly matched (by subjects unfamiliar with the stimulus individuals) at a greater than chance frequency. In contrast, subjects were not able reliably to match the odors of husbands and wives. These data support the hypotheses that characteristic individuals odors are genetically mediated and that kin recognition should be facilitated by the similarity of such familiar odors among close relatives. PMID:4011726

  19. Developments in Signature Process Control

    NASA Astrophysics Data System (ADS)

    Keller, L. B.; Dominski, Marty

    1993-01-01

    Developments in the adaptive process control technique known as Signature Process Control for Advanced Composites (SPCC) are described. This computer control method for autoclave processing of composites was used to develop an optimum cure cycle for AFR 700B polyamide and for an experimental poly-isoimide. An improved process cycle was developed for Avimid N polyamide. The potential for extending the SPCC technique to pre-preg quality control, press modeling, pultrusion and RTM is briefly discussed.

  20. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important steps in the process. Verification insures that the CFD code is solving the equations as intended (no errors in the implementation). This is typically done either through the method of manufactured solutions (MMS) or through careful step-by-step comparisons with other verified codes. After the verification step is concluded, validation is performed to document the ability of the turbulence model to represent different types of flow physics. Validation can involve a large number of test case comparisons with experiments, theory, or DNS. Organized workshops have proved to be valuable resources for the turbulence modeling community in its pursuit of turbulence modeling verification and validation. Workshop contributors using different CFD codes run the same cases, often according to strict guidelines, and compare results. Through these comparisons, it is often possible to (1) identify codes that have likely implementation errors, and (2) gain insight into the capabilities and shortcomings of different turbulence models to predict the flow physics associated with particular types of flows. These are valuable lessons because they help bring consistency to CFD codes by encouraging the correction of faulty programming and facilitating the adoption of better models. They also sometimes point to specific areas needed for improvement in the models. In this paper, several recent workshops are summarized primarily from the point of view of turbulence modeling verification and validation. Furthermore, the NASA Langley Turbulence Modeling Resource website is described. The purpose of this site is to provide a central location where RANS turbulence models are documented, and test cases, grids, and data are provided. The goal of this paper is to provide an abbreviated survey of turbulence modeling verification and validation efforts, summarize some of the outcomes, and give some ideas for future endeavors in this area.

  1. Microbial Lifestyle and Genome Signatures

    PubMed Central

    Dutta, Chitra; Paul, Sandip

    2012-01-01

    Microbes are known for their unique ability to adapt to varying lifestyle and environment, even to the extreme or adverse ones. The genomic architecture of a microbe may bear the signatures not only of its phylogenetic position, but also of the kind of lifestyle to which it is adapted. The present review aims to provide an account of the specific genome signatures observed in microbes acclimatized to distinct lifestyles or ecological niches. Niche-specific signatures identified at different levels of microbial genome organization like base composition, GC-skew, purine-pyrimidine ratio, dinucleotide abundance, codon bias, oligonucleotide composition etc. have been discussed. Among the specific cases highlighted in the review are the phenomena of genome shrinkage in obligatory host-restricted microbes, genome expansion in strictly intra-amoebal pathogens, strand-specific codon usage in intracellular species, acquisition of genome islands in pathogenic or symbiotic organisms, discriminatory genomic traits of marine microbes with distinct trophic strategies, and conspicuous sequence features of certain extremophiles like those adapted to high temperature or high salinity. PMID:23024607

  2. Functional Role of Ribosomal Signatures

    PubMed Central

    Chen, Ke; Eargle, John; Sarkar, Krishnarjun; Gruebele, Martin; Luthey-Schulten, Zaida

    2010-01-01

    Although structure and sequence signatures in ribosomal RNA and proteins are defining characteristics of the three domains of life and instrumental in constructing the modern phylogeny, little is known about their functional roles in the ribosome. In this work, the largest coevolving RNA/protein signatures in the bacterial 30S ribosome are investigated both experimentally and computationally through all-atom molecular-dynamics simulations. The complex includes the N-terminal fragment of the ribosomal protein S4, which is a primary binding protein that initiates 30S small subunit assembly from the 5? domain, and helix 16 (h16), which is part of the five-way junction in 16S rRNA. Our results show that the S4 N-terminus signature is intrinsically disordered in solution, whereas h16 is relatively stable by itself. The dynamic disordered property of the protein is exploited to couple the folding and binding process to the five-way junction, and the results provide insight into the mechanism for the early and fast binding of S4 in the assembly of the ribosomal small subunit. PMID:21156135

  3. Trojan technical specification verification project

    SciTech Connect

    Bates, L.; Rickenback, M.

    1991-01-01

    The Trojan Technical Specification Verification (TTSV) project at the Trojan plant of Portland General Electric Company was motivated by the recognition that many numbers in the Trojan technical specifications (TTS) potentially lacked the consideration of instrument- and/or process-related errors. The plant setpoints were known to consider such errors, but many of the values associated with the limiting conditions for operation (LCO) did not. In addition, the existing plant instrument error analyses were based on industry values that do not reflect the Trojan plant-specific experience. The purpose of this project is to ensure that the Trojan plant setpoint and LCO values include plant-specific instrument error.

  4. Conformance Verification of Privacy Policies

    NASA Astrophysics Data System (ADS)

    Fu, Xiang

    Web applications are both the consumers and providers of information. To increase customer confidence, many websites choose to publish their privacy protection policies. However, policy conformance is often neglected. We propose a logic based framework for formally specifying and reasoning about the implementation of privacy protection by a web application. A first order extension of computation tree logic is used to specify a policy. A verification paradigm, built upon a static control/data flow analysis, is presented to verify if a policy is satisfied.

  5. SHIELD verification and validation report

    SciTech Connect

    Boman, C.

    1992-02-01

    This document outlines the verification and validation effort for the SHIELD, SHLDED, GEDIT, GENPRT, FIPROD, FPCALC, and PROCES modules of the SHIELD system code. Along with its predecessors, SHIELD has been in use at the Savannah River Site (SRS) for more than ten years. During this time the code has been extensively tested and a variety of validation documents have been issued. The primary function of this report is to specify the features and capabilities for which SHIELD is to be considered validated, and to reference the documents that establish the validation.

  6. Why do verification and validation?

    DOE PAGESBeta

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  7. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  8. A new quantum blind signature with unlinkability

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Min; Zhang, Jian-Biao; Zhou, Yi-Hua; Yang, Yu-Guang

    2015-08-01

    Recently, some quantum blind signature protocols have been proposed. However, the previous schemes cannot satisfy the unlinkability requirement. To overcome the drawback of unlinkability in the previous schemes, we propose a new quantum blind signature based on Bell states with the help of an authentic party. In this paper, we provide a method to inject a randomizing factor into a message when it is signed by the signer and then get rid of the blind factor from the blinded signature when it is verified by the verifier. Even when the message owner publishes the message-signature pair, the signer cannot identify the association between the message-signature pair and the blind signature he generated. Therefore, our scheme really realizes unlinkability property. At last, analysis results show that this scheme satisfies the basis security requirements of a weak signature such as no-counterfeiting, no-disavowing, blindness and traceability, and our total efficiency is not less than the previous schemes.

  9. Spectral signature selection for mapping unvegetated soils

    NASA Technical Reports Server (NTRS)

    May, G. A.; Petersen, G. W.

    1975-01-01

    Airborne multispectral scanner data covering the wavelength interval from 0.40-2.60 microns were collected at an altitude of 1000 m above the terrain in southeastern Pennsylvania. Uniform training areas were selected within three sites from this flightline. Soil samples were collected from each site and a procedure developed to allow assignment of scan line and element number from the multispectral scanner data to each sampling location. These soil samples were analyzed on a spectrophotometer and laboratory spectral signatures were derived. After correcting for solar radiation and atmospheric attenuation, the laboratory signatures were compared to the spectral signatures derived from these same soils using multispectral scanner data. Both signatures were used in supervised and unsupervised classification routines. Computer-generated maps using the laboratory and multispectral scanner derived signatures resulted in maps that were similar to maps resulting from field surveys. Approximately 90% agreement was obtained between classification maps produced using multispectral scanner derived signatures and laboratory derived signatures.

  10. TOXLINE (TOXICOLOGY INFORMATION ONLINE)

    EPA Science Inventory

    TOXLINE? (TOXicology information onLINE) are the National Library of Medicines extensive collection of online bibliographic information covering the pharmacological, biochemical, physiological, and toxicological effects of drugs and other chemicals. TOXLINE and TOXLINE65 together...

  11. The Online Underworld.

    ERIC Educational Resources Information Center

    Scrogan, Len

    1988-01-01

    Discusses some of the misuses of telecommunicating using school computers, including online piracy, hacking, phreaking, online crime, and destruction boards. Suggests ways that schools can deal with these problems. (TW)

  12. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  13. Brain oscillatory signatures of motor tasks.

    PubMed

    Ramos-Murguialday, Ander; Birbaumer, Niels

    2015-06-01

    Noninvasive brain-computer-interfaces (BCI) coupled with prosthetic devices were recently introduced in the rehabilitation of chronic stroke and other disorders of the motor system. These BCI systems and motor rehabilitation in general involve several motor tasks for training. This study investigates the neurophysiological bases of an EEG-oscillation-driven BCI combined with a neuroprosthetic device to define the specific oscillatory signature of the BCI task. Controlling movements of a hand robotic orthosis with motor imagery of the same movement generates sensorimotor rhythm oscillation changes and involves three elements of tasks also used in stroke motor rehabilitation: passive and active movement, motor imagery, and motor intention. We recorded EEG while nine healthy participants performed five different motor tasks consisting of closing and opening of the hand as follows: 1) motor imagery without any external feedback and without overt hand movement, 2) motor imagery that moves the orthosis proportional to the produced brain oscillation change with online proprioceptive and visual feedback of the hand moving through a neuroprosthetic device (BCI condition), 3) passive and 4) active movement of the hand with feedback (seeing and feeling the hand moving), and 5) rest. During the BCI condition, participants received contingent online feedback of the decrease of power of the sensorimotor rhythm, which induced orthosis movement and therefore proprioceptive and visual information from the moving hand. We analyzed brain activity during the five conditions using time-frequency domain bootstrap-based statistical comparisons and Morlet transforms. Activity during rest was used as a reference. Significant contralateral and ipsilateral event-related desynchronization of sensorimotor rhythm was present during all motor tasks, largest in contralateral-postcentral, medio-central, and ipsilateral-precentral areas identifying the ipsilateral precentral cortex as an integral part of motor regulation. Changes in task-specific frequency power compared with rest were similar between motor tasks, and only significant differences in the time course and some narrow specific frequency bands were observed between motor tasks. We identified EEG features representing active and passive proprioception (with and without muscle contraction) and active intention and passive involvement (with and without voluntary effort) differentiating brain oscillations during motor tasks that could substantially support the design of novel motor BCI-based rehabilitation therapies. The BCI task induced significantly different brain activity compared with the other motor tasks, indicating neural processes unique to the use of body actuators control in a BCI context. PMID:25810484

  14. Runtime Verification with State Estimation

    NASA Technical Reports Server (NTRS)

    Stoller, Scott D.; Bartocci, Ezio; Seyster, Justin; Grosu, Radu; Havelund, Klaus; Smolka, Scott A.; Zadok, Erez

    2011-01-01

    We introduce the concept of Runtime Verification with State Estimation and show how this concept can be applied to estimate theprobability that a temporal property is satisfied by a run of a program when monitoring overhead is reduced by sampling. In such situations, there may be gaps in the observed program executions, thus making accurate estimation challenging. To deal with the effects of sampling on runtime verification, we view event sequences as observation sequences of a Hidden Markov Model (HMM), use an HMM model of the monitored program to "fill in" sampling-induced gaps in observation sequences, and extend the classic forward algorithm for HMM state estimation (which determines the probability of a state sequence, given an observation sequence) to compute the probability that the property is satisfied by an execution of the program. To validate our approach, we present a case study based on the mission software for a Mars rover. The results of our case study demonstrate high prediction accuracy for the probabilities computed by our algorithm. They also show that our technique is much more accurate than simply evaluating the temporal property on the given observation sequences, ignoring the gaps.

  15. RISKIND verification and benchmark comparisons

    SciTech Connect

    Biwer, B.M.; Arnish, J.J.; Chen, S.Y.; Kamboj, S.

    1997-08-01

    This report presents verification calculations and benchmark comparisons for RISKIND, a computer code designed to estimate potential radiological consequences and health risks to individuals and the population from exposures associated with the transportation of spent nuclear fuel and other radioactive materials. Spreadsheet calculations were performed to verify the proper operation of the major options and calculational steps in RISKIND. The program is unique in that it combines a variety of well-established models into a comprehensive treatment for assessing risks from the transportation of radioactive materials. Benchmark comparisons with other validated codes that incorporate similar models were also performed. For instance, the external gamma and neutron dose rate curves for a shipping package estimated by RISKIND were compared with those estimated by using the RADTRAN 4 code and NUREG-0170 methodology. Atmospheric dispersion of released material and dose estimates from the GENII and CAP88-PC codes. Verification results have shown the program to be performing its intended function correctly. The benchmark results indicate that the predictions made by RISKIND are within acceptable limits when compared with predictions from similar existing models.

  16. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  17. Video-Based Fingerprint Verification

    PubMed Central

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, inside similarity and outside similarity are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  18. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-01-01

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low. PMID:24008283

  19. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  20. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  1. The monitoring and verification of nuclear weapons

    NASA Astrophysics Data System (ADS)

    Garwin, Richard L.

    2014-05-01

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  2. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS 417.8 Agency verification. FSIS will verify the adequacy of the HACCP plan(s) by determining that each HACCP plan meets the requirements of this part and all other applicable regulations. Such verification may include: (a) Reviewing the HACCP plan;...

  3. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  4. ETVOICE (ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM)

    EPA Science Inventory

    The purpose of ETVoice, hosted by the USEPA's Environmental Technology Verification (ETV) Program, is to inform subscribers of verification products, events and news related to the ETV Program. Messages will be posted as needed to inform subscibers of hot topics. An archive of pa...

  5. 40 CFR 1066.220 - Linearity verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Linearity verification. 1066.220 Section 1066.220 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.220 Linearity verification. (a) Scope and frequency. Perform linearity...

  6. 40 CFR 1066.135 - Linearity verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Linearity verification. 1066.135 Section 1066.135 Protection of Environment ENVIRONMENTAL PROTECTION AGENCY (CONTINUED) AIR POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Equipment, Measurement Instruments, Fuel, and Analytical Gas Specifications § 1066.135 Linearity verification....

  7. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS 417.8 Agency verification. FSIS will verify the adequacy of the HACCP plan(s) by determining that each HACCP plan meets the requirements of this part and all other applicable regulations. Such verification may include: (a) Reviewing the HACCP plan;...

  8. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS 417.8 Agency verification. FSIS will verify the adequacy of the HACCP plan(s) by determining that each HACCP plan meets the requirements of this part and all other applicable regulations. Such verification may include: (a) Reviewing the HACCP plan;...

  9. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS 417.8 Agency verification. FSIS will verify the adequacy of the HACCP plan(s) by determining that each HACCP plan meets the requirements of this part and all other applicable regulations. Such verification may include: (a) Reviewing the HACCP plan;...

  10. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  11. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS 417.8 Agency verification. FSIS will verify the adequacy of the HACCP plan(s) by determining that each HACCP plan meets the requirements of this part and all other applicable regulations. Such verification may include: (a) Reviewing the HACCP plan;...

  12. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... REGULATORY REQUIREMENTS UNDER THE FEDERAL MEAT INSPECTION ACT AND THE POULTRY PRODUCTS INSPECTION ACT SANITATION 416.17 Agency verification. FSIS shall verify the adequacy and effectiveness of the Sanitation.... Such verification may include: (a) Reviewing the Sanitation SOP's; (b) Reviewing the daily...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  14. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 16 Commercial Practices 1 2013-01-01 2013-01-01 false Prescriber verification. 315.5 Section 315.5 Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact...

  15. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE... accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by the... for verification. When seeking verification of a contact lens prescription, a seller shall provide...

  16. 77 FR 40612 - Notice to All Interested Parties of the Termination of the Receivership of 10375, Signature Bank...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... From the Federal Register Online via the Government Publishing Office FEDERAL DEPOSIT INSURANCE CORPORATION Notice to All Interested Parties of the Termination of the Receivership of 10375, Signature Bank, Windsor, CO Notice is hereby given that the Federal Deposit Insurance Corporation (``FDIC'') as...

  17. Reflectors as Online Extraverts?

    ERIC Educational Resources Information Center

    Downing, Kevin; Chim, Tat Mei

    2004-01-01

    Increasingly, online learning is perceived as an effective method of instruction. Much recent educational research has focused on examining the purposes and situations for which online education is best suited. In this paper, students enrolled in two online courses are compared with their peers enrolled in equivalent classroom-based courses to

  18. Implementing Online Physical Education

    ERIC Educational Resources Information Center

    Mohnsen, Bonnie

    2012-01-01

    Online physical education, although seemingly an oxymoron, appears to be the wave of the future at least for some students. The purpose of this article is to explore research and options for online learning in physical education and to examine a curriculum, assessment, and instructional model for online learning. The article examines how physical

  19. Developing Online Doctoral Programmes

    ERIC Educational Resources Information Center

    Chipere, Ngoni

    2015-01-01

    The objectives of the study were to identify best practices in online doctoral programming and to synthesise these practices into a framework for developing online doctoral programmes. The field of online doctoral studies is nascent and presents challenges for conventional forms of literature review. The literature was therefore reviewed using a

  20. Online Learning. Symposium.

    ERIC Educational Resources Information Center

    2002

    This document contains three papers from a symposium on online learning that was conducted as part of a conference on human resource development (HRD). "An Instructional Strategy Framework for Online Learning Environments" (Scott D. Johnson, Steven R. Aragon) discusses the pitfalls of modeling online courses after traditional instruction instead…

  1. Assessing Online Learning

    ERIC Educational Resources Information Center

    Comeaux, Patricia, Ed.

    2004-01-01

    Students in traditional as well as online classrooms need more than grades from their instructors--they also need meaningful feedback to help bridge their academic knowledge and skills with their daily lives. With the increasing number of online learning classrooms, the question of how to consistently assess online learning has become increasingly

  2. Specificity in ROS Signaling and Transcript Signatures

    PubMed Central

    Vaahtera, Lauri; Brosché, Mikael; Wrzaczek, Michael

    2014-01-01

    Abstract Significance: Reactive oxygen species (ROS), important signaling molecules in plants, are involved in developmental control and stress adaptation. ROS production can trigger broad transcriptional changes; however, it is not clear how specificity in transcriptional regulation is achieved. Recent Advances: A large collection of public transcriptome data from the model plant Arabidopsis thaliana is available for analysis. These data can be used for the analysis of biological processes that are associated with ROS signaling and for the identification of suitable transcriptional indicators. Several online tools, such as Genevestigator and Expression Angler, have simplified the task to analyze, interpret, and visualize this wealth of data. Critical Issues: The analysis of the exact transcriptional responses to ROS requires the production of specific ROS in distinct subcellular compartments with precise timing, which is experimentally difficult. Analyses are further complicated by the effect of ROS production in one subcellular location on the ROS accumulation in other compartments. In addition, even subtle differences in the method of ROS production or treatment can lead to significantly different outcomes when various stimuli are compared. Future Directions: Due to the difficulty of inducing ROS production specifically with regard to ROS type, subcellular localization, and timing, we propose that the concept of a “ROS marker gene” should be re-evaluated. We suggest guidelines for the analysis of transcriptional data in ROS signaling. The use of “ROS signatures,” which consist of a set of genes that together can show characteristic and indicative responses, should be preferred over the use of individual marker genes. Antioxid. Redox Signal. 21, 1422–1441. PMID:24180661

  3. New method of verificating optical flat flatness

    NASA Astrophysics Data System (ADS)

    Sun, Hao; Li, Xueyuan; Han, Sen; Zhu, Jianrong; Guo, Zhenglai; Fu, Yuegang

    2014-11-01

    Optical flat is commonly used in optical testing instruments, flatness is the most important parameter of forming errors. As measurement criteria, optical flat flatness (OFF) index needs to have good precision. Current measurement in China is heavily dependent on the artificial visual interpretation, through discrete points to characterize the flatness. The efficiency and accuracy of this method can not meet the demand of industrial development. In order to improve the testing efficiency and accuracy of measurement, it is necessary to develop an optical flat verification system, which can obtain all surface information rapidly and efficiently, at the same time, in accordance with current national metrological verification procedures. This paper reviews current optical flat verification method and solves the problems existing in previous test, by using new method and its supporting software. Final results show that the new system can improve verification efficiency and accuracy, by comparing with JJG 28-2000 metrological verification procedures method.

  4. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  5. Satellite signatures in SLR observations

    NASA Technical Reports Server (NTRS)

    Appleby, G. M.

    1993-01-01

    We examine the evidence for the detection of satellite-dependent signatures in the laser range observations obtained by the UK single-photon Satellite Laser Ranging (SLR) System models of the expected observation distributions from Ajisai and Lageos are developed from the published satellite spread functions and from the characteristics of the SLR System and compared with the observations. The effects of varying return strengths are discussed using the models and by experimental observations of Ajisai, during which a range of return levels from single to multiple photons is achieved. The implications of these results for system-dependent center for mass corrections are discussed.

  6. Spectroscopic signature for ferroelectric ice

    NASA Astrophysics Data System (ADS)

    Wjcik, Marek J.; G?ug, Maciej; Boczar, Marek; Boda, ?ukasz

    2014-09-01

    Various forms of ice exist within our galaxy. Particularly intriguing type of ice - ferroelectric ice' was discovered experimentally and is stable in temperatures below 72 K. This form of ice can generate enormous electric fields and can play an important role in planetary formation. In this letter we present Car-Parrinello simulation of infrared spectra of ferroelectric ice and compare them with spectra of hexagonal ice. Librational region of the spectra can be treated as spectroscopic signature of ice XI and can be of help to identify ferroelectric ice in the Universe.

  7. Quantum signatures of chimera states

    NASA Astrophysics Data System (ADS)

    Bastidas, V. M.; Omelchenko, I.; Zakharova, A.; Schöll, E.; Brandes, T.

    2015-12-01

    Chimera states are complex spatiotemporal patterns in networks of identical oscillators, characterized by the coexistence of synchronized and desynchronized dynamics. Here we propose to extend the phenomenon of chimera states to the quantum regime, and uncover intriguing quantum signatures of these states. We calculate the quantum fluctuations about semiclassical trajectories and demonstrate that chimera states in the quantum regime can be characterized by bosonic squeezing, weighted quantum correlations, and measures of mutual information. Our findings reveal the relation of chimera states to quantum information theory, and give promising directions for experimental realization of chimera states in quantum systems.

  8. Signature limits in mindreading systems.

    PubMed

    Thompson, J Robert

    2014-01-01

    Recent evidence that young children seem to both understand false belief in one sense, but not in another, has led to two-systems theorizing about mindreading. By analyzing the most detailed two-systems approach in studying social cognition-the theory of mindreading defended by Ian Apperly and Stephen Butterfill-I argue that that even when dutifully constructed, two-systems approaches in social cognition struggle to adequately define the mindreading systems in terms of signature processing limits, an issue that becomes most apparent when investigating mindreading in infancy. I end the article by developing several challenges that face any two-systems account of mindreading. PMID:24646207

  9. Observational Signatures of Magnetic Reconnection

    NASA Technical Reports Server (NTRS)

    Savage, Sabrina

    2014-01-01

    Magnetic reconnection is often referred to as the primary source of energy release during solar flares. Directly observing reconnection occurring in the solar atmosphere, however, is not trivial considering that the scale size of the diffusion region is magnitudes smaller than the observational capabilities of current instrumentation, and coronal magnetic field measurements are not currently sufficient to capture the process. Therefore, predicting and studying observationally feasible signatures of the precursors and consequences of reconnection is necessary for guiding and verifying the simulations that dominate our understanding. I will present a set of such observations, particularly in connection with long-duration solar events, and compare them with recent simulations and theoretical predictions.

  10. Signature of anisotropic bubble collisions

    SciTech Connect

    Salem, Michael P.

    2010-09-15

    Our universe may have formed via bubble nucleation in an eternally inflating background. Furthermore, the background may have a compact dimension--the modulus of which tunnels out of a metastable minimum during bubble nucleation--which subsequently grows to become one of our three large spatial dimensions. When in this scenario our bubble universe collides with other ones like it, the collision geometry is constrained by the reduced symmetry of the tunneling instanton. While the regions affected by such bubble collisions still appear (to leading order) as disks in an observer's sky, the centers of these disks all lie on a single great circle, providing a distinct signature of anisotropic bubble nucleation.

  11. Gender verification of female athletes.

    PubMed

    Elsas, L J; Ljungqvist, A; Ferguson-Smith, M A; Simpson, J L; Genel, M; Carlson, A S; Ferris, E; de la Chapelle, A; Ehrhardt, A A

    2000-01-01

    The International Olympic Committee (IOC) officially mandated gender verification for female athletes beginning in 1968 and continuing through 1998. The rationale was to prevent masquerading males and women with "unfair, male-like" physical advantage from competing in female-only events. Visual observation and gynecological examination had been tried on a trial basis for two years at some competitions leading up to the 1968 Olympic Games, but these invasive and demeaning processes were jettisoned in favor of laboratory-based genetic tests. Sex chromatin and more recently DNA analyses for Y-specific male material were then required of all female athletes immediately preceding IOC-sanctioned sporting events, and many other international and national competitions following the IOC model. On-site gender verification has since been found to be highly discriminatory, and the cause of emotional trauma and social stigmatization for many females with problems of intersex who have been screened out from competition. Despite compelling evidence for the lack of scientific merit for chromosome-based screening for gender, as well as its functional and ethical inconsistencies, the IOC persisted in its policy for 30 years. The coauthors of this manuscript have worked with some success to rescind this policy through educating athletes and sports governors regarding the psychological and physical nature of sexual differentiation, and the inequities of genetic sex testing. In 1990, the International Amateur Athletics Federation (IAAF) called for abandonment of required genetic screening of women athletes, and by 1992 had adopted a fairer, medically justifiable model for preventing only male "impostors" in international track and field. At the recent recommendation of the IOC Athletes Commission, the Executive Board of the IOC has finally recognized the medical and functional inconsistencies and undue costs of chromosome-based methods. In 1999, the IOC ratified the abandonment of on-site genetic screening of females at the next Olympic Games in Australia. This article reviews the history and rationales for fairness in female-only sports that have led to the rise and fall of on-site, chromosome-based gender verification at international sporting events. PMID:11252710

  12. Theoretical Characterizaiton of Visual Signatures

    NASA Astrophysics Data System (ADS)

    Kashinski, D. O.; Chase, G. M.; di Nallo, O. E.; Scales, A. N.; Vanderley, D. L.; Byrd, E. F. C.

    2015-05-01

    We are investigating the accuracy of theoretical models used to predict the visible, ultraviolet, and infrared spectra, as well as other properties, of product materials ejected from the muzzle of currently fielded systems. Recent advances in solid propellants has made the management of muzzle signature (flash) a principle issue in weapons development across the calibers. A priori prediction of the electromagnetic spectra of formulations will allow researchers to tailor blends that yield desired signatures and determine spectrographic detection ranges. Quantum chemistry methods at various levels of sophistication have been employed to optimize molecular geometries, compute unscaled vibrational frequencies, and determine the optical spectra of specific gas-phase species. Electronic excitations are being computed using Time Dependent Density Functional Theory (TD-DFT). A full statistical analysis and reliability assessment of computational results is currently underway. A comparison of theoretical results to experimental values found in the literature is used to assess any affects of functional choice and basis set on calculation accuracy. The status of this work will be presented at the conference. Work supported by the ARL, DoD HPCMP, and USMA.

  13. Update on PIN or Signature

    NASA Astrophysics Data System (ADS)

    Matyas, Vashek

    We promised a year back some data on the experiment that we ran with chip and PIN. If you recall, it was the first phase that we reported on here last year, where we used the University bookstore, and two PIN pads, one with very solid privacy shielding, the other one without any. We ran 17 people through the first one, 15 people through the second one, and we also had the students do, about half of them forging the signature, half of them signing their own signature, on the back of the card that is used for purchasing books, or whatever.We had a second phase of the experiment, after long negotiations, and very complicated logistics, with a supermarket in Brno where we were able to do anything that we wanted through the experiment for five hours on the floor, with only the supermarket manager, the head of security, and the camera operators knowing about the experiment. So the shop assistants, the ground floor security, everybody basically on the floor, did not know about the experiment. That was one of the reasons why the supermarket, or management, agreed to take part, they wanted to control their own internal security procedures.

  14. Collider signatures of Goldstone bosons

    NASA Astrophysics Data System (ADS)

    Cheung, Kingman; Keung, Wai-Yee; Yuan, Tzu-Chiang

    2014-01-01

    Recently Weinberg suggested that Goldstone bosons arising from the spontaneous breakdown of some global hidden symmetries can interact weakly in the early Universe and account for a fraction of the effective number of neutrino species Neff, which has been reported persistently 1? away from its expected value of three. In this work, we study in some details a number of experimental constraints on this interesting idea based on the simplest possibility of a global U(1), as studied by Weinberg. We work out the decay branching ratios of the associated light scalar field ? and suggest a possible collider signature at the Large Hadron Collider. In some corners of the parameter space, the scalar field ? can decay into a pair of pions with a branching ratio of order O(1)% while the rest is mostly a pair of Goldstone bosons. The collider signature would be gluon fusion into the standard model Higgs boson gg?H or associated production with a W gauge boson qq stretchy="false">'?HW, followed by H????(??)(??) where ? is the Goldstone boson.

  15. Seismic verification of underground explosions

    SciTech Connect

    Glenn, L.A.

    1985-06-01

    The first nuclear test agreement, the test moratorium, was made in 1958 and lasted until the Soviet Union unilaterally resumed testing in the atmosphere in 1961. It was followed by the Limited Test Ban Treaty of 1963, which prohibited nuclear tests in the atmosphere, in outer space, and underwater. In 1974 the Threshold Test Ban Treaty (TTBT) was signed, limiting underground tests after March 1976 to a maximum yield of 250 kt. The TTBT was followed by a treaty limiting peaceful nuclear explosions and both the United States and the Soviet Union claim to be abiding by the 150-kt yield limit. A comprehensive test ban treaty (CTBT), prohibiting all testing of nuclear weapons, has also been discussed. However, a verifiable CTBT is a contradiction in terms. No monitoring technology can offer absolute assurance that very-low-yield illicit explosions have not occurred. The verification process, evasion opportunities, and cavity decoupling are discussed in this paper.

  16. NUMERICAL VERIFICATION OF EQUILIBRIUM CHEMISTRY

    SciTech Connect

    Piro, Markus; Lewis, Brent; Thompson, Dr. William T.; Simunovic, Srdjan; Besmann, Theodore M

    2010-01-01

    A numerical tool is in an advanced state of development to compute the equilibrium compositions of phases and their proportions in multi-component systems of importance to the nuclear industry. The resulting software is being conceived for direct integration into large multi-physics fuel performance codes, particularly for providing boundary conditions in heat and mass transport modules. However, any numerical errors produced in equilibrium chemistry computations will be propagated in subsequent heat and mass transport calculations, thus falsely predicting nuclear fuel behaviour. The necessity for a reliable method to numerically verify chemical equilibrium computations is emphasized by the requirement to handle the very large number of elements necessary to capture the entire fission product inventory. A simple, reliable and comprehensive numerical verification method is presented which can be invoked by any equilibrium chemistry solver for quality assurance purposes.

  17. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  18. Requirements, Verification, and Compliance (RVC) Database Tool

    NASA Technical Reports Server (NTRS)

    Rainwater, Neil E., II; McDuffee, Patrick B.; Thomas, L. Dale

    2001-01-01

    This paper describes the development, design, and implementation of the Requirements, Verification, and Compliance (RVC) database used on the International Space Welding Experiment (ISWE) project managed at Marshall Space Flight Center. The RVC is a systems engineer's tool for automating and managing the following information: requirements; requirements traceability; verification requirements; verification planning; verification success criteria; and compliance status. This information normally contained within documents (e.g. specifications, plans) is contained in an electronic database that allows the project team members to access, query, and status the requirements, verification, and compliance information from their individual desktop computers. Using commercial-off-the-shelf (COTS) database software that contains networking capabilities, the RVC was developed not only with cost savings in mind but primarily for the purpose of providing a more efficient and effective automated method of maintaining and distributing the systems engineering information. In addition, the RVC approach provides the systems engineer the capability to develop and tailor various reports containing the requirements, verification, and compliance information that meets the needs of the project team members. The automated approach of the RVC for capturing and distributing the information improves the productivity of the systems engineer by allowing that person to concentrate more on the job of developing good requirements and verification programs and not on the effort of being a "document developer".

  19. Research towards a systematic signature discovery process

    SciTech Connect

    Baker, Nathan A.; Barr, Jonathan L.; Bonheyo, George T.; Joslyn, Cliff A.; Krishnaswami, Kannan; Oxley, Mark; Quadrel, Richard W.; Sego, Landon H.; Tardiff, Mark F.; Wynne, Adam S.

    2013-06-04

    In its most general form, a signature is a unique or distinguishing measurement, pattern, or collection of data that identifies a phenomenon (object, action, or behavior) of interest. The discovery of signatures is an important aspect of a wide range of disciplines from basic science to national security for the rapid and efficient detection and/or prediction of phenomena. Current practice in signature discovery is typically accomplished by asking domain experts to characterize and/or model individual phenomena to identify what might compose a useful signature. What is lacking is an approach that can be applied across a broad spectrum of domains and information sources to efficiently and robustly construct candidate signatures, validate their reliability, measure their quality, and overcome the challenge of detection -- all in the face of dynamic conditions, measurement obfuscation and noisy data environments. Our research has focused on the identification of common elements of signature discovery across application domains and the synthesis of those elements into a systematic process for more robust and efficient signature development. In this way, a systematic signature discovery process lays the groundwork for leveraging knowledge obtained from signatures to a particular domain or problem area, and, more generally, to problems outside that domain. This paper presents the initial results of this research by discussing a mathematical framework for representing signatures and placing that framework in the context of a systematic signature discovery process. Additionally, the basic steps of this process are described with details about the methods available to support the different stages of signature discovery, development, and deployment.

  20. Secure Obfuscation for Encrypted Group Signatures.

    PubMed

    Shi, Yang; Zhao, Qinpei; Fan, Hongfei; Liu, Qin

    2015-01-01

    In recent years, group signature techniques are widely used in constructing privacy-preserving security schemes for various information systems. However, conventional techniques keep the schemes secure only in normal black-box attack contexts. In other words, these schemes suppose that (the implementation of) the group signature generation algorithm is running in a platform that is perfectly protected from various intrusions and attacks. As a complementary to existing studies, how to generate group signatures securely in a more austere security context, such as a white-box attack context, is studied in this paper. We use obfuscation as an approach to acquire a higher level of security. Concretely, we introduce a special group signature functionality-an encrypted group signature, and then provide an obfuscator for the proposed functionality. A series of new security notions for both the functionality and its obfuscator has been introduced. The most important one is the average-case secure virtual black-box property w.r.t. dependent oracles and restricted dependent oracles which captures the requirement of protecting the output of the proposed obfuscator against collision attacks from group members. The security notions fit for many other specialized obfuscators, such as obfuscators for identity-based signatures, threshold signatures and key-insulated signatures. Finally, the correctness and security of the proposed obfuscator have been proven. Thereby, the obfuscated encrypted group signature functionality can be applied to variants of privacy-preserving security schemes and enhance the security level of these schemes. PMID:26167686

  1. Secure Obfuscation for Encrypted Group Signatures

    PubMed Central

    Fan, Hongfei; Liu, Qin

    2015-01-01

    In recent years, group signature techniques are widely used in constructing privacy-preserving security schemes for various information systems. However, conventional techniques keep the schemes secure only in normal black-box attack contexts. In other words, these schemes suppose that (the implementation of) the group signature generation algorithm is running in a platform that is perfectly protected from various intrusions and attacks. As a complementary to existing studies, how to generate group signatures securely in a more austere security context, such as a white-box attack context, is studied in this paper. We use obfuscation as an approach to acquire a higher level of security. Concretely, we introduce a special group signature functionality-an encrypted group signature, and then provide an obfuscator for the proposed functionality. A series of new security notions for both the functionality and its obfuscator has been introduced. The most important one is the average-case secure virtual black-box property w.r.t. dependent oracles and restricted dependent oracles which captures the requirement of protecting the output of the proposed obfuscator against collision attacks from group members. The security notions fit for many other specialized obfuscators, such as obfuscators for identity-based signatures, threshold signatures and key-insulated signatures. Finally, the correctness and security of the proposed obfuscator have been proven. Thereby, the obfuscated encrypted group signature functionality can be applied to variants of privacy-preserving security schemes and enhance the security level of these schemes. PMID:26167686

  2. Novel Quantum Proxy Signature without Entanglement

    NASA Astrophysics Data System (ADS)

    Xu, Guang-bao

    2015-08-01

    Proxy signature is an important research topic in classic cryptography since it has many application occasions in our real life. But only a few quantum proxy signature schemes have been proposed up to now. In this paper, we propose a quantum proxy signature scheme, which is designed based on quantum one-time pad. Our scheme can be realized easily since it only uses single-particle states. Security analysis shows that it is secure and meets all the properties of a proxy signature, such as verifiability, distinguishability, unforgeability and undeniability.

  3. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  4. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  5. The infrasonic signature of the 2009 major Sudden Stratospheric Warming

    NASA Astrophysics Data System (ADS)

    Evers, L.; Siegmund, P.

    2009-12-01

    The study of infrasound is experiencing a renaissance since it was chosen as a verification technique for the Comprehensive Nuclear-Test-Ban Treaty (CTBT). The success of the verification technique strongly depends on knowledge of upper atmospheric processes. The ability of infrasound to probe the upper atmosphere starts to be exploited, taking the field beyond its monitoring application. Processes in the stratosphere couple to the troposphere and influence our daily weather and climate. Infrasound delivers actual observations on the state of the stratosphere with a high spatial and temporal resolution. Here we show the infrasonic signature, passively obtained, of a drastic change in the stratosphere due to the major Sudden Stratospheric Warming (SSW) of January 2009. A major SSW started around January 15. At the altitude of 30 km, the average temperature to the north of 65N increased in one week by more than 50 deg C, leading to exceptionally high temperatures of about -20 deg C. Simultaneously, the polar vortex reversed direction from eastward to westward. The warming was accompanied by a split-up of the polar vortex and an increased amplitude of the zonal wavenumber number 2 planetary waves. Infrasound recordings on the Northern Hemisphere have been analysed. These arrays are part of the International Monitoring System (IMS) for the CTBT. Interacting oceanic waves are almost continuously emitting infrasound, where the whole atmospheric wind and temperature structure determines the detectability of these so-called microbaroms. Changes in this detectability have been associated to wind and temperatures changes around 50 km altitude due to the major SSW. With this study, we infer the enormous capacity of infrasound in passive acoustic remote sensing of stratospheric processes on a global scale with surface based instruments.

  6. 37 CFR 262.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... Verification of royalty payments. (a) General. This section prescribes procedures by which any Copyright Owner... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A...

  7. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 2 2011-04-01 2011-04-01 false Verification and validation. 120.11 Section 120.11... § 120.11 Verification and validation. (a) Verification. Each processor shall verify that the Hazard...) Verification activities shall include: (i) A review of any consumer complaints that have been received by...

  8. 37 CFR 262.7 - Verification of royalty payments.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Verification of royalty... Verification of royalty payments. (a) General. This section prescribes procedures by which any Copyright Owner... Designated Agent have agreed as to proper verification methods. (b) Frequency of verification. A...

  9. 37 CFR 380.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Verification of royalty... Verification of royalty distributions. (a) General. This section prescribes procedures by which any Copyright... Collective have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner...

  10. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... Verification of royalty distributions. (a) General. This section prescribes procedures by which any Copyright... Collective have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner...

  11. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Verification of royalty... Verification of royalty distributions. (a) General. This section prescribes procedures by which any Copyright... Collective have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner...

  12. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 2 2010-04-01 2010-04-01 false Verification and validation. 120.11 Section 120.11... § 120.11 Verification and validation. (a) Verification. Each processor shall verify that the Hazard...) Verification activities shall include: (i) A review of any consumer complaints that have been received by...

  13. 37 CFR 380.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... TRANSMISSIONS, NEW SUBSCRIPTION SERVICES AND THE MAKING OF EPHEMERAL REPRODUCTIONS § 380.7 Verification of... have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner...

  14. Arms control verification: The technologies that make it possible

    SciTech Connect

    Tsipis, K.; Hafemeister, D.W.; Janeway, P.

    1986-01-01

    This book presents papers on arms control verification. Topics considered include the politics of treaty verification and compliance, national security, remote sensing, image processing, image enhancement by digital computer, charge-coupled device image sensors, radar imaging, infrared surveillance, monitoring, seismological aspects, satellite verifications, seismic verification, and verifying a fissile material production freeze.

  15. Signature Product Code for Predicting Protein-Protein Interactions

    Energy Science and Technology Software Center (ESTSC)

    2004-09-25

    The SigProdV1.0 software consists of four programs which together allow the prediction of protein-protein interactions using only amino acid sequences and experimental data. The software is based on the use of tensor products of amino acid trimers coupled with classifiers known as support vector machines. Essentially the program looks for amino acid trimer pairs which occur more frequently in protein pairs which are known to interact. These trimer pairs are then used to make predictionsmore » about unknown protein pairs. A detailed description of the method can be found in the paper: S. Martin, D. Roe, J.L. Faulon. "Predicting protein-protein interactions using signature products," Bioinformatics, available online from Advance Access, Aug. 19, 2004.« less

  16. Higher-order harmonic signature analysis for loudspeaker defect detection

    NASA Astrophysics Data System (ADS)

    Thompson, Shane; Pagliaro, Anthony; Celmer, Robert; Foley, Daniel; Temme, Steve

    2003-10-01

    Loudspeaker assembly faults, such as a rubbing voice coil, bent frame, loose spider, etc., have traditionally been detected using experienced human listeners at the end of a production line. Previous attempts to develop production measurement systems for on-line testing typically analyze only low-order harmonics for the primary purpose of measuring total harmonic distortion (THD), and thus are not specifically designed to detect defective rub, buzz, and ticking sounds. This paper describes a new method wherein the total energy of high-order harmonics groups, for example, 10th through the 20th or 31st through the 40th, are measured and analyzed. By grouping high-order harmonics and resolving their respective total energies, distinct signatures can be obtained that correlate to the root cause of audible rub and buzz distortions [Temme (2000)]. The paper discusses loudspeakers tested with specific defects, as well as results of a computer-based electroacoustic measurement and analysis system used for detection.

  17. A prospective evaluation of a breast cancer prognosis signature in the observational RASTER study

    PubMed Central

    Drukker, CA; Bueno-de-Mesquita, JM; Retl, VP; van Harten, WH; van Tinteren, H; Wesseling, J; Roumen, RMH; Knauer, M; van 't Veer, LJ; Sonke, GS; Rutgers, EJT; van de Vijver, MJ; Linn, SC

    2013-01-01

    The 70-gene signature (MammaPrint) has been developed on retrospective series of breast cancer patients to predict the risk of breast cancer distant metastases. The microarRAy-prognoSTics-in-breast-cancER (RASTER) study was the first study designed to prospectively evaluate the performance of the 70-gene signature, which result was available for 427 patients (cT13N0M0). Adjuvant systemic treatment decisions were based on the Dutch CBO 2004 guidelines, the 70-gene signature and doctors' and patients' preferences. Five-year distant-recurrence-free-interval (DRFI) probabilities were compared between subgroups based on the 70-gene signature and Adjuvant! Online (AOL) (10-year survival probability <90% was defined as high-risk). Median follow-up was 61.6 months. Fifteen percent (33/219) of the 70-gene signature low-risk patients received adjuvant chemotherapy (ACT) versus 81% (169/208) of the 70-gene signature high-risk patients. The 5-year DRFI probabilities for 70-gene signature low-risk (n = 219) and high-risk (n = 208) patients were 97.0% and 91.7%. The 5-year DRFI probabilities for AOL low-risk (n = 132) and high-risk (n = 295) patients were 96.7% and 93.4%. For 70-gene signature low-riskAOL high-risk patients (n = 124), of whom 76% (n = 94) had not received ACT, 5-year DRFI was 98.4%. In the AOL high-risk group, 32% (94/295) less patients would be eligible to receive ACT if the 70-gene signature was used. In this prospective community-based observational study, the 5-year DRFI probabilities confirmed the additional prognostic value of the 70-gene signature to clinicopathological risk estimations such as AOL. Omission of adjuvant chemotherapy as judged appropriate by doctors and patients and instigated by a low-risk 70-gene signature result, appeared not to compromise outcome. PMID:23371464

  18. Online Monitoring of Plant Assets in the Nuclear Industry

    SciTech Connect

    Nancy Lybeck; Vivek Agarwal; Binh Pham; Richard Rusaw; Randy Bickford

    2013-10-01

    Today’s online monitoring technologies provide opportunities to perform predictive and proactive health management of assets within many different industries, in particular the defense and aerospace industries. The nuclear industry can leverage these technologies to enhance safety, productivity, and reliability of the aging fleet of existing nuclear power plants. The U.S. Department of Energy’s Light Water Reactor Sustainability Program is collaborating with the Electric Power Research Institute’s (EPRI’s) Long-Term Operations program to implement online monitoring in existing nuclear power plants. Proactive online monitoring in the nuclear industry is being explored using EPRI’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software, a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. This paper focuses on development of asset fault signatures used to assess the health status of generator step-up transformers and emergency diesel generators in nuclear power plants. Asset fault signatures describe the distinctive features based on technical examinations that can be used to detect a specific fault type. Fault signatures are developed based on the results of detailed technical research and on the knowledge and experience of technical experts. The Diagnostic Advisor of the FW-PHM Suite software matches developed fault signatures with operational data to provide early identification of critical faults and troubleshooting advice that could be used to distinguish between faults with similar symptoms. This research is important as it will support the automation of predictive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  19. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    SciTech Connect

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in social media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation of an event. The goal of this paper is to instigate a discussion within the verification community as to where and how social media can be effectively utilized to complement and enhance traditional treaty verification efforts. In addition, this paper seeks to identify areas of future research and development necessary to adapt social media analytic tools and techniques, and to form the seed for social media analytics to aid and inform arms control and nonproliferation policymakers and analysts. While social media analysis (as well as open source analysis as a whole) will not ever be able to replace traditional arms control verification measures, they do supply unique signatures that can augment existing analysis.

  20. Signature of anisotropic bubble collisions

    NASA Astrophysics Data System (ADS)

    Salem, Michael P.

    2010-09-01

    Our universe may have formed via bubble nucleation in an eternally inflating background. Furthermore, the background may have a compact dimensionthe modulus of which tunnels out of a metastable minimum during bubble nucleationwhich subsequently grows to become one of our three large spatial dimensions. When in this scenario our bubble universe collides with other ones like it, the collision geometry is constrained by the reduced symmetry of the tunneling instanton. While the regions affected by such bubble collisions still appear (to leading order) as disks in an observers sky, the centers of these disks all lie on a single great circle, providing a distinct signature of anisotropic bubble nucleation.

  1. Holographic signatures of cosmological singularities.

    PubMed

    Engelhardt, Netta; Hertog, Thomas; Horowitz, Gary T

    2014-09-19

    To gain insight into the quantum nature of cosmological singularities, we study anisotropic Kasner solutions in gauge-gravity duality. The dual description of the bulk evolution towards the singularity involves N=4 super Yang-Mills theory on the expanding branch of deformed de Sitter space and is well defined. We compute two-point correlators of Yang-Mills operators of large dimensions using spacelike geodesics anchored on the boundary. The correlators show a strong signature of the singularity around horizon scales and decay at large boundary separation at different rates in different directions. More generally, the boundary evolution exhibits a process of particle creation similar to that in inflation. This leads us to conjecture that information on the quantum nature of cosmological singularities is encoded in long-wavelength features of the boundary wave function. PMID:25279620

  2. Metabolic Signatures of Bacterial Vaginosis

    PubMed Central

    Morgan, Martin T.; Fiedler, Tina L.; Djukovic, Danijel; Hoffman, Noah G.; Raftery, Daniel; Marrazzo, Jeanne M.

    2015-01-01

    ABSTRACT Bacterial vaginosis (BV) is characterized by shifts in the vaginal microbiota from Lactobacillus dominant to a microbiota with diverse anaerobic bacteria. Few studies have linked specific metabolites with bacteria found in the human vagina. Here, we report dramatic differences in metabolite compositions and concentrations associated with BV using a global metabolomics approach. We further validated important metabolites using samples from a second cohort of women and a different platform to measure metabolites. In the primary study, we compared metabolite profiles in cervicovaginal lavage fluid from 40 women with BV and 20 women without BV. Vaginal bacterial representation was determined using broad-range PCR with pyrosequencing and concentrations of bacteria by quantitative PCR. We detected 279 named biochemicals; levels of 62% of metabolites were significantly different in women with BV. Unsupervised clustering of metabolites separated women with and without BV. Women with BV have metabolite profiles marked by lower concentrations of amino acids and dipeptides, concomitant with higher levels of amino acid catabolites and polyamines. Higher levels of the signaling eicosanoid 12-hydroxyeicosatetraenoic acid (12-HETE), a biomarker for inflammation, were noted in BV. Lactobacillus crispatus and Lactobacillus jensenii exhibited similar metabolite correlation patterns, which were distinct from correlation patterns exhibited by BV-associated bacteria. Several metabolites were significantly associated with clinical signs and symptoms (Amsel criteria) used to diagnose BV, and no metabolite was associated with all four clinical criteria. BV has strong metabolic signatures across multiple metabolic pathways, and these signatures are associated with the presence and concentrations of particular bacteria. PMID:25873373

  3. Irma multisensor predictive signature model

    NASA Astrophysics Data System (ADS)

    Watson, John S.; Flynn, David S.; Wellfare, Michael R.; Richards, Mike; Prestwood, Lee

    1995-06-01

    The Irma synthetic signature model was one of the first high resolution synthetic infrared (IR) target and background signature models to be developed for tactical air-to-surface weapon scenarios. Originally developed in 1980 by the Armament Directorate of the Air Force Wright Laboratory (WL/MN), the Irma model was used exclusively to generate IR scenes for smart weapons research and development. In 1988, a number of significant upgrades to Irma were initiated including the addition of a laser channel. This two channel version, Irma 3.0, was released to the user community in 1990. In 1992, an improved scene generator was incorporated into the Irma model which supported correlated frame-to-frame imagery. This and other improvements were released in Irma 2.2. Recently, Irma 3.2, a passive IR/millimeter wave (MMW) code, was completed. Currently, upgrades are underway to include an active MMW channel. Designated Irma 4.0, this code will serve as a cornerstone of sensor fusion research in the laboratory from 6.1 concept development to 6.3 technology demonstration programs for precision guided munitions. Several significant milestones have been reached in this development process and are demonstrated. The Irma 4.0 software design has been developed and interim results are available. Irma is being developed to facilitate multi-sensor smart weapons research and development. It is currently in distribution to over 80 agencies within the U.S. Air Force, U.S. Army, U.S. Navy, ARPA, NASA, Department of Transportation, academia, and industry.

  4. Irma multisensor predictive signature model

    NASA Astrophysics Data System (ADS)

    Watson, John S.; Wellfare, Michael R.; Foster, Joseph; Owens, Monte A.; Vechinski, Douglas A.; Richards, Mike; Resnick, Andrew; Underwood, Vincent

    1998-07-01

    The Irma synthetic signature model was one of the first high resolution infrared (IR) target and background signature models to be developed for tactical weapons application. Originally developed in 1980 by the Munitions Directorate of the Air Force Research Laboratory, the Irma model was used exclusively to generate IR scenes for smart weapons research and development. In 1988, a number of significant upgrades to Irma were initiated including the addition of a laser channel. This two channel version, Irma 3.0, was released to the user community in 1990. In 1992, an improved scene generator was incorporated into the Irma model which supported correlated frame-to-frame imagery. This and other improvements were released in Irma 2.2. Irma 3.2, a passive IR/millimeter wave (MMW) code, was completed in 1994. This served as the cornerstone for the development of the co- registered active/passive IR/MMW model, Irma 4.0. Currently, upgrades are underway to include a near IR (NIR)/visible channel; a facet editor; utilities to support image viewing and scaling; and additional target/data files. The Irma 4.1 software development effort is nearly completion. The purpose of this paper is to illustrate the results of the development. Planned upgrades for Irma 5.0 will be provided as well. Irma is being developed to facilitate multi-sensor research and development. It is currently being used to support a number of civilian and military applications. The current Irma user base includes over 100 agencies within the Air Force, Army, Navy, DARPA, NASA, Department of Transportation, academia, and industry.

  5. Automated UF6 Cylinder Enrichment Assay: Status of the Hybrid Enrichment Verification Array (HEVA) Project: POTAS Phase II

    SciTech Connect

    Jordan, David V.; Orton, Christopher R.; Mace, Emily K.; McDonald, Benjamin S.; Kulisek, Jonathan A.; Smith, Leon E.

    2012-06-01

    Pacific Northwest National Laboratory (PNNL) intends to automate the UF6 cylinder nondestructive assay (NDA) verification currently performed by the International Atomic Energy Agency (IAEA) at enrichment plants. PNNL is proposing the installation of a portal monitor at a key measurement point to positively identify each cylinder, measure its mass and enrichment, store the data along with operator inputs in a secure database, and maintain continuity of knowledge on measured cylinders until inspector arrival. This report summarizes the status of the research and development of an enrichment assay methodology supporting the cylinder verification concept. The enrichment assay approach exploits a hybrid of two passively-detected ionizing-radiation signatures: the traditional enrichment meter signature (186-keV photon peak area) and a non-traditional signature, manifested in the high-energy (3 to 8 MeV) gamma-ray continuum, generated by neutron emission from UF6. PNNL has designed, fabricated, and field-tested several prototype assay sensor packages in an effort to demonstrate proof-of-principle for the hybrid assay approach, quantify the expected assay precision for various categories of cylinder contents, and assess the potential for unsupervised deployment of the technology in a portal-monitor form factor. We refer to recent sensor-package prototypes as the Hybrid Enrichment Verification Array (HEVA). The report provides an overview of the assay signatures and summarizes the results of several HEVA field measurement campaigns on populations of Type 30B UF6 cylinders containing low-enriched uranium (LEU), natural uranium (NU), and depleted uranium (DU). Approaches to performance optimization of the assay technique via radiation transport modeling are briefly described, as are spectroscopic and data-analysis algorithms.

  6. VERIFICATION OF GLOBAL CLIMATE CHANGE MITIGATION TECHNOLOGIES

    EPA Science Inventory

    This is a continuation of independent performance evaluations of environmental technologies under EPA's Environmental Technology Verification Program. Emissions of some greenhouse gases, most notably methane. can be controlled profitably now, even in the absence of regulations. ...

  7. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  8. Calibration and verification of environmental models

    NASA Technical Reports Server (NTRS)

    Lee, S. S.; Sengupta, S.; Weinberg, N.; Hiser, H.

    1976-01-01

    The problems of calibration and verification of mesoscale models used for investigating power plant discharges are considered. The value of remote sensors for data acquisition is discussed as well as an investigation of Biscayne Bay in southern Florida.

  9. ETV - ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) - RISK MANAGEMENT

    EPA Science Inventory

    In October 1995, the Environmental Technology Verification (ETV) Program was established by EPA. The goal of ETV is to provide credible performance data for commercial-ready environmental technologies to speed their implementation for the benefit of vendors, purchasers, permitter...

  10. Stennis Space Center Verification and Validation Capabilities

    NASA Technical Reports Server (NTRS)

    Daehler, Erik

    2002-01-01

    Verification and validation capabilities are discussed for: spatial response,reflectance radiometry, positional accuracy,in-flight instruments, lab calibration, thermal radiometry,hyperspectral radiometry, and portable atmospheric monitoring.

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  13. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  14. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM: QUALITY MANAGEMENT PLAN

    EPA Science Inventory

    This Quality Management Plan details specific policies and procedures for managing quality-related activities for the Environmental Technology Verification (ETV) Program of the U.S. Environmental Protection Agency.

  15. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  16. U.S. Environmental Technology Verification Program

    EPA Science Inventory

    Overview of the U.S. Environmental Technology Verification Program (ETV), the ETV Greenhouse Gas Technology Center, and energy-related ETV projects. Presented at the Department of Energy's National Renewable Laboratory in Boulder, Colorado on June 23, 2008.

  17. THE EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM

    EPA Science Inventory

    The Environmental Protection Agency (EPA) instituted the Environmental Technology Verification Program--or ETV--to verify the performance of innovative technical solutions to problems that threaten human health or the environment. ETV was created to substantially accelerate the e...

  18. Verification of the Calore thermal analysis code.

    SciTech Connect

    Dowding, Kevin J.; Blackwell, Bennie Francis

    2004-07-01

    Calore is the ASC code developed to model steady and transient thermal diffusion with chemistry and dynamic enclosure radiation. An integral part of the software development process is code verification, which addresses the question 'Are we correctly solving the model equations'? This process aids the developers in that it identifies potential software bugs and gives the thermal analyst confidence that a properly prepared input will produce satisfactory output. Grid refinement studies have been performed on problems for which we have analytical solutions. In this talk, the code verification process is overviewed and recent results are presented. Recent verification studies have focused on transient nonlinear heat conduction and verifying algorithms associated with (tied) contact and adaptive mesh refinement. In addition, an approach to measure the coverage of the verification test suite relative to intended code applications is discussed.

  19. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  20. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  1. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  2. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  3. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  4. Handbook: Design of automated redundancy verification

    NASA Technical Reports Server (NTRS)

    Ford, F. A.; Hasslinger, T. W.; Moreno, F. J.

    1971-01-01

    The use of the handbook is discussed and the design progress is reviewed. A description of the problem is presented, and examples are given to illustrate the necessity for redundancy verification, along with the types of situations to which it is typically applied. Reusable space vehicles, such as the space shuttle, are recognized as being significant in the development of the automated redundancy verification problem.

  5. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  6. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  7. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  8. Computation of signatures of linear airgun arrays

    SciTech Connect

    Vaage, S.; Ursin, B.

    1987-03-01

    Far-field signatures from an airgun array are usually obtained by carrying out extensive field measurements. In order to decrease the need for such measurements, the authors have developed a method for computing signatures from linear airgun arrays where the distances between the airguns are such that the non-linear interaction among the airguns is negligible. The signature from a single airgun of a given type is computed from the following airgun parameters: airgun chamber volume, chamber pressure, airgun depth and position of the waveshape plate within the chamber. For calibration purposes, a recorded signature for one set of airgun parameters has to be provided for each type of airgun. The signatures are computed by using empirical relations between signature properties and the airgun parameters, and by treating the primary and bubble pulses separately. The far-field signature from a linear airgun array can now be computed by summation of the delayed signatures from the airguns in the array. Practical results are shown for an array with different PAR (Bolt) 1500 C airguns.

  9. 21 CFR 11.50 - Signature manifestations.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... RECORDS; ELECTRONIC SIGNATURES Electronic Records 11.50 Signature manifestations. (a) Signed electronic records shall contain information associated with the signing that clearly indicates all of the following... the same controls as for electronic records and shall be included as part of any human readable...

  10. 21 CFR 11.50 - Signature manifestations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... RECORDS; ELECTRONIC SIGNATURES Electronic Records 11.50 Signature manifestations. (a) Signed electronic records shall contain information associated with the signing that clearly indicates all of the following... the same controls as for electronic records and shall be included as part of any human readable...

  11. 21 CFR 11.50 - Signature manifestations.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... RECORDS; ELECTRONIC SIGNATURES Electronic Records 11.50 Signature manifestations. (a) Signed electronic records shall contain information associated with the signing that clearly indicates all of the following... the same controls as for electronic records and shall be included as part of any human readable...

  12. 5 CFR 850.106 - Electronic signatures.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 5 Administrative Personnel 2 2010-01-01 2010-01-01 false Electronic signatures. 850.106 Section... (CONTINUED) RETIREMENT SYSTEMS MODERNIZATION General Provisions § 850.106 Electronic signatures. (a) Subject to any provisions prescribed by the Director under § 850.104— (1) An electronic communication may...

  13. 5 CFR 850.106 - Electronic signatures.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 5 Administrative Personnel 2 2011-01-01 2011-01-01 false Electronic signatures. 850.106 Section... (CONTINUED) RETIREMENT SYSTEMS MODERNIZATION General Provisions § 850.106 Electronic signatures. (a) Subject to any provisions prescribed by the Director under § 850.104— (1) An electronic communication may...

  14. 5 CFR 850.106 - Electronic signatures.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... 5 Administrative Personnel 2 2013-01-01 2013-01-01 false Electronic signatures. 850.106 Section... (CONTINUED) RETIREMENT SYSTEMS MODERNIZATION General Provisions § 850.106 Electronic signatures. (a) Subject to any provisions prescribed by the Director under § 850.104— (1) An electronic communication may...

  15. 5 CFR 850.106 - Electronic signatures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 5 Administrative Personnel 2 2014-01-01 2014-01-01 false Electronic signatures. 850.106 Section... (CONTINUED) ELECTRONIC RETIREMENT PROCESSING General Provisions § 850.106 Electronic signatures. (a) Subject to any provisions prescribed by the Director under § 850.104— (1) An electronic communication may...

  16. 5 CFR 850.106 - Electronic signatures.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 5 Administrative Personnel 2 2012-01-01 2012-01-01 false Electronic signatures. 850.106 Section... (CONTINUED) RETIREMENT SYSTEMS MODERNIZATION General Provisions § 850.106 Electronic signatures. (a) Subject to any provisions prescribed by the Director under § 850.104— (1) An electronic communication may...

  17. 48 CFR 4.102 - Contractor's signature.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 48 Federal Acquisition Regulations System 1 2013-10-01 2013-10-01 false Contractor's signature. 4.102 Section 4.102 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Contract Execution 4.102 Contractor's signature. (a) Individuals. A contract with...

  18. 48 CFR 4.102 - Contractor's signature.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Contractor's signature. 4.102 Section 4.102 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Contract Execution 4.102 Contractor's signature. (a) Individuals. A contract with...

  19. A Real Quantum Designated Verifier Signature Scheme

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Min; Zhou, Yi-Hua; Yang, Yu-Guang

    2015-09-01

    The effectiveness of most quantum signature schemes reported in the literature can be verified by a designated person, however, those quantum signature schemes aren't the real traditional designated verifier signature schemes, because the designated person hasn't the capability to efficiently simulate a signature which is indistinguishable from a signer, which cannot satisfy the requirements in some special environments such as E-voting, call for tenders and software licensing. For solving this problem, a real quantum designated verifier signature scheme is proposed in this paper. According to the property of unitary transformation and quantum one-way function, only a verifier designated by a signer can verify the "validity of a signature" and the designated verifier cannot prove to a third party that the signature was produced by the signer or by himself through a transcript simulation algorithm. Moreover, the quantum key distribution and quantum encryption algorithm guarantee the unconditional security of this scheme. Analysis results show that this new scheme satisfies the main security requirements of designated verifier signature scheme and the major attack strategies.

  20. Online shopping hesitation.

    PubMed

    Cho, Chang-Hoan; Kang, Jaewon; Cheon, Hongsik John

    2006-06-01

    This study was designed to understand which factors influence consumer hesitation or delay in online product purchases. The study examined four groups of variables (i.e., consumer characteristics, contextual factors perceived uncertainty factors, and medium/channel innovation factors) that predict three types of online shopping hesitation (i.e., overall hesitation, shopping cart abandonment, and hesitation at the final payment stage). We found that different sets of delay factors are related to different aspects of online shopping hesitation. The study concludes with suggestion for various delay-reduction devices to help consumers close their online decision hesitation. PMID:16780394

  1. 21 CFR 11.70 - Signature/record linking.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... RECORDS; ELECTRONIC SIGNATURES Electronic Records 11.70 Signature/record linking. Electronic signatures and handwritten signatures executed to electronic records shall be linked to their respective electronic records to ensure that the signatures cannot be excised, copied, or otherwise transferred...

  2. 21 CFR 11.70 - Signature/record linking.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... RECORDS; ELECTRONIC SIGNATURES Electronic Records 11.70 Signature/record linking. Electronic signatures and handwritten signatures executed to electronic records shall be linked to their respective electronic records to ensure that the signatures cannot be excised, copied, or otherwise transferred...

  3. 21 CFR 11.70 - Signature/record linking.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... RECORDS; ELECTRONIC SIGNATURES Electronic Records 11.70 Signature/record linking. Electronic signatures and handwritten signatures executed to electronic records shall be linked to their respective electronic records to ensure that the signatures cannot be excised, copied, or otherwise transferred...

  4. 21 CFR 11.70 - Signature/record linking.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Signature/record linking. 11.70 Section 11.70 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Electronic Records 11.70 Signature/record linking. Electronic signatures and handwritten signatures executed...

  5. 21 CFR 11.70 - Signature/record linking.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Signature/record linking. 11.70 Section 11.70 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Electronic Records 11.70 Signature/record linking. Electronic signatures and handwritten signatures executed...

  6. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 21 Food and Drugs 1 2011-04-01 2011-04-01 false Electronic signature components and controls. 11... SERVICES GENERAL ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Electronic Signatures § 11.200 Electronic signature components and controls. (a) Electronic signatures that are not based upon biometrics shall:...

  7. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 21 Food and Drugs 1 2013-04-01 2013-04-01 false Electronic signature components and controls. 11... SERVICES GENERAL ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Electronic Signatures § 11.200 Electronic signature components and controls. (a) Electronic signatures that are not based upon biometrics shall:...

  8. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 21 Food and Drugs 1 2010-04-01 2010-04-01 false Electronic signature components and controls. 11... SERVICES GENERAL ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Electronic Signatures § 11.200 Electronic signature components and controls. (a) Electronic signatures that are not based upon biometrics shall:...

  9. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 1 2012-04-01 2012-04-01 false Electronic signature components and controls. 11... SERVICES GENERAL ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Electronic Signatures § 11.200 Electronic signature components and controls. (a) Electronic signatures that are not based upon biometrics shall:...

  10. OPC verification considering CMP induced topography

    NASA Astrophysics Data System (ADS)

    Kuncha, Rakesh Kumar; Narayana Samy, Aravind; Katakamsetty, Ushasree

    2015-09-01

    OPC Verification is important to identify the critical wafer hotspots prior to mask fabrication. It helps to identify process limiting structures and possible yield limiters. These hotspots are also used by litho engineers to set up process conditions upfront. OPC Verification generally involves verification done at nominal and process window conditions. The process window conditions take into consideration typical process variations for lithography. In this standard flow, the post CMP topography variation was also lumped into these process variations via focus. But in current technologies especially in higher metal layers, CMP induced topography variation has become a major contributor to limit the overall process window. This results in different best focus for structures with different topography. This gives rises to requirement of OPC Verification flow taking into account these location-specific variations in order to know if the mask data can be used or not. This paper proposes a method to incorporate the topography induced focus shift into the OPC Verification flow. OPC Verification checks are performed at the new nominal and Process window conditions to identify the real hotspots seen on wafer. Results are shown where the highlighted hotspots with the proposed new flow correlate better with wafer results. Runtime was also taken into consideration when the flow was developed. Experiments on various products show better accuracy with minimal runtime impact.

  11. Security Weaknesses in Arbitrated Quantum Signature Protocols

    NASA Astrophysics Data System (ADS)

    Liu, Feng; Zhang, Kejia; Cao, Tianqing

    2014-01-01

    Arbitrated quantum signature (AQS) is a cryptographic scenario in which the sender (signer), Alice, generates the signature of a message and then a receiver (verifier), Bob, can verify the signature with the help of a trusted arbitrator, Trent. In this paper, we point out there exist some security weaknesses in two AQS protocols. Our analysis shows Alice can successfully disavow any of her signatures by a simple attack in the first protocol. Furthermore, we study the security weaknesses of the second protocol from the aspects of forgery and disavowal. Some potential improvements of this kind of protocols are given. We also design a new method to authenticate a signature or a message, which makes AQS protocols immune to Alice's disavowal attack and Bob's forgery attack effectively.

  12. Real time gamma-ray signature identifier

    DOEpatents

    Rowland, Mark (Alamo, CA); Gosnell, Tom B. (Moraga, CA); Ham, Cheryl (Livermore, CA); Perkins, Dwight (Livermore, CA); Wong, James (Dublin, CA)

    2012-05-15

    A real time gamma-ray signature/source identification method and system using principal components analysis (PCA) for transforming and substantially reducing one or more comprehensive spectral libraries of nuclear materials types and configurations into a corresponding concise representation/signature(s) representing and indexing each individual predetermined spectrum in principal component (PC) space, wherein an unknown gamma-ray signature may be compared against the representative signature to find a match or at least characterize the unknown signature from among all the entries in the library with a single regression or simple projection into the PC space, so as to substantially reduce processing time and computing resources and enable real-time characterization and/or identification.

  13. Signatures of mutational processes in human cancer

    PubMed Central

    Alexandrov, Ludmil B.; Nik-Zainal, Serena; Wedge, David C.; Aparicio, Samuel A.J.R.; Behjati, Sam; Biankin, Andrew V.; Bignell, Graham R.; Bolli, Niccolo; Borg, Ake; Brresen-Dale, Anne-Lise; Boyault, Sandrine; Burkhardt, Birgit; Butler, Adam P.; Caldas, Carlos; Davies, Helen R.; Desmedt, Christine; Eils, Roland; Eyfjrd, Jrunn Erla; Foekens, John A.; Greaves, Mel; Hosoda, Fumie; Hutter, Barbara; Ilicic, Tomislav; Imbeaud, Sandrine; Imielinsk, Marcin; Jger, Natalie; Jones, David T.W.; Jones, David; Knappskog, Stian; Kool, Marcel; Lakhani, Sunil R.; Lpez-Otn, Carlos; Martin, Sancha; Munshi, Nikhil C.; Nakamura, Hiromi; Northcott, Paul A.; Pajic, Marina; Papaemmanuil, Elli; Paradiso, Angelo; Pearson, John V.; Puente, Xose S.; Raine, Keiran; Ramakrishna, Manasa; Richardson, Andrea L.; Richter, Julia; Rosenstiel, Philip; Schlesner, Matthias; Schumacher, Ton N.; Span, Paul N.; Teague, Jon W.; Totoki, Yasushi; Tutt, Andrew N.J.; Valds-Mas, Rafael; van Buuren, Marit M.; van t Veer, Laura; Vincent-Salomon, Anne; Waddell, Nicola; Yates, Lucy R.; Zucman-Rossi, Jessica; Futreal, P. Andrew; McDermott, Ultan; Lichter, Peter; Meyerson, Matthew; Grimmond, Sean M.; Siebert, Reiner; Campo, Elas; Shibata, Tatsuhiro; Pfister, Stefan M.; Campbell, Peter J.; Stratton, Michael R.

    2013-01-01

    All cancers are caused by somatic mutations. However, understanding of the biological processes generating these mutations is limited. The catalogue of somatic mutations from a cancer genome bears the signatures of the mutational processes that have been operative. Here, we analysed 4,938,362 mutations from 7,042 cancers and extracted more than 20 distinct mutational signatures. Some are present in many cancer types, notably a signature attributed to the APOBEC family of cytidine deaminases, whereas others are confined to a single class. Certain signatures are associated with age of the patient at cancer diagnosis, known mutagenic exposures or defects in DNA maintenance, but many are of cryptic origin. In addition to these genome-wide mutational signatures, hypermutation localized to small genomic regions, kataegis, is found in many cancer types. The results reveal the diversity of mutational processes underlying the development of cancer with potential implications for understanding of cancer etiology, prevention and therapy. PMID:23945592

  14. Going Online the MI Way.

    ERIC Educational Resources Information Center

    Feldt, Jill

    This booklet describes online searching using Materials Information, a metallurgy and metals science information service of the Institute of Metals in London and ASM International in Cleveland, Ohio, which is available through the major online vendors. Described in detail are online searching, online databases, costs, online hosts or vendors,…

  15. Development of advanced seal verification

    NASA Technical Reports Server (NTRS)

    Workman, Gary L.; Kosten, Susan E.; Abushagur, Mustafa A.

    1992-01-01

    The purpose of this research is to develop a technique to monitor and insure seal integrity with a sensor that has no active elements to burn-out during a long duration activity, such as a leakage test or especially during a mission in space. The original concept proposed is that by implementing fiber optic sensors, changes in the integrity of a seal can be monitored in real time and at no time should the optical fiber sensor fail. The electrical components which provide optical excitation and detection through the fiber are not part of the seal; hence, if these electrical components fail, they can be easily changed without breaking the seal. The optical connections required for the concept to work does present a functional problem to work out. The utility of the optical fiber sensor for seal monitoring should be general enough that the degradation of a seal can be determined before catastrophic failure occurs and appropriate action taken. Two parallel efforts were performed in determining the feasibility of using optical fiber sensors for seal verification. In one study, research on interferometric measurements of the mechanical response of the optical fiber sensors to seal integrity was studied. In a second study, the implementation of the optical fiber to a typical vacuum chamber was implemented and feasibility studies on microbend experiments in the vacuum chamber were performed. Also, an attempt was made to quantify the amount of pressure actually being applied to the optical fiber using finite element analysis software by Algor.

  16. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  17. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  18. Dynamic characteristics of signatures: effects of writer style on genuine and simulated signatures.

    PubMed

    Mohammed, Linton; Found, Bryan; Caligiuri, Michael; Rogers, Doug

    2015-01-01

    The aims of this study were to determine if computer-measured dynamic features (duration, size, velocity, jerk, and pen pressure) differ between genuine and simulated signatures. Sixty subjects (3 equal groups of 3 signature styles) each provided 10 naturally written (genuine) signatures. Each of these subjects then provided 15 simulations of each of three model signatures. The genuine (N = 600) and simulated (N = 2700) signatures were collected using a digitizing tablet. MovAlyzeR(®) software was used to estimate kinematic parameters for each pen stroke. Stroke duration, velocity, and pen pressure were found to discriminate between genuine and simulated signatures regardless of the simulator's own style of signature or the style of signature being simulated. However, there was a significant interaction between style and condition for size and jerk (a measure of smoothness). The results of this study, based on quantitative analysis and dynamic handwriting features, indicate that the style of the simulator's own signature and the style of signature being simulated can impact the characteristics of handwriting movements for simulations. Writer style characteristics might therefore need to be taken into consideration as potentially significant when evaluating signature features with a view to forming opinions regarding authenticity. PMID:25420668

  19. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... signature components and controls. (a) Electronic signatures that are not based upon biometrics shall: (1... signatures based upon biometrics shall be designed to ensure that they cannot be used by anyone other...

  20. Online Knowledge Communities.

    ERIC Educational Resources Information Center

    de Vries, Sjoerd; Bloemen, Paul; Roossink, Lonneke

    This paper describes the concept of online knowledge communities. The concept is defined, and six qualities of online communities are identified: members (user roles are clearly defined); mission (generally accepted goal-statement, ideas, beliefs, etc.); commitment (members give their loyalty to the mission); social interaction (frequent

  1. Accessible Online Learning

    ERIC Educational Resources Information Center

    Case, D. Elizabeth; Davidson, Roseanna C.

    2011-01-01

    The number of online courses offered at the postsecondary level is increasing at a rate greater than the increase in overall higher education enrollment, with approximately one of every four higher education students taking at least one course online (Allen and Seaman, 2009). In 2008, students with disabilities represented nearly 11% of all

  2. Adolescent Online Victimization

    ERIC Educational Resources Information Center

    Young, Adena; Young, Atrisha; Fullwood, Harry

    2007-01-01

    Online victimization is a concern among many who work with youth. This article reviews the latest research on online victimization then promotes honest dialogue, personal responsibility of the youth, and proper reporting actions as strategies to reduce this type of victimization. (Contains 1 figure and 1 table.)

  3. Online Learning for Teachers.

    ERIC Educational Resources Information Center

    Kesner, Rebecca J., Ed.

    2001-01-01

    This newsletter contains two articles on teacher use of educational technology. The first article, "Online Learning for Teachers," (Stephen G. Barkley) explains that online learning has the ability to multiply both the effectiveness and efficiency of traditional onsite training by eliminating the need for travel. It describes the five components

  4. Children's Online Privacy.

    ERIC Educational Resources Information Center

    Aidman, Amy

    2000-01-01

    The first federal Internet privacy law (the Children's Online Privacy Protection Act) provides safeguards for children by regulating collection of their personal information. Unfortunately, teens are not protected. Legislation is pending to protect children from online marketers such as ZapMe! Interactive technologies require constant vigilance.

  5. Serving the Online Learner

    ERIC Educational Resources Information Center

    Boettcher, Judith V.

    2007-01-01

    Systems and services for recruiting, advising, and support of online students have seldom been at the top of the list when planning online and distance learning programs. That is now changing: Forces pushing advising and support services into the foreground include recognition of the student learner as "customer" and the increasing expectations

  6. Facilitating Online Discussions Effectively

    ERIC Educational Resources Information Center

    Rovai, Alfred P.

    2007-01-01

    This article presents a synthesis of the theoretical and research literature on facilitating asynchronous online discussions effectively. Online courses need to be designed so that they provide motivation for students to engage in productive discussions and clearly describe what is expected, perhaps in the form of a discussion rubric.…

  7. Online Videoconferencing Products: Update

    ERIC Educational Resources Information Center

    Burton, Douglas; Kitchen, Tim

    2011-01-01

    Software allowing real-time online video connectivity is rapidly evolving. The ability to connect students, staff, and guest speakers instantaneously carries great benefits for the online distance education classroom. This evaluation report compares four software applications at opposite ends of the cost spectrum: "DimDim", "Elluminate VCS",

  8. Children's Online Privacy.

    ERIC Educational Resources Information Center

    Aidman, Amy

    2000-01-01

    The first federal Internet privacy law (the Children's Online Privacy Protection Act) provides safeguards for children by regulating collection of their personal information. Unfortunately, teens are not protected. Legislation is pending to protect children from online marketers such as ZapMe! Interactive technologies require constant vigilance.…

  9. Classroom versus Online Assessment

    ERIC Educational Resources Information Center

    Spivey, Michael F.; McMillan, Jeffrey J.

    2014-01-01

    The authors examined students' effort and performance using online versus traditional classroom testing procedures. The instructor and instructional methodology were the same in different sections of an introductory finance class. Only the procedure in which students were tested--online versus in the classroom--differed. The authors measured

  10. On-Line Nutrition.

    ERIC Educational Resources Information Center

    Kongshem, Lars

    1995-01-01

    Several sources of nutrition information are available on the Internet. Good online sources include the U.S. Department of Agriculture Food and Consumer Service bulletin board, the Food and Drug Administration's Center for Food Safety and Applied Nutrition, and the IFIC (International Food Information Council) Foundation On-Line. E-mail addresses…

  11. Assessing Online Learning.

    ERIC Educational Resources Information Center

    Wagner, June G.

    2001-01-01

    This document contains three articles devoted to assessing online learning. "Online Learning: A Digital Revolution" profiles innovative World Wide Web-based programs at high schools and colleges in the United States and worldwide and discusses the following topics: new demographic realities; the need for continuous (lifelong) learning; and

  12. Online Collaboration: Curriculum Unbound!

    ERIC Educational Resources Information Center

    Waters, John K.

    2007-01-01

    Freed from the nuisances of paper-based methods, districts are making creative use of digital tools to move their curricular documents online, where educators can collaborate on course development and lesson planning. Back in 2003, Amarillo Independent School District (Texas) had begun using the Blackboard Content System to provide lessons online.

  13. Three plasma metabolite signatures for diagnosing high altitude pulmonary edema

    PubMed Central

    Guo, Li; Tan, Guangguo; Liu, Ping; Li, Huijie; Tang, Lulu; Huang, Lan; Ren, Qian

    2015-01-01

    High-altitude pulmonary edema (HAPE) is a potentially fatal condition, occurring at altitudes greater than 3,000 m and affecting rapidly ascending, non-acclimatized healthy individuals. However, the lack of biomarkers for this disease still constitutes a bottleneck in the clinical diagnosis. Here, ultra-high performance liquid chromatography coupled with Q-TOF mass spectrometry was applied to study plasma metabolite profiling from 57 HAPE and 57 control subjects. 14 differential plasma metabolites responsible for the discrimination between the two groups from discovery set (35 HAPE subjects and 35 healthy controls) were identified. Furthermore, 3 of the 14 metabolites (C8-ceramide, sphingosine and glutamine) were selected as candidate diagnostic biomarkers for HAPE using metabolic pathway impact analysis. The feasibility of using the combination of these three biomarkers for HAPE was evaluated, where the area under the receiver operating characteristic curve (AUC) was 0.981 and 0.942 in the discovery set and the validation set (22 HAPE subjects and 22 healthy controls), respectively. Taken together, these results suggested that this composite plasma metabolite signature may be used in HAPE diagnosis, especially after further investigation and verification with larger samples. PMID:26459926

  14. Three plasma metabolite signatures for diagnosing high altitude pulmonary edema

    NASA Astrophysics Data System (ADS)

    Guo, Li; Tan, Guangguo; Liu, Ping; Li, Huijie; Tang, Lulu; Huang, Lan; Ren, Qian

    2015-10-01

    High-altitude pulmonary edema (HAPE) is a potentially fatal condition, occurring at altitudes greater than 3,000 m and affecting rapidly ascending, non-acclimatized healthy individuals. However, the lack of biomarkers for this disease still constitutes a bottleneck in the clinical diagnosis. Here, ultra-high performance liquid chromatography coupled with Q-TOF mass spectrometry was applied to study plasma metabolite profiling from 57 HAPE and 57 control subjects. 14 differential plasma metabolites responsible for the discrimination between the two groups from discovery set (35 HAPE subjects and 35 healthy controls) were identified. Furthermore, 3 of the 14 metabolites (C8-ceramide, sphingosine and glutamine) were selected as candidate diagnostic biomarkers for HAPE using metabolic pathway impact analysis. The feasibility of using the combination of these three biomarkers for HAPE was evaluated, where the area under the receiver operating characteristic curve (AUC) was 0.981 and 0.942 in the discovery set and the validation set (22 HAPE subjects and 22 healthy controls), respectively. Taken together, these results suggested that this composite plasma metabolite signature may be used in HAPE diagnosis, especially after further investigation and verification with larger samples.

  15. Online Advertising in Social Networks

    NASA Astrophysics Data System (ADS)

    Bagherjeiran, Abraham; Bhatt, Rushi P.; Parekh, Rajesh; Chaoji, Vineet

    Online social networks offer opportunities to analyze user behavior and social connectivity and leverage resulting insights for effective online advertising. This chapter focuses on the role of social network information in online display advertising.

  16. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  17. DNA Methylation Signature of Childhood Chronic Physical Aggression in T Cells of Both Men and Women

    PubMed Central

    Guillemin, Claire; Provençal, Nadine; Suderman, Matthew; Côté, Sylvana M.; Vitaro, Frank; Hallett, Michael; Tremblay, Richard E.; Szyf, Moshe

    2014-01-01

    Background High frequency of physical aggression is the central feature of severe conduct disorder and is associated with a wide range of social, mental and physical health problems. We have previously tested the hypothesis that differential DNA methylation signatures in peripheral T cells are associated with a chronic aggression trajectory in males. Despite the fact that sex differences appear to play a pivotal role in determining the development, magnitude and frequency of aggression, most of previous studies focused on males, so little is known about female chronic physical aggression. We therefore tested here whether or not there is a signature of physical aggression in female DNA methylation and, if there is, how it relates to the signature observed in males. Methodology/Principal Findings Methylation profiles were created using the method of methylated DNA immunoprecipitation (MeDIP) followed by microarray hybridization and statistical and bioinformatic analyses on T cell DNA obtained from adult women who were found to be on a chronic physical aggression trajectory (CPA) between 6 and 12 years of age compared to women who followed a normal physical aggression trajectory. We confirmed the existence of a well-defined, genome-wide signature of DNA methylation associated with chronic physical aggression in the peripheral T cells of adult females that includes many of the genes similarly associated with physical aggression in the same cell types of adult males. Conclusions This study in a small number of women presents preliminary evidence for a genome-wide variation in promoter DNA methylation that associates with CPA in women that warrant larger studies for further verification. A significant proportion of these associations were previously observed in men with CPA supporting the hypothesis that the epigenetic signature of early life aggression in females is composed of a component specific to females and another common to both males and females. PMID:24475181

  18. Molecular signatures of vaccine adjuvants.

    PubMed

    Olafsdottir, Thorunn; Lindqvist, Madelene; Harandi, Ali M

    2015-09-29

    Mass vaccination has saved millions of human lives and improved the quality of life in both developing and developed countries. The emergence of new pathogens and inadequate protection conferred by some of the existing vaccines such as vaccines for tuberculosis, influenza and pertussis especially in certain age groups have resulted in a move from empirically developed vaccines toward more pathogen tailored and rationally engineered vaccines. A deeper understanding of the interaction of innate and adaptive immunity at molecular level enables the development of vaccines that selectively target certain type of immune responses without excessive reactogenicity. Adjuvants constitute an imperative element of modern vaccines. Although a variety of candidate adjuvants have been evaluated in the past few decades, only a limited number of vaccine adjuvants are currently available for human use. A better understanding of the mode of action of adjuvants is pivotal to harness the potential of existing and new adjuvants in shaping a desired immune response. Recent advancement in systems biology powered by the emerging cutting edge omics technology has led to the identification of molecular signatures rapidly induced after vaccination in the blood that correlate and predict a later protective immune response or vaccine safety. This can pave ways to prospectively determine the potency and safety of vaccines and adjuvants. This review is intended to highlight the importance of big data analysis in advancing our understanding of the mechanisms of actions of adjuvants to inform rational development of future human vaccines. PMID:25989447

  19. Molecular Signatures of Major Depression

    PubMed Central

    Cai, Na; Chang, Simon; Li, Yihan; Li, Qibin; Hu, Jingchu; Liang, Jieqin; Song, Li; Kretzschmar, Warren; Gan, Xiangchao; Nicod, Jerome; Rivera, Margarita; Deng, Hong; Du, Bo; Li, Keqing; Sang, Wenhu; Gao, Jingfang; Gao, Shugui; Ha, Baowei; Ho, Hung-Yao; Hu, Chunmei; Hu, Jian; Hu, Zhenfei; Huang, Guoping; Jiang, Guoqing; Jiang, Tao; Jin, Wei; Li, Gongying; Li, Kan; Li, Yi; Li, Yingrui; Li, Youhui; Lin, Yu-Ting; Liu, Lanfen; Liu, Tiebang; Liu, Ying; Liu, Yuan; Lu, Yao; Lv, Luxian; Meng, Huaqing; Qian, Puyi; Sang, Hong; Shen, Jianhua; Shi, Jianguo; Sun, Jing; Tao, Ming; Wang, Gang; Wang, Guangbiao; Wang, Jian; Wang, Linmao; Wang, Xueyi; Wang, Xumei; Yang, Huanming; Yang, Lijun; Yin, Ye; Zhang, Jinbei; Zhang, Kerang; Sun, Ning; Zhang, Wei; Zhang, Xiuqing; Zhang, Zhen; Zhong, Hui; Breen, Gerome; Wang, Jun; Marchini, Jonathan; Chen, Yiping; Xu, Qi; Xu, Xun; Mott, Richard; Huang, Guo-Jen; Kendler, Kenneth; Flint, Jonathan

    2015-01-01

    Summary Adversity, particularly in early life, can cause illness. Clues to the responsible mechanisms may lie with the discovery of molecular signatures of stress, some of which include alterations to an individual’s somatic genome. Here, using genome sequences from 11,670 women, we observed a highly significant association between a stress-related disease, major depression, and the amount of mtDNA (p = 9.00 × 10−42, odds ratio 1.33 [95% confidence interval [CI] = 1.29–1.37]) and telomere length (p = 2.84 × 10−14, odds ratio 0.85 [95% CI = 0.81–0.89]). While both telomere length and mtDNA amount were associated with adverse life events, conditional regression analyses showed the molecular changes were contingent on the depressed state. We tested this hypothesis with experiments in mice, demonstrating that stress causes both molecular changes, which are partly reversible and can be elicited by the administration of corticosterone. Together, these results demonstrate that changes in the amount of mtDNA and telomere length are consequences of stress and entering a depressed state. These findings identify increased amounts of mtDNA as a molecular marker of MD and have important implications for understanding how stress causes the disease. PMID:25913401

  20. Symbolic signatures for deformable shapes.

    PubMed

    Ruiz-Correa, Salvador; Shapiro, Linda G; Meila, Marina; Berson, Gabriel; Cunningham, Michael L; Sze, Raymond W

    2006-01-01

    Recognizing classes of objects from their shape is an unsolved problem in machine vision that entails the ability of a computer system to represent and generalize complex geometrical information on the basis of a finite amount of prior data. A practical approach to this problem is particularly difficult to implement, not only because the shape variability of relevant object classes is generally large, but also because standard sensing devices used to capture the real world only provide a partial view of a scene, so there is partial information pertaining to the objects of interest. In this work, we develop an algorithmic framework for recognizing classes of deformable shapes from range data. The basic idea of our component-based approach is to generalize existing surface representations that have proven effective in recognizing specific 3D objects to the problem of object classes using our newly introduced symbolic-signature representation that is robust to deformations, as opposed to a numeric representation that is often tied to a specific shape. Based on this approach, we present a system that is capable of recognizing and classifying a variety of object shape classes from range data. We demonstrate our system in a series of large-scale experiments that were motivated by specific applications in scene analysis and medical diagnosis. PMID:16402621

  1. (abstract) Topographic Signatures in Geology

    NASA Technical Reports Server (NTRS)

    Farr, Tom G.; Evans, Diane L.

    1996-01-01

    Topographic information is required for many Earth Science investigations. For example, topography is an important element in regional and global geomorphic studies because it reflects the interplay between the climate-driven processes of erosion and the tectonic processes of uplift. A number of techniques have been developed to analyze digital topographic data, including Fourier texture analysis. A Fourier transform of the topography of an area allows the spatial frequency content of the topography to be analyzed. Band-pass filtering of the transform produces images representing the amplitude of different spatial wavelengths. These are then used in a multi-band classification to map units based on their spatial frequency content. The results using a radar image instead of digital topography showed good correspondence to a geologic map, however brightness variations in the image unrelated to topography caused errors. An additional benefit to the use of Fourier band-pass images for the classification is that the textural signatures of the units are quantative measures of the spatial characteristics of the units that may be used to map similar units in similar environments.

  2. Molecular signatures of major depression.

    PubMed

    Cai, Na; Chang, Simon; Li, Yihan; Li, Qibin; Hu, Jingchu; Liang, Jieqin; Song, Li; Kretzschmar, Warren; Gan, Xiangchao; Nicod, Jerome; Rivera, Margarita; Deng, Hong; Du, Bo; Li, Keqing; Sang, Wenhu; Gao, Jingfang; Gao, Shugui; Ha, Baowei; Ho, Hung-Yao; Hu, Chunmei; Hu, Jian; Hu, Zhenfei; Huang, Guoping; Jiang, Guoqing; Jiang, Tao; Jin, Wei; Li, Gongying; Li, Kan; Li, Yi; Li, Yingrui; Li, Youhui; Lin, Yu-Ting; Liu, Lanfen; Liu, Tiebang; Liu, Ying; Liu, Yuan; Lu, Yao; Lv, Luxian; Meng, Huaqing; Qian, Puyi; Sang, Hong; Shen, Jianhua; Shi, Jianguo; Sun, Jing; Tao, Ming; Wang, Gang; Wang, Guangbiao; Wang, Jian; Wang, Linmao; Wang, Xueyi; Wang, Xumei; Yang, Huanming; Yang, Lijun; Yin, Ye; Zhang, Jinbei; Zhang, Kerang; Sun, Ning; Zhang, Wei; Zhang, Xiuqing; Zhang, Zhen; Zhong, Hui; Breen, Gerome; Wang, Jun; Marchini, Jonathan; Chen, Yiping; Xu, Qi; Xu, Xun; Mott, Richard; Huang, Guo-Jen; Kendler, Kenneth; Flint, Jonathan

    2015-05-01

    Adversity, particularly in early life, can cause illness. Clues to the responsible mechanisms may lie with the discovery of molecular signatures of stress, some of which include alterations to an individual's somatic genome. Here, using genome sequences from 11,670 women, we observed a highly significant association between a stress-related disease, major depression, and the amount of mtDNA (p = 9.00 10(-42), odds ratio 1.33 [95% confidence interval [CI] = 1.29-1.37]) and telomere length (p = 2.84 10(-14), odds ratio 0.85 [95% CI = 0.81-0.89]). While both telomere length and mtDNA amount were associated with adverse life events, conditional regression analyses showed the molecular changes were contingent on the depressed state. We tested this hypothesis with experiments in mice, demonstrating that stress causes both molecular changes, which are partly reversible and can be elicited by the administration of corticosterone. Together, these results demonstrate that changes in the amount of mtDNA and telomere length are consequences of stress and entering a depressed state. These findings identify increased amounts of mtDNA as a molecular marker of MD and have important implications for understanding how stress causes the disease. PMID:25913401

  3. Chemical Signatures in Dwarf Galaxies

    NASA Astrophysics Data System (ADS)

    Venn, Kim A.; Hill, Vanessa M.

    2008-12-01

    Chemical signatures in dwarf galaxies describe the examination of specific elemental abundance ratios to investigate the formation and evolution of dwarf galaxies, particularly when compared with the variety of stellar populations in the Galaxy. Abundance ratios can come from HII region emission lines, planetary nebulae, or supernova remnants, but mostly they come from stars. Since stars can live a very long time, for example, a 0.8 MSun star born at the time of the Big Bang would only now be ascending the red giant branch, and, if, for the most part, its quiescent main sequence lifetime had been uneventful, then it is possible that the surface chemistry of stars actually still resembles their natal chemistry. Detailed abundances of stars in dwarf galaxies can be used to reconstruct their chemical evolution, which we now find to be distinct from any other component of the Galaxy, questioning the assertion that dwarf galaxies like these built up the Galaxy. Potential solutions to reconciling dwarf galaxy abundances and Galaxy formation models include the timescale for significant merging and the possibility for uncovering different stellar populations in the new ultra-faint dwarfs.

  4. Microwave Signatures of Inundation Area

    NASA Astrophysics Data System (ADS)

    Takbiri, Zeinab; Ebtejah, Ardeshir M.; Foufoula Georgiou, Efi

    2015-12-01

    The goal of this research is to use the inundation data available from 250 m Moderate-Resolution Imaging Spectroradiometer (MODIS) Near Real Time (NRT) as supplementary data to quantify the signature of different levels of intra-monthly percentages of water-cover on spectral channels of the Tropical Rainfall Measuring Mission Microwave Imager (TMI). The analysis relies on an unsupervised clustering methodology, which uses information content of the magnitudes and polarization differences of all TMI channels. This clustering method is applied to the monthly averages of the spectral brightness temperatures in calendar year 2013, focusing on two important deltaic regions: Mekong and Ganges-Brahmaputra-Meghna. The results illustrate that during the non-raining condition, the high-frequency brightness temperatures at 85V,H GHz can be used as a surrogate of inundation while for the "all-sky" condition, the low frequency channel of 10 GHz, is the one that is hardy influenced by atmospheric constituents but is not very suitable for the detection of small-feature inundation because of its coarse resolution. The monthly inundation dynamics of the two aforementioned delta regions are demonstrated on each frequency channel separately and on the clusters of all TMI spectral channels for non-rainy and "all-sky" scenes. The mean and variance of the identified clusters are used to detect the inundated percentage in another deltaic region, the Yellow River delta, while the MODIS NRT data are exploited to characterize the error of inundation mapping.

  5. Signature geometry and quantum engineering

    NASA Astrophysics Data System (ADS)

    Samociuk, Stefan

    2013-09-01

    As the operating frequency of electromagnetic based devices increase, physical design geometry is playing an ever more important role. Evidence is considered in support of a relationship between the dimensionality of primitive geometric forms, such as transistors, and corresponding electromagnetic coupling efficiency. The industry of electronics is defined as the construction of devices by the patterning of primitive forms to physical materials. Examples are given to show the evolution of these primitives, down to nano scales, are requiring exacting geometry and three dimensional content. Consideration of microwave monolithic integrated circuits,(MMIC), photonics and metamaterials,(MM), support this trend and also add new requirements of strict geometric periodicity and multiplicity. Signature geometries,(SG), are characterized by distinctive attributes and examples are given. The transcendent form transcode algorithm, (TTA) is introduced as a multi dimensional SG and its use in designing photonic integrated circuits and metamaterials is discussed . A creative commons licensed research database, TRANSFORM, containing TTA geometries in OASIS file formats is described. An experimental methodology for using the database is given. Multidimensional SG and extraction of three dimensional cross sections as primitive forms is discussed as a foundation for quantum engineering and the exploitation of phenomena other than the electromagnetic.

  6. Multifractal signatures of infectious diseases.

    PubMed

    Holdsworth, Amber M; Kevlahan, Nicholas K-R; Earn, David J D

    2012-09-01

    Incidence of infection time-series data for the childhood diseases measles, chicken pox, rubella and whooping cough are described in the language of multifractals. We explore the potential of using the wavelet transform maximum modulus (WTMM) method to characterize the multiscale structure of the observed time series and of simulated data generated by the stochastic susceptible-exposed-infectious-recovered (SEIR) epidemic model. The singularity spectra of the observed time series suggest that each disease is characterized by a unique multifractal signature, which distinguishes that particular disease from the others. The wavelet scaling functions confirm that the time series of measles, rubella and whooping cough are clearly multifractal, while chicken pox has a more monofractal structure in time. The stochastic SEIR epidemic model is unable to reproduce the qualitative singularity structure of the reported incidence data: it is too smooth and does not appear to have a multifractal singularity structure. The precise reasons for the failure of the SEIR epidemic model to reproduce the correct multiscale structure of the reported incidence data remain unclear. PMID:22442094

  7. Direct verification of warped hierarchy-and-flavor models

    SciTech Connect

    Davoudiasl, Hooman; Soni, Amarjit; Rizzo, Thomas G.

    2008-02-01

    We consider direct experimental verification of warped models, based on the Randall-Sundrum (RS) scenario, that explain gauge and flavor hierarchies, assuming that the gauge fields and fermions of the standard model (SM) propagate in the 5D bulk. Most studies have focused on the bosonic Kaluza-Klein (KK) signatures and indicate that discovering gauge KK modes is likely possible, yet challenging, while graviton KK modes are unlikely to be accessible at the CERN LHC, even with a luminosity upgrade. We show that direct evidence for bulk SM fermions, i.e. their KK modes, is likely also beyond the reach of a luminosity-upgraded LHC. Thus, neither the spin-2 KK graviton, the most distinct RS signal, nor the KK SM fermions, direct evidence for bulk flavor, seem to be within the reach of the LHC. We then consider hadron colliders with {radical}(s)=21, 28, and 60 TeV. We find that discovering the first KK modes of SM fermions and the graviton typically requires the Next Hadron Collider (NHC) with {radical}(s){approx_equal}60 TeV and O(1) ab{sup -1} of integrated luminosity. If the LHC yields hints of these warped models, establishing that nature is described by them, or their 4D conformal field theory duals, requires an NHC-class machine in the post-LHC experimental program.

  8. On Direct Verification of Warped Hierarchy-and-FlavorModels

    SciTech Connect

    Davoudiasl, Hooman; Rizzo, Thomas G.; Soni, Amarjit; /Brookhaven

    2007-10-15

    We consider direct experimental verification of warped models, based on the Randall-Sundrum (RS) scenario, that explain gauge and flavor hierarchies, assuming that the gauge fields and fermions of the Standard Model (SM) propagate in the 5D bulk. Most studies have focused on the bosonic Kaluza Klein (KK) signatures and indicate that discovering gauge KK modes is likely possible, yet challenging, while graviton KK modes are unlikely to be accessible at the LHC, even with a luminosity upgrade. We show that direct evidence for bulk SM fermions, i.e. their KK modes, is likely also beyond the reach of a luminosity-upgraded LHC. Thus, neither the spin-2 KK graviton, the most distinct RS signal, nor the KK SM fermions, direct evidence for bulk flavor, seem to be within the reach of the LHC. We then consider hadron colliders with vs. = 21, 28, and 60 TeV. We find that discovering the first KK modes of SM fermions and the graviton typically requires the Next Hadron Collider (NHC) with {radical}s {approx} 60 TeV and O(1) ab-1 of integrated luminosity. If the LHC yields hints of these warped models, establishing that Nature is described by them, or their 4D CFT duals, requires an NHC-class machine in the post-LHC experimental program.

  9. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  10. Automated verification of system configuration

    NASA Astrophysics Data System (ADS)

    Andrews, W. H., Jr.; Baker, S. P.; Blalock, A. V.

    1991-05-01

    Errors in field wiring can result in significant correction costs (if the errors are discovered prior to use), in erroneous or unusable data (if the errors are not discovered in time), or in serious accidents (if the errors corrupt critical data). Detailed field wiring checkout rework are tedious and expensive, but they are essential steps in the quality assurance process for large, complex instrumentation and control systems. A recent Oak Ridge National Laboratory (ORNL) development, the CONFiguration IDEnification System (CONFIDES) automates verification of field wiring. In CONFIDES, an identifier module is installed on or integrated into each component (e.g., sensor, actuator, cable, distribution panel) to be verified. Interrogator modules, controlled by a personal computer (PC), are installed at the connections of the field wiring to the inputs of the data acquisition and control system (DACS). Interrogator modules poll the components connected to each channel of the DACS and can determine the path taken by each channel's signal to or from the end device for that channel. The system will provide not only the identification (ID) code for the cables and patch panels in the path to a particular sensor or actuator but for individual cable conductor IDs as well. One version of the system uses existing signal wires for communications between CONFIDES modules. Another, more powerful version requires a single dedicated conductor in each cable. Both version can operate with or without instrument power applied and neither interferes with the normal operation of the DACS. Identifier modules can provide a variety of information including status and calibration data.

  11. Talking Online: Reflecting on Online Communication Tools

    ERIC Educational Resources Information Center

    Greener, Susan

    2009-01-01

    Purpose: The purpose of this paper is to reflect on the value and constraints of varied online communication tools from web 2.0 to e-mail in a higher education (HE) teaching and learning context, where these tools are used to support or be the main focus of learning. Design/methodology/approach: A structured reflection is produced with the aid of

  12. On the Privacy Protection of Biometric Traits: Palmprint, Face, and Signature

    NASA Astrophysics Data System (ADS)

    Panigrahy, Saroj Kumar; Jena, Debasish; Korra, Sathya Babu; Jena, Sanjay Kumar

    Biometrics are expected to add a new level of security to applications, as a person attempting access must prove who he or she really is by presenting a biometric to the system. The recent developments in the biometrics area have lead to smaller, faster and cheaper systems, which in turn has increased the number of possible application areas for biometric identity verification. The biometric data, being derived from human bodies (and especially when used to identify or verify those bodies) is considered personally identifiable information (PII). The collection, use and disclosure of biometric data — image or template, invokes rights on the part of an individual and obligations on the part of an organization. As biometric uses and databases grow, so do concerns that the personal data collected will not be used in reasonable and accountable ways. Privacy concerns arise when biometric data are used for secondary purposes, invoking function creep, data matching, aggregation, surveillance and profiling. Biometric data transmitted across networks and stored in various databases by others can also be stolen, copied, or otherwise misused in ways that can materially affect the individual involved. As Biometric systems are vulnerable to replay, database and brute-force attacks, such potential attacks must be analysed before they are massively deployed in security systems. Along with security, also the privacy of the users is an important factor as the constructions of lines in palmprints contain personal characteristics, from face images a person can be recognised, and fake signatures can be practised by carefully watching the signature images available in the database. We propose a cryptographic approach to encrypt the images of palmprints, faces, and signatures by an advanced Hill cipher technique for hiding the information in the images. It also provides security to these images from being attacked by above mentioned attacks. So, during the feature extraction, the encrypted images are first decrypted, then the features are extracted, and used for identification or verification.

  13. Quantification, Prediction, and the Online Impact of Sentence Truth-Value: Evidence from Event-Related Potentials

    ERIC Educational Resources Information Center

    Nieuwland, Mante S.

    2016-01-01

    Do negative quantifiers like "few" reduce people's ability to rapidly evaluate incoming language with respect to world knowledge? Previous research has addressed this question by examining whether online measures of quantifier comprehension match the "final" interpretation reflected in verification judgments. However, these…

  14. Simulation-Based Verification of Livingstone Applications

    NASA Technical Reports Server (NTRS)

    Lindsey, Anthony E.; Pecheur, Charles

    2003-01-01

    AI software is viewed as a means to give greater autonomy to automated systems, capable of coping with harsh and unpredictable environments in deep space missions. Autonomous systems pose a serious challenge to traditional test-based verification approaches, because of the enormous space of possible situations that they aim to address. Before these systems are put in control of critical applications, appropriate new verification approaches need to be developed. This article describes Livingstone PathFinder (LPF), a verification tool for autonomous diagnosis applications based on NASA's Livingstone model-based diagnosis system. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the Livingstone diagnosis system embedded in a simulated operating environment. The article describes different facets of LPF and reports some experimental results from applying LPF to a Livingstone model of the main propulsion feed subsystem of the X-34 space vehicle. This paper describes Livingstone PathFinder (LPF), a verification tool for autonomous diagnosis applications based on NASA's Livingstone model-based diagnosis system. LPF applies state space exploration algorithms to an instrumented testbed, consisting of the Livingstone diagnosis system embedded in a simulated operating environment. Section 2 provides an overview of Livingstone; Section 3 describes the LPF architecture; Section 4 discusses its applicability; Section 5 reviews some experimental results; Section 6 compares LPF to related verification approaches; Section 7 draws conclusions and discusses some perspectives.

  15. Connecting to On-line Data, a Progress Report

    NASA Astrophysics Data System (ADS)

    Eichhorn, G.; Astrophysics Datacenter Executive Committee (ADEC)

    2004-12-01

    The Astrophysics Datacenter Executive Committee (ADEC) has worked with the American Astronomical Society (AAS) and the University of Chicago Press (UChP) to implement links from the on-line literature to on-line data and vice versa. A first demonstration of this system is on-line in the Astrophysical Journal Supplement, Volume 154, Issue 1, a special issue about first results from Spitzer. Several of these on-line articles have links to on-line data. This linking system requires the collaboration of the data centers (marking data sets with unique identifiers, providing a verification system for identifiers, providing a systematic linking system to data sets), the ADS (providing a master verifier that connects the journal to the individual verifiers at the data centers, providing a linking server that allows stable links for the journals even if data sets move), and the AAS and the UChP (implementing LaTeX tags for identifiers, processing and verifying identifiers, implementing the links). Once the links are in place at the journal website, the publisher returns this information to the ADS and from there to the data centers in order to provide the data centers the information necessary to implement the opposite links from data sets to journal articles. The pipeline for this information flow is now fully in place and will be described in this poster. This work is supported by NASA under sevreal grants.

  16. ACCRETING CIRCUMPLANETARY DISKS: OBSERVATIONAL SIGNATURES

    SciTech Connect

    Zhu, Zhaohuan

    2015-01-20

    I calculate the spectral energy distributions of accreting circumplanetary disks using atmospheric radiative transfer models. Circumplanetary disks only accreting at 10{sup –10} M {sub ☉} yr{sup –1} around a 1 M{sub J} planet can be brighter than the planet itself. A moderately accreting circumplanetary disk ( M-dot ∼10{sup −8} M{sub ⊙} yr{sup −1}; enough to form a 10 M{sub J} planet within 1 Myr) around a 1 M{sub J} planet has a maximum temperature of ∼2000 K, and at near-infrared wavelengths (J, H, K bands), this disk is as bright as a late-M-type brown dwarf or a 10 M{sub J} planet with a ''hot start''. To use direct imaging to find the accretion disks around low-mass planets (e.g., 1 M{sub J} ) and distinguish them from brown dwarfs or hot high-mass planets, it is crucial to obtain photometry at mid-infrared bands (L', M, N bands) because the emission from circumplanetary disks falls off more slowly toward longer wavelengths than those of brown dwarfs or planets. If young planets have strong magnetic fields (≳100 G), fields may truncate slowly accreting circumplanetary disks ( M-dot ≲10{sup −9} M{sub ⊙} yr{sup −1}) and lead to magnetospheric accretion, which can provide additional accretion signatures, such as UV/optical excess from the accretion shock and line emission.

  17. Isotopic signatures by bulk analyses

    SciTech Connect

    Efurd, D.W.; Rokop, D.J.

    1997-12-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally.

  18. Experimental demonstration of photonic quantum digital signatures

    NASA Astrophysics Data System (ADS)

    Collins, Robert J.; Clarke, Patrick J.; Dunjko, Vedran; Andersson, Erika; Jeffers, John; Buller, Gerald S.

    2012-09-01

    Digital signature schemes are often used in interconnected computer networks to verify the origin and authenticity of messages. Current classical digital signature schemes based on so-called "one-way functions" rely on computational complexity to provide security over sufficiently long timescales. However, there are currently no mathematical proofs that such functions will always be computationally complex. Quantum digital signatures offers a means of confirming both origin and authenticity of a message with security verified by information theoretical limits. The message cannot be forged or repudiated. We have constructed, tested and analyzed the security of what is, to the best of our knowledge, the first example of an experimental quantum digital signature system.

  19. 15 CFR 908.16 - Signature.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National... or intending to conduct the weather modification activities referred to therein by such...

  20. 15 CFR 908.16 - Signature.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National... or intending to conduct the weather modification activities referred to therein by such...

  1. 15 CFR 908.16 - Signature.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National... or intending to conduct the weather modification activities referred to therein by such...

  2. 15 CFR 908.16 - Signature.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National... or intending to conduct the weather modification activities referred to therein by such...

  3. 15 CFR 908.16 - Signature.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National... or intending to conduct the weather modification activities referred to therein by such...

  4. Magnetic Signatures on Planets Without Magnetic Fields

    NASA Astrophysics Data System (ADS)

    McEnroe, S. A.; Dyar, M. D.; Brown, L. B.

    2002-03-01

    On extraterrestrial bodies with no present day magnetic fields, the majority of the magnetic signature must come from high coercivity phases such as hemo-ilmenite, ilmenohematite, or very fine-grained magnetite.

  5. Simulation and Experimental Validation of Electromagnetic Signatures for Monitoring of Nuclear Material Storage Containers

    SciTech Connect

    Aker, Pamela M.; Bunch, Kyle J.; Jones, Anthony M.

    2013-01-01

    Previous research at the Pacific Northwest National Laboratory (PNNL) has demonstrated that the low frequency electromagnetic (EM) response of a sealed metallic container interrogated with an encircling coil is a strong function of its contents and can be used to form a distinct signature which can confirm the presence of specific components without revealing hidden geometry or classified design information. Finite element simulations have recently been performed to further investigate this response for a variety of configurations composed of an encircling coil and a typical nuclear material storage container. Excellent agreement was obtained between simulated and measured impedance signatures of electrically conducting spheres placed inside an AT-400R nuclear container. Simulations were used to determine the effects of excitation frequency and the geometry of the encircling coil, nuclear container, and internal contents. The results show that it is possible to use electromagnetic models to evaluate the application of the EM signature technique to proposed versions of nuclear weapons containers which can accommodate restrictions imposed by international arms control and treaty verification legislation.

  6. A feature based comparison of pen and swipe based signature characteristics.

    PubMed

    Robertson, Joshua; Guest, Richard

    2015-10-01

    Dynamic Signature Verification (DSV) is a biometric modality that identifies anatomical and behavioral characteristics when an individual signs their name. Conventionally signature data has been captured using pen/tablet apparatus. However, the use of other devices such as the touch-screen tablets has expanded in recent years affording the possibility of assessing biometric interaction on this new technology. To explore the potential of employing DSV techniques when a user signs or swipes with their finger, we report a study to correlate pen and finger generated features. Investigating the stability and correlation between a set of characteristic features recorded in participant's signatures and touch-based swipe gestures, a statistical analysis was conducted to assess consistency between capture scenarios. The results indicate that there is a range of static and dynamic features such as the rate of jerk, size, duration and the distance the pen traveled that can lead to interoperability between these two systems for input methods for use within a potential biometric context. It can be concluded that this data indicates that a general principle is that the same underlying constructional mechanisms are evident. PMID:26097008

  7. Verification, Seismology, and Negative Evidence

    NASA Astrophysics Data System (ADS)

    Wallace, T. C.; Koper, K. D.; Paquette, A. M.

    2001-05-01

    During the senatorial debate on the merits of ratifying the CTBT it was often stated that the treaty was "unverifiable". This is inherently a political statement since verification always has some level of uncertainty. However, opponents and advocates of the treaty alike, cast this as a technical problem: either the IMS monitoring system has the ability to detect small clandestine nuclear weapons tests or it does not. In the year and a half since the vote on ratification studies have been commissioned to determine the capability of various monitoring systems (IMS, IMS plus other networks) in an attempt to determine the "largest" test that could avoid detection. Unfortunately, the standard for this evaluation is a hypothetical test, an event that has not occurred. In other words, the metric for success is proving an event did not happen, or negative evidence. There are no known examples of full nuclear weapons tests that were undetected, although some were not recognized until after the fact. One example of an explosion that was only recognized later is a small PNE used to suppress a dangerous buildup of methane in a Ukrainian coal mine in Sept. 1979. The experiment, which was 0.3 kt, was on the lists of PNEs that were released in the late 1980s; in fact, the event was recorded on NORSAR and had a magnitude of 3.3. Today this event would be routinely located and flagged as suspicious. A true example of the null hypothesis test arose on Feb. 25 of this year when the London Sunday Times reported a detailed account of a clandestine Iraqi test of a simple nuclear weapon in 1989. This test was alleged to be a 10 kt device detonated in a decoupling cavity. The reports also state that the explosion produced a seismic signal of magnitude 2.7, and went unreported by any catalog outside Iraq. We examined seismic catalogs from Israeli, Jordan and Iran and could find no evidence of this event. Examination of continuous seismic data provides a baseline for the detection capability for the suspect test site. In 1989 the detection threshold was approximately 3.1 from regional stations. Using a conservative estimate of a decoupling factor of 50, and the lack of a seismic signal, gives a yield of 0.8 kt. The number of seismic stations in operation in the middle east has increased significantly since 1989, and the detection capability decreased by .25 magnitude units. The usefulness of negative evidence is providing a baseline for measuring monitoring capability.

  8. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  9. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  10. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  11. Fuel Retrieval System (FRS) Design Verification

    SciTech Connect

    YANOCHKO, R.M.

    2000-01-27

    This document was prepared as part of an independent review to explain design verification activities already completed, and to define the remaining design verification actions for the Fuel Retrieval System. The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR).

  12. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  13. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  14. Land Ice Verification and Validation Kit

    Energy Science and Technology Software Center (ESTSC)

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&Vmore » involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.« less

  15. Verification and validation for magnetic fusion

    SciTech Connect

    Greenwald, Martin

    2010-05-15

    Dramatic progress in the scope and power of plasma simulations over the past decade has extended our understanding of these complex phenomena. However, as codes embody imperfect models for physical reality, a necessary step toward developing a predictive capability is demonstrating agreement, without bias, between simulations and experimental results. While comparisons between computer calculations and experimental data are common, there is a compelling need to make these comparisons more systematic and more quantitative. Tests of models are divided into two phases, usually called verification and validation. Verification is an essentially mathematical demonstration that a chosen physical model, rendered as a set of equations, has been accurately solved by a computer code. Validation is a physical process which attempts to ascertain the extent to which the model used by a code correctly represents reality within some domain of applicability, to some specified level of accuracy. This paper will cover principles and practices for verification and validation including lessons learned from related fields.

  16. Research Plan for Fire Signatures and Detection

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Viewgraphs on the prevention, suppression, and detection of fires aboard a spacecraft is presented. The topics include: 1) Fire Prevention, Detection, and Suppression Sub-Element Products; 2) FPDS Organizing Questions; 3) FPDS Organizing Questions; 4) Signatures, Sensors, and Simulations; 5) Quantification of Fire and Pre-Fire Signatures; 6) Smoke; 7) DAFT Hardware; 8) Additional Benefits of DAFT; 9) Development and Characterization of Sensors 10) Simulation of the Transport of Smoke and Fire Precursors; and 11) FPDS Organizing Questions.

  17. Signature scheme based on bilinear pairs

    NASA Astrophysics Data System (ADS)

    Tong, Rui Y.; Geng, Yong J.

    2013-03-01

    An identity-based signature scheme is proposed by using bilinear pairs technology. The scheme uses user's identity information as public key such as email address, IP address, telephone number so that it erases the cost of forming and managing public key infrastructure and avoids the problem of user private generating center generating forgery signature by using CL-PKC framework to generate user's private key.

  18. Assessing the Quality of Bioforensic Signatures

    SciTech Connect

    Sego, Landon H.; Holmes, Aimee E.; Gosink, Luke J.; Webb-Robertson, Bobbie-Jo M.; Kreuzer, Helen W.; Anderson, Richard M.; Brothers, Alan J.; Corley, Courtney D.; Tardiff, Mark F.

    2013-06-04

    We present a mathematical framework for assessing the quality of signature systems in terms of fidelity, cost, risk, and utility—a method we refer to as Signature Quality Metrics (SQM). We demonstrate the SQM approach by assessing the quality of a signature system designed to predict the culture medium used to grow a microorganism. The system consists of four chemical assays designed to identify various ingredients that could be used to produce the culture medium. The analytical measurements resulting from any combination of these four assays can be used in a Bayesian network to predict the probabilities that the microorganism was grown using one of eleven culture media. We evaluated fifteen combinations of the signature system by removing one or more of the assays from the Bayes network. We demonstrated that SQM can be used to distinguish between the various combinations in terms of attributes of interest. The approach assisted in clearly identifying assays that were least informative, largely in part because they only could discriminate between very few culture media, and in particular, culture media that are rarely used. There are limitations associated with the data that were used to train and test the signature system. Consequently, our intent is not to draw formal conclusions regarding this particular bioforensic system, but rather to illustrate an analytical approach that could be useful in comparing one signature system to another.

  19. Chemical and Physical Signatures for Microbial Forensics

    SciTech Connect

    Cliff, John B.; Kreuzer, Helen W.; Ehrhardt, Christopher J.; Wunschel, David S.

    2012-01-03

    Chemical and physical signatures for microbial forensics John Cliff and Helen Kreuzer-Martin, eds. Humana Press Chapter 1. Introduction: Review of history and statement of need. Randy Murch, Virginia Tech Chapter 2. The Microbe: Structure, morphology, and physiology of the microbe as they relate to potential signatures of growth conditions. Joany Jackman, Johns Hopkins University Chapter 3. Science for Forensics: Special considerations for the forensic arena - quality control, sample integrity, etc. Mark Wilson (retired FBI): Western Carolina University Chapter 4. Physical signatures: Light and electron microscopy, atomic force microscopy, gravimetry etc. Joseph Michael, Sandia National Laboratory Chapter 5. Lipids: FAME, PLFA, steroids, LPS, etc. James Robertson, Federal Bureau of Investigation Chapter 6. Carbohydrates: Cell wall components, cytoplasm components, methods Alvin Fox, University of South Carolina School of Medicine David Wunschel, Pacific Northwest National Laboratory Chapter 7. Peptides: Peptides, proteins, lipoproteins David Wunschel, Pacific Northwest National Laboratory Chapter 8. Elemental content: CNOHPS (treated in passing), metals, prospective cell types John Cliff, International Atomic Energy Agency Chapter 9. Isotopic signatures: Stable isotopes C,N,H,O,S, 14C dating, potential for heavy elements. Helen Kreuzer-Martin, Pacific Northwest National Laboratory Michaele Kashgarian, Lawrence Livermore National Laboratory Chapter 10. Extracellular signatures: Cellular debris, heme, agar, headspace, spent media, etc Karen Wahl, Pacific Northwest National Laboratory Chapter 11. Data Reduction and Integrated Microbial Forensics: Statistical concepts, parametric and multivariate statistics, integrating signatures Kristin Jarman, Pacific Northwest National Laboratory

  20. Source signature estimation in the seismic experiment

    SciTech Connect

    Osen, A.; Amundsen, L.; Secrest, B.G.

    1994-12-31

    The source signature is a necessary input to many algorithms in seismic data processing for extracting reliable information of the earth`s structure. In the last few years a number of papers have addressed the problem of deterministic source signature estimation using the acoustic wave equation. In particular, many algorithms utilizing two marine field measurements have been proposed. The most general source signature identification method for marine seismic (acoustic) data has been given by Weglein and Secrest (1990), who showed that the source signature could be estimated by solving a Kirchhoff-like integral over the receiver surface. The algorithm requires no knowledge of the scattering medium below the receivers. When the data are measured along horizontal surfaces, the authors show that the source signature problem easily can be solved in the frequency-horizontal wavenumber domain by a linear least-squares inversion technique. Furthermore, if the sources in the array are located at the same depth, the source signature estimation algorithm corresponds to frequency-wavenumber algorithms published previously.

  1. Signature extension through the application of cluster matching algorithms to determine appropriate signature transformations

    NASA Technical Reports Server (NTRS)

    Lambeck, P. F.; Rice, D. P.

    1976-01-01

    Signature extension is intended to increase the space-time range over which a set of training statistics can be used to classify data without significant loss of recognition accuracy. A first cluster matching algorithm MASC (Multiplicative and Additive Signature Correction) was developed at the Environmental Research Institute of Michigan to test the concept of using associations between training and recognition area cluster statistics to define an average signature transformation. A more recent signature extension module CROP-A (Cluster Regression Ordered on Principal Axis) has shown evidence of making significant associations between training and recognition area cluster statistics, with the clusters to be matched being selected automatically by the algorithm.

  2. 340 and 310 drawing field verification

    SciTech Connect

    Langdon, J.

    1996-09-27

    The purpose of the drawing field verification work plan is to provide reliable drawings for the 310 Treated Effluent Disposal Facility (TEDF) and 340 Waste Handling Facility (340 Facility). The initial scope of this work plan is to provide field verified and updated versions of all the 340 Facility essential drawings. This plan can also be used for field verification of any other drawings that the facility management directs to be so updated. Any drawings revised by this work plan will be issued in an AutoCAD format.

  3. Fuel Retrieval System Design Verification Report

    SciTech Connect

    GROTH, B.D.

    2000-04-11

    The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway. Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR). A Design Verification Status Questionnaire, Table 1, is included which addresses Corrective Action SNF-EG-MA-EG-20000060, Item No.9 (Miller 2000).

  4. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  5. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  6. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  7. Verification of Plan Models Using UPPAAL

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Haveland, Klaus; Lau, Sonic (Technical Monitor)

    2001-01-01

    This paper describes work on the verification of HSTS, the planner and scheduler of the Remote Agent autonomous control system deployed in Deep Space 1 (DS1). The verification is done using UPPAAL, a real time model checking tool. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify. Finally, we conclude with a summary.

  8. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM FOR MONITORING AND CHARACTERIZATION

    EPA Science Inventory

    The Environmental Technology Verification Program is a service of the Environmental Protection Agency designed to accelerate the development and commercialization of improved environmental technology through third party verification and reporting of performance. The goal of ETV i...

  10. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agencys Environmental Technology Verification Progr...

  11. ETV INTERNATIONAL OUTREACH ACTIVITIES (ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program's international outrearch activities have extended as far as Canada, Germany, Taiwan and the Philippines. Vendors from Canada and Germany were hosted at verification tests of turbidimeters. In May 1999, EPA's ETV Coordinator...

  12. 37 CFR 382.16 - Verification of royalty distributions.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... CONGRESS RATES AND TERMS FOR STATUTORY LICENSES RATES AND TERMS FOR DIGITAL TRANSMISSIONS OF SOUND... Collective have agreed as to proper verification methods. (b) Frequency of verification. A Copyright Owner...

  13. Online bartering motivations.

    PubMed

    Lee, Hsiang-Ming; Chen, Tsai; Hung, Min-Li

    2014-08-01

    This study examined the role of enjoyment in people's decision to barter online. A survey in barter BBS/discussion forums and websites collected data from 135 participants (30 men, 105 women; 71% in the age group of 21-30 years) who barter online. To test a modification of the Expectation Confirmation Model, perceived enjoyment, confirmation of expectations, perceived usefulness, satisfaction, and continuance intention were measured. The data analysis showed that the expanded ECM had good explanatory power, with all paths supported except for perceived usefulness-satisfaction. In the proposed model, 33.1% of the variance in continuance intentions was predicted by the independent variables. Thus, the expanded ECM can provide supplementary information that is relevant for understanding continued online bartering usage. Barter website managers may encourage users' intentions to continue using these websites by emphasizing enjoyable aspects of online bartering. PMID:25153951

  14. Protecting Children's Online Privacy.

    ERIC Educational Resources Information Center

    Kresses, Mamie

    2001-01-01

    Discuss provisions of new federal Children's Online Privacy Protection Act that principals should know to protect student privacy on the Internet. Also discusses relevant provisions of the Family Educational Rights and Privacy Act. (PKP)

  15. Searching Online for 'Hemorrhoids'?

    MedlinePLUS

    ... Home For Consumers Consumer Updates Searching Online for 'Hemorrhoids'? Share Tweet Linkedin Pin it More sharing options ... lumps near the anus. back to top Treating Hemorrhoids There are a number of over-the-counter ...

  16. MEDLINE (MEDLARS ONLINE)

    EPA Science Inventory

    MEDLINE (MEDlars onLINE) is the National Library of Medicines (NLM) premier bibliographic database covering the fields of medicine, nursing, dentistry, veterinary medicine, the health care system, and the preclinical sciences. It contains bibliographic citations (e.g., authors, ...

  17. Subduction signature in backarc mantle?

    NASA Astrophysics Data System (ADS)

    Nelson, W. R.; Snow, J. E.; Brandon, A. D.; Ohara, Y.

    2013-12-01

    Abyssal peridotites exposed during seafloor extension provide a rare glimpse into the processes occurring within the oceanic mantle. Whole rock and mineral-scale major element data from abyssal peridotites record processes intimately associated with melt-depletion and melt-rock interaction occurring just prior to exposure of the mantle at the surface. Isotopic data, however, can provide insight into the long-term evolution of the oceanic mantle. A number of studies of mantle material exposed along mid-ocean ridges have demonstrated that abyssal peridotites from Mid-Atlantic Ridge, Gakkel Ridge, and Southwest Indian Ridge commonly display a range of whole rock Os isotopic ratios (187Os/188Os = 0.118- 0.130; Brandon et al., 2000; Standish et al., 2002; Alard et al., 2005; Harvey et al., 2006; Liu et al., 2008). The range of isotopic values in each region demonstrates that the oceanic mantle does not melt uniformly over time. Instead, anciently depleted regions (187Os/188Os ? 0.118) are juxtaposed against relatively fertile regions (187Os/188Os ? 0.130) that are isotopically similar to established primitive mantle values (187Os/188Os = 0.1296; Meisel et al. 2001). Abyssal peridotites from the Godzilla Megamullion and Chaotic Terrain in the backarc Parece Vela Basin (Philippine Sea) display a range of Os isotopic values extending to similar unradiogenic values. However, some of the backarc basin abyssal peridotites record more radiogenic 187Os/188Os values (0.135-0.170) than mid-ocean ridge peridotites. Comparable radiogenic signatures are reported only in highly weathered abyssal peridotites (187Os/188Os ? 0.17, Standish et al., 2002) and subduction-related volcanic arc peridotites (187Os/188Os ? 0.16, Brandon et al., 1996; Widom et al., 2003). In both the weathered peridotites and arc peridotites, the 187Os/188Os value is negatively correlated with Os abundance: the most radiogenic value has the lowest Os abundance (< 1 ppb) making them highly susceptible to overprinting by radiogenic fluids. In contrast, abyssal peridotites from the Parece Vela Basin show no correlation between 187Os/188Os and Os abundance; abyssal peridotites with radiogenic 187Os/188Os contain 2-3 ppb Os. Additionally, there is no correlation between 187Os/188Os and major element indicators of melt-rock interaction or Re abundance. Similar to the conclusions of Brandon et al. (1996), we suggest that radiogenic Os can be partitioned into slab-derived fluids and subsequently introduced into the overlying mantle. As a result, the radiogenic Os isotopic ratios found in backarc abyssal peridotites may be a consequence of subduction processes.

  18. Employers Often Distrust Online Degrees

    ERIC Educational Resources Information Center

    Carnevale, Dan

    2007-01-01

    This article explains why employers are reluctant to accept potential employees with online degrees. The results of several surveys of those who evaluate potential employees and make hiring decisions indicate a bias against online degrees, even as more and more colleges are offering programs online. To those officials, the words "online education"

  19. The Anatomy of Online Offerings

    ERIC Educational Resources Information Center

    Hoskins, Barbara J.

    2014-01-01

    The perceptions about online teaching and learning are frequently different from the reality. Some students say they expected the online course to be easier than the traditional face-to-face course and are surprised by the rigor, while skeptics decry the quality of online offerings since students cannot possibly learn as well online as they do in…

  20. Autonomy Support for Online Students

    ERIC Educational Resources Information Center

    Lee, Eunbae; Pate, Joseph A.; Cozart, Deanna

    2015-01-01

    Despite the rapid growth of online learning in higher education, the dropout rates for online courses has reached 50 percent. Lack of student engagement rank as a critical reason for frequent online course dropout. This article discusses autonomy support as a strategy to enhance online students' intrinsic motivation and engagement. Drawing from

  1. Getting Ahead by Getting Online.

    ERIC Educational Resources Information Center

    Hamilton-Pennell, Christine

    2002-01-01

    Discussion of online learning and distance education focuses on continuing education possibilities for librarians. Highlights include how online courses work; overcoming feelings of isolation; the use of technology; the effectiveness of online learning; future trends and challenges; considerations before signing up; and finding online professional…

  2. The Anatomy of Online Offerings

    ERIC Educational Resources Information Center

    Hoskins, Barbara J.

    2014-01-01

    The perceptions about online teaching and learning are frequently different from the reality. Some students say they expected the online course to be easier than the traditional face-to-face course and are surprised by the rigor, while skeptics decry the quality of online offerings since students cannot possibly learn as well online as they do in

  3. Online TESL/TEFL Training.

    ERIC Educational Resources Information Center

    Nixon, Thomas

    2003-01-01

    Discusses distance learning with a focus on online learning, which is growing in popularity at a faster rate than other types of programs. Looks at the advantages and disadvantages of online learning, the future of online learning, and specifically TESOL training online. (Author/VWL)

  4. Employers Often Distrust Online Degrees

    ERIC Educational Resources Information Center

    Carnevale, Dan

    2007-01-01

    This article explains why employers are reluctant to accept potential employees with online degrees. The results of several surveys of those who evaluate potential employees and make hiring decisions indicate a bias against online degrees, even as more and more colleges are offering programs online. To those officials, the words "online education"…

  5. Earth Science Enterprise Scientific Data Purchase Project: Verification and Validation

    NASA Technical Reports Server (NTRS)

    Jenner, Jeff; Policelli, Fritz; Fletcher, Rosea; Holecamp, Kara; Owen, Carolyn; Nicholson, Lamar; Dartez, Deanna

    2000-01-01

    This paper presents viewgraphs on the Earth Science Enterprise Scientific Data Purchase Project's verification,and validation process. The topics include: 1) What is Verification and Validation? 2) Why Verification and Validation? 3) Background; 4) ESE Data Purchas Validation Process; 5) Data Validation System and Ingest Queue; 6) Shipment Verification; 7) Tracking and Metrics; 8) Validation of Contract Specifications; 9) Earth Watch Data Validation; 10) Validation of Vertical Accuracy; and 11) Results of Vertical Accuracy Assessment.

  6. 47 CFR 54.419 - Validity of electronic signatures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... electronic signatures. (a) For the purposes of this subpart, an electronic signature, defined by the Electronic Signatures in Global and National Commerce Act, as an electronic sound, symbol, or process... 47 Telecommunication 3 2014-10-01 2014-10-01 false Validity of electronic signatures....

  7. 47 CFR 54.419 - Validity of electronic signatures.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... electronic signatures. (a) For the purposes of this subpart, an electronic signature, defined by the Electronic Signatures in Global and National Commerce Act, as an electronic sound, symbol, or process... 47 Telecommunication 3 2012-10-01 2012-10-01 false Validity of electronic signatures....

  8. 47 CFR 54.680 - Validity of electronic signatures.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... Validity of electronic signatures. (a) For the purposes of this subpart, an electronic signature (defined by the Electronic Signatures in Global and National Commerce Act, as an electronic sound, symbol, or... 47 Telecommunication 3 2014-10-01 2014-10-01 false Validity of electronic signatures....

  9. 47 CFR 54.419 - Validity of electronic signatures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... electronic signatures. (a) For the purposes of this subpart, an electronic signature, defined by the Electronic Signatures in Global and National Commerce Act, as an electronic sound, symbol, or process... 47 Telecommunication 3 2013-10-01 2013-10-01 false Validity of electronic signatures....

  10. 47 CFR 54.680 - Validity of electronic signatures.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... Validity of electronic signatures. (a) For the purposes of this subpart, an electronic signature (defined by the Electronic Signatures in Global and National Commerce Act, as an electronic sound, symbol, or... 47 Telecommunication 3 2013-10-01 2013-10-01 false Validity of electronic signatures....

  11. ENVIRONMENTAL TECHNOLOGY VERIFICATION REPORT - PERFORMANCE VERIFICATION OF THE W.L. GORE & ASSOCIATES GORE-SORBER SCREENING SURVEY

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) has created the Environmental Technology Verification Program (ETV) to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The goal of the...

  12. 37 CFR 261.7 - Verification of royalty payments.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Verification of royalty... TRANSMISSIONS AND THE MAKING OF EPHEMERAL REPRODUCTIONS § 261.7 Verification of royalty payments. (a) General. This section prescribes general rules pertaining to the verification by any Copyright Owner...

  13. 6 CFR 37.13 - Document verification requirements.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 6 Domestic Security 1 2010-01-01 2010-01-01 false Document verification requirements. 37.13... LICENSES AND IDENTIFICATION CARDS Minimum Documentation, Verification, and Card Issuance Requirements § 37.13 Document verification requirements. (a) States shall make reasonable efforts to ensure that...

  14. 37 CFR 380.26 - Verification of royalty distributions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Verification of royalty... Webcasters § 380.26 Verification of royalty distributions. (a) General. This section prescribes procedures by... or Performer and the Collective have agreed as to proper verification methods. (b) Frequency...

  15. 19 CFR 181.73 - Notification of verification visit.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 2 2010-04-01 2010-04-01 false Notification of verification visit. 181.73 Section 181.73 Customs Duties U.S. CUSTOMS AND BORDER PROTECTION, DEPARTMENT OF HOMELAND SECURITY; DEPARTMENT OF THE TREASURY (CONTINUED) NORTH AMERICAN FREE TRADE AGREEMENT Origin Verifications and Determinations § 181.73 Notification of verification...

  16. 37 CFR 384.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Verification of royalty... BUSINESS ESTABLISHMENT SERVICES § 384.7 Verification of royalty distributions. (a) General. This section... Copyright Owner and the Collective have agreed as to proper verification methods. (b) Frequency...

  17. 19 CFR 181.74 - Verification visit procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 19 Customs Duties 2 2011-04-01 2011-04-01 false Verification visit procedures. 181.74 Section 181... § 181.74 Verification visit procedures. (a) Written consent required. Prior to conducting a verification... (a) of this section shall be delivered by certified or registered mail, or by any other method...

  18. 19 CFR 351.307 - Verification of information.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 19 Customs Duties 3 2010-04-01 2010-04-01 false Verification of information. 351.307 Section 351... COUNTERVAILING DUTIES Information and Argument § 351.307 Verification of information. (a) Introduction. Prior to... verify relevant factual information. This section clarifies when verification will occur, the contents...

  19. 37 CFR 380.7 - Verification of royalty distributions.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2011-07-01 2011-07-01 false Verification of royalty... Noncommercial Webcasters § 380.7 Verification of royalty distributions. (a) General. This section prescribes... Copyright Owner or Performer and the Collective have agreed as to proper verification methods. (b)...

  20. 37 CFR 261.7 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 37 Patents, Trademarks, and Copyrights 1 2010-07-01 2010-07-01 false Verification of royalty... TRANSMISSIONS AND THE MAKING OF EPHEMERAL REPRODUCTIONS § 261.7 Verification of royalty payments. (a) General. This section prescribes general rules pertaining to the verification by any Copyright Owner...