Science.gov

Sample records for online signature verification

  1. Enhanced Cancelable Biometrics for Online Signature Verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. In this paper, we propose a scheme to improve the performance of a cancelable approach for online signature verification. Our scheme generates two cancelable dataset from one raw dataset and uses them for verification. Preliminary experiments were performed using a distance-based online signature verification algorithm. The experimental results show that our proposed scheme is promising.

  2. Fusion strategies for boosting cancelable online signature verification

    NASA Astrophysics Data System (ADS)

    Muramatsu, Daigo; Inuma, Manabu; Shikata, Junji; Otsuka, Akira

    2010-04-01

    Cancelable approaches for biometric person authentication have been studied to protect enrolled biometric data, and several algorithms have been proposed. One drawback of cancelable approaches is that the performance is inferior to that of non-cancelable approaches. As one solution, we proposed a scheme to enhance the performance of a cancelable approach for online signature verification by combining scores calculated from two transformed datasets generated using two keys. Generally, the same verification algorithm is used for transformed data as for raw (non-transformed) data in cancelable approaches, and, in our previous work, a verification system developed for a non-transformed dataset was used to calculate the scores from transformed data. In this paper, we modify the verification system by using transformed data for training. Several experiments were performed by using public databases, and the experimental results show that the modification of the verification system improved the performances. Our cancelable system combines two scores to make a decision. Several fusion strategies are also considered, and the experimental results are reported here.

  3. Online handwritten signature verification using neural network classifier based on principal component analysis.

    PubMed

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Yussof, Salman; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%.

  4. Online Handwritten Signature Verification Using Neural Network Classifier Based on Principal Component Analysis

    PubMed Central

    Iranmanesh, Vahab; Ahmad, Sharifah Mumtazah Syed; Adnan, Wan Azizun Wan; Arigbabu, Olasimbo Ayodeji; Malallah, Fahad Layth

    2014-01-01

    One of the main difficulties in designing online signature verification (OSV) system is to find the most distinctive features with high discriminating capabilities for the verification, particularly, with regard to the high variability which is inherent in genuine handwritten signatures, coupled with the possibility of skilled forgeries having close resemblance to the original counterparts. In this paper, we proposed a systematic approach to online signature verification through the use of multilayer perceptron (MLP) on a subset of principal component analysis (PCA) features. The proposed approach illustrates a feature selection technique on the usually discarded information from PCA computation, which can be significant in attaining reduced error rates. The experiment is performed using 4000 signature samples from SIGMA database, which yielded a false acceptance rate (FAR) of 7.4% and a false rejection rate (FRR) of 6.4%. PMID:25133227

  5. Retail applications of signature verification

    NASA Astrophysics Data System (ADS)

    Zimmerman, Thomas G.; Russell, Gregory F.; Heilper, Andre; Smith, Barton A.; Hu, Jianying; Markman, Dmitry; Graham, Jon E.; Drews, Clemens

    2004-08-01

    The dramatic rise in identity theft, the ever pressing need to provide convenience in checkout services to attract and retain loyal customers, and the growing use of multi-function signature captures devices in the retail sector provides favorable conditions for the deployment of dynamic signature verification (DSV) in retail settings. We report on the development of a DSV system to meet the needs of the retail sector. We currently have a database of approximately 10,000 signatures collected from 600 subjects and forgers. Previous work at IBM on DSV has been merged and extended to achieve robust performance on pen position data available from commercial point of sale hardware, achieving equal error rates on skilled forgeries and authentic signatures of 1.5% to 4%.

  6. Input apparatus for dynamic signature verification systems

    DOEpatents

    EerNisse, Errol P.; Land, Cecil E.; Snelling, Jay B.

    1978-01-01

    The disclosure relates to signature verification input apparatus comprising a writing instrument and platen containing piezoelectric transducers which generate signals in response to writing pressures.

  7. Generic Verification Protocol for Verification of Online Turbidimeters

    EPA Science Inventory

    This protocol provides generic procedures for implementing a verification test for the performance of online turbidimeters. The verification tests described in this document will be conducted under the Environmental Technology Verification (ETV) Program. Verification tests will...

  8. Online fingerprint verification.

    PubMed

    Upendra, K; Singh, S; Kumar, V; Verma, H K

    2007-01-01

    As organizations search for more secure authentication methods for user access, e-commerce, and other security applications, biometrics is gaining increasing attention. With an increasing emphasis on the emerging automatic personal identification applications, fingerprint based identification is becoming more popular. The most widely used fingerprint representation is the minutiae based representation. The main drawback with this representation is that it does not utilize a significant component of the rich discriminatory information available in the fingerprints. Local ridge structures cannot be completely characterized by minutiae. Also, it is difficult quickly to match two fingerprint images containing different number of unregistered minutiae points. In this study filter bank based representation, which eliminates these weakness, is implemented and the overall performance of the developed system is tested. The results have shown that this system can be used effectively for secure online verification applications.

  9. Classification and Verification of Handwritten Signatures with Time Causal Information Theory Quantifiers.

    PubMed

    Rosso, Osvaldo A; Ospina, Raydonal; Frery, Alejandro C

    2016-01-01

    We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups.

  10. Classification and Verification of Handwritten Signatures with Time Causal Information Theory Quantifiers

    PubMed Central

    Ospina, Raydonal; Frery, Alejandro C.

    2016-01-01

    We present a new approach for handwritten signature classification and verification based on descriptors stemming from time causal information theory. The proposal uses the Shannon entropy, the statistical complexity, and the Fisher information evaluated over the Bandt and Pompe symbolization of the horizontal and vertical coordinates of signatures. These six features are easy and fast to compute, and they are the input to an One-Class Support Vector Machine classifier. The results are better than state-of-the-art online techniques that employ higher-dimensional feature spaces which often require specialized software and hardware. We assess the consistency of our proposal with respect to the size of the training sample, and we also use it to classify the signatures into meaningful groups. PMID:27907014

  11. Online adaptation and verification of VMAT

    SciTech Connect

    Crijns, Wouter; Defraene, Gilles; Depuydt, Tom; Haustermans, Karin; Van Herck, Hans; Maes, Frederik; Van den Heuvel, Frank

    2015-07-15

    Purpose: This work presents a method for fast volumetric modulated arc therapy (VMAT) adaptation in response to interfraction anatomical variations. Additionally, plan parameters extracted from the adapted plans are used to verify the quality of these plans. The methods were tested as a prostate class solution and compared to replanning and to their current clinical practice. Methods: The proposed VMAT adaptation is an extension of their previous intensity modulated radiotherapy (IMRT) adaptation. It follows a direct (forward) planning approach: the multileaf collimator (MLC) apertures are corrected in the beam’s eye view (BEV) and the monitor units (MUs) are corrected using point dose calculations. All MLC and MU corrections are driven by the positions of four fiducial points only, without need for a full contour set. Quality assurance (QA) of the adapted plans is performed using plan parameters that can be calculated online and that have a relation to the delivered dose or the plan quality. Five potential parameters are studied for this purpose: the number of MU, the equivalent field size (EqFS), the modulation complexity score (MCS), and the components of the MCS: the aperture area variability (AAV) and the leaf sequence variability (LSV). The full adaptation and its separate steps were evaluated in simulation experiments involving a prostate phantom subjected to various interfraction transformations. The efficacy of the current VMAT adaptation was scored by target mean dose (CTV{sub mean}), conformity (CI{sub 95%}), tumor control probability (TCP), and normal tissue complication probability (NTCP). The impact of the adaptation on the plan parameters (QA) was assessed by comparison with prediction intervals (PI) derived from a statistical model of the typical variation of these parameters in a population of VMAT prostate plans (n = 63). These prediction intervals are the adaptation equivalent of the tolerance tables for couch shifts in the current clinical

  12. Age and gender-invariant features of handwritten signatures for verification systems

    NASA Astrophysics Data System (ADS)

    AbdAli, Sura; Putz-Leszczynska, Joanna

    2014-11-01

    Handwritten signature is one of the most natural biometrics, the study of human physiological and behavioral patterns. Behavioral biometrics includes signatures that may be different due to its owner gender or age because of intrinsic or extrinsic factors. This paper presents the results of the author's research on age and gender influence on verification factors. The experiments in this research were conducted using a database that contains signatures and their associated metadata. The used algorithm is based on the universal forgery feature idea, where the global classifier is able to classify a signature as a genuine one or, as a forgery, without the actual knowledge of the signature template and its owner. Additionally, the reduction of the dimensionality with the MRMR method is discussed.

  13. Spectral signature verification using statistical analysis and text mining

    NASA Astrophysics Data System (ADS)

    DeCoster, Mallory E.; Firpi, Alexe H.; Jacobs, Samantha K.; Cone, Shelli R.; Tzeng, Nigel H.; Rodriguez, Benjamin M.

    2016-05-01

    In the spectral science community, numerous spectral signatures are stored in databases representative of many sample materials collected from a variety of spectrometers and spectroscopists. Due to the variety and variability of the spectra that comprise many spectral databases, it is necessary to establish a metric for validating the quality of spectral signatures. This has been an area of great discussion and debate in the spectral science community. This paper discusses a method that independently validates two different aspects of a spectral signature to arrive at a final qualitative assessment; the textual meta-data and numerical spectral data. Results associated with the spectral data stored in the Signature Database1 (SigDB) are proposed. The numerical data comprising a sample material's spectrum is validated based on statistical properties derived from an ideal population set. The quality of the test spectrum is ranked based on a spectral angle mapper (SAM) comparison to the mean spectrum derived from the population set. Additionally, the contextual data of a test spectrum is qualitatively analyzed using lexical analysis text mining. This technique analyzes to understand the syntax of the meta-data to provide local learning patterns and trends within the spectral data, indicative of the test spectrum's quality. Text mining applications have successfully been implemented for security2 (text encryption/decryption), biomedical3 , and marketing4 applications. The text mining lexical analysis algorithm is trained on the meta-data patterns of a subset of high and low quality spectra, in order to have a model to apply to the entire SigDB data set. The statistical and textual methods combine to assess the quality of a test spectrum existing in a database without the need of an expert user. This method has been compared to other validation methods accepted by the spectral science community, and has provided promising results when a baseline spectral signature is

  14. On Hunting Animals of the Biometric Menagerie for Online Signature.

    PubMed

    Houmani, Nesma; Garcia-Salicetti, Sonia

    2016-01-01

    Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database.

  15. On Hunting Animals of the Biometric Menagerie for Online Signature

    PubMed Central

    Houmani, Nesma; Garcia-Salicetti, Sonia

    2016-01-01

    Individuals behave differently regarding to biometric authentication systems. This fact was formalized in the literature by the concept of Biometric Menagerie, defining and labeling user groups with animal names in order to reflect their characteristics with respect to biometric systems. This concept was illustrated for face, fingerprint, iris, and speech modalities. The present study extends the Biometric Menagerie to online signatures, by proposing a novel methodology that ties specific quality measures for signatures to categories of the Biometric Menagerie. Such measures are combined for retrieving automatically writer categories of the extended version of the Biometric Menagerie. Performance analysis with different types of classifiers shows the pertinence of our approach on the well-known MCYT-100 database. PMID:27054836

  16. On-line infrared process signature measurements through combustion atmospheres

    NASA Astrophysics Data System (ADS)

    Zweibaum, F. M.; Kozlowski, A. T.; Surette, W. E., Jr.

    1980-01-01

    A number of on-line infrared process signature measurements have been made through combustion atmospheres, including those in jet engines, piston engines, and coal gasification reactors. The difficulties involved include operation in the presence of pressure as high as 1800 psi, temperatures as high as 3200 F, and explosive, corrosive and dust-laden atmospheres. Calibration problems have resulted from the use of purge gases to clear the viewing tubes, and the obscuration of the view ports by combustion products. A review of the solutions employed to counteract the problems is presented, and areas in which better solutions are required are suggested.

  17. Analysis of an indirect neutron signature for enhanced UF6 cylinder verification

    NASA Astrophysics Data System (ADS)

    Kulisek, J. A.; McDonald, B. S.; Smith, L. E.; Zalavadia, M. A.; Webster, J. B.

    2017-02-01

    The International Atomic Energy Agency (IAEA) currently uses handheld gamma-ray spectrometers combined with ultrasonic wall-thickness gauges to verify the declared enrichment of uranium hexafluoride (UF6) cylinders. The current method provides relatively low accuracy for the assay of 235U enrichment, especially for natural and depleted UF6. Furthermore, the current method provides no capability to assay the absolute mass of 235U in the cylinder due to the localized instrument geometry and limited penetration of the 186-keV gamma-ray signature from 235U. Also, the current verification process is a time-consuming component of on-site inspections at uranium enrichment plants. Toward the goal of a more-capable cylinder assay method, the Pacific Northwest National Laboratory has developed the hybrid enrichment verification array (HEVA). HEVA measures both the traditional 186-keV direct signature and a non-traditional, high-energy neutron-induced signature (HEVANT). HEVANT enables full-volume assay of UF6 cylinders by exploiting the relatively larger mean free paths of the neutrons emitted from the UF6. In this work, Monte Carlo modeling is used as the basis for characterizing HEVANT in terms of the individual contributions to HEVANT from nuclides and hardware components. Monte Carlo modeling is also used to quantify the intrinsic efficiency of HEVA for neutron detection in a cylinder-assay geometry. Modeling predictions are validated against neutron-induced gamma-ray spectra from laboratory measurements and a relatively large population of Type 30B cylinders spanning a range of enrichments. Implications of the analysis and findings on the viability of HEVA for cylinder verification are discussed, such as the resistance of the HEVANT signature to manipulation by the nearby placement of neutron-conversion materials.

  18. Online signature recognition using principal component analysis and artificial neural network

    NASA Astrophysics Data System (ADS)

    Hwang, Seung-Jun; Park, Seung-Je; Baek, Joong-Hwan

    2016-12-01

    In this paper, we propose an algorithm for on-line signature recognition using fingertip point in the air from the depth image acquired by Kinect. We extract 10 statistical features from X, Y, Z axis, which are invariant to changes in shifting and scaling of the signature trajectories in three-dimensional space. Artificial neural network is adopted to solve the complex signature classification problem. 30 dimensional features are converted into 10 principal components using principal component analysis, which is 99.02% of total variances. We implement the proposed algorithm and test to actual on-line signatures. In experiment, we verify the proposed method is successful to classify 15 different on-line signatures. Experimental result shows 98.47% of recognition rate when using only 10 feature vectors.

  19. On-line failure detection and damping measurement of aerospace structures by random decrement signatures

    NASA Technical Reports Server (NTRS)

    Cole, H. A., Jr.

    1973-01-01

    Random decrement signatures of structures vibrating in a random environment are studied through use of computer-generated and experimental data. Statistical properties obtained indicate that these signatures are stable in form and scale and hence, should have wide application in one-line failure detection and damping measurement. On-line procedures are described and equations for estimating record-length requirements to obtain signatures of a prescribed precision are given.

  20. Optical security verification by synthesizing thin films with unique polarimetric signatures.

    PubMed

    Carnicer, Artur; Arteaga, Oriol; Pascual, Esther; Canillas, Adolf; Vallmitjana, Santiago; Javidi, Bahram; Bertran, Enric

    2015-11-15

    This Letter reports the production and optical polarimetric verification of codes based on thin-film technology for security applications. Because thin-film structures display distinctive polarization signatures, this data is used to authenticate the message encoded. Samples are analyzed using an imaging ellipsometer able to measure the 16 components of the Mueller matrix. As a result, the behavior of the thin film under polarized light becomes completely characterized. This information is utilized to distinguish among true and false codes by means of correlation. Without the imaging optics the components of the Mueller matrix become noise-like distributions and, consequently, the message encoded is no longer available. Then, a set of Stokes vectors are generated numerically for any polarization state of the illuminating beam and thus, machine learning techniques can be used to perform classification. We show that successful authentication is possible using the k-nearest neighbors algorithm in thin-films codes that have been anisotropically phase-encoded with pseudorandom phase code.

  1. A method for online verification of adapted fields using an independent dose monitor

    SciTech Connect

    Chang Jina; Norrlinger, Bernhard D.; Heaton, Robert K.; Jaffray, David A.; Cho, Young-Bin; Islam, Mohammad K.; Mahon, Robert

    2013-07-15

    Purpose: Clinical implementation of online adaptive radiotherapy requires generation of modified fields and a method of dosimetric verification in a short time. We present a method of treatment field modification to account for patient setup error, and an online method of verification using an independent monitoring system.Methods: The fields are modified by translating each multileaf collimator (MLC) defined aperture in the direction of the patient setup error, and magnifying to account for distance variation to the marked isocentre. A modified version of a previously reported online beam monitoring system, the integral quality monitoring (IQM) system, was investigated for validation of adapted fields. The system consists of a large area ion-chamber with a spatial gradient in electrode separation to provide a spatially sensitive signal for each beam segment, mounted below the MLC, and a calculation algorithm to predict the signal. IMRT plans of ten prostate patients have been modified in response to six randomly chosen setup errors in three orthogonal directions.Results: A total of approximately 49 beams for the modified fields were verified by the IQM system, of which 97% of measured IQM signal agree with the predicted value to within 2%.Conclusions: The modified IQM system was found to be suitable for online verification of adapted treatment fields.

  2. SU-E-T-602: Patient-Specific Online Dose Verification Based On Transmission Detector Measurements

    SciTech Connect

    Thoelking, J; Yuvaraj, S; Jens, F; Lohr, F; Wenz, F; Wertz, H; Wertz, H

    2015-06-15

    Purpose: Intensity modulated radiotherapy requires a comprehensive quality assurance program in general and ideally independent verification of dose delivery. Since conventional 2D detector arrays allow only pre-treatment verification, there is a debate concerning the need of online dose verification. This study presents the clinical performance, including dosimetric plan verification in 2D as well as in 3D and the error detection abilities of a new transmission detector (TD) for online dose verification of 6MV photon beam. Methods: To validate the dosimetric performance of the new device, dose reconstruction based on TD measurements were compared to a conventional pre-treatment verification method (reference) and treatment planning system (TPS) for 18 IMRT and VMAT treatment plans. Furthermore, dose reconstruction inside the patient based on TD read-out was evaluated by comparing various dose volume indices and 3D gamma evaluations against independent dose computation and TPS. To investigate the sensitivity of the new device, different types of systematic and random errors for leaf positions and linac output were introduced in IMRT treatment sequences. Results: The 2D gamma index evaluation of transmission detector based dose reconstruction showed an excellent agreement for all IMRT and VMAT plans compared to reference measurements (99.3±1.2)% and TPS (99.1±0.7)%. Good agreement was also obtained for 3D dose reconstruction based on TD read-out compared to dose computation (mean gamma value of PTV = 0.27±0.04). Only a minimal dose underestimation within the target volume was observed when analyzing DVH indices (<1%). Positional errors in leaf banks larger than 1mm and errors in linac output larger than 2% could clearly identified with the TD. Conclusion: Since 2D and 3D evaluations for all IMRT and VMAT treatment plans were in excellent agreement with reference measurements and dose computation, the new TD is suitable to qualify for routine treatment plan

  3. A Prototype of Mathematical Treatment of Pen Pressure Data for Signature Verification.

    PubMed

    Li, Chi-Keung; Wong, Siu-Kay; Chim, Lai-Chu Joyce

    2017-03-26

    A prototype using simple mathematical treatment of the pen pressure data recorded by a digital pen movement recording device was derived. In this study, a total of 48 sets of signature and initial specimens were collected. Pearson's correlation coefficient was used to compare the data of the pen pressure patterns. From the 820 pair comparisons of the 48 sets of genuine signatures, a high degree of matching was found in which 95.4% (782 pairs) and 80% (656 pairs) had rPA > 0.7 and rPA > 0.8, respectively. In the comparison of the 23 forged signatures with their corresponding control signatures, 20 of them (89.2% of pairs) had rPA values < 0.6, showing a lower degree of matching when compared with the results of the genuine signatures. The prototype could be used as a complementary technique to improve the objectivity of signature examination and also has a good potential to be developed as a tool for automated signature identification.

  4. Is Your Avatar Ethical? On-Line Course Tools that Are Methods for Student Identity and Verification

    ERIC Educational Resources Information Center

    Semple, Mid; Hatala, Jeffrey; Franks, Patricia; Rossi, Margherita A.

    2011-01-01

    On-line college courses present a mandate for student identity verification for accreditation and funding sources. Student authentication requires course modification to detect fraud and misrepresentation of authorship in assignment submissions. The reality is that some college students cheat in face-to-face classrooms; however, the potential for…

  5. Efficient cost-sensitive human-machine collaboration for offline signature verification

    NASA Astrophysics Data System (ADS)

    Coetzer, Johannes; Swanepoel, Jacques; Sabourin, Robert

    2012-01-01

    We propose a novel strategy for the optimal combination of human and machine decisions in a cost-sensitive environment. The proposed algorithm should be especially beneficial to financial institutions where off-line signatures, each associated with a specific transaction value, require authentication. When presented with a collection of genuine and fraudulent training signatures, produced by so-called guinea pig writers, the proficiency of a workforce of human employees and a score-generating machine can be estimated and represented in receiver operating characteristic (ROC) space. Using a set of Boolean fusion functions, the majority vote decision of the human workforce is combined with each threshold-specific machine-generated decision. The performance of the candidate ensembles is estimated and represented in ROC space, after which only the optimal ensembles and associated decision trees are retained. When presented with a questioned signature linked to an arbitrary writer, the system first uses the ROC-based cost gradient associated with the transaction value to select the ensemble that minimises the expected cost, and then uses the corresponding decision tree to authenticate the signature in question. We show that, when utilising the entire human workforce, the incorporation of a machine streamlines the authentication process and decreases the expected cost for all operating conditions.

  6. WE-D-BRA-04: Online 3D EPID-Based Dose Verification for Optimum Patient Safety

    SciTech Connect

    Spreeuw, H; Rozendaal, R; Olaciregui-Ruiz, I; Mans, A; Mijnheer, B; Herk, M van; Gonzalez, P

    2015-06-15

    Purpose: To develop an online 3D dose verification tool based on EPID transit dosimetry to ensure optimum patient safety in radiotherapy treatments. Methods: A new software package was developed which processes EPID portal images online using a back-projection algorithm for the 3D dose reconstruction. The package processes portal images faster than the acquisition rate of the portal imager (∼ 2.5 fps). After a portal image is acquired, the software seeks for “hot spots” in the reconstructed 3D dose distribution. A hot spot is in this study defined as a 4 cm{sup 3} cube where the average cumulative reconstructed dose exceeds the average total planned dose by at least 20% and 50 cGy. If a hot spot is detected, an alert is generated resulting in a linac halt. The software has been tested by irradiating an Alderson phantom after introducing various types of serious delivery errors. Results: In our first experiment the Alderson phantom was irradiated with two arcs from a 6 MV VMAT H&N treatment having a large leaf position error or a large monitor unit error. For both arcs and both errors the linac was halted before dose delivery was completed. When no error was introduced, the linac was not halted. The complete processing of a single portal frame, including hot spot detection, takes about 220 ms on a dual hexacore Intel Xeon 25 X5650 CPU at 2.66 GHz. Conclusion: A prototype online 3D dose verification tool using portal imaging has been developed and successfully tested for various kinds of gross delivery errors. The detection of hot spots was proven to be effective for the timely detection of these errors. Current work is focused on hot spot detection criteria for various treatment sites and the introduction of a clinical pilot program with online verification of hypo-fractionated (lung) treatments.

  7. Statistical evaluation of the influence of writing postures on on-line signatures. Study of the impact of time.

    PubMed

    Thiéry, A; Marquis, R; Montani, I

    2013-07-10

    The aim of this study is to investigate the influence of unusual writing positions on a person's signature, in comparison to a standard writing position. Ten writers were asked to sign their signature six times, in each of four different writing positions, including the standard one. In order to take into consideration the effect of the day-to-day variation, this same process was repeated over 12 sessions, giving a total of 288 signatures per subject. The signatures were collected simultaneously in an off-line and on-line acquisition mode, using an interactive tablet and a ballpoint pen. Unidimensional variables (height to width ratio; time with or without in air displacement) and time-dependent variables (pressure; X and Y coordinates; altitude and azimuth angles) were extracted from each signature. For the unidimensional variables, the position effect was assessed through ANOVA and Dunnett contrast tests. Concerning the time-dependent variables, the signatures were compared by using dynamic time warping, and the position effect was evaluated through classification by linear discriminant analysis. Both of these variables provided similar results: no general tendency regarding the position factor could be highlighted. The influence of the position factor varies according to the subject as well as the variable studied. The impact of the session factor was shown to cover the impact that could be ascribed to the writing position factor. Indeed, the day-to-day variation has a greater effect than the position factor on the studied signature variables. The results of this study suggest guidelines for best practice in the area of signature comparisons and demonstrate the importance of a signature collection procedure covering an adequate number of sampling sessions, with a sufficient number of samples per session.

  8. Aging in biometrics: an experimental analysis on on-line signature.

    PubMed

    Galbally, Javier; Martinez-Diaz, Marcos; Fierrez, Julian

    2013-01-01

    The first consistent and reproducible evaluation of the effect of aging on dynamic signature is reported. Experiments are carried out on a database generated from two previous datasets which were acquired, under very similar conditions, in 6 sessions distributed in a 15-month time span. Three different systems, representing the current most popular approaches in signature recognition, are used in the experiments, proving the degradation suffered by this trait with the passing of time. Several template update strategies are also studied as possible measures to reduce the impact of aging on the system's performance. Different results regarding the way in which signatures tend to change with time, and their most and least stable features, are also given.

  9. Aging in Biometrics: An Experimental Analysis on On-Line Signature

    PubMed Central

    Galbally, Javier; Martinez-Diaz, Marcos; Fierrez, Julian

    2013-01-01

    The first consistent and reproducible evaluation of the effect of aging on dynamic signature is reported. Experiments are carried out on a database generated from two previous datasets which were acquired, under very similar conditions, in 6 sessions distributed in a 15-month time span. Three different systems, representing the current most popular approaches in signature recognition, are used in the experiments, proving the degradation suffered by this trait with the passing of time. Several template update strategies are also studied as possible measures to reduce the impact of aging on the system’s performance. Different results regarding the way in which signatures tend to change with time, and their most and least stable features, are also given. PMID:23894557

  10. Grazing-Angle Fourier Transform Infrared Spectroscopy for Online Surface Cleanliness Verification. Year 1

    DTIC Science & Technology

    2000-07-01

    As part of the Online Surface Cleanliness Project, the Naval Facilities Engineering Service Center (NFESC) conducted a study of grazing-angle...demonstrated in the laboratory for the detection of organic contaminant residues on reflective surfaces. Applications where surface cleanliness is critical

  11. Workload and Interaction: Unisa's Signature Courses--A Design Template for Transitioning to Online DE?

    ERIC Educational Resources Information Center

    Hülsmann, Thomas; Shabalala, Lindiwe

    2016-01-01

    The principal contradiction of online distance education is the disparity that exists between economies of scale and the new interactive capabilities of digital technologies. This is particularly felt where mega-universities in developing countries seek to make better use of these affordances while at the same time protecting their economies of…

  12. Authentication Based on Pole-zero Models of Signature Velocity

    PubMed Central

    Rashidi, Saeid; Fallah, Ali; Towhidkhah, Farzad

    2013-01-01

    With the increase of communication and financial transaction through internet, on-line signature verification is an accepted biometric technology for access control and plays a significant role in authenticity and authorization in modernized society. Therefore, fast and precise algorithms for the signature verification are very attractive. The goal of this paper is modeling of velocity signal that pattern and properties is stable for persons. With using pole-zero models based on discrete cosine transform, precise method is proposed for modeling and then features is founded from strokes. With using linear, parzen window and support vector machine classifiers, the signature verification technique was tested with a large number of authentic and forgery signatures and has demonstrated the good potential of this technique. The signatures are collected from three different database include a proprietary database, the SVC2004 and the Sabanci University signature database benchmark databases. Experimental results based on Persian, SVC2004 and SUSIG databases show that our method achieves an equal error rate of 5.91%, 5.62% and 3.91% in the skilled forgeries, respectively. PMID:24696797

  13. MARQ: an online tool to mine GEO for experiments with similar or opposite gene expression signatures.

    PubMed

    Vazquez, Miguel; Nogales-Cadenas, Ruben; Arroyo, Javier; Botías, Pedro; García, Raul; Carazo, Jose M; Tirado, Francisco; Pascual-Montano, Alberto; Carmona-Saez, Pedro

    2010-07-01

    The enormous amount of data available in public gene expression repositories such as Gene Expression Omnibus (GEO) offers an inestimable resource to explore gene expression programs across several organisms and conditions. This information can be used to discover experiments that induce similar or opposite gene expression patterns to a given query, which in turn may lead to the discovery of new relationships among diseases, drugs or pathways, as well as the generation of new hypotheses. In this work, we present MARQ, a web-based application that allows researchers to compare a query set of genes, e.g. a set of over- and under-expressed genes, against a signature database built from GEO datasets for different organisms and platforms. MARQ offers an easy-to-use and integrated environment to mine GEO, in order to identify conditions that induce similar or opposite gene expression patterns to a given experimental condition. MARQ also includes additional functionalities for the exploration of the results, including a meta-analysis pipeline to find genes that are differentially expressed across different experiments. The application is freely available at http://marq.dacya.ucm.es.

  14. Online Kidney Position Verification Using Non-Contrast Radiographs on a Linear Accelerator with on Board KV X-Ray Imaging Capability

    SciTech Connect

    Willis, David J. Kron, Tomas; Hubbard, Patricia; Haworth, Annette; Wheeler, Greg; Duchesne, Gillian M.

    2009-01-01

    The kidneys are dose-limiting organs in abdominal radiotherapy. Kilovoltage (kV) radiographs can be acquired using on-board imager (OBI)-equipped linear accelerators with better soft tissue contrast and lower radiation doses than conventional portal imaging. A feasibility study was conducted to test the suitability of anterior-posterior (AP) non-contrast kV radiographs acquired at treatment time for online kidney position verification. Anthropomorphic phantoms were used to evaluate image quality and radiation dose. Institutional Review Board approval was given for a pilot study that enrolled 5 adults and 5 children. Customized digitally reconstructed radiographs (DRRs) were generated to provide a priori information on kidney shape and position. Radiotherapy treatment staff performed online evaluation of kidney visibility on OBI radiographs. Kidney dose measured in a pediatric anthropomorphic phantom was 0.1 cGy for kV imaging and 1.7 cGy for MV imaging. Kidneys were rated as well visualized in 60% of patients (90% confidence interval, 34-81%). The likelihood of visualization appears to be influenced by the relative AP separation of the abdomen and kidneys, the axial profile of the kidneys, and their relative contrast with surrounding structures. Online verification of kidney position using AP non-contrast kV radiographs on an OBI-equipped linear accelerator appears feasible for patients with suitable abdominal anatomy. Kidney position information provided is limited to 2-dimensional 'snapshots,' but this is adequate in some clinical situations and potentially advantageous in respiratory-correlated treatments. Successful clinical implementation requires customized partial DRRs, appropriate imaging parameters, and credentialing of treatment staff.

  15. Online kidney position verification using non-contrast radiographs on a linear accelerator with on board KV X-Ray imaging capability.

    PubMed

    Willis, David J; Kron, Tomas; Hubbard, Patricia; Haworth, Annette; Wheeler, Greg; Duchesne, Gillian M

    2009-01-01

    The kidneys are dose-limiting organs in abdominal radiotherapy. Kilovoltage (kV) radiographs can be acquired using on-board imager (OBI)-equipped linear accelerators with better soft tissue contrast and lower radiation doses than conventional portal imaging. A feasibility study was conducted to test the suitability of anterior-posterior (AP) non-contrast kV radiographs acquired at treatment time for online kidney position verification. Anthropomorphic phantoms were used to evaluate image quality and radiation dose. Institutional Review Board approval was given for a pilot study that enrolled 5 adults and 5 children. Customized digitally reconstructed radiographs (DRRs) were generated to provide a priori information on kidney shape and position. Radiotherapy treatment staff performed online evaluation of kidney visibility on OBI radiographs. Kidney dose measured in a pediatric anthropomorphic phantom was 0.1 cGy for kV imaging and 1.7 cGy for MV imaging. Kidneys were rated as well visualized in 60% of patients (90% confidence interval, 34-81%). The likelihood of visualization appears to be influenced by the relative AP separation of the abdomen and kidneys, the axial profile of the kidneys, and their relative contrast with surrounding structures. Online verification of kidney position using AP non-contrast kV radiographs on an OBI-equipped linear accelerator appears feasible for patients with suitable abdominal anatomy. Kidney position information provided is limited to 2-dimensional "snapshots," but this is adequate in some clinical situations and potentially advantageous in respiratory-correlated treatments. Successful clinical implementation requires customized partial DRRs, appropriate imaging parameters, and credentialing of treatment staff.

  16. WE-EF-303-06: Feasibility of PET Image-Based On-Line Proton Beam-Range Verification with Simulated Uniform Phantom and Human Brain Studies

    SciTech Connect

    Lou, K; Sun, X; Zhu, X; Grosshans, D; Clark, J; Shao, Y

    2015-06-15

    Purpose: To study the feasibility of clinical on-line proton beam range verification with PET imaging Methods: We simulated a 179.2-MeV proton beam with 5-mm diameter irradiating a PMMA phantom of human brain size, which was then imaged by a brain PET with 300*300*100-mm{sup 3} FOV and different system sensitivities and spatial resolutions. We calculated the mean and standard deviation of positron activity range (AR) from reconstructed PET images, with respect to different data acquisition times (from 5 sec to 300 sec with 5-sec step). We also developed a technique, “Smoothed Maximum Value (SMV)”, to improve AR measurement under a given dose. Furthermore, we simulated a human brain irradiated by a 110-MeV proton beam of 50-mm diameter with 0.3-Gy dose at Bragg peak and imaged by the above PET system with 40% system sensitivity at the center of FOV and 1.7-mm spatial resolution. Results: MC Simulations on the PMMA phantom showed that, regardless of PET system sensitivities and spatial resolutions, the accuracy and precision of AR were proportional to the reciprocal of the square root of image count if image smoothing was not applied. With image smoothing or SMV method, the accuracy and precision could be substantially improved. For a cylindrical PMMA phantom (200 mm diameter and 290 mm long), the accuracy and precision of AR measurement could reach 1.0 and 1.7 mm, with 100-sec data acquired by the brain PET. The study with a human brain showed it was feasible to achieve sub-millimeter accuracy and precision of AR measurement with acquisition time within 60 sec. Conclusion: This study established the relationship between count statistics and the accuracy and precision of activity-range verification. It showed the feasibility of clinical on-line BR verification with high-performance PET systems and improved AR measurement techniques. Cancer Prevention and Research Institute of Texas grant RP120326, NIH grant R21CA187717, The Cancer Center Support (Core) Grant CA

  17. SU-E-J-146: A Research of PET-CT SUV Range for the Online Dose Verification in Carbon Ion Radiation Therapy

    SciTech Connect

    Sun, L; Hu, W; Moyers, M; Zhao, J; Hsi, W

    2015-06-15

    Purpose: Positron-emitting isotope distributions can be used for the image fusion of the carbon ion planning CT and online target verification PETCT, after radiation in the same decay period,the relationship between the same target volume and the SUV value of different every single fraction dose can be found,then the range of SUV for the radiation target could be decided.So this online range also can provide reference for the correlation and consistency in planning target dose verification and evaluation for the clinical trial. Methods: The Rando head phantom can be used as real body,the 10cc cube volume target contouring is done,beam ISO Center depth is 7.6cm and the 90 degree fixed carbon ion beams should be delivered in single fraction effective dose of 2.5GyE,5GyE and 8GyE.After irradiation,390 seconds later the 30 minutes PET-CT scanning is performed,parameters are set to 50Kg virtual weight,0.05mCi activity.MIM Maestro is used for the image processing and fusion,five 16mm diameter SUV spheres have been chosen in the different direction in the target.The average SUV in target for different fraction dose can be found by software. Results: For 10cc volume target,390 seconds decay period,the Single fraction effective dose equal to 2.5Gy,Ethe SUV mean value is 3.42,the relative range is 1.72 to 6.83;Equal to 5GyE,SUV mean value is 9.946,the relative range is 7.016 to 12.54;Equal or above to 8GyE,SUV mean value is 20.496,the relative range is 11.16 to 34.73. Conclusion: Making an evaluation for accuracy of the dose distribution using the SUV range which is from the planning CT with after treatment online PET-CT fusion for the normal single fraction carbon ion treatment is available.Even to the plan which single fraction dose is above 2GyE,in the condition of other parameters all the same,the SUV range is linearly dependent with single fraction dose,so this method also can be used in the hyper-fraction treatment plan.

  18. SU-E-T-582: On-Line Dosimetric Verification of Respiratory Gated Volumetric Modulated Arc Therapy Using the Electronic Portal Imaging Device

    SciTech Connect

    Schaly, B; Gaede, S; Xhaferllari, I

    2015-06-15

    Purpose: To investigate the clinical utility of on-line verification of respiratory gated VMAT dosimetry during treatment. Methods: Portal dose images were acquired during treatment in integrated mode on a Varian TrueBeam (v. 1.6) linear accelerator for gated lung and liver patients that used flattening filtered beams. The source to imager distance (SID) was set to 160 cm to ensure imager clearance in case the isocenter was off midline. Note that acquisition of integrated images resulted in no extra dose to the patient. Fraction 1 was taken as baseline and all portal dose images were compared to that of the baseline, where the gamma comparison and dose difference were used to measure day-to-day exit dose variation. All images were analyzed in the Portal Dosimetry module of Aria (v. 10). The portal imager on the TrueBeam was calibrated by following the instructions for dosimetry calibration in service mode, where we define 1 calibrated unit (CU) equal to 1 Gy for 10×10 cm field size at 100 cm SID. This reference condition was measured frequently to verify imager calibration. Results: The gamma value (3%, 3 mm, 5% threshold) ranged between 92% and 100% for the lung and liver cases studied. The exit dose can vary by as much as 10% of the maximum dose for an individual fraction. The integrated images combined with the information given by the corresponding on-line soft tissue matched cone-beam computed tomography (CBCT) images were useful in explaining dose variation. For gated lung treatment, dose variation was mainly due to the diaphragm position. For gated liver treatment, the dose variation was due to both diaphragm position and weight loss. Conclusion: Integrated images can be useful in verifying dose delivery consistency during respiratory gated VMAT, although the CBCT information is needed to explain dose differences due to anatomical changes.

  19. Creation of a Reference Image with Monte Carlo Simulations for Online EPID Verification of Daily Patient Setup

    SciTech Connect

    Descalle, M-A; Chuang, C; Pouliot, J

    2002-01-30

    Patient positioning accuracy remains an issue for external beam radiotherapy. Currently, kilovoltage verification images are used as reference by clinicians to compare the actual patient treatment position with the planned position. These images are qualitatively different from treatment-time megavoltage portal images. This study will investigate the feasibility of using PEREGRINE, a 3D Monte Carlo calculation engine, to create reference images for portal image comparisons. Portal images were acquired using an amorphous-silicon flat-panel EPID for (1) the head and pelvic sections of an anthropomorphic phantom with 7-8 mm displacements applied, and (2) a prostate patient on five treatment days. Planning CT scans were used to generate simulated reference images with PEREGRINE. A correlation algorithm quantified the setup deviations between simulated and portal images. Monte Carlo simulated images exhibit similar qualities to portal images, the phantom slabs appear clearly. Initial positioning differences and applied displacements were detected and quantified. We find that images simulated with Monte Carlo methods can be used as reference images to detect and quantify set-up errors during treatment.

  20. SU-E-J-46: Development of a Compton Camera Prototype for Online Range Verification of Laser-Accelerated Proton Beams

    SciTech Connect

    Thirolf, PG; Bortfeldt, J; Lang, C; Parodi, K; Aldawood, S; Boehmer, M; Gernhaeuser, R; Maier, L; Castelhano, I; Kolff, H van der; Schaart, DR

    2014-06-01

    Purpose: Development of a photon detection system designed for online range verification of laser-accelerated proton beams via prompt-gamma imaging of nuclear reactions. Methods: We develop a Compton camera for the position-sensitive detection of prompt photons emitted from nuclear reactions between the proton beam and biological samples. The detector is designed to be capable to reconstruct the photon source origin not only from the Compton scattering kinematics of the primary photon, but also to allow for tracking of the Compton-scattered electrons. Results: Simulation studies resulted in the design of the Compton camera based on a LaBr{sub 3}(Ce) scintillation crystal acting as absorber, preceded by a stacked array of 6 double-sided silicon strip detectors as scatterers. From the design simulations, an angular resolution of ≤ 2° and an image reconstruction efficiency of 10{sup −3} −10{sup −5} (at 2–6 MeV) can be expected. The LaBr{sub 3} crystal has been characterized with calibration sources, resulting in a time resolution of 273 ps (FWHM) and an energy resolution of about 3.8% (FWHM). Using a collimated (1 mm diameter) {sup 137}Cs calibration source, the light distribution was measured for each of 64 pixels (6×6 mm{sup 2}). Data were also taken with 0.5 mm collimation and 0.5 mm step size to generate a reference library of light distributions that allows for reconstructing the interaction position of the initial photon using a k-nearest neighbor (k-NN) algorithm developed by the Delft group. Conclusion: The Compton-camera approach for prompt-gamma detection offers promising perspectives for ion beam range verification. A Compton camera prototype is presently being developed and characterized in Garching. Furthermore, an arrangement of, e.g., 4 camera modules could even be used in a ‘gamma-PET’ mode to detect delayed annihilation radiation from positron emitters in the irradiation interrupts (with improved performance in the presence of an

  1. Actively Promoting Student Engagement within an Online Environment: Developing and Implementing a Signature Subject on "Contemporary Issues in Sex and Sexuality"

    ERIC Educational Resources Information Center

    Fletcher, Gillian; Dowsett, Gary W.; Austin, Lilian

    2012-01-01

    La Trobe University is committed to improving the first year experience, and to developing its online teaching portfolio in response to increasing student demand. This article will acknowledge that these two objectives will remain contradictory if online learning systems are used predominantly as repositories of information with little thought…

  2. Development and Multi-laboratory Verification of US EPA Method 543 for the Analysis of Drinking Water Contaminants by Online Solid Phase Extraction-LC–MS-MS

    EPA Science Inventory

    A drinking water method for seven pesticides and pesticide degradates is presented that addresses the occurrence monitoring needs of the US Environmental Protection Agency (EPA) for a future Unregulated Contaminant Monitoring Regulation (UCMR). The method employs online solid pha...

  3. Signature control

    NASA Astrophysics Data System (ADS)

    Pyati, Vittal P.

    The reduction of vehicle radar signature is accomplished by means of vehicle shaping, the use of microwave frequencies-absorbent materials, and either passive or active cancellation techniques; such techniques are also useful in the reduction of propulsion system-associated IR emissions. In some anticipated scenarios, the objective is not signature-reduction but signature control, for deception, via decoy vehicles that mimic the signature characteristics of actual weapons systems. As the stealthiness of airframes and missiles increases, their propulsion systems' exhaust plumes assume a more important role in detection by an adversary.

  4. PhiSiGns: an online tool to identify signature genes in phages and design PCR primers for examining phage diversity

    PubMed Central

    2012-01-01

    Background Phages (viruses that infect bacteria) have gained significant attention because of their abundance, diversity and important ecological roles. However, the lack of a universal gene shared by all phages presents a challenge for phage identification and characterization, especially in environmental samples where it is difficult to culture phage-host systems. Homologous conserved genes (or "signature genes") present in groups of closely-related phages can be used to explore phage diversity and define evolutionary relationships amongst these phages. Bioinformatic approaches are needed to identify candidate signature genes and design PCR primers to amplify those genes from environmental samples; however, there is currently no existing computational tool that biologists can use for this purpose. Results Here we present PhiSiGns, a web-based and standalone application that performs a pairwise comparison of each gene present in user-selected phage genomes, identifies signature genes, generates alignments of these genes, and designs potential PCR primer pairs. PhiSiGns is available at (http://www.phantome.org/phisigns/; http://phisigns.sourceforge.net/) with a link to the source code. Here we describe the specifications of PhiSiGns and demonstrate its application with a case study. Conclusions PhiSiGns provides phage biologists with a user-friendly tool to identify signature genes and design PCR primers to amplify related genes from uncultured phages in environmental samples. This bioinformatics tool will facilitate the development of novel signature genes for use as molecular markers in studies of phage diversity, phylogeny, and evolution. PMID:22385976

  5. 78 FR 56266 - Consent Based Social Security Number Verification (CBSV) Service

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-12

    ... Security number (SSN) verification services to enrolled private businesses, State and local government... From the Federal Register Online via the Government Publishing Office SOCIAL SECURITY ADMINISTRATION Consent Based Social Security Number Verification (CBSV) Service AGENCY: Social...

  6. 78 FR 23743 - Proposed Information Collection; Comment Request; Delivery Verification Procedure for Imports

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-22

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF COMMERCE Bureau of Industry and Security Proposed Information Collection; Comment Request; Delivery Verification... furnish their foreign supplier with a U.S. Delivery Verification Certificate validating that...

  7. Signature-based store checking buffer

    DOEpatents

    Sridharan, Vilas; Gurumurthi, Sudhanva

    2015-06-02

    A system and method for optimizing redundant output verification, are provided. A hardware-based store fingerprint buffer receives multiple instances of output from multiple instances of computation. The store fingerprint buffer generates a signature from the content included in the multiple instances of output. When a barrier is reached, the store fingerprint buffer uses the signature to verify the content is error-free.

  8. Single particle mass spectral signatures from vehicle exhaust particles and the source apportionment of on-line PM2.5 by single particle aerosol mass spectrometry.

    PubMed

    Yang, Jian; Ma, Shexia; Gao, Bo; Li, Xiaoying; Zhang, Yanjun; Cai, Jing; Li, Mei; Yao, Ling'ai; Huang, Bo; Zheng, Mei

    2017-03-24

    In order to accurately apportion the many distinct types of individual particles observed, it is necessary to characterize fingerprints of individual particles emitted directly from known sources. In this study, single particle mass spectral signatures from vehicle exhaust particles in a tunnel were performed. These data were used to evaluate particle signatures in a real-world PM2.5 apportionment study. The dominant chemical type originating from average positive and negative mass spectra for vehicle exhaust particles are EC species. Four distinct particle types describe the majority of particles emitted by vehicle exhaust particles in this tunnel. Each particle class is labeled according to the most significant chemical features in both average positive and negative mass spectral signatures, including ECOC, NaK, Metal and PAHs species. A single particle aerosol mass spectrometry (SPAMS) was also employed during the winter of 2013 in Guangzhou to determine both the size and chemical composition of individual atmospheric particles, with vacuum aerodynamic diameter (dva) in the size range of 0.2-2μm. A total of 487,570 particles were chemically analyzed with positive and negative ion mass spectra and a large set of single particle mass spectra was collected and analyzed in order to identify the speciation. According to the typical tracer ions from different source types and classification by the ART-2a algorithm which uses source fingerprints for apportioning ambient particles, the major sources of single particles were simulated. Coal combustion, vehicle exhaust, and secondary ion were the most abundant particle sources, contributing 28.5%, 17.8%, and 18.2%, respectively. The fraction with vehicle exhaust species particles decreased slightly with particle size in the condensation mode particles.

  9. On-line high-performance liquid chromatography-ultraviolet-nuclear magnetic resonance method of the markers of nerve agents for verification of the Chemical Weapons Convention.

    PubMed

    Mazumder, Avik; Gupta, Hemendra K; Garg, Prabhat; Jain, Rajeev; Dubey, Devendra K

    2009-07-03

    This paper details an on-flow liquid chromatography-ultraviolet-nuclear magnetic resonance (LC-UV-NMR) method for the retrospective detection and identification of alkyl alkylphosphonic acids (AAPAs) and alkylphosphonic acids (APAs), the markers of the toxic nerve agents for verification of the Chemical Weapons Convention (CWC). Initially, the LC-UV-NMR parameters were optimized for benzyl derivatives of the APAs and AAPAs. The optimized parameters include stationary phase C(18), mobile phase methanol:water 78:22 (v/v), UV detection at 268nm and (1)H NMR acquisition conditions. The protocol described herein allowed the detection of analytes through acquisition of high quality NMR spectra from the aqueous solution of the APAs and AAPAs with high concentrations of interfering background chemicals which have been removed by preceding sample preparation. The reported standard deviation for the quantification is related to the UV detector which showed relative standard deviations (RSDs) for quantification within +/-1.1%, while lower limit of detection upto 16mug (in mug absolute) for the NMR detector. Finally the developed LC-UV-NMR method was applied to identify the APAs and AAPAs in real water samples, consequent to solid phase extraction and derivatization. The method is fast (total experiment time approximately 2h), sensitive, rugged and efficient.

  10. Development and Multi-laboratory Verification of US EPA Method 543 for the Analysis of Drinking Water Contaminants by Online Solid Phase Extraction-LC-MS-MS.

    PubMed

    Shoemaker, Jody A

    2016-06-26

    A drinking water method for seven pesticides and pesticide degradates is presented that addresses the occurrence monitoring needs of the US Environmental Protection Agency (EPA) for a future Unregulated Contaminant Monitoring Regulation (UCMR). The method employs online solid phase extraction-liquid chromatography-tandem mass spectrometry (SPE-LC-MS-MS). Online SPE-LC-MS-MS has the potential to offer cost-effective, faster, more sensitive and more rugged methods than the traditional offline SPE approach due to complete automation of the SPE process, as well as seamless integration with the LC-MS-MS system. The method uses 2-chloroacetamide, ascorbic acid and Trizma to preserve the drinking water samples for up to 28 days. The mean recoveries in drinking water (from a surface water source) fortified with method analytes are 87.1-112% with relative standard deviations of <14%. Single laboratory lowest concentration minimum reporting levels of 0.27-1.7 ng/L are demonstrated with this methodology. Multi-laboratory data are presented that demonstrate method ruggedness and transferability. The final method meets all of the EPA's UCMR survey requirements for sample collection and storage, precision, accuracy, and sensitivity.

  11. Strengths-based positive psychology interventions: a randomized placebo-controlled online trial on long-term effects for a signature strengths- vs. a lesser strengths-intervention.

    PubMed

    Proyer, René T; Gander, Fabian; Wellenzohn, Sara; Ruch, Willibald

    2015-01-01

    Recent years have seen an increasing interest in research in positive psychology interventions. There is broad evidence for their effectiveness in increasing well-being and ameliorating depression. Intentional activities that focus on those character strengths, which are most typical for a person (i.e., signature strengths, SS) and encourage their usage in a new way have been identified as highly effective. The current study aims at comparing an intervention aimed at using SS with one on using individual low scoring (or lesser) strengths in a randomized placebo-controlled trial. A total of 375 adults were randomly assigned to one of the two intervention conditions [i.e., using five signature vs. five lesser strengths (LS) in a new way] or a placebo control condition (i.e., early memories). We measured happiness and depressive symptoms at five time points (i.e., pre- and post-test, 1-, 3-, and 6-months follow-ups) and character strengths at pre-test. The main findings are that (1) there were increases in happiness for up to 3 months and decreases in depressive symptoms in the short term in both intervention conditions; (2) participants found working with strengths equally rewarding (enjoyment and benefit) in both conditions; (3) those participants that reported generally higher levels of strengths benefitted more from working on LS rather than SS and those with comparatively lower levels of strengths tended to benefit more from working on SS; and (4) deviations from an average profile derived from a large sample of German-speakers completing the Values-in-Action Inventory of Strengths were associated with greater benefit from the interventions in the SS-condition. We conclude that working on character strengths is effective for increasing happiness and discuss how these interventions could be tailored to the individual for promoting their effectiveness.

  12. Strengths-based positive psychology interventions: a randomized placebo-controlled online trial on long-term effects for a signature strengths- vs. a lesser strengths-intervention

    PubMed Central

    Proyer, René T.; Gander, Fabian; Wellenzohn, Sara; Ruch, Willibald

    2015-01-01

    Recent years have seen an increasing interest in research in positive psychology interventions. There is broad evidence for their effectiveness in increasing well-being and ameliorating depression. Intentional activities that focus on those character strengths, which are most typical for a person (i.e., signature strengths, SS) and encourage their usage in a new way have been identified as highly effective. The current study aims at comparing an intervention aimed at using SS with one on using individual low scoring (or lesser) strengths in a randomized placebo-controlled trial. A total of 375 adults were randomly assigned to one of the two intervention conditions [i.e., using five signature vs. five lesser strengths (LS) in a new way] or a placebo control condition (i.e., early memories). We measured happiness and depressive symptoms at five time points (i.e., pre- and post-test, 1-, 3-, and 6-months follow-ups) and character strengths at pre-test. The main findings are that (1) there were increases in happiness for up to 3 months and decreases in depressive symptoms in the short term in both intervention conditions; (2) participants found working with strengths equally rewarding (enjoyment and benefit) in both conditions; (3) those participants that reported generally higher levels of strengths benefitted more from working on LS rather than SS and those with comparatively lower levels of strengths tended to benefit more from working on SS; and (4) deviations from an average profile derived from a large sample of German-speakers completing the Values-in-Action Inventory of Strengths were associated with greater benefit from the interventions in the SS-condition. We conclude that working on character strengths is effective for increasing happiness and discuss how these interventions could be tailored to the individual for promoting their effectiveness. PMID:25954221

  13. MO-F-CAMPUS-J-03: Development of a Human Brain PET for On-Line Proton Beam-Range Verification

    SciTech Connect

    Shao, Yiping

    2015-06-15

    Purpose: To develop a prototype PET for verifying proton beam-range before each fractionated therapy that will enable on-line re-planning proton therapy. Methods: Latest “edge-less” silicon photomultiplier arrays and customized ASIC readout electronics were used to develop PET detectors with depth-of-interaction (DOI) measurement capability. Each detector consists of one LYSO array with each end coupled to a SiPM array. Multiple detectors can be seamlessly tiled together to form a large detector panel. Detectors with 1.5×1.5 and 2.0×2.0 mm crystals at 20 or 30 mm lengths were studied. Readout of individual SiPM or signal multiplexing was used to transfer 3D interaction position-coded analog signals through flexible-print-circuit cables or PCB board to dedicated ASIC front-end electronics to output digital timing pulses that encode interaction information. These digital pulses can be transferred to, through standard LVDS cables, and decoded by a FPGA-based data acquisition of coincidence events and data transfer. The modular detector and scalable electronics/data acquisition will enable flexible PET system configuration for different imaging geometry. Results: Initial detector performance measurement shows excellent crystal identification even with 30 mm long crystals, ∼18% and 2.8 ns energy and timing resolutions, and around 2–3 mm DOI resolution. A small prototype PET scanner with one detector ring has been built and evaluated, validating the technology and design. A large size detector panel has been fabricated by scaling up from modular detectors. Different designs of resistor and capacitor based signal multiplexing boards were tested and selected based on optimal crystal identification and timing performance. Stackable readout electronics boards and FPGA-based data acquisition boards were developed and tested. A brain PET is under construction. Conclusion: Technology of large-size DOI detector based on SiPM array and advanced readout has been

  14. Verification Games: Crowd-Sourced Formal Verification

    DTIC Science & Technology

    2016-03-01

    Formal Verification the verification tools developed by the Programming Languages and Software Engineering group were improved. A series of games...were developed by the Center for Game Science: Pipe Jam, Traffic Jam, Flow Jam and Paradox. Verification tools and games were integrated to verify...N/A i Contents List of Figures 1. SUMMARY .............................................................................................. 1 2

  15. More Efficient Threshold Signature Scheme in Gap Diffie-Hellman Group

    NASA Astrophysics Data System (ADS)

    Nyang, Daehun; Yamamura, Akihiro

    By modifying the private key and the public key setting in Boneh-Lynn-Shacham's short signature shcheme, a variation of BLS' short signature scheme is proposed. Based on this variation, we present a very efficient threshold signature scheme where the number of pairing computation for the signaure share verification reduces to half.

  16. Swarm Verification

    NASA Technical Reports Server (NTRS)

    Holzmann, Gerard J.; Joshi, Rajeev; Groce, Alex

    2008-01-01

    Reportedly, supercomputer designer Seymour Cray once said that he would sooner use two strong oxen to plow a field than a thousand chickens. Although this is undoubtedly wise when it comes to plowing a field, it is not so clear for other types of tasks. Model checking problems are of the proverbial "search the needle in a haystack" type. Such problems can often be parallelized easily. Alas, none of the usual divide and conquer methods can be used to parallelize the working of a model checker. Given that it has become easier than ever to gain access to large numbers of computers to perform even routine tasks it is becoming more and more attractive to find alternate ways to use these resources to speed up model checking tasks. This paper describes one such method, called swarm verification.

  17. Wavelet Features Based Fingerprint Verification

    NASA Astrophysics Data System (ADS)

    Bagadi, Shweta U.; Thalange, Asha V.; Jain, Giridhar P.

    2010-11-01

    In this work; we present a automatic fingerprint identification system based on Level 3 features. Systems based only on minutiae features do not perform well for poor quality images. In practice, we often encounter extremely dry, wet fingerprint images with cuts, warts, etc. Due to such fingerprints, minutiae based systems show poor performance for real time authentication applications. To alleviate the problem of poor quality fingerprints, and to improve overall performance of the system, this paper proposes fingerprint verification based on wavelet statistical features & co-occurrence matrix features. The features include mean, standard deviation, energy, entropy, contrast, local homogeneity, cluster shade, cluster prominence, Information measure of correlation. In this method, matching can be done between the input image and the stored template without exhaustive search using the extracted feature. The wavelet transform based approach is better than the existing minutiae based method and it takes less response time and hence suitable for on-line verification, with high accuracy.

  18. Modeling the lexical morphology of Western handwritten signatures.

    PubMed

    Diaz-Cabrera, Moises; Ferrer, Miguel A; Morales, Aythami

    2015-01-01

    A handwritten signature is the final response to a complex cognitive and neuromuscular process which is the result of the learning process. Because of the many factors involved in signing, it is possible to study the signature from many points of view: graphologists, forensic experts, neurologists and computer vision experts have all examined them. Researchers study written signatures for psychiatric, penal, health and automatic verification purposes. As a potentially useful, multi-purpose study, this paper is focused on the lexical morphology of handwritten signatures. This we understand to mean the identification, analysis, and description of the signature structures of a given signer. In this work we analyze different public datasets involving 1533 signers from different Western geographical areas. Some relevant characteristics of signature lexical morphology have been selected, examined in terms of their probability distribution functions and modeled through a General Extreme Value distribution. This study suggests some useful models for multi-disciplinary sciences which depend on handwriting signatures.

  19. Modeling the Lexical Morphology of Western Handwritten Signatures

    PubMed Central

    Diaz-Cabrera, Moises; Ferrer, Miguel A.; Morales, Aythami

    2015-01-01

    A handwritten signature is the final response to a complex cognitive and neuromuscular process which is the result of the learning process. Because of the many factors involved in signing, it is possible to study the signature from many points of view: graphologists, forensic experts, neurologists and computer vision experts have all examined them. Researchers study written signatures for psychiatric, penal, health and automatic verification purposes. As a potentially useful, multi-purpose study, this paper is focused on the lexical morphology of handwritten signatures. This we understand to mean the identification, analysis, and description of the signature structures of a given signer. In this work we analyze different public datasets involving 1533 signers from different Western geographical areas. Some relevant characteristics of signature lexical morphology have been selected, examined in terms of their probability distribution functions and modeled through a General Extreme Value distribution. This study suggests some useful models for multi-disciplinary sciences which depend on handwriting signatures. PMID:25860942

  20. Towards a Better Understanding of the Oxygen Isotope Signature of Atmospheric CO2: Determining the 18O-Exchange Between CO2 and H2O in Leaves and Soil On-line with Laser-Based Spectroscopy

    NASA Astrophysics Data System (ADS)

    Gangi, L.; Rothfuss, Y.; Vereecken, H.; Brueggemann, N.

    2013-12-01

    The oxygen isotope signature of carbon dioxide (δ18O-CO2) is a powerful tool to disentangle CO2 fluxes in terrestrial ecosystems, as CO2 attains a contrasting 18O signature by the interaction with isotopically different soil and leaf water pools during soil respiration and photosynthesis, respectively. However, using the δ18O-CO2 signal to quantify plant-soil-atmosphere CO2 fluxes is still challenging due to a lack of knowledge concerning the magnitude and effect of individual fractionation processes during CO2 and H2O diffusion and during CO2-H2O isotopic exchange in soils and leaves, especially related to short-term changes in environmental conditions (non-steady state). This study addresses this research gap by combined on-line monitoring of the oxygen isotopic signature of CO2 and water vapor during gas exchange in soil and plant leaves with laser-based spectroscopy, using soil columns and plant chambers. In both experimental setups, the measured δ18O of water vapor was used to infer the δ18O of liquid water, and, together with the δ18O-CO2, the degree of oxygen isotopic equilibrium between the two species (θ). Gas exchange experiments with different functional plant types (C3 coniferous, C3 monocotyledonous, C3 dicotyledonous, C4) revealed that θ and the influence of the plant on the ambient δ18O-CO2 (CO18O-isoforcing) not only varied on a diurnal timescale but also when plants were exposed to limited water availability, elevated air temperature, and abrupt changes in light intensity (sunflecks). Maximum θ before treatments ranged between 0.7 and 0.8 for the C3 dicotyledonous (poplar) and C3 monocotyledonous (wheat) plants, and between 0.5 and 0.6 for the conifer (spruce) and C4 plant (maize) while maximum CO18O-isoforcing was highest in wheat (0.03 m s-1 ‰), similar in poplar and maize (0.02 m s-1 ‰), and lowest in spruce (0.01 m s-1 ‰). Multiple regression analysis showed that up to 97 % of temporal dynamics in CO18O-isoforcing could be

  1. Verification of VLSI designs

    NASA Technical Reports Server (NTRS)

    Windley, P. J.

    1991-01-01

    In this paper we explore the specification and verification of VLSI designs. The paper focuses on abstract specification and verification of functionality using mathematical logic as opposed to low-level boolean equivalence verification such as that done using BDD's and Model Checking. Specification and verification, sometimes called formal methods, is one tool for increasing computer dependability in the face of an exponentially increasing testing effort.

  2. Columbus pressurized module verification

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Comandatore, Emanuele

    1986-01-01

    The baseline verification approach of the COLUMBUS Pressurized Module was defined during the A and B1 project phases. Peculiarities of the verification program are the testing requirements derived from the permanent manned presence in space. The model philosophy and the test program have been developed in line with the overall verification concept. Such critical areas as meteoroid protections, heat pipe radiators and module seals are identified and tested. Verification problem areas are identified and recommendations for the next development are proposed.

  3. Verification Tools Secure Online Shopping, Banking

    NASA Technical Reports Server (NTRS)

    2010-01-01

    Just like rover or rocket technology sent into space, the software that controls these technologies must be extensively tested to ensure reliability and effectiveness. Ames Research Center invented the open-source Java Pathfinder (JPF) toolset for the deep testing of Java-based programs. Fujitsu Labs of America Inc., based in Sunnyvale, California, improved the capabilities of the JPF Symbolic Pathfinder tool, establishing the tool as a means of thoroughly testing the functionality and security of Web-based Java applications such as those used for Internet shopping and banking.

  4. A hybrid digital-signature and zero-watermarking approach for authentication and protection of sensitive electronic documents.

    PubMed

    Tayan, Omar; Kabir, Muhammad N; Alginahi, Yasser M

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints.

  5. A Hybrid Digital-Signature and Zero-Watermarking Approach for Authentication and Protection of Sensitive Electronic Documents

    PubMed Central

    Kabir, Muhammad N.; Alginahi, Yasser M.

    2014-01-01

    This paper addresses the problems and threats associated with verification of integrity, proof of authenticity, tamper detection, and copyright protection for digital-text content. Such issues were largely addressed in the literature for images, audio, and video, with only a few papers addressing the challenge of sensitive plain-text media under known constraints. Specifically, with text as the predominant online communication medium, it becomes crucial that techniques are deployed to protect such information. A number of digital-signature, hashing, and watermarking schemes have been proposed that essentially bind source data or embed invisible data in a cover media to achieve its goal. While many such complex schemes with resource redundancies are sufficient in offline and less-sensitive texts, this paper proposes a hybrid approach based on zero-watermarking and digital-signature-like manipulations for sensitive text documents in order to achieve content originality and integrity verification without physically modifying the cover text in anyway. The proposed algorithm was implemented and shown to be robust against undetected content modifications and is capable of confirming proof of originality whilst detecting and locating deliberate/nondeliberate tampering. Additionally, enhancements in resource utilisation and reduced redundancies were achieved in comparison to traditional encryption-based approaches. Finally, analysis and remarks are made about the current state of the art, and future research issues are discussed under the given constraints. PMID:25254247

  6. Determining activities of radionuclides from coincidence signatures

    NASA Astrophysics Data System (ADS)

    Warren, Glen A.; Smith, L. Eric; Aalseth, Craig E.; Ellis, Edward; Hossbach, Todd W.; Valsan, Andrei B.

    2006-05-01

    The spectral analysis of simultaneously observed photons in separate detectors may provide an invaluable tool for radioisotope identification applications. A general recursive method to determine the activity of an isotope from the observed coincidence signature rate is discussed. The method coherently accounts for effects of true coincidence summing within a single detector and detection efficiencies. A verification of the approach with computer simulations is also discussed.

  7. Short Signature Scheme From Bilinear Pairings

    DTIC Science & Technology

    2010-11-01

    model. 3.3 Efficiency We compare our signature scheme with the BLS scheme and ZSS scheme from the implementation point of view. PO, SM , PA, Squ, Inv, MTP ...1 SM 1 SM 2 SM Signing 1 MTP , 1 SM 1 H, 1 Inv, 1 SM 1 H, 1 Squ, 1 Inv, 1 SM Verification 1 MTP , 2 PO 1 H, 1 SM , 1 PO 1 H, 1 Squ, 1 SM , 2 PA, 1 PO

  8. Quantum blind dual-signature scheme without arbitrator

    NASA Astrophysics Data System (ADS)

    Li, Wei; Shi, Ronghua; Huang, Dazu; Shi, Jinjing; Guo, Ying

    2016-03-01

    Motivated by the elegant features of a bind signature, we suggest the design of a quantum blind dual-signature scheme with three phases, i.e., initial phase, signing phase and verification phase. Different from conventional schemes, legal messages are signed not only by the blind signatory but also by the sender in the signing phase. It does not rely much on an arbitrator in the verification phase as the previous quantum signature schemes usually do. The security is guaranteed by entanglement in quantum information processing. Security analysis demonstrates that the signature can be neither forged nor disavowed by illegal participants or attacker. It provides a potential application for e-commerce or e-payment systems with the current technology.

  9. Software verification and testing

    NASA Technical Reports Server (NTRS)

    1985-01-01

    General procedures for software verification and validation are provided as a guide for managers, programmers, and analysts involved in software development. The verification and validation procedures described are based primarily on testing techniques. Testing refers to the execution of all or part of a software system for the purpose of detecting errors. Planning, execution, and analysis of tests are outlined in this document. Code reading and static analysis techniques for software verification are also described.

  10. Electronic Signatures for Public Procurement across Europe

    NASA Astrophysics Data System (ADS)

    Ølnes, Jon; Andresen, Anette; Arbia, Stefano; Ernst, Markus; Hagen, Martin; Klein, Stephan; Manca, Giovanni; Rossi, Adriano; Schipplick, Frank; Tatti, Daniele; Wessolowski, Gesa; Windheuser, Jan

    The PEPPOL (Pan-European Public Procurement On-Line) project is a large scale pilot under the CIP programme of the EU, exploring electronic public procurement in a unified European market. An important element is interoperability of electronic signatures across borders, identified today as a major obstacle to cross-border procurement. PEPPOL will address use of signatures in procurement processes, in particular tendering but also post-award processes like orders and invoices. Signature policies, i.e. quality requirements and requirements on information captured in the signing process, will be developed. This as well as technical interoperability of e-signatures across Europe will finally be piloted in demonstrators starting late 2009 or early 2010.

  11. 76 FR 39070 - Proposed Information Collection; Comment Request; Import, End-User, and Delivery Verification...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-07-05

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF COMMERCE Bureau of Industry and Security Proposed Information Collection; Comment Request; Import, End- User, and Delivery Verification Certificates AGENCY: Bureau of Industry and Security, Commerce. ACTION:...

  12. 78 FR 21144 - Introduction of the Revised Employment Eligibility Verification Form; Correction

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-04-09

    ... From the Federal Register Online via the Government Publishing Office DEPARTMENT OF HOMELAND SECURITY U.S. Citizenship and Immigration Services Introduction of the Revised Employment Eligibility Verification Form; Correction AGENCY: U.S. Citizenship and Immigration Services, DHS. ACTION:...

  13. Signatures support program

    NASA Astrophysics Data System (ADS)

    Hawley, Chadwick T.

    2009-05-01

    The Signatures Support Program (SSP) leverages the full spectrum of signature-related activities (collections, processing, development, storage, maintenance, and dissemination) within the Department of Defense (DOD), the intelligence community (IC), other Federal agencies, and civil institutions. The Enterprise encompasses acoustic, seismic, radio frequency, infrared, radar, nuclear radiation, and electro-optical signatures. The SSP serves the war fighter, the IC, and civil institutions by supporting military operations, intelligence operations, homeland defense, disaster relief, acquisitions, and research and development. Data centers host and maintain signature holdings, collectively forming the national signatures pool. The geographically distributed organizations are the authoritative sources and repositories for signature data; the centers are responsible for data content and quality. The SSP proactively engages DOD, IC, other Federal entities, academia, and industry to locate signatures for inclusion in the distributed national signatures pool and provides world-wide 24/7 access via the SSP application.

  14. 17 CFR 1.4 - Electronic signatures, acknowledgments and verifications.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... commission merchant or introducing broker, a retail forex customer of a retail foreign exchange dealer or..., retail forex customer, participant, client, counterparty, swap dealer, or major swap participant will...

  15. 17 CFR 1.4 - Electronic signatures, acknowledgments and verifications.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... commission merchant or introducing broker, a retail forex customer of a retail foreign exchange dealer or..., retail forex customer, participant, client, counterparty, swap dealer, or major swap participant will...

  16. Voltage verification unit

    DOEpatents

    Martin, Edward J.

    2008-01-15

    A voltage verification unit and method for determining the absence of potentially dangerous potentials within a power supply enclosure without Mode 2 work is disclosed. With this device and method, a qualified worker, following a relatively simple protocol that involves a function test (hot, cold, hot) of the voltage verification unit before Lock Out/Tag Out and, and once the Lock Out/Tag Out is completed, testing or "trying" by simply reading a display on the voltage verification unit can be accomplished without exposure of the operator to the interior of the voltage supply enclosure. According to a preferred embodiment, the voltage verification unit includes test leads to allow diagnostics with other meters, without the necessity of accessing potentially dangerous bus bars or the like.

  17. Uncertainty in hydrological signatures

    NASA Astrophysics Data System (ADS)

    Westerberg, I. K.; McMillan, H. K.

    2015-09-01

    Information about rainfall-runoff processes is essential for hydrological analyses, modelling and water-management applications. A hydrological, or diagnostic, signature quantifies such information from observed data as an index value. Signatures are widely used, e.g. for catchment classification, model calibration and change detection. Uncertainties in the observed data - including measurement inaccuracy and representativeness as well as errors relating to data management - propagate to the signature values and reduce their information content. Subjective choices in the calculation method are a further source of uncertainty. We review the uncertainties relevant to different signatures based on rainfall and flow data. We propose a generally applicable method to calculate these uncertainties based on Monte Carlo sampling and demonstrate it in two catchments for common signatures including rainfall-runoff thresholds, recession analysis and basic descriptive signatures of flow distribution and dynamics. Our intention is to contribute to awareness and knowledge of signature uncertainty, including typical sources, magnitude and methods for its assessment. We found that the uncertainties were often large (i.e. typical intervals of ±10-40 % relative uncertainty) and highly variable between signatures. There was greater uncertainty in signatures that use high-frequency responses, small data subsets, or subsets prone to measurement errors. There was lower uncertainty in signatures that use spatial or temporal averages. Some signatures were sensitive to particular uncertainty types such as rating-curve form. We found that signatures can be designed to be robust to some uncertainty sources. Signature uncertainties of the magnitudes we found have the potential to change the conclusions of hydrological and ecohydrological analyses, such as cross-catchment comparisons or inferences about dominant processes.

  18. Hybrid Enrichment Verification Array: Module Characterization Studies

    SciTech Connect

    Zalavadia, Mital A.; Smith, Leon E.; McDonald, Benjamin S.; Kulisek, Jonathan A.; Mace, Emily K.; Deshmukh, Nikhil S.

    2016-03-01

    The work presented in this report is focused on the characterization and refinement of the Hybrid Enrichment Verification Array (HEVA) approach, which combines the traditional 186-keV 235U signature with high-energy prompt gamma rays from neutron capture in the detector and surrounding collimator material, to determine the relative enrichment and 235U mass of the cylinder. The design of the HEVA modules (hardware and software) deployed in the current field trial builds on over seven years of study and evolution by PNNL, and consists of a ø3''×3'' NaI(Tl) scintillator coupled to an Osprey digital multi-channel analyzer tube base from Canberra. The core of the HEVA methodology, the high-energy prompt gamma-ray signature, serves as an indirect method for the measurement of total neutron emission from the cylinder. A method for measuring the intrinsic efficiency of this “non-traditional” neutron signature and the results from a benchmark experiment are presented. Also discussed are potential perturbing effects on the non-traditional signature, including short-lived activation of materials in the HEVA module. Modeling and empirical results are presented to demonstrate that such effects are expected to be negligible for the envisioned implementation scenario. In comparison to previous versions, the new design boosts the high-energy prompt gamma-ray signature, provides more flexible and effective collimation, and improves count-rate management via commercially available pulse-processing electronics with a special modification prompted by PNNL.

  19. Toward Automatic Verification of Goal-Oriented Flow Simulations

    NASA Technical Reports Server (NTRS)

    Nemec, Marian; Aftosmis, Michael J.

    2014-01-01

    We demonstrate the power of adaptive mesh refinement with adjoint-based error estimates in verification of simulations governed by the steady Euler equations. The flow equations are discretized using a finite volume scheme on a Cartesian mesh with cut cells at the wall boundaries. The discretization error in selected simulation outputs is estimated using the method of adjoint-weighted residuals. Practical aspects of the implementation are emphasized, particularly in the formulation of the refinement criterion and the mesh adaptation strategy. Following a thorough code verification example, we demonstrate simulation verification of two- and three-dimensional problems. These involve an airfoil performance database, a pressure signature of a body in supersonic flow and a launch abort with strong jet interactions. The results show reliable estimates and automatic control of discretization error in all simulations at an affordable computational cost. Moreover, the approach remains effective even when theoretical assumptions, e.g., steady-state and solution smoothness, are relaxed.

  20. Electronic health records: what does your signature signify?

    PubMed Central

    2012-01-01

    Electronic health records serve multiple purposes, including clinical communication, legal documentation, financial transaction capture, research and analytics. Electronic signatures attached to entries in EHRs have different logical and legal meanings for different users. Some of these are vestiges from historic paper formats that require reconsideration. Traditionally accepted functions of signatures, such as identity verification, attestation, consent, authorization and non-repudiation can become ambiguous in the context of computer-assisted workflow processes that incorporate functions like logins, auto-fill and audit trails. This article exposes the incompatibility of expectations among typical users of electronically signed information. PMID:22888846

  1. Explaining Verification Conditions

    NASA Technical Reports Server (NTRS)

    Deney, Ewen; Fischer, Bernd

    2006-01-01

    The Hoare approach to program verification relies on the construction and discharge of verification conditions (VCs) but offers no support to trace, analyze, and understand the VCs themselves. We describe a systematic extension of the Hoare rules by labels so that the calculus itself can be used to build up explanations of the VCs. The labels are maintained through the different processing steps and rendered as natural language explanations. The explanations can easily be customized and can capture different aspects of the VCs; here, we focus on their structure and purpose. The approach is fully declarative and the generated explanations are based only on an analysis of the labels rather than directly on the logical meaning of the underlying VCs or their proofs. Keywords: program verification, Hoare calculus, traceability.

  2. Nuclear disarmament verification

    SciTech Connect

    DeVolpi, A.

    1993-12-31

    Arms control treaties, unilateral actions, and cooperative activities -- reflecting the defusing of East-West tensions -- are causing nuclear weapons to be disarmed and dismantled worldwide. In order to provide for future reductions and to build confidence in the permanency of this disarmament, verification procedures and technologies would play an important role. This paper outlines arms-control objectives, treaty organization, and actions that could be undertaken. For the purposes of this Workshop on Verification, nuclear disarmament has been divided into five topical subareas: Converting nuclear-weapons production complexes, Eliminating and monitoring nuclear-weapons delivery systems, Disabling and destroying nuclear warheads, Demilitarizing or non-military utilization of special nuclear materials, and Inhibiting nuclear arms in non-nuclear-weapons states. This paper concludes with an overview of potential methods for verification.

  3. Voice verification upgrade

    NASA Astrophysics Data System (ADS)

    Davis, R. L.; Sinnamon, J. T.; Cox, D. L.

    1982-06-01

    This contractor has two major objectives. The first was to build, test, and deliver to the government an entry control system using speaker verification (voice authentication) as the mechanism for verifying the user's claimed identity. This system included a physical mantrap, with an integral weight scale to prevent more than one user from gaining access with one verification (tailgating). The speaker verification part of the entry control system contained all the updates and embellishments to the algorithm that was developed earlier for the BISS (Base and Installation Security System) system under contract with the Electronic Systems Division of the USAF. These updates were tested prior to and during the contract on an operational system used at Texas Instruments in Dallas, Texas, for controlling entry to the Corporate Information Center (CIC).

  4. Verification and validation benchmarks.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy

    2007-02-01

    Verification and validation (V&V) are the primary means to assess the accuracy and reliability of computational simulations. V&V methods and procedures have fundamentally improved the credibility of simulations in several high-consequence fields, such as nuclear reactor safety, underground nuclear waste storage, and nuclear weapon safety. Although the terminology is not uniform across engineering disciplines, code verification deals with assessing the reliability of the software coding, and solution verification deals with assessing the numerical accuracy of the solution to a computational model. Validation addresses the physics modeling accuracy of a computational simulation by comparing the computational results with experimental data. Code verification benchmarks and validation benchmarks have been constructed for a number of years in every field of computational simulation. However, no comprehensive guidelines have been proposed for the construction and use of V&V benchmarks. For example, the field of nuclear reactor safety has not focused on code verification benchmarks, but it has placed great emphasis on developing validation benchmarks. Many of these validation benchmarks are closely related to the operations of actual reactors at near-safety-critical conditions, as opposed to being more fundamental-physics benchmarks. This paper presents recommendations for the effective design and use of code verification benchmarks based on manufactured solutions, classical analytical solutions, and highly accurate numerical solutions. In addition, this paper presents recommendations for the design and use of validation benchmarks, highlighting the careful design of building-block experiments, the estimation of experimental measurement uncertainty for both inputs and outputs to the code, validation metrics, and the role of model calibration in validation. It is argued that the understanding of predictive capability of a computational model is built on the level of

  5. Requirement Assurance: A Verification Process

    NASA Technical Reports Server (NTRS)

    Alexander, Michael G.

    2011-01-01

    Requirement Assurance is an act of requirement verification which assures the stakeholder or customer that a product requirement has produced its "as realized product" and has been verified with conclusive evidence. Product requirement verification answers the question, "did the product meet the stated specification, performance, or design documentation?". In order to ensure the system was built correctly, the practicing system engineer must verify each product requirement using verification methods of inspection, analysis, demonstration, or test. The products of these methods are the "verification artifacts" or "closure artifacts" which are the objective evidence needed to prove the product requirements meet the verification success criteria. Institutional direction is given to the System Engineer in NPR 7123.1A NASA Systems Engineering Processes and Requirements with regards to the requirement verification process. In response, the verification methodology offered in this report meets both the institutional process and requirement verification best practices.

  6. General Environmental Verification Specification

    NASA Technical Reports Server (NTRS)

    Milne, J. Scott, Jr.; Kaufman, Daniel S.

    2003-01-01

    The NASA Goddard Space Flight Center s General Environmental Verification Specification (GEVS) for STS and ELV Payloads, Subsystems, and Components is currently being revised based on lessons learned from GSFC engineering and flight assurance. The GEVS has been used by Goddard flight projects for the past 17 years as a baseline from which to tailor their environmental test programs. A summary of the requirements and updates are presented along with the rationale behind the changes. The major test areas covered by the GEVS include mechanical, thermal, and EMC, as well as more general requirements for planning, tracking of the verification programs.

  7. Voice Verification Upgrade.

    DTIC Science & Technology

    1982-06-01

    to develop speaker verification techniques for use over degraded commun- ication channels -- specifically telephone lines. A test of BISS type speaker...verification technology was performed on a degraded channel and compensation techniques were then developed . The fifth program [103 (Total Voice SV...UPGAW. *mbit aL DuI~sel Jme T. SImmoon e~d David L. Cox AAWVLP FIR MIEW RMAS Utgl~rIMIW At" DT11C AU9 231f CD, _ ROME AIR DEVELOPMENT CENTER Air

  8. Digital Signature Management.

    ERIC Educational Resources Information Center

    Hassler, Vesna; Biely, Helmut

    1999-01-01

    Describes the Digital Signature Project that was developed in Austria to establish an infrastructure for applying smart card-based digital signatures in banking and electronic-commerce applications. Discusses the need to conform to international standards, an international certification infrastructure, and security features for a public directory…

  9. Verification in referral-based crowdsourcing.

    PubMed

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through "referral-based crowdsourcing": the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge.

  10. Verification in Referral-Based Crowdsourcing

    PubMed Central

    Naroditskiy, Victor; Rahwan, Iyad; Cebrian, Manuel; Jennings, Nicholas R.

    2012-01-01

    Online social networks offer unprecedented potential for rallying a large number of people to accomplish a given task. Here we focus on information gathering tasks where rare information is sought through “referral-based crowdsourcing”: the information request is propagated recursively through invitations among members of a social network. Whereas previous work analyzed incentives for the referral process in a setting with only correct reports, misreporting is known to be both pervasive in crowdsourcing applications, and difficult/costly to filter out. A motivating example for our work is the DARPA Red Balloon Challenge where the level of misreporting was very high. In order to undertake a formal study of verification, we introduce a model where agents can exert costly effort to perform verification and false reports can be penalized. This is the first model of verification and it provides many directions for future research, which we point out. Our main theoretical result is the compensation scheme that minimizes the cost of retrieving the correct answer. Notably, this optimal compensation scheme coincides with the winning strategy of the Red Balloon Challenge. PMID:23071530

  11. Computer Graphics Verification

    NASA Technical Reports Server (NTRS)

    1992-01-01

    Video processing creates technical animation sequences using studio quality equipment to realistically represent fluid flow over space shuttle surfaces, helicopter rotors, and turbine blades.Computer systems Co-op, Tim Weatherford, performing computer graphics verification. Part of Co-op brochure.

  12. ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    This presentation will be given at the EPA Science Forum 2005 in Washington, DC. The Environmental Technology Verification Program (ETV) was initiated in 1995 to speed implementation of new and innovative commercial-ready environemntal technologies by providing objective, 3rd pa...

  13. FPGA Verification Accelerator (FVAX)

    NASA Technical Reports Server (NTRS)

    Oh, Jane; Burke, Gary

    2008-01-01

    Is Verification Acceleration Possible? - Increasing the visibility of the internal nodes of the FPGA results in much faster debug time - Forcing internal signals directly allows a problem condition to be setup very quickly center dot Is this all? - No, this is part of a comprehensive effort to improve the JPL FPGA design and V&V process.

  14. Exomars Mission Verification Approach

    NASA Astrophysics Data System (ADS)

    Cassi, Carlo; Gilardi, Franco; Bethge, Boris

    According to the long-term cooperation plan established by ESA and NASA in June 2009, the ExoMars project now consists of two missions: A first mission will be launched in 2016 under ESA lead, with the objectives to demonstrate the European capability to safely land a surface package on Mars, to perform Mars Atmosphere investigation, and to provide communi-cation capability for present and future ESA/NASA missions. For this mission ESA provides a spacecraft-composite, made up of an "Entry Descent & Landing Demonstrator Module (EDM)" and a Mars Orbiter Module (OM), NASA provides the Launch Vehicle and the scientific in-struments located on the Orbiter for Mars atmosphere characterisation. A second mission with it launch foreseen in 2018 is lead by NASA, who provides spacecraft and launcher, the EDL system, and a rover. ESA contributes the ExoMars Rover Module (RM) to provide surface mobility. It includes a drill system allowing drilling down to 2 meter, collecting samples and to investigate them for signs of past and present life with exobiological experiments, and to investigate the Mars water/geochemical environment, In this scenario Thales Alenia Space Italia as ESA Prime industrial contractor is in charge of the design, manufacturing, integration and verification of the ESA ExoMars modules, i.e.: the Spacecraft Composite (OM + EDM) for the 2016 mission, the RM for the 2018 mission and the Rover Operations Control Centre, which will be located at Altec-Turin (Italy). The verification process of the above products is quite complex and will include some pecu-liarities with limited or no heritage in Europe. Furthermore the verification approach has to be optimised to allow full verification despite significant schedule and budget constraints. The paper presents the verification philosophy tailored for the ExoMars mission in line with the above considerations, starting from the model philosophy, showing the verification activities flow and the sharing of tests

  15. UV Signature Mutations †

    PubMed Central

    2014-01-01

    Sequencing complete tumor genomes and exomes has sparked the cancer field's interest in mutation signatures for identifying the tumor's carcinogen. This review and meta-analysis discusses signatures and their proper use. We first distinguish between a mutagen's canonical mutations – deviations from a random distribution of base changes to create a pattern typical of that mutagen – and the subset of signature mutations, which are unique to that mutagen and permit inference backward from mutations to mutagen. To verify UV signature mutations, we assembled literature datasets on cells exposed to UVC, UVB, UVA, or solar simulator light (SSL) and tested canonical UV mutation features as criteria for clustering datasets. A confirmed UV signature was: ≥60% of mutations are C→T at a dipyrimidine site, with ≥5% CC→TT. Other canonical features such as a bias for mutations on the non-transcribed strand or at the 3' pyrimidine had limited application. The most robust classifier combined these features with criteria for the rarity of non-UV canonical mutations. In addition, several signatures proposed for specific UV wavelengths were limited to specific genes or species; non-signature mutations induced by UV may cause melanoma BRAF mutations; and the mutagen for sunlight-related skin neoplasms may vary between continents. PMID:25354245

  16. An archaeal genomic signature

    NASA Technical Reports Server (NTRS)

    Graham, D. E.; Overbeek, R.; Olsen, G. J.; Woese, C. R.

    2000-01-01

    Comparisons of complete genome sequences allow the most objective and comprehensive descriptions possible of a lineage's evolution. This communication uses the completed genomes from four major euryarchaeal taxa to define a genomic signature for the Euryarchaeota and, by extension, the Archaea as a whole. The signature is defined in terms of the set of protein-encoding genes found in at least two diverse members of the euryarchaeal taxa that function uniquely within the Archaea; most signature proteins have no recognizable bacterial or eukaryal homologs. By this definition, 351 clusters of signature proteins have been identified. Functions of most proteins in this signature set are currently unknown. At least 70% of the clusters that contain proteins from all the euryarchaeal genomes also have crenarchaeal homologs. This conservative set, which appears refractory to horizontal gene transfer to the Bacteria or the Eukarya, would seem to reflect the significant innovations that were unique and fundamental to the archaeal "design fabric." Genomic protein signature analysis methods may be extended to characterize the evolution of any phylogenetically defined lineage. The complete set of protein clusters for the archaeal genomic signature is presented as supplementary material (see the PNAS web site, www.pnas.org).

  17. Twin Signature Schemes, Revisited

    NASA Astrophysics Data System (ADS)

    Schäge, Sven

    In this paper, we revisit the twin signature scheme by Naccache, Pointcheval and Stern from CCS 2001 that is secure under the Strong RSA (SRSA) assumption and improve its efficiency in several ways. First, we present a new twin signature scheme that is based on the Strong Diffie-Hellman (SDH) assumption in bilinear groups and allows for very short signatures and key material. A big advantage of this scheme is that, in contrast to the original scheme, it does not require a computationally expensive function for mapping messages to primes. We prove this new scheme secure under adaptive chosen message attacks. Second, we present a modification that allows to significantly increase efficiency when signing long messages. This construction uses collision-resistant hash functions as its basis. As a result, our improvements make the signature length independent of the message size. Our construction deviates from the standard hash-and-sign approach in which the hash value of the message is signed in place of the message itself. We show that in the case of twin signatures, one can exploit the properties of the hash function as an integral part of the signature scheme. This improvement can be applied to both the SRSA based and SDH based twin signature scheme.

  18. Description of a Computerized, On-Line Interlibrary Loan System.

    ERIC Educational Resources Information Center

    Kilgour, Frederick G.

    This paper describes the first two months of operation of the OCLC interlibrary loan system, an online system designed to increase speed and effectiveness in obtaining interlibrary loans. This system provides (1) bibliographic verification of interlibrary loan records and location of materials by using online union catalog records, (2) automatic…

  19. Extraction and analysis of signatures from the Gene Expression Omnibus by the crowd

    NASA Astrophysics Data System (ADS)

    Wang, Zichen; Monteiro, Caroline D.; Jagodnik, Kathleen M.; Fernandez, Nicolas F.; Gundersen, Gregory W.; Rouillard, Andrew D.; Jenkins, Sherry L.; Feldmann, Axel S.; Hu, Kevin S.; McDermott, Michael G.; Duan, Qiaonan; Clark, Neil R.; Jones, Matthew R.; Kou, Yan; Goff, Troy; Woodland, Holly; Amaral, Fabio M. R.; Szeto, Gregory L.; Fuchs, Oliver; Schüssler-Fiorenza Rose, Sophia M.; Sharma, Shvetank; Schwartz, Uwe; Bausela, Xabier Bengoetxea; Szymkiewicz, Maciej; Maroulis, Vasileios; Salykin, Anton; Barra, Carolina M.; Kruth, Candice D.; Bongio, Nicholas J.; Mathur, Vaibhav; Todoric, Radmila D.; Rubin, Udi E.; Malatras, Apostolos; Fulp, Carl T.; Galindo, John A.; Motiejunaite, Ruta; Jüschke, Christoph; Dishuck, Philip C.; Lahl, Katharina; Jafari, Mohieddin; Aibar, Sara; Zaravinos, Apostolos; Steenhuizen, Linda H.; Allison, Lindsey R.; Gamallo, Pablo; de Andres Segura, Fernando; Dae Devlin, Tyler; Pérez-García, Vicente; Ma'Ayan, Avi

    2016-09-01

    Gene expression data are accumulating exponentially in public repositories. Reanalysis and integration of themed collections from these studies may provide new insights, but requires further human curation. Here we report a crowdsourcing project to annotate and reanalyse a large number of gene expression profiles from Gene Expression Omnibus (GEO). Through a massive open online course on Coursera, over 70 participants from over 25 countries identify and annotate 2,460 single-gene perturbation signatures, 839 disease versus normal signatures, and 906 drug perturbation signatures. All these signatures are unique and are manually validated for quality. Global analysis of these signatures confirms known associations and identifies novel associations between genes, diseases and drugs. The manually curated signatures are used as a training set to develop classifiers for extracting similar signatures from the entire GEO repository. We develop a web portal to serve these signatures for query, download and visualization.

  20. Extraction and analysis of signatures from the Gene Expression Omnibus by the crowd

    PubMed Central

    Wang, Zichen; Monteiro, Caroline D.; Jagodnik, Kathleen M.; Fernandez, Nicolas F.; Gundersen, Gregory W.; Rouillard, Andrew D.; Jenkins, Sherry L.; Feldmann, Axel S.; Hu, Kevin S.; McDermott, Michael G.; Duan, Qiaonan; Clark, Neil R.; Jones, Matthew R.; Kou, Yan; Goff, Troy; Woodland, Holly; Amaral, Fabio M R.; Szeto, Gregory L.; Fuchs, Oliver; Schüssler-Fiorenza Rose, Sophia M.; Sharma, Shvetank; Schwartz, Uwe; Bausela, Xabier Bengoetxea; Szymkiewicz, Maciej; Maroulis, Vasileios; Salykin, Anton; Barra, Carolina M.; Kruth, Candice D.; Bongio, Nicholas J.; Mathur, Vaibhav; Todoric, Radmila D; Rubin, Udi E.; Malatras, Apostolos; Fulp, Carl T.; Galindo, John A.; Motiejunaite, Ruta; Jüschke, Christoph; Dishuck, Philip C.; Lahl, Katharina; Jafari, Mohieddin; Aibar, Sara; Zaravinos, Apostolos; Steenhuizen, Linda H.; Allison, Lindsey R.; Gamallo, Pablo; de Andres Segura, Fernando; Dae Devlin, Tyler; Pérez-García, Vicente; Ma'ayan, Avi

    2016-01-01

    Gene expression data are accumulating exponentially in public repositories. Reanalysis and integration of themed collections from these studies may provide new insights, but requires further human curation. Here we report a crowdsourcing project to annotate and reanalyse a large number of gene expression profiles from Gene Expression Omnibus (GEO). Through a massive open online course on Coursera, over 70 participants from over 25 countries identify and annotate 2,460 single-gene perturbation signatures, 839 disease versus normal signatures, and 906 drug perturbation signatures. All these signatures are unique and are manually validated for quality. Global analysis of these signatures confirms known associations and identifies novel associations between genes, diseases and drugs. The manually curated signatures are used as a training set to develop classifiers for extracting similar signatures from the entire GEO repository. We develop a web portal to serve these signatures for query, download and visualization. PMID:27667448

  1. How to Prove Security of a Signature with a Tighter Security Reduction

    NASA Astrophysics Data System (ADS)

    Guo, Fuchun; Mu, Yi; Susilo, Willy

    It is a challenging task to construct a signature that it can be tightly reduced to a weak security assumption in the standard model. In this paper, we introduce a simple chameleon-hash-based transformation and show that it can tighten a security reduction of a signature scheme that suffers from a loose security reduction. Taking the Waters' signature from Eurocrypt 2005 as an example, we demonstrate an improvement of the security reduction that the probability of success in the security reduction can be made as a constant and independent of the signature queries from an adversary. Our reduction methodology has never been considered in the literature and is applicable to many signature schemes such as identity-based signature schemes, online/offline signatures, and signatures with strong unforeability.

  2. A Quantum Multi-Proxy Weak Blind Signature Scheme Based on Entanglement Swapping

    NASA Astrophysics Data System (ADS)

    Yan, LiLi; Chang, Yan; Zhang, ShiBin; Han, GuiHua; Sheng, ZhiWei

    2017-02-01

    In this paper, we present a multi-proxy weak blind signature scheme based on quantum entanglement swapping of Bell states. In the scheme, proxy signers can finish the signature instead of original singer with his/her authority. It can be applied to the electronic voting system, electronic paying system, etc. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. It could guarantee not only the unconditionally security but also the anonymity of the message owner. The security analysis shows the scheme satisfies the security features of multi-proxy weak signature, singers cannot disavowal his/her signature while the signature cannot be forged by others, and the message owner can be traced.

  3. A Quantum Multi-Proxy Weak Blind Signature Scheme Based on Entanglement Swapping

    NASA Astrophysics Data System (ADS)

    Yan, LiLi; Chang, Yan; Zhang, ShiBin; Han, GuiHua; Sheng, ZhiWei

    2016-11-01

    In this paper, we present a multi-proxy weak blind signature scheme based on quantum entanglement swapping of Bell states. In the scheme, proxy signers can finish the signature instead of original singer with his/her authority. It can be applied to the electronic voting system, electronic paying system, etc. The scheme uses the physical characteristics of quantum mechanics to implement delegation, signature and verification. It could guarantee not only the unconditionally security but also the anonymity of the message owner. The security analysis shows the scheme satisfies the security features of multi-proxy weak signature, singers cannot disavowal his/her signature while the signature cannot be forged by others, and the message owner can be traced.

  4. Improved Verification for Aerospace Systems

    NASA Technical Reports Server (NTRS)

    Powell, Mark A.

    2008-01-01

    Aerospace systems are subject to many stringent performance requirements to be verified with low risk. This report investigates verification planning using conditional approaches vice the standard classical statistical methods, and usage of historical surrogate data for requirement validation and in verification planning. The example used in this report to illustrate the results of these investigations is a proposed mission assurance requirement with the concomitant maximum acceptable verification risk for the NASA Constellation Program Orion Launch Abort System (LAS). This report demonstrates the following improvements: 1) verification planning using conditional approaches vice classical statistical methods results in plans that are more achievable and feasible; 2) historical surrogate data can be used to bound validation of performance requirements; and, 3) incorporation of historical surrogate data in verification planning using conditional approaches produces even less costly and more reasonable verification plans. The procedures presented in this report may produce similar improvements and cost savings in verification for any stringent performance requirement for an aerospace system.

  5. Are there molecular signatures?

    SciTech Connect

    Bennett, W.P.

    1995-10-01

    This report describes molecular signatures and mutational spectrum analysis. The mutation spectrum is defined as the type and location of DNA base change. There are currently about five well documented cases. Mutations and radon-associated tumors are discussed.

  6. Meteor signature interpretation

    SciTech Connect

    Canavan, G.H.

    1997-01-01

    Meteor signatures contain information about the constituents of space debris and present potential false alarms to early warnings systems. Better models could both extract the maximum scientific information possible and reduce their danger. Accurate predictions can be produced by models of modest complexity, which can be inverted to predict the sizes, compositions, and trajectories of object from their signatures for most objects of interest and concern.

  7. The 70-Gene Prognostic Signature for Korean Breast Cancer Patients

    PubMed Central

    Na, Kuk Young; Lee, Jeong Eon; Kim, Hee Jeong; Yang, Jung-Hyun; Ahn, Sei-Hyun; Moon, Byung-In; Kim, Ra Mi; Ko, Si Mon; Jung, Yong Sik

    2011-01-01

    Purpose A 70-gene prognostic signature has prognostic value in patients with node-negative breast cancer in Europe. This diagnostic test known as "MammaPrint™ (70-gene prognostic signature)" was recently validated and implementation was feasible. Therefore, we assessed the 70-gene prognostic signature in Korean patients with breast cancer. We compared the risk predicted by the 70-gene prognostic signature with commonly used clinicopathological guidelines among Korean patients with breast cancer. We also analyzed the 70-gene prognostic signature and clinicopathological feature of the patients in comparison with a previous validation study. Methods Forty-eight eligible patients with breast cancer (clinical T1-2N0M0) were selected from four hospitals in Korea. Fresh tumor samples were analyzed with a customized microarray for the 70-gene prognostic signature. Concordance between the risk predicted by the 70-gene prognostic signature and risk predicted by commonly used clinicopathological guidelines (St. Gallen guidelines, National Institutes of Health [NIH] guideline, and Adjuvant! Online) was evaluated. Results Prognosis signatures were assessed in 36 patients. No significant differences were observed in the clinicopathological features of patients compared with previous studies. The 70-gene prognosis signature identified five (13.9%) patients with a low-risk prognosis signature and 31 (86.1%) patients with a high-risk prognosis signature. Clinical risk was concordant with the prognosis signature for 29 patients (80.6%) according to the St. Gallen guidelines; 30 patients (83.4%) according to the NIH guidelines; and 23 patients (63.8%) according to the Adjuvant! Online. Our results were different from previous validation studies in Europe with about a 40% low-risk prognosis and about a 60% high-risk prognosis. The high incidence in the high-risk group was consistent with data in Japan. Conclusion The results of 70-gene prognostic signature of Korean patients with

  8. Microcode Verification Project.

    DTIC Science & Technology

    1980-05-01

    MICROCOPY RESOLUTION TEST CHART MADCTR.S042 /2>t w NI TeduIem R"pm’ 00 0 MICRQCODE VERIFICATION PROJECT Unhvrsity of Southern California Stephen D...in the production, testing , and maintenance of Air Force software. This effort was undertaken in response to that goal. The objective of the effort was...rather than hard wiring, is a recent development in computer technology. Hardware diagnostics do not fulfill testing requirements for these computers

  9. Robust verification analysis

    NASA Astrophysics Data System (ADS)

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-01

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  10. Robust verification analysis

    SciTech Connect

    Rider, William; Witkowski, Walt; Kamm, James R.; Wildey, Tim

    2016-02-15

    We introduce a new methodology for inferring the accuracy of computational simulations through the practice of solution verification. We demonstrate this methodology on examples from computational heat transfer, fluid dynamics and radiation transport. Our methodology is suited to both well- and ill-behaved sequences of simulations. Our approach to the analysis of these sequences of simulations incorporates expert judgment into the process directly via a flexible optimization framework, and the application of robust statistics. The expert judgment is systematically applied as constraints to the analysis, and together with the robust statistics guards against over-emphasis on anomalous analysis results. We have named our methodology Robust Verification. Our methodology is based on utilizing multiple constrained optimization problems to solve the verification model in a manner that varies the analysis' underlying assumptions. Constraints applied in the analysis can include expert judgment regarding convergence rates (bounds and expectations) as well as bounding values for physical quantities (e.g., positivity of energy or density). This approach then produces a number of error models, which are then analyzed through robust statistical techniques (median instead of mean statistics). This provides self-contained, data and expert informed error estimation including uncertainties for both the solution itself and order of convergence. Our method produces high quality results for the well-behaved cases relatively consistent with existing practice. The methodology can also produce reliable results for ill-behaved circumstances predicated on appropriate expert judgment. We demonstrate the method and compare the results with standard approaches used for both code and solution verification on well-behaved and ill-behaved simulations.

  11. RESRAD-BUILD verification.

    SciTech Connect

    Kamboj, S.; Yu, C.; Biwer, B. M.; Klett, T.

    2002-01-31

    The results generated by the RESRAD-BUILD code (version 3.0) were verified with hand or spreadsheet calculations using equations given in the RESRAD-BUILD manual for different pathways. For verification purposes, different radionuclides--H-3, C-14, Na-22, Al-26, Cl-36, Mn-54, Co-60, Au-195, Ra-226, Ra-228, Th-228, and U-238--were chosen to test all pathways and models. Tritium, Ra-226, and Th-228 were chosen because of the special tritium and radon models in the RESRAD-BUILD code. Other radionuclides were selected to represent a spectrum of radiation types and energies. Verification of the RESRAD-BUILD code was conducted with an initial check of all the input parameters for correctness against their original source documents. Verification of the calculations was performed external to the RESRAD-BUILD code with Microsoft{reg_sign} Excel to verify all the major portions of the code. In some cases, RESRAD-BUILD results were compared with those of external codes, such as MCNP (Monte Carlo N-particle) and RESRAD. The verification was conducted on a step-by-step basis and used different test cases as templates. The following types of calculations were investigated: (1) source injection rate, (2) air concentration in the room, (3) air particulate deposition, (4) radon pathway model, (5) tritium model for volume source, (6) external exposure model, (7) different pathway doses, and (8) time dependence of dose. Some minor errors were identified in version 3.0; these errors have been corrected in later versions of the code. Some possible improvements in the code were also identified.

  12. Computer-Aided High Precision Verification Of Miniature Spring Structure

    NASA Astrophysics Data System (ADS)

    Bow, Sing T.; Wang, Da-hao; Chen, Tsung-sheng; Newell, Darrell E.

    1990-01-01

    A system is proposed for the high precision on-line verification of the minia-ture spring structure, including overall height, diameters of various coils as well as pitches between neighboring coils of the miniature conical springs. High preci-sion measurements without physical contact and short processing time are achieved. Deformations of any kind on the conical springs can be identified even from the worst viewing direction.

  13. Invisibly Sanitizable Signature without Pairings

    NASA Astrophysics Data System (ADS)

    Yum, Dae Hyun; Lee, Pil Joong

    Sanitizable signatures allow sanitizers to delete some pre-determined parts of a signed document without invalidating the signature. While ordinary sanitizable signatures allow verifiers to know how many subdocuments have been sanitized, invisibly sanitizable signatures do not leave any clue to the sanitized subdocuments; verifiers do not know whether or not sanitizing has been performed. Previous invisibly sanitizable signature scheme was constructed based on aggregate signature with pairings. In this article, we present the first invisibly sanitizable signature without using pairings. Our proposed scheme is secure under the RSA assumption.

  14. Quantum money with classical verification

    SciTech Connect

    Gavinsky, Dmitry

    2014-12-04

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  15. Quantum money with classical verification

    NASA Astrophysics Data System (ADS)

    Gavinsky, Dmitry

    2014-12-01

    We propose and construct a quantum money scheme that allows verification through classical communication with a bank. This is the first demonstration that a secure quantum money scheme exists that does not require quantum communication for coin verification. Our scheme is secure against adaptive adversaries - this property is not directly related to the possibility of classical verification, nevertheless none of the earlier quantum money constructions is known to possess it.

  16. Variability study of Ka-band HRR polarimetric signatures on 11 T-72 tanks

    NASA Astrophysics Data System (ADS)

    Nixon, William E.; Neilson, H. J.; Szatkowski, G. N.; Giles, Robert H.; Kersey, William T.; Perkins, L. C.; Waldman, Jerry

    1998-09-01

    In an effort to effectively understand signature verification requirements through the variability of a structure's RCS characteristics, the U.S. Army National Ground Intelligence Center (NGIC), with technical support from STL, originated a signature project plan to obtain MMW signatures from multiple similar tanks. In implementing this plan NGIC/STL directed and sponsored turntable measurements performed by the U.S. Army Research Laboratory Sensors and Electromagnetic Resource Directorate on eleven T-72 tanks using an HRR full-polarimetric Ka-band radar. The physical condition and configuration of these vehicles were documented by careful inspection and then photographed during the acquisition sequence at 45 degree(s) azimuth intervals. The turntable signature of one vehicle was acquired eight times over the three day signatures acquisition period for establishing measurement variability on any single target. At several intervals between target measurements, the turntable signature of a 30 m2 trihedral was also acquired as a calibration reference for the signature library. Through an RCS goodness-of-fit correlation and ISAR comparison study, the signature-to-signature variability was evaluated for the eighteen HRR turntable measurements of the T-72 tanks. This signature data is available from NGIC on request for Government Agencies and Government Contractors with an established need-to-know.

  17. Practical quantum digital signature

    NASA Astrophysics Data System (ADS)

    Yin, Hua-Lei; Fu, Yao; Chen, Zeng-Bing

    2016-03-01

    Guaranteeing nonrepudiation, unforgeability as well as transferability of a signature is one of the most vital safeguards in today's e-commerce era. Based on fundamental laws of quantum physics, quantum digital signature (QDS) aims to provide information-theoretic security for this cryptographic task. However, up to date, the previously proposed QDS protocols are impractical due to various challenging problems and most importantly, the requirement of authenticated (secure) quantum channels between participants. Here, we present the first quantum digital signature protocol that removes the assumption of authenticated quantum channels while remaining secure against the collective attacks. Besides, our QDS protocol can be practically implemented over more than 100 km under current mature technology as used in quantum key distribution.

  18. Factor models for cancer signatures

    NASA Astrophysics Data System (ADS)

    Kakushadze, Zura; Yu, Willie

    2016-11-01

    We present a novel method for extracting cancer signatures by applying statistical risk models (http://ssrn.com/abstract=2732453) from quantitative finance to cancer genome data. Using 1389 whole genome sequenced samples from 14 cancers, we identify an "overall" mode of somatic mutational noise. We give a prescription for factoring out this noise and source code for fixing the number of signatures. We apply nonnegative matrix factorization (NMF) to genome data aggregated by cancer subtype and filtered using our method. The resultant signatures have substantially lower variability than those from unfiltered data. Also, the computational cost of signature extraction is cut by about a factor of 10. We find 3 novel cancer signatures, including a liver cancer dominant signature (96% contribution) and a renal cell carcinoma signature (70% contribution). Our method accelerates finding new cancer signatures and improves their overall stability. Reciprocally, the methods for extracting cancer signatures could have interesting applications in quantitative finance.

  19. Current signature sensor

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M. (Inventor); Lucena, Angel (Inventor); Ihlefeld, Curtis (Inventor); Burns, Bradley (Inventor); Bassignani, Karin E. (Inventor)

    2005-01-01

    A solenoid health monitoring system uses a signal conditioner and controller assembly in one embodiment that includes analog circuitry and a DSP controller. The analog circuitry provides signal conditioning to the low-level raw signal coming from a signal acquisition assembly. Software running in a DSP analyzes the incoming data (recorded current signature) and determines the state of the solenoid whether it is energized, de-energized, or in a transitioning state. In one embodiment, the software identifies key features in the current signature during the transition phase and is able to determine the health of the solenoid.

  20. Current Signature Sensor

    NASA Technical Reports Server (NTRS)

    Perotti, Jose M. (Inventor); Lucena, Angel (Inventor); Ihlefeld, Curtis (Inventor); Burns, Bradley (Inventor); Bassignani, Mario (Inventor); Bassignani, Karin E. (Inventor)

    2005-01-01

    A solenoid health monitoring system uses a signal conditioner and controller assembly in one embodiment that includes analog circuitry and a DSP controller. The analog circuitry provides signal conditioning to the low-level raw signal coming from a signal acquisition assembly. Software running in a DSP analyzes the incoming data (recorded current signature) and determines the state of the solenoid whether it is energized, de-energized, or in a transitioning state. In one embodiment, the software identifies key features in the current signature during the transition phase and is able to determine the health of the solenoid.

  1. Method and computer product to increase accuracy of time-based software verification for sensor networks

    DOEpatents

    Foo Kune, Denis [Saint Paul, MN; Mahadevan, Karthikeyan [Mountain View, CA

    2011-01-25

    A recursive verification protocol to reduce the time variance due to delays in the network by putting the subject node at most one hop from the verifier node provides for an efficient manner to test wireless sensor nodes. Since the software signatures are time based, recursive testing will give a much cleaner signal for positive verification of the software running on any one node in the sensor network. In this protocol, the main verifier checks its neighbor, who in turn checks its neighbor, and continuing this process until all nodes have been verified. This ensures minimum time delays for the software verification. Should a node fail the test, the software verification downstream is halted until an alternative path (one not including the failed node) is found. Utilizing techniques well known in the art, having a node tested twice, or not at all, can be avoided.

  2. Verification of LHS distributions.

    SciTech Connect

    Swiler, Laura Painton

    2006-04-01

    This document provides verification test results for normal, lognormal, and uniform distributions that are used in Sandia's Latin Hypercube Sampling (LHS) software. The purpose of this testing is to verify that the sample values being generated in LHS are distributed according to the desired distribution types. The testing of distribution correctness is done by examining summary statistics, graphical comparisons using quantile-quantile plots, and format statistical tests such as the Chisquare test, the Kolmogorov-Smirnov test, and the Anderson-Darling test. The overall results from the testing indicate that the generation of normal, lognormal, and uniform distributions in LHS is acceptable.

  3. Production readiness verification testing

    NASA Technical Reports Server (NTRS)

    James, A. M.; Bohon, H. L.

    1980-01-01

    A Production Readiness Verification Testing (PRVT) program has been established to determine if structures fabricated from advanced composites can be committed on a production basis to commercial airline service. The program utilizes subcomponents which reflect the variabilities in structure that can realistically be expected from current production and quality control technology to estimate the production qualities, variation in static strength, and durability of advanced composite structures. The results of the static tests and a durability assessment after one year of continuous load/environment testing of twenty two duplicates of each of two structural components (a segment of the front spar and cover of a vertical stabilizer box structure) are discussed.

  4. Implementation framework for digital signatures for electronic data interchange in healthcare.

    PubMed

    De Moor, Georges; Claerhout, Brecht; De Meyer, Filip

    2004-01-01

    This paper aims to propose an action plan for the deployment of the use of digital signatures in Belgian healthcare. This action plan is the result of a number of technical, legal and organisational requirements. It starts by establishing the functional components that are needed to set up a framework for the deployment of digital signatures. The main components should implement an infrastructure for: --the creation of digital signatures; --the verification of digital signatures; --the certification of signature keys; --the certification of attributes; --the handling of revocation. The tasks in the action plan are the logical consequence of all the functions that need to be addressed. The objective of this report is to list what has to be done and how it can be done in the context of healthcare, rather to state who will perform the functions required.

  5. Software Verification and Validation Procedure

    SciTech Connect

    Olund, Thomas S.

    2008-09-15

    This Software Verification and Validation procedure provides the action steps for the Tank Waste Information Network System (TWINS) testing process. The primary objective of the testing process is to provide assurance that the software functions as intended, and meets the requirements specified by the client. Verification and validation establish the primary basis for TWINS software product acceptance.

  6. HDL to verification logic translator

    NASA Technical Reports Server (NTRS)

    Gambles, J. W.; Windley, P. J.

    1992-01-01

    The increasingly higher number of transistors possible in VLSI circuits compounds the difficulty in insuring correct designs. As the number of possible test cases required to exhaustively simulate a circuit design explodes, a better method is required to confirm the absence of design faults. Formal verification methods provide a way to prove, using logic, that a circuit structure correctly implements its specification. Before verification is accepted by VLSI design engineers, the stand alone verification tools that are in use in the research community must be integrated with the CAD tools used by the designers. One problem facing the acceptance of formal verification into circuit design methodology is that the structural circuit descriptions used by the designers are not appropriate for verification work and those required for verification lack some of the features needed for design. We offer a solution to this dilemma: an automatic translation from the designers' HDL models into definitions for the higher-ordered logic (HOL) verification system. The translated definitions become the low level basis of circuit verification which in turn increases the designer's confidence in the correctness of higher level behavioral models.

  7. Deductive Verification of Cryptographic Software

    NASA Technical Reports Server (NTRS)

    Almeida, Jose Barcelar; Barbosa, Manuel; Pinto, Jorge Sousa; Vieira, Barbara

    2009-01-01

    We report on the application of an off-the-shelf verification platform to the RC4 stream cipher cryptographic software implementation (as available in the openSSL library), and introduce a deductive verification technique based on self-composition for proving the absence of error propagation.

  8. TFE Verification Program

    SciTech Connect

    Not Available

    1993-05-01

    The objective of the semiannual progress report is to summarize the technical results obtained during the latest reporting period. The information presented herein will include evaluated test data, design evaluations, the results of analyses and the significance of results. The program objective is to demonstrate the technology readiness of a TFE (thermionic fuel element) suitable for use as the basic element in a thermionic reactor with electric power output in the 0.5 to 5.0 MW(e) range, and a full-power life of 7 years. The TFE Verification Program builds directly on the technology and data base developed in the 1960s and early 1970s in an AEC/NASA program, and in the SP-100 program conducted in 1983, 1984 and 1985. In the SP-100 program, the attractive features of thermionic power conversion technology were recognized but concern was expressed over the lack of fast reactor irradiation data. The TFE Verification Program addresses this concern.

  9. A Signature Style

    ERIC Educational Resources Information Center

    Smiles, Robin V.

    2005-01-01

    This article discusses Dr. Amalia Amaki and her approach to art as her signature style by turning everyday items into fine art. Amaki is an assistant professor of art, art history, and Black American studies at the University of Delaware. She loves taking unexpected an object and redefining it in the context of art--like a button, a fan, a faded…

  10. Data requirements for verification of ram glow chemistry

    NASA Technical Reports Server (NTRS)

    Swenson, G. R.; Mende, S. B.

    1985-01-01

    A set of questions is posed regarding the surface chemistry producing the ram glow on the space shuttle. The questions surround verification of the chemical cycle involved in the physical processes leading to the glow. The questions, and a matrix of measurements required for most answers, are presented. The measurements include knowledge of the flux composition to and from a ram surface as well as spectroscopic signatures from the U to visible to IR. A pallet set of experiments proposed to accomplish the measurements is discussed. An interim experiment involving an available infrared instrument to be operated from the shuttle Orbiter cabin is also be discussed.

  11. Identification of host response signatures of infection.

    SciTech Connect

    Branda, Steven S.; Sinha, Anupama; Bent, Zachary

    2013-02-01

    Biological weapons of mass destruction and emerging infectious diseases represent a serious and growing threat to our national security. Effective response to a bioattack or disease outbreak critically depends upon efficient and reliable distinguishing between infected vs healthy individuals, to enable rational use of scarce, invasive, and/or costly countermeasures (diagnostics, therapies, quarantine). Screening based on direct detection of the causative pathogen can be problematic, because culture- and probe-based assays are confounded by unanticipated pathogens (e.g., deeply diverged, engineered), and readily-accessible specimens (e.g., blood) often contain little or no pathogen, particularly at pre-symptomatic stages of disease. Thus, in addition to the pathogen itself, one would like to detect infection-specific host response signatures in the specimen, preferably ones comprised of nucleic acids (NA), which can be recovered and amplified from tiny specimens (e.g., fingerstick draws). Proof-of-concept studies have not been definitive, however, largely due to use of sub-optimal sample preparation and detection technologies. For purposes of pathogen detection, Sandia has developed novel molecular biology methods that enable selective isolation of NA unique to, or shared between, complex samples, followed by identification and quantitation via Second Generation Sequencing (SGS). The central hypothesis of the current study is that variations on this approach will support efficient identification and verification of NA-based host response signatures of infectious disease. To test this hypothesis, we re-engineered Sandia's sophisticated sample preparation pipelines, and developed new SGS data analysis tools and strategies, in order to pioneer use of SGS for identification of host NA correlating with infection. Proof-of-concept studies were carried out using specimens drawn from pathogen-infected non-human primates (NHP). This work provides a strong foundation for

  12. Field Instructors and Online Training: An Exploratory Survey

    ERIC Educational Resources Information Center

    Dedman, Denise E.; Palmer, Louann Bierlein

    2011-01-01

    Despite field placement being the signature pedagogy of the social work profession, little research exists regarding methods for training field instructors. This study captures their perceptions regarding the use of online training. An online survey of 642 field instructors from 4 universities produced 208 responses. Less than 4% rejected the idea…

  13. Verification of VENTSAR

    SciTech Connect

    Simpkins, A.A.

    1995-01-01

    The VENTSAR code is an upgraded and improved version of the VENTX code, which estimates concentrations on or near a building from a release at a nearby location. The code calculates the concentrations either for a given meteorological exceedance probability or for a given stability and wind speed combination. A single building can be modeled which lies in the path of the plume, or a penthouse can be added to the top of the building. Plume rise may also be considered. Release types can be either chemical or radioactive. Downwind concentrations are determined at user-specified incremental distances. This verification report was prepared to demonstrate that VENTSAR is properly executing all algorithms and transferring data. Hand calculations were also performed to ensure proper application of methodologies.

  14. Shift Verification and Validation

    SciTech Connect

    Pandya, Tara M.; Evans, Thomas M.; Davidson, Gregory G; Johnson, Seth R.; Godfrey, Andrew T.

    2016-09-07

    This documentation outlines the verification and validation of Shift for the Consortium for Advanced Simulation of Light Water Reactors (CASL). Five main types of problems were used for validation: small criticality benchmark problems; full-core reactor benchmarks for light water reactors; fixed-source coupled neutron-photon dosimetry benchmarks; depletion/burnup benchmarks; and full-core reactor performance benchmarks. We compared Shift results to measured data and other simulated Monte Carlo radiation transport code results, and found very good agreement in a variety of comparison measures. These include prediction of critical eigenvalue, radial and axial pin power distributions, rod worth, leakage spectra, and nuclide inventories over a burn cycle. Based on this validation of Shift, we are confident in Shift to provide reference results for CASL benchmarking.

  15. Wake Signature Detection

    NASA Astrophysics Data System (ADS)

    Spedding, Geoffrey R.

    2014-01-01

    An accumulated body of quantitative evidence shows that bluff-body wakes in stably stratified environments have an unusual degree of coherence and organization, so characteristic geometries such as arrays of alternating-signed vortices have very long lifetimes, as measured in units of buoyancy timescales, or in the downstream distance scaled by a body length. The combination of pattern geometry and persistence renders the detection of these wakes possible in principle. It now appears that identifiable signatures can be found from many disparate sources: Islands, fish, and plankton all have been noted to generate features that can be detected by climate modelers, hopeful navigators in open oceans, or hungry predators. The various types of wakes are reviewed with notes on why their signatures are important and to whom. A general theory of wake pattern formation is lacking and would have to span many orders of magnitude in Reynolds number.

  16. SMAWT Signature Test

    DTIC Science & Technology

    1974-10-01

    were generally inversely proportional to the size assesments of the flash and smoke . Table 26 shows the percent of change in average judgments of...Average Time of Gunner’s View Obscuration by Smoke During Firings From the Wood Line .. .. ..... ..... ...... ..... .. 18 7. Average Obscuration Times...of Gunner’s View Obscuration by Smoke - Grass Line 19 8. Normalized Comparisons of the Relative Grades Assigned to Systems Signature Components

  17. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  18. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  19. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  20. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  1. 40 CFR 1065.390 - PM balance verifications and weighing process verification.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false PM balance verifications and weighing... § 1065.390 PM balance verifications and weighing process verification. (a) Scope and frequency. This section describes three verifications. (1) Independent verification of PM balance performance within...

  2. 76 FR 20536 - Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-04-13

    ... From the Federal Register Online via the Government Publishing Office ENVIRONMENTAL PROTECTION AGENCY 40 CFR Part 75 RIN 2060-AQ06 Protocol Gas Verification Program and Minimum Competency Requirements for Air Emission Testing Correction In rule document 2011-6216 appearing on pages 17288-17325 in...

  3. Knowledge Signatures for Information Integration

    SciTech Connect

    Thomson, Judi; Cowell, Andrew J.; Paulson, Patrick R.; Butner, R. Scott; Whiting, Mark A.

    2003-10-25

    This paper introduces the notion of a knowledge signature: a concise, ontologically-driven representation of the semantic characteristics of data. Knowledge signatures provide programmatic access to data semantics while allowing comparisons to be made across different types of data such as text, images or video, enabling efficient, automated information integration. Through observation, which determines the degree of association between data and ontological concepts, and refinement, which uses the axioms and structure of the domain ontology to place the signature more accurately within the context of the domain, knowledge signatures can be created. A comparison of such signatures for two different pieces of data results in a measure of their semantic separation. This paper discusses the definition of knowledge signatures along with the design and prototype implementation of a knowledge signature generator.

  4. Generic interpreters and microprocessor verification

    NASA Technical Reports Server (NTRS)

    Windley, Phillip J.

    1990-01-01

    The following topics are covered in viewgraph form: (1) generic interpreters; (2) Viper microprocessors; (3) microprocessor verification; (4) determining correctness; (5) hierarchical decomposition; (6) interpreter theory; (7) AVM-1; (8) phase-level specification; and future work.

  5. Search for signatures in miRNAs associated with cancer

    PubMed Central

    Kothandan, Ram; Biswas, Sumit

    2013-01-01

    Since the first discovery in the early 1990's, the predicted and validated population of microRNAs (miRNAs or miRs) has grown significantly. These small (~22 nucleotides long) regulators of gene expression have been implicated and associated with several genes in the cancer pathway as well. Globally, the identification and verification of microRNAs as biomarkers for cancer cell types has been the area of thrust for most miRNA biologists. However, there has been a noticeable vacuum when it comes to identifying a common signature or trademark that could be used to demarcate a miR to be associated with the development or suppression of cancer. To answer these queries, we report an in silico study involving the identification of global signatures in experimentally validated microRNAs which have been associated with cancer. This study has thrown light on the presence of significant common signatures, viz., - sequential and hybridization, which may distinguish a miR to be associated with cancer. Based on our analysis, we suggest the utility of such signatures in the design and development of algorithms for prediction of miRs involved in the cancer pathway. PMID:23861569

  6. Reliable Entanglement Verification

    NASA Astrophysics Data System (ADS)

    Arrazola, Juan; Gittsovich, Oleg; Donohue, John; Lavoie, Jonathan; Resch, Kevin; Lütkenhaus, Norbert

    2013-05-01

    Entanglement plays a central role in quantum protocols. It is therefore important to be able to verify the presence of entanglement in physical systems from experimental data. In the evaluation of these data, the proper treatment of statistical effects requires special attention, as one can never claim to have verified the presence of entanglement with certainty. Recently increased attention has been paid to the development of proper frameworks to pose and to answer these type of questions. In this work, we apply recent results by Christandl and Renner on reliable quantum state tomography to construct a reliable entanglement verification procedure based on the concept of confidence regions. The statements made do not require the specification of a prior distribution nor the assumption of an independent and identically distributed (i.i.d.) source of states. Moreover, we develop efficient numerical tools that are necessary to employ this approach in practice, rendering the procedure ready to be employed in current experiments. We demonstrate this fact by analyzing the data of an experiment where photonic entangled two-photon states were generated and whose entanglement is verified with the use of an accessible nonlinear witness.

  7. Woodward Effect Experimental Verifications

    NASA Astrophysics Data System (ADS)

    March, Paul

    2004-02-01

    The work of J. F. Woodward (1990 1996a; 1996b; 1998; 2002a; 2002b; 2004) on the existence of ``mass fluctuations'' and their use in exotic propulsion schemes was examined for possible application in improving space flight propulsion and power generation. Woodward examined Einstein's General Relativity Theory (GRT) and assumed that if the strong Machian interpretation of GRT as well as gravitational / inertia like Wheeler-Feynman radiation reaction forces hold, then when an elementary particle is accelerated through a potential gradient, its rest mass should fluctuate around its mean value during its acceleration. Woodward also used GRT to clarify the precise experimental conditions necessary for observing and exploiting these mass fluctuations or ``Woodward effect'' (W-E). Later, in collaboration with his ex-graduate student T. Mahood, they also pushed the experimental verification boundaries of these proposals. If these purported mass fluctuations occur as Woodward claims, and his assumption that gravity and inertia are both byproducts of the same GRT based phenomenon per Mach's Principle is correct, then many innovative applications such as propellantless propulsion and gravitational exotic matter generators may be feasible. This paper examines the reality of mass fluctuations and the feasibility of using the W-E to design propellantless propulsion devices in the near to mid-term future. The latest experimental results, utilizing MHD-like force rectification systems, will also be presented.

  8. Cold fusion verification

    NASA Astrophysics Data System (ADS)

    North, M. H.; Mastny, G. F.; Wesley, E. J.

    1991-03-01

    The objective of this work to verify and reproduce experimental observations of Cold Nuclear Fusion (CNF), as originally reported in 1989. The method was to start with the original report and add such additional information as became available to build a set of operational electrolytic CNF cells. Verification was to be achieved by first observing cells for neutron production, and for those cells that demonstrated a nuclear effect, careful calorimetric measurements were planned. The authors concluded, after laboratory experience, reading published work, talking with others in the field, and attending conferences, that CNF probably is chimera and will go the way of N-rays and polywater. The neutron detector used for these tests was a completely packaged unit built into a metal suitcase that afforded electrostatic shielding for the detectors and self-contained electronics. It was battery-powered, although it was on charge for most of the long tests. The sensor element consists of He detectors arranged in three independent layers in a solid moderating block. The count from each of the three layers as well as the sum of all the detectors were brought out and recorded separately. The neutron measurements were made with both the neutron detector and the sample tested in a cave made of thick moderating material that surrounded the two units on the sides and bottom.

  9. Signatures of nonthermal melting

    PubMed Central

    Zier, Tobias; Zijlstra, Eeuwe S.; Kalitsov, Alan; Theodonis, Ioannis; Garcia, Martin E.

    2015-01-01

    Intense ultrashort laser pulses can melt crystals in less than a picosecond but, in spite of over thirty years of active research, for many materials it is not known to what extent thermal and nonthermal microscopic processes cause this ultrafast phenomenon. Here, we perform ab-initio molecular-dynamics simulations of silicon on a laser-excited potential-energy surface, exclusively revealing nonthermal signatures of laser-induced melting. From our simulated atomic trajectories, we compute the decay of five structure factors and the time-dependent structure function. We demonstrate how these quantities provide criteria to distinguish predominantly nonthermal from thermal melting. PMID:26798822

  10. Signature CERN-URSS

    SciTech Connect

    2006-01-24

    Le DG W.Jentschke souhaite la bienvenue à l'assemblée et aux invités pour la signature du protocole entre le Cern et l'URSS qui est un événement important. C'est en 1955 que 55 visiteurs soviétiques ont visité le Cern pour la première fois. Le premier DG au Cern, F.Bloch, et Mons.Amaldi sont aussi présents. Tandis que le discours anglais de W.Jentschke est traduit en russe, le discours russe de Mons.Morozov est traduit en anglais.

  11. Electromagnetic Signature Technique as a Promising Tool to Verify Nuclear Weapons Storage and Dismantlement under a Nuclear Arms Control Regime

    SciTech Connect

    Bunch, Kyle J.; Williams, Laura S.; Jones, Anthony M.; Ramuhalli, Pradeep

    2012-08-01

    The 2010 ratification of the New START Treaty has been widely regarded as a noteworthy national security achievement for both the Obama administration and the Medvedev-Putin regime, but deeper cuts are envisioned under future arms control regimes. Future verification needs will include monitoring the storage of warhead components and fissile materials and verifying dismantlement of warheads, pits, secondaries, and other materials. From both the diplomatic and technical perspectives, verification under future arms control regimes will pose new challenges. Since acceptable verification technology must protect sensitive design information and attributes, non-nuclear non-sensitive signatures may provide a significant verification tool without the use of additional information barriers. The use of electromagnetic signatures to monitor nuclear material storage containers is a promising technology with the potential to fulfill these challenging requirements. Research performed at Pacific Northwest National Laboratory (PNNL) has demonstrated that low frequency electromagnetic signatures of sealed metallic containers can be used to confirm the presence of specific components on a “yes/no” basis without revealing classified information. Arms control inspectors might use this technique to verify the presence or absence of monitored items, including both nuclear and non-nuclear materials. Although additional research is needed to study signature aspects such as uniqueness and investigate container-specific scenarios, the technique potentially offers a rapid and cost-effective tool to verify reduction and dismantlement of U.S. and Russian nuclear weapons.

  12. Advanced spectral signature discrimination algorithm

    NASA Astrophysics Data System (ADS)

    Chakravarty, Sumit; Cao, Wenjie; Samat, Alim

    2013-05-01

    This paper presents a novel approach to the task of hyperspectral signature analysis. Hyperspectral signature analysis has been studied a lot in literature and there has been a lot of different algorithms developed which endeavors to discriminate between hyperspectral signatures. There are many approaches for performing the task of hyperspectral signature analysis. Binary coding approaches like SPAM and SFBC use basic statistical thresholding operations to binarize a signature which are then compared using Hamming distance. This framework has been extended to techniques like SDFC wherein a set of primate structures are used to characterize local variations in a signature together with the overall statistical measures like mean. As we see such structures harness only local variations and do not exploit any covariation of spectrally distinct parts of the signature. The approach of this research is to harvest such information by the use of a technique similar to circular convolution. In the approach we consider the signature as cyclic by appending the two ends of it. We then create two copies of the spectral signature. These three signatures can be placed next to each other like the rotating discs of a combination lock. We then find local structures at different circular shifts between the three cyclic spectral signatures. Texture features like in SDFC can be used to study the local structural variation for each circular shift. We can then create different measure by creating histogram from the shifts and thereafter using different techniques for information extraction from the histograms. Depending on the technique used different variant of the proposed algorithm are obtained. Experiments using the proposed technique show the viability of the proposed methods and their performances as compared to current binary signature coding techniques.

  13. Multimodal signature modeling of humans

    NASA Astrophysics Data System (ADS)

    Cathcart, J. Michael; Kocher, Brian; Prussing, Keith; Lane, Sarah; Thomas, Alan

    2010-04-01

    Georgia Tech been investigating method for the detection of covert personnel in traditionally difficult environments (e.g., urban, caves). This program focuses on a detailed phenomenological analysis of human physiology and signatures with the subsequent identification and characterization of potential observables. Both aspects are needed to support the development of personnel detection and tracking algorithms. The difficult nature of these personnel-related problems dictates a multimodal sensing approach. Human signature data of sufficient and accurate quality and quantity do not exist, thus the development of an accurate signature model for a human is needed. This model should also simulate various human activities to allow motion-based observables to be exploited. This paper will describe a multimodal signature modeling approach that incorporates human physiological aspects, thermoregulation, and dynamics into the signature calculation. This approach permits both passive and active signatures to be modeled. The focus of the current effort involved the computation of signatures in urban environments. This paper will discuss the development of a human motion model for use in simulating both electro-optical signatures and radar-based signatures. Video sequences of humans in a simulated urban environment will also be presented; results using these sequences for personnel tracking will be presented.

  14. Verification hybrid control of a wheeled mobile robot and manipulator

    NASA Astrophysics Data System (ADS)

    Muszynska, Magdalena; Burghardt, Andrzej; Kurc, Krzysztof; Szybicki, Dariusz

    2016-04-01

    In this article, innovative approaches to realization of the wheeled mobile robots and manipulator tracking are presented. Conceptions include application of the neural-fuzzy systems to compensation of the controlled system's nonlinearities in the tracking control task. Proposed control algorithms work on-line, contain structure, that adapt to the changeable work conditions of the controlled systems, and do not require the preliminary learning. The algorithm was verification on the real object which was a Scorbot - ER 4pc robotic manipulator and a Pioneer - 2DX mobile robot.

  15. Hard and Soft Safety Verifications

    NASA Technical Reports Server (NTRS)

    Wetherholt, Jon; Anderson, Brenda

    2012-01-01

    The purpose of this paper is to examine the differences between and the effects of hard and soft safety verifications. Initially, the terminology should be defined and clarified. A hard safety verification is datum which demonstrates how a safety control is enacted. An example of this is relief valve testing. A soft safety verification is something which is usually described as nice to have but it is not necessary to prove safe operation. An example of a soft verification is the loss of the Solid Rocket Booster (SRB) casings from Shuttle flight, STS-4. When the main parachutes failed, the casings impacted the water and sank. In the nose cap of the SRBs, video cameras recorded the release of the parachutes to determine safe operation and to provide information for potential anomaly resolution. Generally, examination of the casings and nozzles contributed to understanding of the newly developed boosters and their operation. Safety verification of SRB operation was demonstrated by examination for erosion or wear of the casings and nozzle. Loss of the SRBs and associated data did not delay the launch of the next Shuttle flight.

  16. Online Monitoring of Induction Motors

    SciTech Connect

    McJunkin, Timothy R.; Agarwal, Vivek; Lybeck, Nancy Jean

    2016-01-01

    The online monitoring of active components project, under the Advanced Instrumentation, Information, and Control Technologies Pathway of the Light Water Reactor Sustainability Program, researched diagnostic and prognostic models for alternating current induction motors (IM). Idaho National Laboratory (INL) worked with the Electric Power Research Institute (EPRI) to augment and revise the fault signatures previously implemented in the Asset Fault Signature Database of EPRI’s Fleet Wide Prognostic and Health Management (FW PHM) Suite software. Induction Motor diagnostic models were researched using the experimental data collected by Idaho State University. Prognostic models were explored in the set of literature and through a limited experiment with 40HP to seek the Remaining Useful Life Database of the FW PHM Suite.

  17. Identifying, Visualizing, and Fusing Social Media Data to Support Nonproliferation and Arms Control Treaty Verification: Preliminary Results

    SciTech Connect

    Gastelum, Zoe N.; Cramer, Nicholas O.; Benz, Jacob M.; Kreyling, Sean J.; Henry, Michael J.; Corley, Courtney D.; Whattam, Kevin M.

    2013-07-11

    While international nonproliferation and arms control verification capabilities have their foundations in physical and chemical sensors, state declarations, and on-site inspections, verification experts are beginning to consider the importance of open source data to complement and support traditional means of verification. One of those new, and increasingly expanding, sources of open source information is social media, which can be ingested and understood through social media analytics (SMA). Pacific Northwest National Laboratory (PNNL) is conducting research to further our ability to identify, visualize, and fuse social media data to support nonproliferation and arms control treaty verification efforts. This paper will describe our preliminary research to examine social media signatures of nonproliferation or arms control proxy events. We will describe the development of our preliminary nonproliferation and arms control proxy events, outline our initial findings, and propose ideas for future work.

  18. Signature CERN-URSS

    ScienceCinema

    None

    2016-07-12

    Le DG W.Jentschke souhaite la bienvenue à l'assemblée et aux invités pour la signature du protocole entre le Cern et l'URSS qui est un événement important. C'est en 1955 que 55 visiteurs soviétiques ont visité le Cern pour la première fois. Le premier DG au Cern, F.Bloch, et Mons.Amaldi sont aussi présents. Tandis que le discours anglais de W.Jentschke est traduit en russe, le discours russe de Mons.Morozov est traduit en anglais.

  19. Signatures of aging revisited

    SciTech Connect

    Drell, S.; Jeanloz, R.; Cornwall, J.; Dyson, F.; Eardley, D.

    1998-03-18

    This study is a follow-on to the review made by JASON during its 1997 Summer Study of what is known about the aging of critical constituents, particularly the high explosives, metals (Pu, U), and polymers in the enduring stockpile. The JASON report (JSR-97-320) that summarized the findings was based on briefings by the three weapons labs (LANL, LLNL, SNL). They presented excellent technical analyses covering a broad range of scientific and engineering problems pertaining to determining signatures of aging. But the report also noted: `Missing, however, from the briefings and the written documents made available to us by the labs and DOE, was evidence of an adequately sharp focus and high priorities on a number of essential near-term needs of maintaining weapons in the stockpile.

  20. Landsat Signature Development Program

    NASA Technical Reports Server (NTRS)

    Hall, R. N.; Mcguire, K. G.; Bland, R. A.

    1976-01-01

    The Landsat Signature Development Program, LSDP, is designed to produce an unsupervised classification of a scene from a Landsat tape. This classification is based on the clustering tendencies of the multispectral scanner data processed from the scene. The program will generate a character map that, by identifying each of the general classes of surface features extracted from the scene data with a specific line printer symbol, indicates the approximate locations and distributions of these general classes within the scene. Also provided with the character map are a number of tables each of which describes either some aspect of the spectral properties of the resultant classes, some inter-class relationship, the incidence of picture elements assigned to the various classes in the character map classification of the scene, or some significant intermediate stage in the development of the final classes.

  1. Multisensors signature prediction workbench

    NASA Astrophysics Data System (ADS)

    Latger, Jean; Cathala, Thierry

    2015-10-01

    Guidance of weapon systems relies on sensors to analyze targets signature. Defense weapon systems also need to detect then identify threats also using sensors. The sensors performance is very dependent on conditions e.g. time of day, atmospheric propagation, background ... Visible camera are very efficient for diurnal fine weather conditions, long wave infrared sensors for night vision, radar systems very efficient for seeing through atmosphere and/or foliage ... Besides, multi sensors systems, combining several collocated sensors with associated algorithms of fusion, provide better efficiency (typically for Enhanced Vision Systems). But these sophisticated systems are all the more difficult to conceive, assess and qualify. In that frame, multi sensors simulation is highly required. This paper focuses on multi sensors simulation tools. A first part makes a state of the Art of such simulation workbenches with a special focus on SE-Workbench. SEWorkbench is described with regards to infrared/EO sensors, millimeter waves sensors, active EO sensors and GNSS sensors. Then a general overview of simulation of targets and backgrounds signature objectives is presented, depending on the type of simulation required (parametric studies, open loop simulation, closed loop simulation, hybridization of SW simulation and HW ...). After the objective review, the paper presents some basic requirements for simulation implementation such as the deterministic behavior of simulation, mandatory to repeat it many times for parametric studies... Several technical topics are then discussed, such as the rendering technique (ray tracing vs. rasterization), the implementation (CPU vs. GP GPU) and the tradeoff between physical accuracy and performance of computation. Examples of results using SE-Workbench are showed and commented.

  2. Signatures of dark matter

    NASA Astrophysics Data System (ADS)

    Baltz, Edward Anthony

    It is well known that most of the mass in the universe remains unobserved save for its gravitational effect on luminous matter. The nature of this ``dark matter'' remains a mystery. From measurements of the primordial deuterium abundance, the theory of big bang nucleosynthesis predicts that there are not enough baryons to account for the amount of dark matter observed, thus the missing mass must take an exotic form. Several promising candidates have been proposed. In this work I will describe my research along two main lines of inquiry into the dark matter puzzle. The first possibility is that the dark matter is exotic massive particles, such as those predicted by supersymmetric extensions to the standard model of particle physics. Such particles are generically called WIMPs, for weakly interacting massive particles. Focusing on the so-called neutralino in supersymmetric models, I discuss the possible signatures of such particles, including their direct detection via nuclear recoil experiments and their indirect detection via annihilations in the halos of galaxies, producing high energy antiprotons, positrons and gamma rays. I also discuss signatures of the possible slow decays of such particles. The second possibility is that there is a population of black holes formed in the early universe. Any dark objects in galactic halos, black holes included, are called MACHOs, for massive compact halo objects. Such objects can be detected by their gravitational microlensing effects. Several possibilities for sources of baryonic dark matter are also interesting for gravitational microlensing. These include brown dwarf stars and old, cool white dwarf stars. I discuss the theory of gravitational microlensing, focusing on the technique of pixel microlensing. I make predictions for several planned microlensing experiments with ground based and space based telescopes. Furthermore, I discuss binary lenses in the context of pixel microlensing. Finally, I develop a new technique for

  3. Signatures of AGN feedback

    NASA Astrophysics Data System (ADS)

    Wylezalek, D.; Zakamska, N.

    2016-06-01

    Feedback from active galactic nuclei (AGN) is widely considered to be the main driver in regulating the growth of massive galaxies. It operates by either heating or driving the gas that would otherwise be available for star formation out of the galaxy, preventing further increase in stellar mass. Observational proof for this scenario has, however, been hard to come by. We have assembled a large sample of 133 radio-quiet type-2 and red AGN at 0.1signatures are hosted in galaxies that are more `quenched' considering their stellar mass than galaxies with weaker outflow signatures. This correlation is only seen in AGN host galaxies with SFR >100 M_{⊙} yr^{-1} where presumably the coupling of the AGN-driven wind to the gas is strongest. This observation is consistent with the AGN having a net suppression, or `negative' impact, through feedback on the galaxies' star formation history.

  4. Structural verification for GAS experiments

    NASA Technical Reports Server (NTRS)

    Peden, Mark Daniel

    1992-01-01

    The purpose of this paper is to assist the Get Away Special (GAS) experimenter in conducting a thorough structural verification of its experiment structural configuration, thus expediting the structural review/approval process and the safety process in general. Material selection for structural subsystems will be covered with an emphasis on fasteners (GSFC fastener integrity requirements) and primary support structures (Stress Corrosion Cracking requirements and National Space Transportation System (NSTS) requirements). Different approaches to structural verifications (tests and analyses) will be outlined especially those stemming from lessons learned on load and fundamental frequency verification. In addition, fracture control will be covered for those payloads that utilize a door assembly or modify the containment provided by the standard GAS Experiment Mounting Plate (EMP). Structural hazard assessment and the preparation of structural hazard reports will be reviewed to form a summation of structural safety issues for inclusion in the safety data package.

  5. Technical challenges for dismantlement verification

    SciTech Connect

    Olinger, C.T.; Stanbro, W.D.; Johnston, R.G.; Nakhleh, C.W.; Dreicer, J.S.

    1997-11-01

    In preparation for future nuclear arms reduction treaties, including any potential successor treaties to START I and II, the authors have been examining possible methods for bilateral warhead dismantlement verification. Warhead dismantlement verification raises significant challenges in the political, legal, and technical arenas. This discussion will focus on the technical issues raised by warhead arms controls. Technical complications arise from several sources. These will be discussed under the headings of warhead authentication, chain-of-custody, dismantlement verification, non-nuclear component tracking, component monitoring, and irreversibility. The authors will discuss possible technical options to address these challenges as applied to a generic dismantlement and disposition process, in the process identifying limitations and vulnerabilities. They expect that these considerations will play a large role in any future arms reduction effort and, therefore, should be addressed in a timely fashion.

  6. Index of Spectrum Signature Data

    DTIC Science & Technology

    1985-05-01

    Frederick Research Corporation. Alexandria. VA 163 AN/APG-030 Radar Receiver Heasureaents Electromagnetic Coapatibilitv Analysis Center, US Navv Marine ... Electromagnetic Compatibility Characteristics of the W 86 Gun Fire Control Svstem. Naval HEapons Lab, Dahlgren, VA 501 Partial Spectrum Signature...ECAC-I-IO-(SS) DEPARTMENT OF DEFENSE Electromagnetic Compatibility Analysis Center Annapolis, Maryland 21402 INDEX OF SPECTRUM SIGNATURE DATA

  7. Cell short circuit, preshort signature

    NASA Technical Reports Server (NTRS)

    Lurie, C.

    1980-01-01

    Short-circuit events observed in ground test simulations of DSCS-3 battery in-orbit operations are analyzed. Voltage signatures appearing in the data preceding the short-circuit event are evaluated. The ground test simulation is briefly described along with performance during reconditioning discharges. Results suggest that a characteristic signature develops prior to a shorting event.

  8. Formal verification of mathematical software

    NASA Technical Reports Server (NTRS)

    Sutherland, D.

    1984-01-01

    Methods are investigated for formally specifying and verifying the correctness of mathematical software (software which uses floating point numbers and arithmetic). Previous work in the field was reviewed. A new model of floating point arithmetic called the asymptotic paradigm was developed and formalized. Two different conceptual approaches to program verification, the classical Verification Condition approach and the more recently developed Programming Logic approach, were adapted to use the asymptotic paradigm. These approaches were then used to verify several programs; the programs chosen were simplified versions of actual mathematical software.

  9. CHEMICAL INDUCTION MIXER VERIFICATION - ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Wet-Weather Flow Technologies Pilot of the Environmental Technology Verification (ETV) Program, which is supported by the U.S. Environmental Protection Agency and facilitated by NSF International, has recently evaluated the performance of chemical induction mixers used for di...

  10. Working Memory Mechanism in Proportional Quantifier Verification

    ERIC Educational Resources Information Center

    Zajenkowski, Marcin; Szymanik, Jakub; Garraffa, Maria

    2014-01-01

    The paper explores the cognitive mechanisms involved in the verification of sentences with proportional quantifiers (e.g. "More than half of the dots are blue"). The first study shows that the verification of proportional sentences is more demanding than the verification of sentences such as: "There are seven blue and eight yellow…

  11. Online Learning

    ERIC Educational Resources Information Center

    Perry, Edward H.; Pilati, Michelle L.

    2011-01-01

    Distance education, which began as correspondence courses in the nineteenth century and grew into educational television during the twentieth century, evolved into learning on the Web by the mid-1990s. Accompanying the rise in online learning has been a similar rise in organizations and publications dedicated to serving the needs of online…

  12. Online 1990.

    ERIC Educational Resources Information Center

    Goldstein, Morris

    This paper examines the co-existence of online and CD-ROM technologies in terms of their existing pricing structures, marketing strategies, functionality, and future roles. "Fixed Price Unlimited Usage" (FPUU) pricing and flat-rate pricing are discussed as viable alternatives to current pricing practices. In addition, it is argued that the…

  13. Automated verification system user's guide

    NASA Technical Reports Server (NTRS)

    Hoffman, R. H.

    1972-01-01

    Descriptions of the operational requirements for all of the programs of the Automated Verification System (AVS) are provided. The AVS programs are: (1) FORTRAN code analysis and instrumentation program (QAMOD); (2) Test Effectiveness Evaluation Program (QAPROC); (3) Transfer Control Variable Tracking Program (QATRAK); (4) Program Anatomy Table Generator (TABGEN); and (5) Network Path Analysis Program (RAMBLE).

  14. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-07-16

    This paper will explore the difficulties of deep reductions by examining the technical verification challenges. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 (Pifer 2010). Further reductions will include stepping stones at 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national lab complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  15. Signatures of AGN feedback

    NASA Astrophysics Data System (ADS)

    Wylezalek, Dominika; Zakamska, Nadia L.; MaNGA-GMOS Team

    2017-01-01

    Feedback from actively accreting SMBHs (Active Galactic Nuclei, AGN) is now widely considered to be the main driver in regulating the growth of massive galaxies. Observational proof for this scenario has, however, been hard to come by. Many attempts at finding a conclusive observational proof that AGN may be able to quench star formation and regulate the host galaxies' growth have shown that this problem is highly complex.I will present results from several projects that focus on understanding the power, reach and impact of feedback processes exerted by AGN. I will describe recent efforts in our group of relating feedback signatures to the specific star formation rate in their host galaxies, where our results are consistent with the AGN having a `negative' impact through feedback on the galaxies' star formation history (Wylezalek+2016a,b). Furthermore, I will show that powerful AGN-driven winds can be easily hidden and not be apparent in the integrated spectrum of the galaxy. This implies that large IFU surveys, such as the SDSS-IV MaNGA survey, might uncover many previously unknown AGN and outflows that are potentially very relevant for understanding the role of AGN in galaxy evolution (Wylezalek+2016c)!

  16. Statistical clumped isotope signatures

    PubMed Central

    Röckmann, T.; Popa, M. E.; Krol, M. C.; Hofmann, M. E. G.

    2016-01-01

    High precision measurements of molecules containing more than one heavy isotope may provide novel constraints on element cycles in nature. These so-called clumped isotope signatures are reported relative to the random (stochastic) distribution of heavy isotopes over all available isotopocules of a molecule, which is the conventional reference. When multiple indistinguishable atoms of the same element are present in a molecule, this reference is calculated from the bulk (≈average) isotopic composition of the involved atoms. We show here that this referencing convention leads to apparent negative clumped isotope anomalies (anti-clumping) when the indistinguishable atoms originate from isotopically different populations. Such statistical clumped isotope anomalies must occur in any system where two or more indistinguishable atoms of the same element, but with different isotopic composition, combine in a molecule. The size of the anti-clumping signal is closely related to the difference of the initial isotope ratios of the indistinguishable atoms that have combined. Therefore, a measured statistical clumped isotope anomaly, relative to an expected (e.g. thermodynamical) clumped isotope composition, may allow assessment of the heterogeneity of the isotopic pools of atoms that are the substrate for formation of molecules. PMID:27535168

  17. Intrusion detection using secure signatures

    DOEpatents

    Nelson, Trent Darnel; Haile, Jedediah

    2014-09-30

    A method and device for intrusion detection using secure signatures comprising capturing network data. A search hash value, value employing at least one one-way function, is generated from the captured network data using a first hash function. The presence of a search hash value match in a secure signature table comprising search hash values and an encrypted rule is determined. After determining a search hash value match, a decryption key is generated from the captured network data using a second hash function, a hash function different form the first hash function. One or more of the encrypted rules of the secure signatures table having a hash value equal to the generated search hash value are then decrypted using the generated decryption key. The one or more decrypted secure signature rules are then processed for a match and one or more user notifications are deployed if a match is identified.

  18. Ballastic signature identification systems study

    NASA Technical Reports Server (NTRS)

    Reich, A.; Hine, T. L.

    1976-01-01

    The results are described of an attempt to establish a uniform procedure for documenting (recording) expended bullet signatures as effortlessly as possible and to build a comprehensive library of these signatures in a form that will permit the automated comparison of a new suspect bullet with the prestored library. The ultimate objective is to achieve a standardized format that will permit nationwide interaction between police departments, crime laboratories, and other interested law enforcement agencies.

  19. Hybrid Enrichment Assay Methods for a UF6 Cylinder Verification Station: FY10 Progress Report

    SciTech Connect

    Smith, Leon E.; Jordan, David V.; Orton, Christopher R.; Misner, Alex C.; Mace, Emily K.

    2010-08-01

    Pacific Northwest National Laboratory (PNNL) is developing the concept of an automated UF6 cylinder verification station that would be located at key measurement points to positively identify each cylinder, measure its mass and enrichment, store the collected data in a secure database, and maintain continuity of knowledge on measured cylinders until the arrival of International Atomic Energy Agency (IAEA) inspectors. At the center of this unattended system is a hybrid enrichment assay technique that combines the traditional enrichment-meter method (based on the 186 keV peak from 235U) with non-traditional neutron-induced high-energy gamma-ray signatures (spawned primarily by 234U alpha emissions and 19F(alpha, neutron) reactions). Previous work by PNNL provided proof-of-principle for the non-traditional signatures to support accurate, full-volume interrogation of the cylinder enrichment, thereby reducing the systematic uncertainties in enrichment assay due to UF6 heterogeneity and providing greater sensitivity to material substitution scenarios. The work described here builds on that preliminary evaluation of the non-traditional signatures, but focuses on a prototype field system utilizing NaI(Tl) and LaBr3(Ce) spectrometers, and enrichment analysis algorithms that integrate the traditional and non-traditional signatures. Results for the assay of Type-30B cylinders ranging from 0.2 to 4.95 wt% 235U, at an AREVA fuel fabrication plant in Richland, WA, are described for the following enrichment analysis methods: 1) traditional enrichment meter signature (186 keV peak) as calculated using a square-wave convolute (SWC) algorithm; 2) non-traditional high-energy gamma-ray signature that provides neutron detection without neutron detectors and 3) hybrid algorithm that merges the traditional and non-traditional signatures. Uncertainties for each method, relative to the declared enrichment for each cylinder, are calculated and compared to the uncertainties from an attended

  20. NEXT Thruster Component Verification Testing

    NASA Technical Reports Server (NTRS)

    Pinero, Luis R.; Sovey, James S.

    2007-01-01

    Component testing is a critical part of thruster life validation activities under NASA s Evolutionary Xenon Thruster (NEXT) project testing. The high voltage propellant isolators were selected for design verification testing. Even though they are based on a heritage design, design changes were made because the isolators will be operated under different environmental conditions including temperature, voltage, and pressure. The life test of two NEXT isolators was therefore initiated and has accumulated more than 10,000 hr of operation. Measurements to date indicate only a negligibly small increase in leakage current. The cathode heaters were also selected for verification testing. The technology to fabricate these heaters, developed for the International Space Station plasma contactor hollow cathode assembly, was transferred to Aerojet for the fabrication of the NEXT prototype model ion thrusters. Testing the contractor-fabricated heaters is necessary to validate fabrication processes for high reliability heaters. This paper documents the status of the propellant isolator and cathode heater tests.

  1. Ontology Matching with Semantic Verification

    PubMed Central

    Jean-Mary, Yves R.; Shironoshita, E. Patrick; Kabuka, Mansur R.

    2009-01-01

    ASMOV (Automated Semantic Matching of Ontologies with Verification) is a novel algorithm that uses lexical and structural characteristics of two ontologies to iteratively calculate a similarity measure between them, derives an alignment, and then verifies it to ensure that it does not contain semantic inconsistencies. In this paper, we describe the ASMOV algorithm, and then present experimental results that measure its accuracy using the OAEI 2008 tests, and that evaluate its use with two different thesauri: WordNet, and the Unified Medical Language System (UMLS). These results show the increased accuracy obtained by combining lexical, structural and extensional matchers with semantic verification, and demonstrate the advantage of using a domain-specific thesaurus for the alignment of specialized ontologies. PMID:20186256

  2. Verification and transparency in future arms control

    SciTech Connect

    Pilat, J.F.

    1996-09-01

    Verification`s importance has changed dramatically over time, although it always has been in the forefront of arms control. The goals and measures of verification and the criteria for success have changed with the times as well, reflecting such factors as the centrality of the prospective agreement to East-West relations during the Cold War, the state of relations between the United States and the Soviet Union, and the technologies available for monitoring. Verification`s role may be declining in the post-Cold War period. The prospects for such a development will depend, first and foremost, on the high costs of traditional arms control, especially those associated with requirements for verification. Moreover, the growing interest in informal, or non-negotiated arms control does not allow for verification provisions by the very nature of these arrangements. Multilateral agreements are also becoming more prominent and argue against highly effective verification measures, in part because of fears of promoting proliferation by opening sensitive facilities to inspectors from potential proliferant states. As a result, it is likely that transparency and confidence-building measures will achieve greater prominence, both as supplements to and substitutes for traditional verification. Such measures are not panaceas and do not offer all that we came to expect from verification during the Cold war. But they may be the best possible means to deal with current problems of arms reductions and restraints at acceptable levels of expenditure.

  3. Crowd-Sourced Program Verification

    DTIC Science & Technology

    2012-12-01

    S / ROBERT L. KAMINSKI WARREN H. DEBANY, JR. Work Unit Manager Technical Advisor, Information Exploitation & Operations...Arlington, VA 22202-4302, and to the Office of Management and Budget, Paperwork Reduction Project (0704-0188) Washington, DC 20503. PLEASE DO NOT RETURN...investigation, the contractor constructed a prototype of a crowd-sourced verification system that takes as input a given program and produces as output a

  4. Structural System Identification Technology Verification

    DTIC Science & Technology

    1981-11-01

    USAAVRADCOM-TR-81-D-28Q V󈧄 ADA1091 81 LEI STRUCTURAL SYSTEM IDENTIFICATION TECHNOLOGY VERIFICATION \\ N. Giansante, A. Berman, W. o. Flannelly, E...release; distribution unlimited. Prepared for APPLIED TECHNOLOGY LABORATORY U. S. ARMY RESEARCH AND TECHNOLOGY LABORATORIES (AVRADCOM) S Fort Eustis...Va. 23604 4-J" APPLI ED TECHNOLOGY LABORATORY POSITION STATEMENT The Applied Technology Laboratory has been involved in the development of the Struc

  5. Formal verification of AI software

    NASA Technical Reports Server (NTRS)

    Rushby, John; Whitehurst, R. Alan

    1989-01-01

    The application of formal verification techniques to Artificial Intelligence (AI) software, particularly expert systems, is investigated. Constraint satisfaction and model inversion are identified as two formal specification paradigms for different classes of expert systems. A formal definition of consistency is developed, and the notion of approximate semantics is introduced. Examples are given of how these ideas can be applied in both declarative and imperative forms.

  6. Quantum messages with signatures forgeable in arbitrated quantum signature schemes

    NASA Astrophysics Data System (ADS)

    Kim, Taewan; Choi, Jeong Woon; Jho, Nam-Su; Lee, Soojoon

    2015-02-01

    Even though a method to perfectly sign quantum messages has not been known, the arbitrated quantum signature scheme has been considered as one of the good candidates. However, its forgery problem has been an obstacle to the scheme becoming a successful method. In this paper, we consider one situation, which is slightly different from the forgery problem, that we use to check whether at least one quantum message with signature can be forged in a given scheme, although all the messages cannot be forged. If there are only a finite number of forgeable quantum messages in the scheme, then the scheme can be secured against the forgery attack by not sending forgeable quantum messages, and so our situation does not directly imply that we check whether the scheme is secure against the attack. However, if users run a given scheme without any consideration of forgeable quantum messages, then a sender might transmit such forgeable messages to a receiver and in such a case an attacker can forge the messages if the attacker knows them. Thus it is important and necessary to look into forgeable quantum messages. We show here that there always exists such a forgeable quantum message-signature pair for every known scheme with quantum encryption and rotation, and numerically show that there are no forgeable quantum message-signature pairs that exist in an arbitrated quantum signature scheme.

  7. Simulating realistic predator signatures in quantitative fatty acid signature analysis

    USGS Publications Warehouse

    Bromaghin, Jeffrey F.

    2015-01-01

    Diet estimation is an important field within quantitative ecology, providing critical insights into many aspects of ecology and community dynamics. Quantitative fatty acid signature analysis (QFASA) is a prominent method of diet estimation, particularly for marine mammal and bird species. Investigators using QFASA commonly use computer simulation to evaluate statistical characteristics of diet estimators for the populations they study. Similar computer simulations have been used to explore and compare the performance of different variations of the original QFASA diet estimator. In both cases, computer simulations involve bootstrap sampling prey signature data to construct pseudo-predator signatures with known properties. However, bootstrap sample sizes have been selected arbitrarily and pseudo-predator signatures therefore may not have realistic properties. I develop an algorithm to objectively establish bootstrap sample sizes that generates pseudo-predator signatures with realistic properties, thereby enhancing the utility of computer simulation for assessing QFASA estimator performance. The algorithm also appears to be computationally efficient, resulting in bootstrap sample sizes that are smaller than those commonly used. I illustrate the algorithm with an example using data from Chukchi Sea polar bears (Ursus maritimus) and their marine mammal prey. The concepts underlying the approach may have value in other areas of quantitative ecology in which bootstrap samples are post-processed prior to their use.

  8. Nuclear Data Verification and Standardization

    SciTech Connect

    Karam, Lisa R.; Arif, Muhammad; Thompson, Alan K.

    2011-10-01

    The objective of this interagency program is to provide accurate neutron interaction verification and standardization data for the U.S. Department of Energy Division of Nuclear Physics programs which include astrophysics, radioactive beam studies, and heavy-ion reactions. The measurements made in this program are also useful to other programs that indirectly use the unique properties of the neutron for diagnostic and analytical purposes. These include homeland security, personnel health and safety, nuclear waste disposal, treaty verification, national defense, and nuclear based energy production. The work includes the verification of reference standard cross sections and related neutron data employing the unique facilities and capabilities at NIST and other laboratories as required; leadership and participation in international intercomparisons and collaborations; and the preservation of standard reference deposits. An essential element of the program is critical evaluation of neutron interaction data standards including international coordinations. Data testing of critical data for important applications is included. The program is jointly supported by the Department of Energy and the National Institute of Standards and Technology.

  9. Regression Verification Using Impact Summaries

    NASA Technical Reports Server (NTRS)

    Backes, John; Person, Suzette J.; Rungta, Neha; Thachuk, Oksana

    2013-01-01

    Regression verification techniques are used to prove equivalence of syntactically similar programs. Checking equivalence of large programs, however, can be computationally expensive. Existing regression verification techniques rely on abstraction and decomposition techniques to reduce the computational effort of checking equivalence of the entire program. These techniques are sound but not complete. In this work, we propose a novel approach to improve scalability of regression verification by classifying the program behaviors generated during symbolic execution as either impacted or unimpacted. Our technique uses a combination of static analysis and symbolic execution to generate summaries of impacted program behaviors. The impact summaries are then checked for equivalence using an o-the-shelf decision procedure. We prove that our approach is both sound and complete for sequential programs, with respect to the depth bound of symbolic execution. Our evaluation on a set of sequential C artifacts shows that reducing the size of the summaries can help reduce the cost of software equivalence checking. Various reduction, abstraction, and compositional techniques have been developed to help scale software verification techniques to industrial-sized systems. Although such techniques have greatly increased the size and complexity of systems that can be checked, analysis of large software systems remains costly. Regression analysis techniques, e.g., regression testing [16], regression model checking [22], and regression verification [19], restrict the scope of the analysis by leveraging the differences between program versions. These techniques are based on the idea that if code is checked early in development, then subsequent versions can be checked against a prior (checked) version, leveraging the results of the previous analysis to reduce analysis cost of the current version. Regression verification addresses the problem of proving equivalence of closely related program

  10. Earthquake Forecasting, Validation and Verification

    NASA Astrophysics Data System (ADS)

    Rundle, J.; Holliday, J.; Turcotte, D.; Donnellan, A.; Tiampo, K.; Klein, B.

    2009-05-01

    Techniques for earthquake forecasting are in development using both seismicity data mining methods, as well as numerical simulations. The former rely on the development of methods to recognize patterns in data, while the latter rely on the use of dynamical models that attempt to faithfully replicate the actual fault systems. Testing such forecasts is necessary not only to determine forecast quality, but also to improve forecasts. A large number of techniques to validate and verify forecasts have been developed for weather and financial applications. Many of these have been elaborated in public locations, including, for example, the URL as listed below. Typically, the goal is to test for forecast resolution, reliability and sharpness. A good forecast is characterized by consistency, quality and value. Most, if not all of these forecast verification procedures can be readily applied to earthquake forecasts as well. In this talk, we discuss both methods of forecasting, as well as validation and verification using a number of these standard methods. We show how these test methods might be useful for both fault-based forecasting, a group of forecast methods that includes the WGCEP and simulator-based renewal models, and grid-based forecasting, which includes the Relative Intensity, Pattern Informatics, and smoothed seismicity methods. We find that applying these standard methods of forecast verification is straightforward. Judgments about the quality of a given forecast method can often depend on the test applied, as well as on the preconceptions and biases of the persons conducting the tests.

  11. Verification Challenges at Low Numbers

    SciTech Connect

    Benz, Jacob M.; Booker, Paul M.; McDonald, Benjamin S.

    2013-06-01

    Many papers have dealt with the political difficulties and ramifications of deep nuclear arms reductions, and the issues of “Going to Zero”. Political issues include extended deterrence, conventional weapons, ballistic missile defense, and regional and geo-political security issues. At each step on the road to low numbers, the verification required to ensure compliance of all parties will increase significantly. Looking post New START, the next step will likely include warhead limits in the neighborhood of 1000 . Further reductions will include stepping stones at1000 warheads, 100’s of warheads, and then 10’s of warheads before final elimination could be considered of the last few remaining warheads and weapons. This paper will focus on these three threshold reduction levels, 1000, 100’s, 10’s. For each, the issues and challenges will be discussed, potential solutions will be identified, and the verification technologies and chain of custody measures that address these solutions will be surveyed. It is important to note that many of the issues that need to be addressed have no current solution. In these cases, the paper will explore new or novel technologies that could be applied. These technologies will draw from the research and development that is ongoing throughout the national laboratory complex, and will look at technologies utilized in other areas of industry for their application to arms control verification.

  12. Development of Asset Fault Signatures for Prognostic and Health Management in the Nuclear Industry

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2014-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: Diagnostic Advisor, Asset Fault Signature (AFS) Database, Remaining Useful Life Advisor, and Remaining Useful Life Database. This paper focuses on development of asset fault signatures to assess the health status of generator step-up generators and emergency diesel generators in nuclear power plants. Asset fault signatures describe the distinctive features based on technical examinations that can be used to detect a specific fault type. At the most basic level, fault signatures are comprised of an asset type, a fault type, and a set of one or more fault features (symptoms) that are indicative of the specified fault. The AFS Database is populated with asset fault signatures via a content development exercise that is based on the results of intensive technical research and on the knowledge and experience of technical experts. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  13. Nuclear Resonance Fluorescence for Material Verification in Dismantlement

    SciTech Connect

    Warren, Glen A.; Detwiler, Rebecca S.

    2011-10-01

    Nuclear resonance fluorescence (NRF) is a well-established physical process that provides an isotope-specific signature that can be exploited for isotopic detection and characterization of samples. Pacific Northwest National Laboratory has been investigating possible applications of NRF for national security. Of the investigated applications, the verification of material in the dismantlement process is the most promising. Through a combination of benchmarking measurements and radiation transport modeling, we have shown that NRF techniques with existing bremsstrahlung photon sources and a modest detection system can be used to detect highly enriched uranium in the quantities and time limits relevant to the dismantlement process. Issues such as orientation, placement and material geometry do not significantly impact the sensitivity of the technique. We have also investigated how shielding of the uranium would be observed through non-NRF processes to enable the accurate assay of the material. This paper will discuss our findings on how NRF and photon-interrogation techniques may be applied to the material verification in the dismantlement process.

  14. Significance Analysis of Prognostic Signatures

    PubMed Central

    Beck, Andrew H.; Knoblauch, Nicholas W.; Hefti, Marco M.; Kaplan, Jennifer; Schnitt, Stuart J.; Culhane, Aedin C.; Schroeder, Markus S.; Risch, Thomas; Quackenbush, John; Haibe-Kains, Benjamin

    2013-01-01

    A major goal in translational cancer research is to identify biological signatures driving cancer progression and metastasis. A common technique applied in genomics research is to cluster patients using gene expression data from a candidate prognostic gene set, and if the resulting clusters show statistically significant outcome stratification, to associate the gene set with prognosis, suggesting its biological and clinical importance. Recent work has questioned the validity of this approach by showing in several breast cancer data sets that “random” gene sets tend to cluster patients into prognostically variable subgroups. This work suggests that new rigorous statistical methods are needed to identify biologically informative prognostic gene sets. To address this problem, we developed Significance Analysis of Prognostic Signatures (SAPS) which integrates standard prognostic tests with a new prognostic significance test based on stratifying patients into prognostic subtypes with random gene sets. SAPS ensures that a significant gene set is not only able to stratify patients into prognostically variable groups, but is also enriched for genes showing strong univariate associations with patient prognosis, and performs significantly better than random gene sets. We use SAPS to perform a large meta-analysis (the largest completed to date) of prognostic pathways in breast and ovarian cancer and their molecular subtypes. Our analyses show that only a small subset of the gene sets found statistically significant using standard measures achieve significance by SAPS. We identify new prognostic signatures in breast and ovarian cancer and their corresponding molecular subtypes, and we show that prognostic signatures in ER negative breast cancer are more similar to prognostic signatures in ovarian cancer than to prognostic signatures in ER positive breast cancer. SAPS is a powerful new method for deriving robust prognostic biological signatures from clinically annotated

  15. Negotiating Femininities Online

    ERIC Educational Resources Information Center

    Davies, Julia

    2004-01-01

    Much has been written about the potential for online learning (Fryer, 1997; www. ngfl.gov.uk/ngfl/index.html). However this literature typically emphasizes not online learning but online education. In this paper I focus on the potential for online learning, specifically learning about issues surrounding femininity in the presence of online peers,…

  16. PERFORMANCE VERIFICATION OF ANIMAL WATER TREATMENT TECHNOLOGIES THROUGH EPA'S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The U.S. Environmental Protection Agency created the Environmental Technology Verification Program (ETV) to further environmental protection by accelerating the commercialization of new and innovative technology through independent performance verification and dissemination of in...

  17. PERFORMANCE VERIFICATION OF STORMWATER TREATMENT DEVICES UNDER EPA�S ENVIRONMENTAL TECHNOLOGY VERIFICATION PROGRAM

    EPA Science Inventory

    The Environmental Technology Verification (ETV) Program was created to facilitate the deployment of innovative or improved environmental technologies through performance verification and dissemination of information. The program�s goal is to further environmental protection by a...

  18. Signature molecular descriptor : advanced applications.

    SciTech Connect

    Visco, Donald Patrick, Jr.

    2010-04-01

    In this work we report on the development of the Signature Molecular Descriptor (or Signature) for use in the solution of inverse design problems as well as in highthroughput screening applications. The ultimate goal of using Signature is to identify novel and non-intuitive chemical structures with optimal predicted properties for a given application. We demonstrate this in three studies: green solvent design, glucocorticoid receptor ligand design and the design of inhibitors for Factor XIa. In many areas of engineering, compounds are designed and/or modified in incremental ways which rely upon heuristics or institutional knowledge. Often multiple experiments are performed and the optimal compound is identified in this brute-force fashion. Perhaps a traditional chemical scaffold is identified and movement of a substituent group around a ring constitutes the whole of the design process. Also notably, a chemical being evaluated in one area might demonstrate properties very attractive in another area and serendipity was the mechanism for solution. In contrast to such approaches, computer-aided molecular design (CAMD) looks to encompass both experimental and heuristic-based knowledge into a strategy that will design a molecule on a computer to meet a given target. Depending on the algorithm employed, the molecule which is designed might be quite novel (re: no CAS registration number) and/or non-intuitive relative to what is known about the problem at hand. While CAMD is a fairly recent strategy (dating to the early 1980s), it contains a variety of bottlenecks and limitations which have prevented the technique from garnering more attention in the academic, governmental and industrial institutions. A main reason for this is how the molecules are described in the computer. This step can control how models are developed for the properties of interest on a given problem as well as how to go from an output of the algorithm to an actual chemical structure. This report

  19. Sensorless, online motor diagnostics

    SciTech Connect

    Kliman, G.B.; Premerlani, W.J.; Yazici, B.; Koegl, R.A.; Mazereeuw, J.

    1997-04-01

    Electric motors play a very important role in the safe and efficient running of any industrial plant. Early detection of abnormalities in the motors will help avoid expensive failures. Motor current signature analysis (MCSA) implemented in a computer-based motor monitor can contribute to such condition-based maintenance functions. Such a system may also detect an abnormality in the process as well as the motor. Extensive online monitoring of the motors can lead to greater plant availability, extended plant life, higher quality product, and smoother plant operation. With advances in digital technology over the last several years, adequate data processing capability is now available on cost-effective, microprocessor-based, protective-relay platforms to monitor motors for a variety of abnormalities in addition to the normal protection functions. Such multifunction monitors, first introduced by Multilin, are displacing the multiplicity of electromechanical devices commonly applied for many years. Following some background information on motor monitoring, this article features recent developments in providing tools for the diagnosis of faults or incipient faults in electric motor drives: Sensorless torque measurement, direct detection of turn-to-turn short circuits, detection of cracked or broken rotor bars, and detection of bearing deterioration.

  20. The Global Diffusion of Societal Verification Tools: A Quantitative Assessment of the Public’s Ability to Engage Nonproliferation Treaty Monitoring

    SciTech Connect

    Sayre, Amanda M.; Kreyling, Sean J.; West, Curtis L.

    2015-07-11

    The spread of nuclear and dual-use technologies and the need for more robust, effective and efficient nonproliferation and arms control treaties has led to an increasing need for innovative verification approaches and technologies. This need, paired with advancements in online computing, mobile devices, commercially available satellite imagery and the evolution of online social networks, has led to a resurgence of the concept of societal verification for arms control and nonproliferation treaties. In the event a country accepts its citizens’ assistance in supporting transparency, confidence-building and societal verification, the host government will need a population that is willing and able to participate. While scholarly interest in societal verification continues to grow, social scientific research on the topic is lacking. The aim of this paper is to begin the process of understanding public societal verification capabilities, extend the availability of quantitative research on societal verification and set in motion complementary research to increase the breadth and depth of knowledge on this topic. This paper presents a potential framework and outlines a research roadmap for the development of such a societal verification capability index.

  1. Compendium of Arms Control Verification Proposals.

    DTIC Science & Technology

    1982-03-01

    ZONAL ON-SITE INSPECTION ............ 123 CHAPTER D - CONTROL POSTS ................................... 139 CHAPTER E - RECORDS MONITORING...de:cribi.nr in reneral the zirnifiemit features of the verification method concerned. I’ ’ ’vi.i Chapters A to D deal with verification by direct on...inspection (i.e. increasing as confidence develops), and chapter D with control or observation posts. Chapter E deals with verification by examination of

  2. Magnetic cleanliness verification approach on tethered satellite

    NASA Technical Reports Server (NTRS)

    Messidoro, Piero; Braghin, Massimo; Grande, Maurizio

    1990-01-01

    Magnetic cleanliness testing was performed on the Tethered Satellite as the last step of an articulated verification campaign aimed at demonstrating the capability of the satellite to support its TEMAG (TEthered MAgnetometer) experiment. Tests at unit level and analytical predictions/correlations using a dedicated mathematical model (GANEW program) are also part of the verification activities. Details of the tests are presented, and the results of the verification are described together with recommendations for later programs.

  3. Using tools for verification, documentation and testing

    NASA Technical Reports Server (NTRS)

    Osterweil, L. J.

    1978-01-01

    Methodologies are discussed on four of the major approaches to program upgrading -- namely dynamic testing, symbolic execution, formal verification and static analysis. The different patterns of strengths, weaknesses and applications of these approaches are shown. It is demonstrated that these patterns are in many ways complementary, offering the hope that they can be coordinated and unified into a single comprehensive program testing and verification system capable of performing a diverse and useful variety of error detection, verification and documentation functions.

  4. Automated verification of flight software. User's manual

    NASA Technical Reports Server (NTRS)

    Saib, S. H.

    1982-01-01

    (Automated Verification of Flight Software), a collection of tools for analyzing source programs written in FORTRAN and AED is documented. The quality and the reliability of flight software are improved by: (1) indented listings of source programs, (2) static analysis to detect inconsistencies in the use of variables and parameters, (3) automated documentation, (4) instrumentation of source code, (5) retesting guidance, (6) analysis of assertions, (7) symbolic execution, (8) generation of verification conditions, and (9) simplification of verification conditions. Use of AVFS in the verification of flight software is described.

  5. The SeaHorn Verification Framework

    NASA Technical Reports Server (NTRS)

    Gurfinkel, Arie; Kahsai, Temesghen; Komuravelli, Anvesh; Navas, Jorge A.

    2015-01-01

    In this paper, we present SeaHorn, a software verification framework. The key distinguishing feature of SeaHorn is its modular design that separates the concerns of the syntax of the programming language, its operational semantics, and the verification semantics. SeaHorn encompasses several novelties: it (a) encodes verification conditions using an efficient yet precise inter-procedural technique, (b) provides flexibility in the verification semantics to allow different levels of precision, (c) leverages the state-of-the-art in software model checking and abstract interpretation for verification, and (d) uses Horn-clauses as an intermediate language to represent verification conditions which simplifies interfacing with multiple verification tools based on Horn-clauses. SeaHorn provides users with a powerful verification tool and researchers with an extensible and customizable framework for experimenting with new software verification techniques. The effectiveness and scalability of SeaHorn are demonstrated by an extensive experimental evaluation using benchmarks from SV-COMP 2015 and real avionics code.

  6. Hydrologic data-verification management program plan

    USGS Publications Warehouse

    Alexander, C.W.

    1982-01-01

    Data verification refers to the performance of quality control on hydrologic data that have been retrieved from the field and are being prepared for dissemination to water-data users. Water-data users now have access to computerized data files containing unpublished, unverified hydrologic data. Therefore, it is necessary to develop techniques and systems whereby the computer can perform some data-verification functions before the data are stored in user-accessible files. Computerized data-verification routines can be developed for this purpose. A single, unified concept describing master data-verification program using multiple special-purpose subroutines, and a screen file containing verification criteria, can probably be adapted to any type and size of computer-processing system. Some traditional manual-verification procedures can be adapted for computerized verification, but new procedures can also be developed that would take advantage of the powerful statistical tools and data-handling procedures available to the computer. Prototype data-verification systems should be developed for all three data-processing environments as soon as possible. The WATSTORE system probably affords the greatest opportunity for long-range research and testing of new verification subroutines. (USGS)

  7. 40 CFR 1065.550 - Gas analyzer range verification and drift verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... Cycles § 1065.550 Gas analyzer range verification and drift verification. (a) Range verification. If an... with a CLD and the removed water is corrected based on measured CO2, CO, THC, and NOX concentrations...-specific emissions over the entire duty cycle for drift. For each constituent to be verified, both sets...

  8. Signature Visualization of Software Binaries

    SciTech Connect

    Panas, T

    2008-07-01

    In this paper we present work on the visualization of software binaries. In particular, we utilize ROSE, an open source compiler infrastructure, to pre-process software binaries, and we apply a landscape metaphor to visualize the signature of each binary (malware). We define the signature of a binary as a metric-based layout of the functions contained in the binary. In our initial experiment, we visualize the signatures of a series of computer worms that all originate from the same line. These visualizations are useful for a number of reasons. First, the images reveal how the archetype has evolved over a series of versions of one worm. Second, one can see the distinct changes between version. This allows the viewer to form conclusions about the development cycle of a particular worm.

  9. Graph Analytics for Signature Discovery

    SciTech Connect

    Hogan, Emilie A.; Johnson, John R.; Halappanavar, Mahantesh; Lo, Chaomei

    2013-06-01

    Within large amounts of seemingly unstructured data it can be diffcult to find signatures of events. In our work we transform unstructured data into a graph representation. By doing this we expose underlying structure in the data and can take advantage of existing graph analytics capabilities, as well as develop new capabilities. Currently we focus on applications in cybersecurity and communication domains. Within cybersecurity we aim to find signatures for perpetrators using the pass-the-hash attack, and in communications we look for emails or phone calls going up or down a chain of command. In both of these areas, and in many others, the signature we look for is a path with certain temporal properties. In this paper we discuss our methodology for finding these temporal paths within large graphs.

  10. Measurement of sniper infrared signatures

    NASA Astrophysics Data System (ADS)

    Kastek, M.; Dulski, R.; Trzaskawka, P.; Bieszczad, G.

    2009-09-01

    The paper presents some practical aspects of sniper IR signature measurements. Description of particular signatures for sniper and background in typical scenarios has been presented. We take into consideration sniper activities in open area as well as in urban environment. The measurements were made at field test ground. High precision laboratory measurements were also performed. Several infrared cameras were used during measurements to cover all measurement assumptions. Some of the cameras are measurement class devices with high accuracy and speed. The others are microbolometer cameras with FPA detector similar to those used in real commercial counter-sniper systems. The registration was made in SWIR and LWIR spectral bands simultaneously. An ultra fast visual camera was also used for visible spectra registration. Exemplary sniper IR signatures for typical situation were presented.

  11. Online Staff Development.

    ERIC Educational Resources Information Center

    Pease, Pamela S.; Magnuson, Peter

    2003-01-01

    Describes the benefits for principals of online staff development for teachers. Sources of online courses and training include local and state departments of education, professional associations, colleges and universities, online universities, and commercial suppliers. (PKP)

  12. Textural signatures for wetland vegetation

    NASA Technical Reports Server (NTRS)

    Whitman, R. I.; Marcellus, K. L.

    1973-01-01

    This investigation indicates that unique textural signatures do exist for specific wetland communities at certain times in the growing season. When photographs with the proper resolution are obtained, the textural features can identify the spectral features of the vegetation community seen with lower resolution mapping data. The development of a matrix of optimum textural signatures is the goal of this research. Seasonal variations of spectral and textural features are particularly important when performing a vegetations analysis of fresh water marshes. This matrix will aid in flight planning, since expected seasonal variations and resolution requirements can be established prior to a given flight mission.

  13. Ballistic Signature Identification System Study

    NASA Technical Reports Server (NTRS)

    1976-01-01

    The first phase of a research project directed toward development of a high speed automatic process to be used to match gun barrel signatures imparted to fired bullets was documented. An optical projection technique has been devised to produce and photograph a planar image of the entire signature, and the phototransparency produced is subjected to analysis using digital Fourier transform techniques. The success of this approach appears to be limited primarily by the accuracy of the photographic step since no significant processing limitations have been encountered.

  14. Turbulence Modeling Verification and Validation

    NASA Technical Reports Server (NTRS)

    Rumsey, Christopher L.

    2014-01-01

    Computational fluid dynamics (CFD) software that solves the Reynolds-averaged Navier-Stokes (RANS) equations has been in routine use for more than a quarter of a century. It is currently employed not only for basic research in fluid dynamics, but also for the analysis and design processes in many industries worldwide, including aerospace, automotive, power generation, chemical manufacturing, polymer processing, and petroleum exploration. A key feature of RANS CFD is the turbulence model. Because the RANS equations are unclosed, a model is necessary to describe the effects of the turbulence on the mean flow, through the Reynolds stress terms. The turbulence model is one of the largest sources of uncertainty in RANS CFD, and most models are known to be flawed in one way or another. Alternative methods such as direct numerical simulations (DNS) and large eddy simulations (LES) rely less on modeling and hence include more physics than RANS. In DNS all turbulent scales are resolved, and in LES the large scales are resolved and the effects of the smallest turbulence scales are modeled. However, both DNS and LES are too expensive for most routine industrial usage on today's computers. Hybrid RANS-LES, which blends RANS near walls with LES away from walls, helps to moderate the cost while still retaining some of the scale-resolving capability of LES, but for some applications it can still be too expensive. Even considering its associated uncertainties, RANS turbulence modeling has proved to be very useful for a wide variety of applications. For example, in the aerospace field, many RANS models are considered to be reliable for computing attached flows. However, existing turbulence models are known to be inaccurate for many flows involving separation. Research has been ongoing for decades in an attempt to improve turbulence models for separated and other nonequilibrium flows. When developing or improving turbulence models, both verification and validation are important

  15. Structural dynamics verification facility study

    NASA Technical Reports Server (NTRS)

    Kiraly, L. J.; Hirchbein, M. S.; Mcaleese, J. M.; Fleming, D. P.

    1981-01-01

    The need for a structural dynamics verification facility to support structures programs was studied. Most of the industry operated facilities are used for highly focused research, component development, and problem solving, and are not used for the generic understanding of the coupled dynamic response of major engine subsystems. Capabilities for the proposed facility include: the ability to both excite and measure coupled structural dynamic response of elastic blades on elastic shafting, the mechanical simulation of various dynamical loadings representative of those seen in operating engines, and the measurement of engine dynamic deflections and interface forces caused by alternative engine mounting configurations and compliances.

  16. Optimal Imaging for Treaty Verification

    SciTech Connect

    Brubaker, Erik; Hilton, Nathan R.; Johnson, William; Marleau, Peter; Kupinski, Matthew; MacGahan, Christopher Jonathan

    2014-09-01

    Future arms control treaty verification regimes may use radiation imaging measurements to confirm and track nuclear warheads or other treaty accountable items (TAIs). This project leverages advanced inference methods developed for medical and adaptive imaging to improve task performance in arms control applications. Additionally, we seek a method to acquire and analyze imaging data of declared TAIs without creating an image of those objects or otherwise storing or revealing any classified information. Such a method would avoid the use of classified-information barriers (IB).

  17. Why do verification and validation?

    DOE PAGES

    Hu, Kenneth T.; Paez, Thomas L.

    2016-02-19

    In this discussion paper, we explore different ways to assess the value of verification and validation (V&V) of engineering models. We first present a literature review on the value of V&V and then use value chains and decision trees to show how value can be assessed from a decision maker's perspective. In this context, the value is what the decision maker is willing to pay for V&V analysis with the understanding that the V&V results are uncertain. As a result, the 2014 Sandia V&V Challenge Workshop is used to illustrate these ideas.

  18. Approaches to wind resource verification

    NASA Technical Reports Server (NTRS)

    Barchet, W. R.

    1982-01-01

    Verification of the regional wind energy resource assessments produced by the Pacific Northwest Laboratory addresses the question: Is the magnitude of the resource given in the assessments truly representative of the area of interest? Approaches using qualitative indicators of wind speed (tree deformation, eolian features), old and new data of opportunity not at sites specifically chosen for their exposure to the wind, and data by design from locations specifically selected to be good wind sites are described. Data requirements and evaluation procedures for verifying the resource are discussed.

  19. Informatics for Unveiling Hidden Genome Signatures

    PubMed Central

    Abe, Takashi; Kanaya, Shigehiko; Kinouchi, Makoto; Ichiba, Yuta; Kozuki, Tokio; Ikemura, Toshimichi

    2003-01-01

    With the increasing amount of available genome sequences, novel tools are needed for comprehensive analysis of species-specific sequence characteristics for a wide variety of genomes. We used an unsupervised neural network algorithm, a self-organizing map (SOM), to analyze di-, tri-, and tetranucleotide frequencies in a wide variety of prokaryotic and eukaryotic genomes. The SOM, which can cluster complex data efficiently, was shown to be an excellent tool for analyzing global characteristics of genome sequences and for revealing key combinations of oligonucleotides representing individual genomes. From analysis of 1- and 10-kb genomic sequences derived from 65 bacteria (a total of 170 Mb) and from 6 eukaryotes (460 Mb), clear species-specific separations of major portions of the sequences were obtained with the di-, tri-, and tetranucleotide SOMs. The unsupervised algorithm could recognize, in most 10-kb sequences, the species-specific characteristics (key combinations of oligonucleotide frequencies) that are signature features of each genome. We were able to classify DNA sequences within one and between many species into subgroups that corresponded generally to biological categories. Because the classification power is very high, the SOM is an efficient and fundamental bioinformatic strategy for extracting a wide range of genomic information from a vast amount of sequences. [Supplemental material is available online at www.genome.org.] PMID:12671005

  20. Improved method of signature extraction

    NASA Technical Reports Server (NTRS)

    Christianson, D.; Gordon, M.; Kistler, R.; Kriegler, F. J.; Lampert, S.; Marshall, R. E.; Mclaughlin, R.; Smith, V.

    1977-01-01

    System promises capability of rapidly processing large amounts of data generated by currently available and planned multispectral sensors, such as those utilized on aircraft and spacecraft. Techniques developed for system, greatly decrease operator time required for signature extraction from multispectral data base.

  1. Topological Signatures for Population Admixture

    Technology Transfer Automated Retrieval System (TEKTRAN)

    Topological Signatures for Population AdmixtureDeniz Yorukoglu1, Filippo Utro1, David Kuhn2, Saugata Basu3 and Laxmi Parida1* Abstract Background: As populations with multi-linear transmission (i.e., mixing of genetic material from two parents, say) evolve over generations, the genetic transmission...

  2. MK 66 Rocket Signature Reduction

    DTIC Science & Technology

    1982-04-01

    Indian Head, Maryland. ’The objec- tive of the study was to reduce the visible signature of the rocket motor. The rocket motor used for demonstration tests...15 6. Actual Emmiissions . . . . . . ........... . 16 7. Human Eye Adjusted Emmissions ..................... .. 16 8. Cross...altered. Additives are commonly used in gun propellants for elimination of muzzle flash. Their use in tactical rockets has been very limited, and

  3. Disaster relief through composite signatures

    NASA Astrophysics Data System (ADS)

    Hawley, Chadwick T.; Hyde, Brian; Carpenter, Tom; Nichols, Steve

    2012-06-01

    A composite signature is a group of signatures that are related in such a way to more completely or further define a target or operational endeavor at a higher fidelity. This paper builds on previous work developing innovative composite signatures associated with civil disasters, including physical, chemical and pattern/behavioral. For the composite signature approach to be successful it requires effective data fusion and visualization. This plays a key role in both preparedness and the response and recovery which are critical to saving lives. Visualization tools enhance the overall understanding of the crisis by pulling together and analyzing the data, and providing a clear and complete analysis of the information to the organizations/agencies dependant on it for a successful operation. An example of this, Freedom Web, is an easy-to-use data visualization and collaboration solution for use in homeland security, emergency preparedness, situational awareness, and event management. The solution provides a nationwide common operating picture for all levels of government through a web based, map interface. The tool was designed to be utilized by non-geospatial experts and is easily tailored to the specific needs of the users. Consisting of standard COTS and open source databases and a web server, users can view, edit, share, and highlight information easily and quickly through a standard internet browser.

  4. CFE verification: The decision to inspect

    SciTech Connect

    Allentuck, J.

    1990-01-01

    Verification of compliance with the provisions of the treaty on Conventional Forces-Europe (CFE) is subject to inspection quotas of various kinds. Thus the decision to carry out a specific inspection or verification activity must be prudently made. This decision process is outlined, and means for conserving quotas'' are suggested. 4 refs., 1 fig.

  5. Identity Verification, Control, and Aggression in Marriage

    ERIC Educational Resources Information Center

    Stets, Jan E.; Burke, Peter J.

    2005-01-01

    In this research we study the identity verification process and its effects in marriage. Drawing on identity control theory, we hypothesize that a lack of verification in the spouse identity (1) threatens stable self-meanings and interaction patterns between spouses, and (2) challenges a (nonverified) spouse's perception of control over the…

  6. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2014-01-01 2014-01-01 false Verification program. 460.17 Section...

  7. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2013-01-01 2013-01-01 false Verification program. 460.17 Section...

  8. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17 Verification... software in an operational flight environment before allowing any space flight participant on board during... 14 Aeronautics and Space 4 2012-01-01 2012-01-01 false Verification program. 460.17 Section...

  9. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 24 Housing and Urban Development 5 2010-04-01 2010-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  10. 24 CFR 4001.112 - Income verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 24 Housing and Urban Development 5 2011-04-01 2011-04-01 false Income verification. 4001.112... Requirements and Underwriting Procedures § 4001.112 Income verification. The mortgagee shall use FHA's procedures to verify the mortgagor's income and shall comply with the following additional requirements:...

  11. IMPROVING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATIONS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) began the Environmental Technology Verification (ETV) Program in 1995 as a means of working with the private sector to establish a market-based verification process available to all environmental technologies. Under EPA's Office of R...

  12. The monitoring and verification of nuclear weapons

    SciTech Connect

    Garwin, Richard L.

    2014-05-09

    This paper partially reviews and updates the potential for monitoring and verification of nuclear weapons, including verification of their destruction. Cooperative monitoring with templates of the gamma-ray spectrum are an important tool, dependent on the use of information barriers.

  13. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  14. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  15. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  16. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  17. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  18. 18 CFR 286.107 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 286.107 Section 286.107 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... Contested Audit Findings and Proposed Remedies § 286.107 Verification. The facts stated in the...

  19. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  20. 18 CFR 41.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 41.5 Section 41.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  1. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  2. 18 CFR 158.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 158.5 Section 158.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT....5 Verification. The facts stated in the memorandum must be sworn to by persons having...

  3. 18 CFR 349.5 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification. 349.5 Section 349.5 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT... PROPOSED REMEDIES § 349.5 Verification. The facts stated in the memorandum must be sworn to by...

  4. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2013 CFR

    2013-10-01

    ... 47 Telecommunication 1 2013-10-01 2013-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  5. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 47 Telecommunication 1 2012-10-01 2012-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  6. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 47 Telecommunication 1 2014-10-01 2014-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  7. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 47 Telecommunication 1 2010-10-01 2010-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  8. 47 CFR 2.902 - Verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 47 Telecommunication 1 2011-10-01 2011-10-01 false Verification. 2.902 Section 2.902 Telecommunication FEDERAL COMMUNICATIONS COMMISSION GENERAL FREQUENCY ALLOCATIONS AND RADIO TREATY MATTERS; GENERAL RULES AND REGULATIONS Equipment Authorization Procedures General Provisions § 2.902 Verification....

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION FOR INDOOR AIR PRODUCTS

    EPA Science Inventory

    The paper discusses environmental technology verification (ETV) for indoor air products. RTI is developing the framework for a verification testing program for indoor air products, as part of EPA's ETV program. RTI is establishing test protocols for products that fit into three...

  10. Guidelines for qualifying cleaning and verification materials

    NASA Technical Reports Server (NTRS)

    Webb, D.

    1995-01-01

    This document is intended to provide guidance in identifying technical issues which must be addressed in a comprehensive qualification plan for materials used in cleaning and cleanliness verification processes. Information presented herein is intended to facilitate development of a definitive checklist that should address all pertinent materials issues when down selecting a cleaning/verification media.

  11. Gender Verification of Female Olympic Athletes.

    ERIC Educational Resources Information Center

    Dickinson, Barry D.; Genel, Myron; Robinowitz, Carolyn B.; Turner, Patricia L.; Woods, Gary L.

    2002-01-01

    Gender verification of female athletes has long been criticized by geneticists, endocrinologists, and others in the medical community. Recently, the International Olympic Committee's Athletic Commission called for discontinuation of mandatory laboratory-based gender verification of female athletes. This article discusses normal sexual…

  12. 14 CFR 460.17 - Verification program.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... a flight. Verification must include flight testing. ... 14 Aeronautics and Space 4 2010-01-01 2010-01-01 false Verification program. 460.17 Section 460.17... TRANSPORTATION LICENSING HUMAN SPACE FLIGHT REQUIREMENTS Launch and Reentry with Crew § 460.17...

  13. ENVIRONMENTAL TECHNOLOGY VERIFICATION AND INDOOR AIR

    EPA Science Inventory

    The paper discusses environmental technology verification and indoor air. RTI has responsibility for a pilot program for indoor air products as part of the U.S. EPA's Environmental Technology Verification (ETV) program. The program objective is to further the development of sel...

  14. Irma 5.2 multi-sensor signature prediction model

    NASA Astrophysics Data System (ADS)

    Savage, James; Coker, Charles; Thai, Bea; Aboutalib, Omar; Pau, John

    2008-04-01

    The Irma synthetic signature prediction code is being developed by the Munitions Directorate of the Air Force Research Laboratory (AFRL/RW) to facilitate the research and development of multi-sensor systems. There are over 130 users within the Department of Defense, NASA, Department of Transportation, academia, and industry. Irma began as a high-resolution, physics-based Infrared (IR) target and background signature model for tactical weapon applications and has grown to include: a laser (or active) channel (1990), improved scene generator to support correlated frame-to-frame imagery (1992), and passive IR/millimeter wave (MMW) channel for a co-registered active/passive IR/MMW model (1994). Irma version 5.0 was released in 2000 and encompassed several upgrades to both the physical models and software; host support was expanded to Windows, Linux, Solaris, and SGI Irix platforms. In 2005, version 5.1 was released after extensive verification and validation of an upgraded and reengineered ladar channel. The reengineering effort then shifted focus to the Irma passive channel. Field measurements for the validation effort include both polarized and unpolarized data collection. Irma 5.2 was released in 2007 with a reengineered passive channel. This paper summarizes the capabilities of Irma and the progress toward Irma 5.3, which includes a reengineered radar channel.

  15. Microcalibrator system for chemical signature and reagent delivery.

    SciTech Connect

    Staton, Alan W.; Simonson, Robert Joseph; Adkins, Douglas Ray; Rawlinson, Kim Scott; Robinson, Alex Lockwood; Hance, Bradley G.; Manginell, Ronald Paul; Sanchez, Lawrence James; Ellison, Jennifer Anne; Sokolowski, Sara Suzette

    2005-03-01

    Networked systems of low-cost, small, integrable chemical sensors will enable monitoring of Nonproliferation and Materials Control targets and chemical weapons threats. Sandia-designed prototype chemical sensor systems are undergoing extended field testing supported by DOE and other government agencies. A required surety component will be verification of microanalytical system performance, which can be achieved by providing a programmable source of chemical signature(s) for autonomous calibration of analytical systems. In addition, such a controlled chemical source could be used to dispense microaliquots of derivatization reagents, extending the analysis capability of chemical sensors to a wider range of targets. We have developed a microfabricated system for controlled release of selected compounds (calibrants) into the analytical stream of microsensor systems. To minimize pumping and valve requirements of microfluidic systems, and to avoid degradation issues associated with storage of dilute solutions, we have utilized thermally labile organic salts as solid-phase reservoir materials. Reproducible deposition of tetrapropyl ammonium hydroxide onto arrays of microfabricated heating elements can provide a pair of calibration marker compounds (one fast and one slow-eluting compound) for GC analyses. The use of this microaliquot gas source array for hydrogen generation is currently under further development. The goal of the latter effort will be to provide a source of high-pressure, low viscosity GC carrier gas for Sandia's next-generation microfabricated gas-phase chemical analysis systems.

  16. Cognitive Bias in Systems Verification

    NASA Technical Reports Server (NTRS)

    Larson, Steve

    2012-01-01

    Working definition of cognitive bias: Patterns by which information is sought and interpreted that can lead to systematic errors in decisions. Cognitive bias is used in diverse fields: Economics, Politics, Intelligence, Marketing, to name a few. Attempts to ground cognitive science in physical characteristics of the cognitive apparatus exceed our knowledge. Studies based on correlations; strict cause and effect is difficult to pinpoint. Effects cited in the paper and discussed here have been replicated many times over, and appear sound. Many biases have been described, but it is still unclear whether they are all distinct. There may only be a handful of fundamental biases, which manifest in various ways. Bias can effect system verification in many ways . Overconfidence -> Questionable decisions to deploy. Availability -> Inability to conceive critical tests. Representativeness -> Overinterpretation of results. Positive Test Strategies -> Confirmation bias. Debiasing at individual level very difficult. The potential effect of bias on the verification process can be managed, but not eliminated. Worth considering at key points in the process.

  17. Video-based fingerprint verification.

    PubMed

    Qin, Wei; Yin, Yilong; Liu, Lili

    2013-09-04

    Conventional fingerprint verification systems use only static information. In this paper, fingerprint videos, which contain dynamic information, are utilized for verification. Fingerprint videos are acquired by the same capture device that acquires conventional fingerprint images, and the user experience of providing a fingerprint video is the same as that of providing a single impression. After preprocessing and aligning processes, "inside similarity" and "outside similarity" are defined and calculated to take advantage of both dynamic and static information contained in fingerprint videos. Match scores between two matching fingerprint videos are then calculated by combining the two kinds of similarity. Experimental results show that the proposed video-based method leads to a relative reduction of 60 percent in the equal error rate (EER) in comparison to the conventional single impression-based method. We also analyze the time complexity of our method when different combinations of strategies are used. Our method still outperforms the conventional method, even if both methods have the same time complexity. Finally, experimental results demonstrate that the proposed video-based method can lead to better accuracy than the multiple impressions fusion method, and the proposed method has a much lower false acceptance rate (FAR) when the false rejection rate (FRR) is quite low.

  18. Quantitative reactive modeling and verification.

    PubMed

    Henzinger, Thomas A

    Formal verification aims to improve the quality of software by detecting errors before they do harm. At the basis of formal verification is the logical notion of correctness, which purports to capture whether or not a program behaves as desired. We suggest that the boolean partition of software into correct and incorrect programs falls short of the practical need to assess the behavior of software in a more nuanced fashion against multiple criteria. We therefore propose to introduce quantitative fitness measures for programs, specifically for measuring the function, performance, and robustness of reactive programs such as concurrent processes. This article describes the goals of the ERC Advanced Investigator Project QUAREM. The project aims to build and evaluate a theory of quantitative fitness measures for reactive models. Such a theory must strive to obtain quantitative generalizations of the paradigms that have been success stories in qualitative reactive modeling, such as compositionality, property-preserving abstraction and abstraction refinement, model checking, and synthesis. The theory will be evaluated not only in the context of software and hardware engineering, but also in the context of systems biology. In particular, we will use the quantitative reactive models and fitness measures developed in this project for testing hypotheses about the mechanisms behind data from biological experiments.

  19. Modular verification of concurrent systems

    SciTech Connect

    Sobel, A.E.K.

    1986-01-01

    During the last ten years, a number of authors have proposed verification techniques that allow one to prove properties of individual processes by using global assumptions about the behavior of the remaining processes in the distributed program. As a result, one must justify these global assumptions before drawing any conclusions regarding the correctness of the entire program. This justification is often the most difficult part of the proof and presents a serious obstacle to hierarchical program development. This thesis develops a new approach to the verification of concurrent systems. The approach is modular and supports compositional development of programs since the proofs of each individual process of a program are completely isolated from all others. The generality of this approach is illustrated by applying it to a representative set of contemporary concurrent programming languages, namely: CSP, ADA, Distributed Processes, and a shared variable language. In addition, it is also shown how the approach may be used to deal with a number of other constructs that have been proposed for inclusion in concurrent languages: FORK and JOIN primitives, nested monitor calls, path expressions, atomic transactions, and asynchronous message passing. These results allow argument that the approach is universal and can be used to design proof systems for any concurrent language.

  20. Dust devil signatures in infrasound records of the International Monitoring System

    NASA Astrophysics Data System (ADS)

    Lorenz, Ralph D.; Christie, Douglas

    2015-03-01

    We explore whether dust devils have a recognizable signature in infrasound array records, since several Comprehensive Nuclear-Test-Ban Treaty verification stations conducting continuous measurements with microbarometers are in desert areas which see dust devils. The passage of dust devils (and other boundary layer vortices, whether dust laden or not) causes a local temporary drop in pressure: the high-pass time domain filtering in microbarometers results in a "heartbeat" signature, which we observe at the Warramunga station in Australia. We also observe a ~50 min pseudoperiodicity in the occurrence of these signatures and some higher-frequency infrasound. Dust devils do not significantly degrade the treaty verification capability. The pipe arrays for spatial averaging used in infrasound monitoring degrade the detection efficiency of small devils, but the long observation time may allow a useful census of large vortices, and thus, the high-sensitivity infrasonic array data from the monitoring network can be useful in studying columnar vortices in the lower atmosphere.

  1. A Scala DSL for RETE-Based Runtime Verification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus

    2013-01-01

    Runtime verification (RV) consists in part of checking execution traces against formalized specifications. Several systems have emerged, most of which support specification notations based on state machines, regular expressions, temporal logic, or grammars. The field of Artificial Intelligence (AI) has for an even longer period of time studied rule-based production systems, which at a closer look appear to be relevant for RV, although seemingly focused on slightly different application domains, such as for example business processes and expert systems. The core algorithm in many of these systems is the Rete algorithm. We have implemented a Rete-based runtime verification system, named LogFire (originally intended for offline log analysis but also applicable to online analysis), as an internal DSL in the Scala programming language, using Scala's support for defining DSLs. This combination appears attractive from a practical point of view. Our contribution is in part conceptual in arguing that such rule-based frameworks originating from AI may be suited for RV.

  2. Space transportation system payload interface verification

    NASA Technical Reports Server (NTRS)

    Everline, R. T.

    1977-01-01

    The paper considers STS payload-interface verification requirements and the capability provided by STS to support verification. The intent is to standardize as many interfaces as possible, not only through the design, development, test and evaluation (DDT and E) phase of the major payload carriers but also into the operational phase. The verification process is discussed in terms of its various elements, such as the Space Shuttle DDT and E (including the orbital flight test program) and the major payload carriers DDT and E (including the first flights). Five tools derived from the Space Shuttle DDT and E are available to support the verification process: mathematical (structural and thermal) models, the Shuttle Avionics Integration Laboratory, the Shuttle Manipulator Development Facility, and interface-verification equipment (cargo-integration test equipment).

  3. Hierarchical Design and Verification for VLSI

    NASA Technical Reports Server (NTRS)

    Shostak, R. E.; Elliott, W. D.; Levitt, K. N.

    1983-01-01

    The specification and verification work is described in detail, and some of the problems and issues to be resolved in their application to Very Large Scale Integration VLSI systems are examined. The hierarchical design methodologies enable a system architect or design team to decompose a complex design into a formal hierarchy of levels of abstraction. The first step inprogram verification is tree formation. The next step after tree formation is the generation from the trees of the verification conditions themselves. The approach taken here is similar in spirit to the corresponding step in program verification but requires modeling of the semantics of circuit elements rather than program statements. The last step is that of proving the verification conditions using a mechanical theorem-prover.

  4. Online Nucleosynthesis

    NASA Astrophysics Data System (ADS)

    Meyer Jordan, Bradley, IV; The, Lih-Sin; Robbins, Stuart

    2004-05-01

    Nuclear-reaction network codes are important to astronomers seeking to explore nucleosynthetic implications of astrophysical models and to nuclear physicists seeking to understand the role of nuclear properties or reaction rates in element formation. However, many users do not have the time or inclination to download and compile the codes, to manage the requisite input files, or to explore the often complex output with their own graphics programs. To help make nucleosynthesis calculations more readily available, we have placed the Clemson Nucleosynthesis code on the world-wide web at http://www.ces.clemson.edu/physics/nucleo/nuclearNetwork At this web site, any Internet user may set his or her own reaction network, nuclear properties and reaction rates, and thermodynamic trajectories. The user then submits the nucleosynthesis calculation, which runs on a dedicated server professionally maintained at Clemson University. Once the calculation is completed, the user may explore the results through dynamically produced and downloadable tables and graphs. Online help guides the user through the necessary steps. We hope this web site will prove a user-friendly and helpful tool for professional scientists as well as for students seeking to explore element formation.

  5. Block truncation signature coding for hyperspectral analysis

    NASA Astrophysics Data System (ADS)

    Chakravarty, Sumit; Chang, Chein-I.

    2008-08-01

    This paper introduces a new signature coding which is designed based on the well-known Block Truncation Coding (BTC). It comprises of bit-maps of the signature blocks generated by different threshold criteria. Two new BTC-based algorithms are developed for signature coding, to be called Block Truncation Signature Coding (BTSC) and 2-level BTSC (2BTSC). In order to compare the developed BTC based algorithms with current binary signature coding schemes such as Spectral Program Analysis Manager (SPAM) developed by Mazer et al. and Spectral Feature-based Binary Coding (SFBC) by Qian et al., three different thresholding functions, local block mean, local block gradient, local block correlation are derived to improve the BTSC performance where the combined bit-maps generated by these thresholds can provide better spectral signature characterization. Experimental results reveal that the new BTC-based signature coding performs more effectively in characterizing spectral variations than currently available binary signature coding methods.

  6. Fleet-Wide Prognostic and Health Management Suite: Asset Fault Signature Database

    SciTech Connect

    Vivek Agarwal; Nancy J. Lybeck; Randall Bickford; Richard Rusaw

    2015-06-01

    Proactive online monitoring in the nuclear industry is being explored using the Electric Power Research Institute’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software. The FW-PHM Suite is a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. The FW-PHM Suite has four main modules: (1) Diagnostic Advisor, (2) Asset Fault Signature (AFS) Database, (3) Remaining Useful Life Advisor, and (4) Remaining Useful Life Database. The paper focuses on the AFS Database of the FW-PHM Suite, which is used to catalog asset fault signatures. A fault signature is a structured representation of the information that an expert would use to first detect and then verify the occurrence of a specific type of fault. The fault signatures developed to assess the health status of generator step-up transformers are described in the paper. The developed fault signatures capture this knowledge and implement it in a standardized approach, thereby streamlining the diagnostic and prognostic process. This will support the automation of proactive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  7. University Student Online Plagiarism

    ERIC Educational Resources Information Center

    Wang, Yu-mei

    2008-01-01

    This article reports a study investigating university student online plagiarism. The following questions are investigated: (a) What is the incidence of student online plagiarism? (b) What are student perceptions regarding online plagiarism? (c) Are there any differences in terms of student perceptions of online plagiarism and print plagiarism? (d)…

  8. Online Organic Chemistry

    ERIC Educational Resources Information Center

    Janowicz, Philip A.

    2010-01-01

    This is a comprehensive study of the many facets of an entirely online organic chemistry course. Online homework with structure-drawing capabilities was found to be more effective than written homework. Online lecture was found to be just as effective as in-person lecture, and students prefer an online lecture format with shorter Webcasts. Online…

  9. Partially Blind Signatures Based on Quantum Cryptography

    NASA Astrophysics Data System (ADS)

    Cai, Xiao-Qiu; Niu, Hui-Fang

    2012-12-01

    In a partially blind signature scheme, the signer explicitly includes pre-agreed common information in the blind signature, which can improve the availability and performance. We present a new partially blind signature scheme based on fundamental properties of quantum mechanics. In addition, we analyze the security of this scheme, and show it is not possible to forge valid partially blind signatures. Moreover, the comparisons between this scheme and those based on public-key cryptography are also discussed.

  10. 48 CFR 4.102 - Contractor's signature.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 48 Federal Acquisition Regulations System 1 2011-10-01 2011-10-01 false Contractor's signature. 4... ADMINISTRATIVE MATTERS Contract Execution 4.102 Contractor's signature. (a) Individuals. A contract with an... be signed by that individual, and the signature shall be followed by the individual's typed,...

  11. Attitudes toward buying online.

    PubMed

    Yang, Bijou; Lester, David

    2004-02-01

    A survey of 11 positive features and 10 discouraging features of online shopping was carried out on 180 students and identified certain behavioral patterns for online shoppers versus non-shoppers. It was found that online shoppers have consistently stronger positive feelings about online shopping than do non-shoppers. On the other hand, non-shoppers have more negative feelings about online shopping than do shoppers, but not consistently so. Online shoppers are aware of some of the discouraging features of online shopping, but these features do not deter them from shopping online. The implication for marketers is that they should focus on making the experience of online shopping more accommodating and more user-friendly since the positive features of online shopping ("convenience" and "efficiency") appear to be more important than the negative features ("effort/impersonality").

  12. DETECTORS AND EXPERIMENTAL METHODS: Online measurement of the BEPC II background using RadFET dosimeters

    NASA Astrophysics Data System (ADS)

    Gong, Hui; Li, Jin; Gong, Guang-Hua; Li, Yu-Xiong; Hou, Lei; Shao, Bei-Bei

    2009-09-01

    To monitor the integral dose deposited in the BESIII electromagnetic calorimeter whose performance degrades due to exposure to the BEPC II background, a 400 nm IMPL RadFET dosimeter-based integral dose online monitor system is built. After calibration with the 60Co source and verification with TLD in the pulse radiation fields, an experiment was arranged to measure the BEPC II background online. The results are presented.

  13. 75 FR 42575 - Electronic Signature and Storage of Form I-9, Employment Eligibility Verification

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-07-22

    ... commenters raised concerns about the employers' ability to implement new systems as technology changes and... interim final rule and this final rule do not specify any technology based system, but provide only for a...) (authorizing use of ``reasonable data compression or formatting technologies''). Several commenters...

  14. Analyzing Online Behaviors, Roles, and Learning Communities via Online Discussions

    ERIC Educational Resources Information Center

    Yeh, Yu-Chu

    2010-01-01

    Online learning communities are an important means of sharing and creating knowledge. Online behaviors and online roles can reveal how online learning communities function. However, no study has elucidated the relationships among online behaviors, online roles, and online learning communities. In this study, 32 preservice teachers participated in…

  15. Signatures of topological Josephson junctions

    NASA Astrophysics Data System (ADS)

    Peng, Yang; Pientka, Falko; Berg, Erez; Oreg, Yuval; von Oppen, Felix

    2016-08-01

    Quasiparticle poisoning and diabatic transitions may significantly narrow the window for the experimental observation of the 4 π -periodic dc Josephson effect predicted for topological Josephson junctions. Here, we show that switching-current measurements provide accessible and robust signatures for topological superconductivity which persist in the presence of quasiparticle poisoning processes. Such measurements provide access to the phase-dependent subgap spectrum and Josephson currents of the topological junction when incorporating it into an asymmetric SQUID together with a conventional Josephson junction with large critical current. We also argue that pump-probe experiments with multiple current pulses can be used to measure the quasiparticle poisoning rates of the topological junction. The proposed signatures are particularly robust, even in the presence of Zeeman fields and spin-orbit coupling, when focusing on short Josephson junctions. Finally, we also consider microwave excitations of short topological Josephson junctions which may complement switching-current measurements.

  16. Polarization signatures of airborne particulates

    NASA Astrophysics Data System (ADS)

    Raman, Prashant; Fuller, Kirk A.; Gregory, Don A.

    2013-07-01

    Exploratory research has been conducted with the aim of completely determining the polarization signatures of selected particulates as a function of wavelength. This may lead to a better understanding of the interaction between electromagnetic radiation and such materials, perhaps leading to the point detection of bio-aerosols present in the atmosphere. To this end, a polarimeter capable of measuring the complete Mueller matrix of highly scattering samples in transmission and reflection (with good spectral resolution from 300 to 1100 nm) has been developed. The polarization properties of Bacillus subtilis (surrogate for anthrax spore) are compared to ambient particulate matter species such as pollen, dust, and soot. Differentiating features in the polarization signatures of these samples have been identified, thus demonstrating the potential applicability of this technique for the detection of bio-aerosol in the ambient atmosphere.

  17. Signatures of a shadow biosphere.

    PubMed

    Davies, Paul C W; Benner, Steven A; Cleland, Carol E; Lineweaver, Charles H; McKay, Christopher P; Wolfe-Simon, Felisa

    2009-03-01

    Astrobiologists are aware that extraterrestrial life might differ from known life, and considerable thought has been given to possible signatures associated with weird forms of life on other planets. So far, however, very little attention has been paid to the possibility that our own planet might also host communities of weird life. If life arises readily in Earth-like conditions, as many astrobiologists contend, then it may well have formed many times on Earth itself, which raises the question whether one or more shadow biospheres have existed in the past or still exist today. In this paper, we discuss possible signatures of weird life and outline some simple strategies for seeking evidence of a shadow biosphere.

  18. KAT-7 Science Verification Highlights

    NASA Astrophysics Data System (ADS)

    Lucero, Danielle M.; Carignan, Claude; KAT-7 Science Data; Processing Team, KAT-7 Science Commissioning Team

    2015-01-01

    KAT-7 is a pathfinder of the Square Kilometer Array precursor MeerKAT, which is under construction. Its short baselines and low system temperature make it sensitive to large scale, low surface brightness emission. This makes it an ideal instrument to use in searches for faint extended radio emission and low surface density extraplanar gas. We present an update on the progress of several such ongoing KAT-7 science verification projects. These include a large scale radio continuum and polarization survey of the Galactic Center, deep HI observations (100+ hours) of nearby disk galaxies (e.g. NGC253 and NGC3109), and targeted searches for HI tidal tails in galaxy groups (e.g. IC1459). A brief status update for MeerKAT will also be presented if time permits.

  19. MFTF sensor verification computer program

    SciTech Connect

    Chow, H.K.

    1984-11-09

    The design, requirements document and implementation of the MFE Sensor Verification System were accomplished by the Measurement Engineering Section (MES), a group which provides instrumentation for the MFTF magnet diagnostics. The sensors, installed on and around the magnets and solenoids, housed in a vacuum chamber, will supply information about the temperature, strain, pressure, liquid helium level and magnet voltage to the facility operator for evaluation. As the sensors are installed, records must be maintained as to their initial resistance values. Also, as the work progresses, monthly checks will be made to insure continued sensor health. Finally, after the MFTF-B demonstration, yearly checks will be performed as well as checks of sensors as problem develops. The software to acquire and store the data was written by Harry Chow, Computations Department. The acquired data will be transferred to the MFE data base computer system.

  20. Verification of NASA Emergent Systems

    NASA Technical Reports Server (NTRS)

    Rouff, Christopher; Vanderbilt, Amy K. C. S.; Truszkowski, Walt; Rash, James; Hinchey, Mike

    2004-01-01

    NASA is studying advanced technologies for a future robotic exploration mission to the asteroid belt. This mission, the prospective ANTS (Autonomous Nano Technology Swarm) mission, will comprise of 1,000 autonomous robotic agents designed to cooperate in asteroid exploration. The emergent properties of swarm type missions make them powerful, but at the same time are more difficult to design and assure that the proper behaviors will emerge. We are currently investigating formal methods and techniques for verification and validation of future swarm-based missions. The advantage of using formal methods is their ability to mathematically assure the behavior of a swarm, emergent or otherwise. The ANT mission is being used as an example and case study for swarm-based missions for which to experiment and test current formal methods with intelligent swam. Using the ANTS mission, we have evaluated multiple formal methods to determine their effectiveness in modeling and assuring swarm behavior.

  1. Verification of FANTASTIC integrated code

    NASA Technical Reports Server (NTRS)

    Chauhan, Rajinder Singh

    1987-01-01

    FANTASTIC is an acronym for Failure Analysis Nonlinear Thermal and Structural Integrated Code. This program was developed by Failure Analysis Associates, Palo Alto, Calif., for MSFC to improve the accuracy of solid rocket motor nozzle analysis. FANTASTIC has three modules: FACT - thermochemical analysis; FAHT - heat transfer analysis; and FAST - structural analysis. All modules have keywords for data input. Work is in progress for the verification of the FAHT module, which is done by using data for various problems with known solutions as inputs to the FAHT module. The information obtained is used to identify problem areas of the code and passed on to the developer for debugging purposes. Failure Analysis Associates have revised the first version of the FANTASTIC code and a new improved version has been released to the Thermal Systems Branch.

  2. Nonlinear analysis of dynamic signature

    NASA Astrophysics Data System (ADS)

    Rashidi, S.; Fallah, A.; Towhidkhah, F.

    2013-12-01

    Signature is a long trained motor skill resulting in well combination of segments like strokes and loops. It is a physical manifestation of complex motor processes. The problem, generally stated, is that how relative simplicity in behavior emerges from considerable complexity of perception-action system that produces behavior within an infinitely variable biomechanical and environmental context. To solve this problem, we present evidences which indicate that motor control dynamic in signing process is a chaotic process. This chaotic dynamic may explain a richer array of time series behavior in motor skill of signature. Nonlinear analysis is a powerful approach and suitable tool which seeks for characterizing dynamical systems through concepts such as fractal dimension and Lyapunov exponent. As a result, they can be analyzed in both horizontal and vertical for time series of position and velocity. We observed from the results that noninteger values for the correlation dimension indicates low dimensional deterministic dynamics. This result could be confirmed by using surrogate data tests. We have also used time series to calculate the largest Lyapunov exponent and obtain a positive value. These results constitute significant evidence that signature data are outcome of chaos in a nonlinear dynamical system of motor control.

  3. Formal specification and verification of Ada software

    NASA Technical Reports Server (NTRS)

    Hird, Geoffrey R.

    1991-01-01

    The use of formal methods in software development achieves levels of quality assurance unobtainable by other means. The Larch approach to specification is described, and the specification of avionics software designed to implement the logic of a flight control system is given as an example. Penelope is described which is an Ada-verification environment. The Penelope user inputs mathematical definitions, Larch-style specifications and Ada code and performs machine-assisted proofs that the code obeys its specifications. As an example, the verification of a binary search function is considered. Emphasis is given to techniques assisting the reuse of a verification effort on modified code.

  4. Liquefied Natural Gas (LNG) dispenser verification device

    NASA Astrophysics Data System (ADS)

    Xiong, Maotao; Yang, Jie-bin; Zhao, Pu-jun; Yu, Bo; Deng, Wan-quan

    2013-01-01

    The composition of working principle and calibration status of LNG (Liquefied Natural Gas) dispenser in China are introduced. According to the defect of weighing method in the calibration of LNG dispenser, LNG dispenser verification device has been researched. The verification device bases on the master meter method to verify LNG dispenser in the field. The experimental results of the device indicate it has steady performance, high accuracy level and flexible construction, and it reaches the international advanced level. Then LNG dispenser verification device will promote the development of LNG dispenser industry in China and to improve the technical level of LNG dispenser manufacture.

  5. Design for Verification: Enabling Verification of High Dependability Software-Intensive Systems

    NASA Technical Reports Server (NTRS)

    Mehlitz, Peter C.; Penix, John; Markosian, Lawrence Z.; Koga, Dennis (Technical Monitor)

    2003-01-01

    Strategies to achieve confidence that high-dependability applications are correctly implemented include testing and automated verification. Testing deals mainly with a limited number of expected execution paths. Verification usually attempts to deal with a larger number of possible execution paths. While the impact of architecture design on testing is well known, its impact on most verification methods is not as well understood. The Design for Verification approach considers verification from the application development perspective, in which system architecture is designed explicitly according to the application's key properties. The D4V-hypothesis is that the same general architecture and design principles that lead to good modularity, extensibility and complexity/functionality ratio can be adapted to overcome some of the constraints on verification tools, such as the production of hand-crafted models and the limits on dynamic and static analysis caused by state space explosion.

  6. EXAMINING THE ROLE AND RESEARCH CHALLENGES OF SOCIAL MEDIA AS A TOOL FOR NONPROLIFERATION AND ARMS CONTROL TREATY VERIFICATION

    SciTech Connect

    Henry, Michael J.; Cramer, Nicholas O.; Benz, Jacob M.; Gastelum, Zoe N.; Kreyling, Sean J.; West, Curtis L.

    2014-05-13

    Traditional arms control treaty verification activities typically involve a combination of technical measurements via physical and chemical sensors, state declarations, political agreements, and on-site inspections involving international subject matter experts. However, the ubiquity of the internet, and the electronic sharing of data that it enables, has made available a wealth of open source information with the potential to benefit verification efforts. Open source information is already being used by organizations such as the International Atomic Energy Agency to support the verification of state-declared information, prepare inspectors for in-field activities, and to maintain situational awareness . The recent explosion in social media use has opened new doors to exploring the attitudes, moods, and activities around a given topic. Social media platforms, such as Twitter, Facebook, and YouTube, offer an opportunity for individuals, as well as institutions, to participate in a global conversation at minimal cost. Social media data can also provide a more data-rich environment, with text data being augmented with images, videos, and location data. The research described in this paper investigates the utility of applying social media signatures as potential arms control and nonproliferation treaty verification tools and technologies, as determined through a series of case studies. The treaty relevant events that these case studies touch upon include detection of undeclared facilities or activities, determination of unknown events recorded by the International Monitoring System (IMS), and the global media response to the occurrence of an Indian missile launch. The case studies examine how social media can be used to fill an information gap and provide additional confidence to a verification activity. The case studies represent, either directly or through a proxy, instances where social media information may be available that could potentially augment the evaluation

  7. Space Station automated systems testing/verification and the Galileo Orbiter fault protection design/verification

    NASA Technical Reports Server (NTRS)

    Landano, M. R.; Easter, R. W.

    1984-01-01

    Aspects of Space Station automated systems testing and verification are discussed, taking into account several program requirements. It is found that these requirements lead to a number of issues of uncertainties which require study and resolution during the Space Station definition phase. Most, if not all, of the considered uncertainties have implications for the overall testing and verification strategy adopted by the Space Station Program. A description is given of the Galileo Orbiter fault protection design/verification approach. Attention is given to a mission description, an Orbiter description, the design approach and process, the fault protection design verification approach/process, and problems of 'stress' testing.

  8. Brain oscillatory signatures of motor tasks.

    PubMed

    Ramos-Murguialday, Ander; Birbaumer, Niels

    2015-06-01

    Noninvasive brain-computer-interfaces (BCI) coupled with prosthetic devices were recently introduced in the rehabilitation of chronic stroke and other disorders of the motor system. These BCI systems and motor rehabilitation in general involve several motor tasks for training. This study investigates the neurophysiological bases of an EEG-oscillation-driven BCI combined with a neuroprosthetic device to define the specific oscillatory signature of the BCI task. Controlling movements of a hand robotic orthosis with motor imagery of the same movement generates sensorimotor rhythm oscillation changes and involves three elements of tasks also used in stroke motor rehabilitation: passive and active movement, motor imagery, and motor intention. We recorded EEG while nine healthy participants performed five different motor tasks consisting of closing and opening of the hand as follows: 1) motor imagery without any external feedback and without overt hand movement, 2) motor imagery that moves the orthosis proportional to the produced brain oscillation change with online proprioceptive and visual feedback of the hand moving through a neuroprosthetic device (BCI condition), 3) passive and 4) active movement of the hand with feedback (seeing and feeling the hand moving), and 5) rest. During the BCI condition, participants received contingent online feedback of the decrease of power of the sensorimotor rhythm, which induced orthosis movement and therefore proprioceptive and visual information from the moving hand. We analyzed brain activity during the five conditions using time-frequency domain bootstrap-based statistical comparisons and Morlet transforms. Activity during rest was used as a reference. Significant contralateral and ipsilateral event-related desynchronization of sensorimotor rhythm was present during all motor tasks, largest in contralateral-postcentral, medio-central, and ipsilateral-precentral areas identifying the ipsilateral precentral cortex as an integral

  9. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: STORMWATER TECHNOLOGIES

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techn...

  10. ENVIRONMENTAL TECHNOLOGY VERIFICATION (ETV) PROGRAM: FUEL CELLS

    EPA Science Inventory

    The U.S. Environmental Protection Agency (EPA) Environmental Technology Verification (ETV) Program evaluates the performance of innovative air, water, pollution prevention and monitoring technologies that have the potential to improve human health and the environment. This techno...

  11. 78 FR 58492 - Generator Verification Reliability Standards

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-09-24

    ... Energy Regulatory Commission 18 CFR Part 40 Generator Verification Reliability Standards AGENCY: Federal... approve the following Reliability Standards that were submitted to the Commission for approval by the North American Electric Reliability Corporation, the Commission-certified Electric...

  12. Environmental Technology Verification Program (ETV) Policy Compendium

    EPA Science Inventory

    The Policy Compendium summarizes operational decisions made to date by participants in the U.S. Environmental Protection Agency's (EPA's) Environmental Technology Verification Program (ETV) to encourage consistency among the ETV centers. The policies contained herein evolved fro...

  13. HDM/PASCAL Verification System User's Manual

    NASA Technical Reports Server (NTRS)

    Hare, D.

    1983-01-01

    The HDM/Pascal verification system is a tool for proving the correctness of programs written in PASCAL and specified in the Hierarchical Development Methodology (HDM). This document assumes an understanding of PASCAL, HDM, program verification, and the STP system. The steps toward verification which this tool provides are parsing programs and specifications, checking the static semantics, and generating verification conditions. Some support functions are provided such as maintaining a data base, status management, and editing. The system runs under the TOPS-20 and TENEX operating systems and is written in INTERLISP. However, no knowledge is assumed of these operating systems or of INTERLISP. The system requires three executable files, HDMVCG, PARSE, and STP. Optionally, the editor EMACS should be on the system in order for the editor to work. The file HDMVCG is invoked to run the system. The files PARSE and STP are used as lower forks to perform the functions of parsing and proving.

  14. Engineering drawing field verification program. Revision 3

    SciTech Connect

    Ulk, P.F.

    1994-10-12

    Safe, efficient operation of waste tank farm facilities is dependent in part upon the availability of accurate, up-to-date plant drawings. Accurate plant drawings are also required in support of facility upgrades and future engineering remediation projects. This supporting document establishes the procedure for performing a visual field verification of engineering drawings, the degree of visual observation being performed and documenting the results. A copy of the drawing attesting to the degree of visual observation will be paginated into the released Engineering Change Notice (ECN) documenting the field verification for future retrieval and reference. All waste tank farm essential and support drawings within the scope of this program will be converted from manual to computer aided drafting (CAD) drawings. A permanent reference to the field verification status will be placed along the right border of the CAD-converted drawing, referencing the revision level, at which the visual verification was performed and documented.

  15. The PASCAL-HDM Verification System

    NASA Technical Reports Server (NTRS)

    1983-01-01

    The PASCAL-HDM verification system is described. This system supports the mechanical generation of verification conditions from PASCAL programs and HDM-SPECIAL specifications using the Floyd-Hoare axiomatic method. Tools are provided to parse programs and specifications, check their static semantics, generate verification conditions from Hoare rules, and translate the verification conditions appropriately for proof using the Shostak Theorem Prover, are explained. The differences between standard PASCAL and the language handled by this system are explained. This consists mostly of restrictions to the standard language definition, the only extensions or modifications being the addition of specifications to the code and the change requiring the references to a function of no arguments to have empty parentheses.

  16. MAMA Software Features: Quantification Verification Documentation-1

    SciTech Connect

    Ruggiero, Christy E.; Porter, Reid B.

    2014-05-21

    This document reviews the verification of the basic shape quantification attributes in the MAMA software against hand calculations in order to show that the calculations are implemented mathematically correctly and give the expected quantification results.

  17. Verification of the Calore thermal analysis code.

    SciTech Connect

    Dowding, Kevin J.; Blackwell, Bennie Francis

    2004-07-01

    Calore is the ASC code developed to model steady and transient thermal diffusion with chemistry and dynamic enclosure radiation. An integral part of the software development process is code verification, which addresses the question 'Are we correctly solving the model equations'? This process aids the developers in that it identifies potential software bugs and gives the thermal analyst confidence that a properly prepared input will produce satisfactory output. Grid refinement studies have been performed on problems for which we have analytical solutions. In this talk, the code verification process is overviewed and recent results are presented. Recent verification studies have focused on transient nonlinear heat conduction and verifying algorithms associated with (tied) contact and adaptive mesh refinement. In addition, an approach to measure the coverage of the verification test suite relative to intended code applications is discussed.

  18. 9 CFR 417.8 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... ANALYSIS AND CRITICAL CONTROL POINT (HACCP) SYSTEMS § 417.8 Agency verification. FSIS will verify the... deviation occurs; (d) Reviewing the critical limits; (e) Reviewing other records pertaining to the...

  19. 77 FR 40612 - Notice to All Interested Parties of the Termination of the Receivership of 10375, Signature Bank...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2012-07-10

    ... From the Federal Register Online via the Government Publishing Office FEDERAL DEPOSIT INSURANCE CORPORATION Notice to All Interested Parties of the Termination of the Receivership of 10375, Signature Bank, Windsor, CO Notice is hereby given that the Federal Deposit Insurance Corporation (``FDIC'') as...

  20. Curricular Innovation and Digitisation at a Mega University in the Developing World--The UNISA "Signature Course" Project

    ERIC Educational Resources Information Center

    Baijnath, Narend

    2014-01-01

    As part of the endeavor to reposition itself in the open distance and e-learning arena, the University of South Africa (UNISA) has designed and developed six modular courses (one module per College) referred to as "Signature Courses". The focus of these modules is on a student-centred online teaching and learning approach; extensive…

  1. Automated UF6 Cylinder Enrichment Assay: Status of the Hybrid Enrichment Verification Array (HEVA) Project: POTAS Phase II

    SciTech Connect

    Jordan, David V.; Orton, Christopher R.; Mace, Emily K.; McDonald, Benjamin S.; Kulisek, Jonathan A.; Smith, Leon E.

    2012-06-01

    Pacific Northwest National Laboratory (PNNL) intends to automate the UF6 cylinder nondestructive assay (NDA) verification currently performed by the International Atomic Energy Agency (IAEA) at enrichment plants. PNNL is proposing the installation of a portal monitor at a key measurement point to positively identify each cylinder, measure its mass and enrichment, store the data along with operator inputs in a secure database, and maintain continuity of knowledge on measured cylinders until inspector arrival. This report summarizes the status of the research and development of an enrichment assay methodology supporting the cylinder verification concept. The enrichment assay approach exploits a hybrid of two passively-detected ionizing-radiation signatures: the traditional enrichment meter signature (186-keV photon peak area) and a non-traditional signature, manifested in the high-energy (3 to 8 MeV) gamma-ray continuum, generated by neutron emission from UF6. PNNL has designed, fabricated, and field-tested several prototype assay sensor packages in an effort to demonstrate proof-of-principle for the hybrid assay approach, quantify the expected assay precision for various categories of cylinder contents, and assess the potential for unsupervised deployment of the technology in a portal-monitor form factor. We refer to recent sensor-package prototypes as the Hybrid Enrichment Verification Array (HEVA). The report provides an overview of the assay signatures and summarizes the results of several HEVA field measurement campaigns on populations of Type 30B UF6 cylinders containing low-enriched uranium (LEU), natural uranium (NU), and depleted uranium (DU). Approaches to performance optimization of the assay technique via radiation transport modeling are briefly described, as are spectroscopic and data-analysis algorithms.

  2. The NPARC Alliance Verification and Validation Archive

    NASA Technical Reports Server (NTRS)

    Slater, John W.; Dudek, Julianne C.; Tatum, Kenneth E.

    2000-01-01

    The NPARC Alliance (National Project for Applications oriented Research in CFD) maintains a publicly-available, web-based verification and validation archive as part of the development and support of the WIND CFD code. The verification and validation methods used for the cases attempt to follow the policies and guidelines of the ASME and AIAA. The emphasis is on air-breathing propulsion flow fields with Mach numbers ranging from low-subsonic to hypersonic.

  3. Transmutation Fuel Performance Code Thermal Model Verification

    SciTech Connect

    Gregory K. Miller; Pavel G. Medvedev

    2007-09-01

    FRAPCON fuel performance code is being modified to be able to model performance of the nuclear fuels of interest to the Global Nuclear Energy Partnership (GNEP). The present report documents the effort for verification of the FRAPCON thermal model. It was found that, with minor modifications, FRAPCON thermal model temperature calculation agrees with that of the commercial software ABAQUS (Version 6.4-4). This report outlines the methodology of the verification, code input, and calculation results.

  4. Dynamic testing for shuttle design verification

    NASA Technical Reports Server (NTRS)

    Green, C. E.; Leadbetter, S. A.; Rheinfurth, M. H.

    1972-01-01

    Space shuttle design verification requires dynamic data from full scale structural component and assembly tests. Wind tunnel and other scaled model tests are also required early in the development program to support the analytical models used in design verification. Presented is a design philosophy based on mathematical modeling of the structural system strongly supported by a comprehensive test program; some of the types of required tests are outlined.

  5. A verification library for multibody simulation software

    NASA Technical Reports Server (NTRS)

    Kim, Sung-Soo; Haug, Edward J.; Frisch, Harold P.

    1989-01-01

    A multibody dynamics verification library, that maintains and manages test and validation data is proposed, based on RRC Robot arm and CASE backhoe validation and a comparitive study of DADS, DISCOS, and CONTOPS that are existing public domain and commercial multibody dynamic simulation programs. Using simple representative problems, simulation results from each program are cross checked, and the validation results are presented. Functionalities of the verification library are defined, in order to automate validation procedure.

  6. Specificity in ROS Signaling and Transcript Signatures

    PubMed Central

    Vaahtera, Lauri; Brosché, Mikael; Wrzaczek, Michael

    2014-01-01

    Abstract Significance: Reactive oxygen species (ROS), important signaling molecules in plants, are involved in developmental control and stress adaptation. ROS production can trigger broad transcriptional changes; however, it is not clear how specificity in transcriptional regulation is achieved. Recent Advances: A large collection of public transcriptome data from the model plant Arabidopsis thaliana is available for analysis. These data can be used for the analysis of biological processes that are associated with ROS signaling and for the identification of suitable transcriptional indicators. Several online tools, such as Genevestigator and Expression Angler, have simplified the task to analyze, interpret, and visualize this wealth of data. Critical Issues: The analysis of the exact transcriptional responses to ROS requires the production of specific ROS in distinct subcellular compartments with precise timing, which is experimentally difficult. Analyses are further complicated by the effect of ROS production in one subcellular location on the ROS accumulation in other compartments. In addition, even subtle differences in the method of ROS production or treatment can lead to significantly different outcomes when various stimuli are compared. Future Directions: Due to the difficulty of inducing ROS production specifically with regard to ROS type, subcellular localization, and timing, we propose that the concept of a “ROS marker gene” should be re-evaluated. We suggest guidelines for the analysis of transcriptional data in ROS signaling. The use of “ROS signatures,” which consist of a set of genes that together can show characteristic and indicative responses, should be preferred over the use of individual marker genes. Antioxid. Redox Signal. 21, 1422–1441. PMID:24180661

  7. Physical description of nuclear materials identification system (NMIS) signatures

    NASA Astrophysics Data System (ADS)

    Mihalczo, J. T.; Mullens, J. A.; Mattingly, J. K.; Valentine, T. E.

    2000-08-01

    This paper describes all time and frequency analysis parameters measured with a new correlation processor (capability up to 1 GHz sampling rates and up to five input data channels) for three input channels: (1) the 252Cf source ionization chamber; (2) a detection channel; and (3) a second detection channel. An intuitive and physical description of the various measured quantities is given as well as a brief mathematical description and a brief description of how the data are acquired. If the full five-channel capability is used, the number of measured quantities increases in number but not in type. The parameters provided by this new processor can be divided into two general classes: time analysis signatures and their related frequency analysis signatures. The time analysis signatures include the number of time m pulses occurs in a time interval, that is triggered randomly, upon a detection event, or upon a source fission event triggered. From the number of pulses in a time interval, the moments, factorial moments, and Feynmann variance can be obtained. Recent implementations of third- and fourth-order time and frequency analysis signatures in this processor are also briefly described. Thus, this processor used with a timed source of input neutrons contains all of the information from a pulsed neutron measurement, one and two detector Rossi- α measurements, multiplicity measurements, and third- and fourth-order correlation functions. This processor, although originally designed for active measurements with a 252Cf interrogating source, has been successfully used passively (without 252Cf source) for systems with inherent neutron sources such as fissile systems of plutonium. Data from active measurements with an 18.75 kg highly enriched uranium (93.2 wt%, 235U) metal casting for storage are presented to illustrate some of the various time and frequency analysis parameters. This processor, which is a five-channel time correlation analyzer with time channel widths as

  8. Forensic handwriting examiners' expertise for signature comparison.

    PubMed

    Sita, Jodi; Found, Bryan; Rogers, Douglas K

    2002-09-01

    This paper reports on the performance of forensic document examiners (FDEs) in a signature comparison task that was designed to address the issue of expertise. The opinions of FDEs regarding 150 genuine and simulated questioned signatures were compared with a control group of non-examiners' opinions. On the question of expertise, results showed that FDEs were statistically better than the control group at accurately determining the genuineness or non-genuineness of questioned signatures. The FDE group made errors (by calling a genuine signature simulated or by calling a simulated signature genuine) in 3.4% of their opinions while 19.3% of the control group's opinions were erroneous. The FDE group gave significantly more inconclusive opinions than the control group. Analysis of FDEs' responses showed that more correct opinions were expressed regarding simulated signatures and more inconclusive opinions were made on genuine signatures. Further, when the complexity of a signature was taken into account, FDEs made more correct opinions on high complexity signatures than on signatures of lower complexity. There was a wide range of skill amongst FDEs and no significant relationship was found between the number of years FDEs had been practicing and their correct, inconclusive and error rates.

  9. Genetic signatures of heroin addiction.

    PubMed

    Chen, Shaw-Ji; Liao, Ding-Lieh; Shen, Tsu-Wang; Yang, Hsin-Chou; Chen, Kuang-Chi; Chen, Chia-Hsiang

    2016-08-01

    Heroin addiction is a complex psychiatric disorder with a chronic course and a high relapse rate, which results from the interaction between genetic and environmental factors. Heroin addiction has a substantial heritability in its etiology; hence, identification of individuals with a high genetic propensity to heroin addiction may help prevent the occurrence and relapse of heroin addiction and its complications. The study aimed to identify a small set of genetic signatures that may reliably predict the individuals with a high genetic propensity to heroin addiction. We first measured the transcript level of 13 genes (RASA1, PRKCB, PDK1, JUN, CEBPG, CD74, CEBPB, AUTS2, ENO2, IMPDH2, HAT1, MBD1, and RGS3) in lymphoblastoid cell lines in a sample of 124 male heroin addicts and 124 male control subjects using real-time quantitative PCR. Seven genes (PRKCB, PDK1, JUN, CEBPG, CEBPB, ENO2, and HAT1) showed significant differential expression between the 2 groups. Further analysis using 3 statistical methods including logistic regression analysis, support vector machine learning analysis, and a computer software BIASLESS revealed that a set of 4 genes (JUN, CEBPB, PRKCB, ENO2, or CEBPG) could predict the diagnosis of heroin addiction with the accuracy rate around 85% in our dataset. Our findings support the idea that it is possible to identify genetic signatures of heroin addiction using a small set of expressed genes. However, the study can only be considered as a proof-of-concept study. As the establishment of lymphoblastoid cell line is a laborious and lengthy process, it would be more practical in clinical settings to identify genetic signatures for heroin addiction directly from peripheral blood cells in the future study.

  10. Infrared signatures for remote sensing

    SciTech Connect

    McDowell, R.S.; Sharpe, S.W.; Kelly, J.F.

    1994-04-01

    PNL`s capabilities for infrared and near-infrared spectroscopy include tunable-diode-laser (TDL) systems covering 300--3,000 cm{sup {minus}1} at <10-MHz bandwidth; a Bruker Fourier-transform infrared (FTIR) spectrometer for the near- to far-infrared at 50-MHz resolution; and a stable line-tunable, 12-w cw CO{sub 2} laser. PNL also has a beam expansion source with a 12-cm slit, which provides a 3-m effective path for gases at {approximately}10 K, giving a Doppler width of typically 10 MHz; and long-path static gas cells (to 100 m). In applying this equipment to signatures work, the authors emphasize the importance of high spectral resolution for detecting and identifying atmospheric interferences; for identifying the optimum analytical frequencies; for deriving, by spectroscopic analysis, the molecular parameters needed for modeling; and for obtaining data on species and/or bands that are not in existing databases. As an example of such spectroscopy, the authors have assigned and analyzed the C-Cl stretching region of CCl{sub 4} at 770--800 cm{sup {minus}1}. This is an important potential signature species whose IR absorption has remained puzzling because of the natural isotopic mix, extensive hot-band structure, and a Fermi resonance involving a nearby combination band. Instrument development projects include the IR sniffer, a small high-sensitivity, high-discrimination (Doppler-limited) device for fence-line or downwind monitoring that is effective even in regions of atmospheric absorption; preliminary work has achieved sensitivities at the low-ppb level. Other work covers trace species detection with TDLs, and FM-modulated CO{sub 2} laser LIDAR. The authors are planning a field experiment to interrogate the Hanford tank farm for signature species from Rattlesnake Mountain, a standoff of ca. 15 km, to be accompanied by simultaneous ground-truthing at the tanks.

  11. Signature of anisotropic bubble collisions

    SciTech Connect

    Salem, Michael P.

    2010-09-15

    Our universe may have formed via bubble nucleation in an eternally inflating background. Furthermore, the background may have a compact dimension--the modulus of which tunnels out of a metastable minimum during bubble nucleation--which subsequently grows to become one of our three large spatial dimensions. When in this scenario our bubble universe collides with other ones like it, the collision geometry is constrained by the reduced symmetry of the tunneling instanton. While the regions affected by such bubble collisions still appear (to leading order) as disks in an observer's sky, the centers of these disks all lie on a single great circle, providing a distinct signature of anisotropic bubble nucleation.

  12. Spectroscopic signature for ferroelectric ice

    NASA Astrophysics Data System (ADS)

    Wójcik, Marek J.; Gług, Maciej; Boczar, Marek; Boda, Łukasz

    2014-09-01

    Various forms of ice exist within our galaxy. Particularly intriguing type of ice - ‘ferroelectric ice' was discovered experimentally and is stable in temperatures below 72 K. This form of ice can generate enormous electric fields and can play an important role in planetary formation. In this letter we present Car-Parrinello simulation of infrared spectra of ferroelectric ice and compare them with spectra of hexagonal ice. Librational region of the spectra can be treated as spectroscopic signature of ice XI and can be of help to identify ferroelectric ice in the Universe.

  13. Satellite signatures in SLR observations

    NASA Technical Reports Server (NTRS)

    Appleby, G. M.

    1993-01-01

    We examine the evidence for the detection of satellite-dependent signatures in the laser range observations obtained by the UK single-photon Satellite Laser Ranging (SLR) System models of the expected observation distributions from Ajisai and Lageos are developed from the published satellite spread functions and from the characteristics of the SLR System and compared with the observations. The effects of varying return strengths are discussed using the models and by experimental observations of Ajisai, during which a range of return levels from single to multiple photons is achieved. The implications of these results for system-dependent center for mass corrections are discussed.

  14. Observational Signatures of Magnetic Reconnection

    NASA Technical Reports Server (NTRS)

    Savage, Sabrina

    2014-01-01

    Magnetic reconnection is often referred to as the primary source of energy release during solar flares. Directly observing reconnection occurring in the solar atmosphere, however, is not trivial considering that the scale size of the diffusion region is magnitudes smaller than the observational capabilities of current instrumentation, and coronal magnetic field measurements are not currently sufficient to capture the process. Therefore, predicting and studying observationally feasible signatures of the precursors and consequences of reconnection is necessary for guiding and verifying the simulations that dominate our understanding. I will present a set of such observations, particularly in connection with long-duration solar events, and compare them with recent simulations and theoretical predictions.

  15. Gut microbiota signatures of longevity.

    PubMed

    Kong, Fanli; Hua, Yutong; Zeng, Bo; Ning, Ruihong; Li, Ying; Zhao, Jiangchao

    2016-09-26

    An aging global population poses substantial challenges to society [1]. Centenarians are a model for healthy aging because they have reached the extreme limit of life by escaping, surviving, or delaying chronic diseases [2]. The genetics of centenarians have been extensively examined [3], but less is known about their gut microbiotas. Recently, Biagi et al.[4] characterized the gut microbiota in Italian centenarians and semi-supercentenarians. Here, we compare the gut microbiota of Chinese long-living people with younger age groups, and with the results from the Italian population [4], to identify gut-microbial signatures of healthy aging.

  16. The Online Underworld.

    ERIC Educational Resources Information Center

    Scrogan, Len

    1988-01-01

    Discusses some of the misuses of telecommunicating using school computers, including online piracy, hacking, phreaking, online crime, and destruction boards. Suggests ways that schools can deal with these problems. (TW)

  17. Online Data Collection.

    ERIC Educational Resources Information Center

    Topp, Neal W.; Pawloski, Bob

    2002-01-01

    Describes the eventful history of online data collection and presents a review of current literature following by a list of pros and cons to be considered when stepping into online surveying. (Contains 14 references.) (Author/YDS)

  18. National Verification System of National Meteorological Center , China

    NASA Astrophysics Data System (ADS)

    Zhang, Jinyan; Wei, Qing; Qi, Dan

    2016-04-01

    Product Quality Verification Division for official weather forecasting of China was founded in April, 2011. It is affiliated to Forecast System Laboratory (FSL), National Meteorological Center (NMC), China. There are three employees in this department. I'm one of the employees and I am in charge of Product Quality Verification Division in NMC, China. After five years of construction, an integrated realtime National Verification System of NMC, China has been established. At present, its primary roles include: 1) to verify official weather forecasting quality of NMC, China; 2) to verify the official city weather forecasting quality of Provincial Meteorological Bureau; 3) to evaluate forecasting quality for each forecasters in NMC, China. To verify official weather forecasting quality of NMC, China, we have developed : • Grid QPF Verification module ( including upascale) • Grid temperature, humidity and wind forecast verification module • Severe convective weather forecast verification module • Typhoon forecast verification module • Disaster forecast verification • Disaster warning verification module • Medium and extend period forecast verification module • Objective elements forecast verification module • Ensemble precipitation probabilistic forecast verification module To verify the official city weather forecasting quality of Provincial Meteorological Bureau, we have developed : • City elements forecast verification module • Public heavy rain forecast verification module • City air quality forecast verification module. To evaluate forecasting quality for each forecasters in NMC, China, we have developed : • Off-duty forecaster QPF practice evaluation module • QPF evaluation module for forecasters • Severe convective weather forecast evaluation module • Typhoon track forecast evaluation module for forecasters • Disaster warning evaluation module for forecasters • Medium and extend period forecast evaluation module The further

  19. Quantum broadcasting multiple blind signature with constant size

    NASA Astrophysics Data System (ADS)

    Xiao, Min; Li, Zhenli

    2016-09-01

    Using quantum homomorphic signature in quantum network, we propose a quantum broadcasting multiple blind signature scheme. Different from classical signature and current quantum signature schemes, the multi-signature proposed in our scheme is not generated by simply putting the individual signatures together, but by aggregating the individual signatures based on homomorphic property. Therefore, the size of the multi-signature is constant. Furthermore, based on a wide range of investigation for the security of existing quantum signature protocols, our protocol is designed to resist possible forgery attacks against signature and message from the various attack sources and disavowal attacks from participants.

  20. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Niton XLt 700 Series (XLt) XRF Services x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XLt analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XLt analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy

  1. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Elvatech, Ltd. ElvaX (ElvaX) x-ray fluorescence (XRF) analyzer distributed in the United States by Xcalibur XRF Services (Xcalibur), was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ElvaX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ElvaX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as s

  2. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Oxford ED2000 x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ED2000 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ED2000 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by com

  3. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rigaku ZSX Mini II (ZSX Mini II) XRF Services x-ray fluorescence (XRF) analyzer was demon-strated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the ZSX Mini II analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the ZSX Mini II analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element con

  4. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Innov-X XT400 Series (XT400) x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the XT400 analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the XT400 analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was as

  5. INNOVATIVE TECHNOLOGY VERIFICATION REPORT XRF ...

    EPA Pesticide Factsheets

    The Rontec PicoTAX x-ray fluorescence (XRF) analyzer was demonstrated under the U.S. Environmental Protection Agency (EPA) Superfund Innovative Technology Evaluation (SITE) Program. The field portion of the demonstration was conducted in January 2005 at the Kennedy Athletic, Recreational and Social Park (KARS) at Kennedy Space Center on Merritt Island, Florida. The demonstration was designed to collect reliable performance and cost data for the PicoTAX analyzer and seven other commercially available XRF instruments for measuring trace elements in soil and sediment. The performance and cost data were evaluated to document the relative performance of each XRF instrument. This innovative technology verification report describes the objectives and the results of that evaluation and serves to verify the performance and cost of the PicoTAX analyzer. Separate reports have been prepared for the other XRF instruments that were evaluated as part of the demonstration. The objectives of the evaluation included determining each XRF instrument’s accuracy, precision, sample throughput, and tendency for matrix effects. To fulfill these objectives, the field demonstration incorporated the analysis of 326 prepared samples of soil and sediment that contained 13 target elements. The prepared samples included blends of environmental samples from nine different sample collection sites as well as spiked samples with certified element concentrations. Accuracy was assessed by c

  6. Learning Assumptions for Compositional Verification

    NASA Technical Reports Server (NTRS)

    Cobleigh, Jamieson M.; Giannakopoulou, Dimitra; Pasareanu, Corina; Clancy, Daniel (Technical Monitor)

    2002-01-01

    Compositional verification is a promising approach to addressing the state explosion problem associated with model checking. One compositional technique advocates proving properties of a system by checking properties of its components in an assume-guarantee style. However, the application of this technique is difficult because it involves non-trivial human input. This paper presents a novel framework for performing assume-guarantee reasoning in an incremental and fully automated fashion. To check a component against a property, our approach generates assumptions that the environment needs to satisfy for the property to hold. These assumptions are then discharged on the rest of the system. Assumptions are computed by a learning algorithm. They are initially approximate, but become gradually more precise by means of counterexamples obtained by model checking the component and its environment, alternately. This iterative process may at any stage conclude that the property is either true or false in the system. We have implemented our approach in the LTSA tool and applied it to the analysis of a NASA system.

  7. Visual inspection for CTBT verification

    SciTech Connect

    Hawkins, W.; Wohletz, K.

    1997-03-01

    On-site visual inspection will play an essential role in future Comprehensive Test Ban Treaty (CTBT) verification. Although seismic and remote sensing techniques are the best understood and most developed methods for detection of evasive testing of nuclear weapons, visual inspection can greatly augment the certainty and detail of understanding provided by these more traditional methods. Not only can visual inspection offer ``ground truth`` in cases of suspected nuclear testing, but it also can provide accurate source location and testing media properties necessary for detailed analysis of seismic records. For testing in violation of the CTBT, an offending party may attempt to conceal the test, which most likely will be achieved by underground burial. While such concealment may not prevent seismic detection, evidence of test deployment, location, and yield can be disguised. In this light, if a suspicious event is detected by seismic or other remote methods, visual inspection of the event area is necessary to document any evidence that might support a claim of nuclear testing and provide data needed to further interpret seismic records and guide further investigations. However, the methods for visual inspection are not widely known nor appreciated, and experience is presently limited. Visual inspection can be achieved by simple, non-intrusive means, primarily geological in nature, and it is the purpose of this report to describe the considerations, procedures, and equipment required to field such an inspection.

  8. ALMA Band 5 Science Verification

    NASA Astrophysics Data System (ADS)

    Humphreys, L.; Biggs, A.; Immer, K.; Laing, R.; Liu, H. B.; Marconi, G.; Mroczkowski, T.; Testi, L.; Yagoubov, P.

    2017-03-01

    ALMA Band 5 (163–211 GHz) was recently commissioned and Science Verification (SV) observations were obtained in the latter half of 2016. A primary scientific focus of this band is the H2O line at 183.3 GHz, which can be observed around 15% of the time when the precipitable water vapour is sufficiently low (< 0.5 mm). Many more lines are covered in Band 5 and can be observed for over 70% of the time on Chajnantor, requiring similar restrictions to those for ALMA Bands 4 and 6. Examples include the H218O line at 203 GHz, some of the bright (3–2) lines of singly and doubly deuterated forms of formaldehyde, the (2–1) lines of HCO+, HCN, HNC, N2H+ and several of their isotopologues. A young star-forming region near the centre of the Milky Way, an evolved star also in our Galaxy, and a nearby ultraluminous infrared galaxy (ULIRG) were observed as part of the SV process and the data are briefly described. The reduced data, along with imaged data products, are now public and demonstrate the power of ALMA for high-resolution studies of H2O and other molecules in a variety of astronomical targets.

  9. Online Education Interim Report

    ERIC Educational Resources Information Center

    Protopsaltis, Spiros, Ed.

    2007-01-01

    This interim report produced by the Colorado State Board of Education Online Education Task Force examines key issues related to online education. Task force members agree that: (1) online education has become a viable element of Colorado's public education system; (2) the role of technology in educating our children will continue to grow; and (3)…

  10. Library Online Systems.

    ERIC Educational Resources Information Center

    Folda, Linda; And Others

    1989-01-01

    Issues related to library online systems are discussed in six articles. Topics covered include staff education through vendor demonstrations, evaluation of online public access catalogs, the impact of integrated online systems on cataloging operations, the merits of smart and dumb barcodes, and points to consider in planning for the next online…

  11. Online Learning. Symposium.

    ERIC Educational Resources Information Center

    2002

    This document contains three papers from a symposium on online learning that was conducted as part of a conference on human resource development (HRD). "An Instructional Strategy Framework for Online Learning Environments" (Scott D. Johnson, Steven R. Aragon) discusses the pitfalls of modeling online courses after traditional instruction…

  12. Assessing Online Learning

    ERIC Educational Resources Information Center

    Comeaux, Patricia, Ed.

    2004-01-01

    Students in traditional as well as online classrooms need more than grades from their instructors--they also need meaningful feedback to help bridge their academic knowledge and skills with their daily lives. With the increasing number of online learning classrooms, the question of how to consistently assess online learning has become increasingly…

  13. Effective Online Teachers

    ERIC Educational Resources Information Center

    Muirhead, Brent

    2006-01-01

    Effective online teaching is a popular topic in today's educational technology journals due to the vital role that educators play in the teaching and learning process. The author will provide insights into effective online teachers and highlight training and mentoring practices for online instructors at the University of Phoenix.

  14. Developing Online Doctoral Programmes

    ERIC Educational Resources Information Center

    Chipere, Ngoni

    2015-01-01

    The objectives of the study were to identify best practices in online doctoral programming and to synthesise these practices into a framework for developing online doctoral programmes. The field of online doctoral studies is nascent and presents challenges for conventional forms of literature review. The literature was therefore reviewed using a…

  15. Implementing Online Physical Education

    ERIC Educational Resources Information Center

    Mohnsen, Bonnie

    2012-01-01

    Online physical education, although seemingly an oxymoron, appears to be the wave of the future at least for some students. The purpose of this article is to explore research and options for online learning in physical education and to examine a curriculum, assessment, and instructional model for online learning. The article examines how physical…

  16. Reflectors as Online Extraverts?

    ERIC Educational Resources Information Center

    Downing, Kevin; Chim, Tat Mei

    2004-01-01

    Increasingly, online learning is perceived as an effective method of instruction. Much recent educational research has focused on examining the purposes and situations for which online education is best suited. In this paper, students enrolled in two online courses are compared with their peers enrolled in equivalent classroom-based courses to…

  17. Online Training in Australia

    ERIC Educational Resources Information Center

    Kuzic, Joze

    2013-01-01

    On-line training is becoming an interesting phenomenon in Australia and has attracted a lot of interest across many industries and businesses (Chan and Ngai, 2007). The research reported here looks at the use of online training in corporations in Australia. It focuses on two aspects of online training, the factors that "warrant" its…

  18. Use of motor current signature analysis at the EPRI M D Center

    SciTech Connect

    Haynes, H.D.; Kryter, R.C.; Stewart, B.K.

    1990-01-01

    Motor current signature analysis (MCSA), a machinery monitoring technology developed by the Oak Ridge National Laboratory (ORNL), has been used to monitor a variety of electric-motor-driven devices at the Philadelphia Electric Company's Eddystone Generating Station as part of a program conducted by the EPRI Monitoring and Diagnostics Center. The purpose of this project is to demonstrate the ability of MCSA to monitor the occurrence of degradation in aging power plant equipment. An important aspect of the work has been the development and demonstration of an on-line, automated motor current data acquisition system for monitoring the performance of eight motor-operated valves (MOVs) located in the Unit 2 turbine steam extraction system. Improvements continue to be made in the on-line monitoring system, including development of an automated data analysis program that significantly reduces the time required to extract diagnostic information from the MOV motor current signatures. Using portable MCSA equipment, motor current data were acquired for additional MOVs, pumps, fans, compressors, and mills. These tests have provided important baseline signatures against which subsequent test data may be compared. This paper provides descriptions of the tested equipment, the MCSA techniques employed, samples of test data acquired from both the on-line and portable data acquisition systems, and plans for future work in this ongoing effort. 8 refs., 22 figs., 3 tabs.

  19. Visual signatures in video visualization.

    PubMed

    Chen, Min; Botchen, Ralf P; Hashim, Rudy R; Weiskopf, Daniel; Ertl, Thomas; Thornton, Ian M

    2006-01-01

    Video visualization is a computation process that extracts meaningful information from original video data sets and conveys the extracted information to users in appropriate visual representations. This paper presents a broad treatment of the subject, following a typical research pipeline involving concept formulation, system development, a path-finding user study, and a field trial with real application data. In particular, we have conducted a fundamental study on the visualization of motion events in videos. We have, for the first time, deployed flow visualization techniques in video visualization. We have compared the effectiveness of different abstract visual representations of videos. We have conducted a user study to examine whether users are able to learn to recognize visual signatures of motions, and to assist in the evaluation of different visualization techniques. We have applied our understanding and the developed techniques to a set of application video clips. Our study has demonstrated that video visualization is both technically feasible and cost-effective. It has provided the first set of evidence confirming that ordinary users can be accustomed to the visual features depicted in video visualizations, and can learn to recognize visual signatures of a variety of motion events.

  20. Towards Trustable Digital Evidence with PKIDEV: PKI Based Digital Evidence Verification Model

    NASA Astrophysics Data System (ADS)

    Uzunay, Yusuf; Incebacak, Davut; Bicakci, Kemal

    How to Capture and Preserve Digital Evidence Securely? For the investigation and prosecution of criminal activities that involve computers, digital evidence collected in the crime scene has a vital importance. On one side, it is a very challenging task for forensics professionals to collect them without any loss or damage. On the other, there is the second problem of providing the integrity and authenticity in order to achieve legal acceptance in a court of law. By conceiving digital evidence simply as one instance of digital data, it is evident that modern cryptography offers elegant solutions for this second problem. However, to our knowledge, there is not any previous work proposing a systematic model having a holistic view to address all the related security problems in this particular case of digital evidence verification. In this paper, we present PKIDEV (Public Key Infrastructure based Digital Evidence Verification model) as an integrated solution to provide security for the process of capturing and preserving digital evidence. PKIDEV employs, inter alia, cryptographic techniques like digital signatures and secure time-stamping as well as latest technologies such as GPS and EDGE. In our study, we also identify the problems public-key cryptography brings when it is applied to the verification of digital evidence.

  1. Secure Obfuscation for Encrypted Group Signatures

    PubMed Central

    Fan, Hongfei; Liu, Qin

    2015-01-01

    In recent years, group signature techniques are widely used in constructing privacy-preserving security schemes for various information systems. However, conventional techniques keep the schemes secure only in normal black-box attack contexts. In other words, these schemes suppose that (the implementation of) the group signature generation algorithm is running in a platform that is perfectly protected from various intrusions and attacks. As a complementary to existing studies, how to generate group signatures securely in a more austere security context, such as a white-box attack context, is studied in this paper. We use obfuscation as an approach to acquire a higher level of security. Concretely, we introduce a special group signature functionality-an encrypted group signature, and then provide an obfuscator for the proposed functionality. A series of new security notions for both the functionality and its obfuscator has been introduced. The most important one is the average-case secure virtual black-box property w.r.t. dependent oracles and restricted dependent oracles which captures the requirement of protecting the output of the proposed obfuscator against collision attacks from group members. The security notions fit for many other specialized obfuscators, such as obfuscators for identity-based signatures, threshold signatures and key-insulated signatures. Finally, the correctness and security of the proposed obfuscator have been proven. Thereby, the obfuscated encrypted group signature functionality can be applied to variants of privacy-preserving security schemes and enhance the security level of these schemes. PMID:26167686

  2. Verification of Functional Fault Models and the Use of Resource Efficient Verification Tools

    NASA Technical Reports Server (NTRS)

    Bis, Rachael; Maul, William A.

    2015-01-01

    Functional fault models (FFMs) are a directed graph representation of the failure effect propagation paths within a system's physical architecture and are used to support development and real-time diagnostics of complex systems. Verification of these models is required to confirm that the FFMs are correctly built and accurately represent the underlying physical system. However, a manual, comprehensive verification process applied to the FFMs was found to be error prone due to the intensive and customized process necessary to verify each individual component model and to require a burdensome level of resources. To address this problem, automated verification tools have been developed and utilized to mitigate these key pitfalls. This paper discusses the verification of the FFMs and presents the tools that were developed to make the verification process more efficient and effective.

  3. Monitoring and verification R&D

    SciTech Connect

    Pilat, Joseph F; Budlong - Sylvester, Kory W; Fearey, Bryan L

    2011-01-01

    The 2010 Nuclear Posture Review (NPR) report outlined the Administration's approach to promoting the agenda put forward by President Obama in Prague on April 5, 2009. The NPR calls for a national monitoring and verification R&D program to meet future challenges arising from the Administration's nonproliferation, arms control and disarmament agenda. Verification of a follow-on to New START could have to address warheads and possibly components along with delivery capabilities. Deeper cuts and disarmament would need to address all of these elements along with nuclear weapon testing, nuclear material and weapon production facilities, virtual capabilities from old weapon and existing energy programs and undeclared capabilities. We only know how to address some elements of these challenges today, and the requirements may be more rigorous in the context of deeper cuts as well as disarmament. Moreover, there is a critical need for multiple options to sensitive problems and to address other challenges. There will be other verification challenges in a world of deeper cuts and disarmament, some of which we are already facing. At some point, if the reductions process is progressing, uncertainties about past nuclear materials and weapons production will have to be addressed. IAEA safeguards will need to continue to evolve to meet current and future challenges, and to take advantage of new technologies and approaches. Transparency/verification of nuclear and dual-use exports will also have to be addressed, and there will be a need to make nonproliferation measures more watertight and transparent. In this context, and recognizing we will face all of these challenges even if disarmament is not achieved, this paper will explore possible agreements and arrangements; verification challenges; gaps in monitoring and verification technologies and approaches; and the R&D required to address these gaps and other monitoring and verification challenges.

  4. On Direct Verification of Warped Hierarchy-and-FlavorModels

    SciTech Connect

    Davoudiasl, Hooman; Rizzo, Thomas G.; Soni, Amarjit; /Brookhaven

    2007-10-15

    We consider direct experimental verification of warped models, based on the Randall-Sundrum (RS) scenario, that explain gauge and flavor hierarchies, assuming that the gauge fields and fermions of the Standard Model (SM) propagate in the 5D bulk. Most studies have focused on the bosonic Kaluza Klein (KK) signatures and indicate that discovering gauge KK modes is likely possible, yet challenging, while graviton KK modes are unlikely to be accessible at the LHC, even with a luminosity upgrade. We show that direct evidence for bulk SM fermions, i.e. their KK modes, is likely also beyond the reach of a luminosity-upgraded LHC. Thus, neither the spin-2 KK graviton, the most distinct RS signal, nor the KK SM fermions, direct evidence for bulk flavor, seem to be within the reach of the LHC. We then consider hadron colliders with vs. = 21, 28, and 60 TeV. We find that discovering the first KK modes of SM fermions and the graviton typically requires the Next Hadron Collider (NHC) with {radical}s {approx} 60 TeV and O(1) ab-1 of integrated luminosity. If the LHC yields hints of these warped models, establishing that Nature is described by them, or their 4D CFT duals, requires an NHC-class machine in the post-LHC experimental program.

  5. Direct verification of warped hierarchy-and-flavor models

    SciTech Connect

    Davoudiasl, Hooman; Soni, Amarjit; Rizzo, Thomas G.

    2008-02-01

    We consider direct experimental verification of warped models, based on the Randall-Sundrum (RS) scenario, that explain gauge and flavor hierarchies, assuming that the gauge fields and fermions of the standard model (SM) propagate in the 5D bulk. Most studies have focused on the bosonic Kaluza-Klein (KK) signatures and indicate that discovering gauge KK modes is likely possible, yet challenging, while graviton KK modes are unlikely to be accessible at the CERN LHC, even with a luminosity upgrade. We show that direct evidence for bulk SM fermions, i.e. their KK modes, is likely also beyond the reach of a luminosity-upgraded LHC. Thus, neither the spin-2 KK graviton, the most distinct RS signal, nor the KK SM fermions, direct evidence for bulk flavor, seem to be within the reach of the LHC. We then consider hadron colliders with {radical}(s)=21, 28, and 60 TeV. We find that discovering the first KK modes of SM fermions and the graviton typically requires the Next Hadron Collider (NHC) with {radical}(s){approx_equal}60 TeV and O(1) ab{sup -1} of integrated luminosity. If the LHC yields hints of these warped models, establishing that nature is described by them, or their 4D conformal field theory duals, requires an NHC-class machine in the post-LHC experimental program.

  6. INF verification: a guide for the perplexed

    SciTech Connect

    Mendelsohn, J.

    1987-09-01

    The administration has dug itself some deep holes on the verification issue. It will have to conclude an arms control treaty without having resolved earlier (but highly questionable) compliance issues on which it has placed great emphasis. It will probably have to abandon its more sweeping (and unnecessary) on-site inspection (OSI) proposals because of adverse security and political implications for the United States and its allies. And, finally, it will probably have to present to the Congress an INF treaty that will provide for a considerably less-stringent (but nonetheless adequate) verification regime that it had originally demanded. It is difficult to dispel the impression that, when the likelihood of concluding an INF treaty seemed remote, the administration indulged its penchant for intrusive and non-negotiable verification measures. As the possibility of, and eagerness for, a treaty increased, and as the Soviet Union shifted its policy from one of the resistance to OSI to one of indicating that on-site verification involved reciprocal obligations, the administration was forced to scale back its OSI rhetoric. This re-evaluation of OSI by the administration does not make the INF treaty any less verifiable; from the outset the Reagan administration was asking for a far more extensive verification package than was necessary, practicable, acceptable, or negotiable.

  7. Neighborhood Repulsed Metric Learning for Kinship Verification.

    PubMed

    Lu, Jiwen; Zhou, Xiuzhuang; Tan, Yap-Pen; Shang, Yuanyuan; Zhou, Jie

    2013-07-16

    Kinship verification from facial images is an interesting and challenging problem in computer vision, and there is very limited attempts on tackle this problem in the iterature. In this paper, we propose a new neighborhood repulsed metric learning (NRML) method for kinship verification. Motivated by the fact that interclass samples (without kinship relations) with higher similarity usually lie in a neighborhood and are more easily misclassified than those with lower similarity, we aim to learn a distance metric under which the intraclass samples (with kinship relations) are pulled as close as possible and interclass samples lying in a neighborhood are repulsed and pushed away as far as possible, simultaneously, such that more discriminative information can be exploited for verification. To make better use of multiple feature descriptors to extract complementary information, we further propose a multiview NRML (MNRML) method to seek a common distance metric to perform multiple feature fusion to improve the kinship verification performance. Experimental results are presented to demonstrate the efficacy of our proposed methods. Lastly, we also test human ability in kinship verification from facial images and our experimental results show that our methods are comparable to that of human observers.

  8. Neighborhood repulsed metric learning for kinship verification.

    PubMed

    Lu, Jiwen; Zhou, Xiuzhuang; Tan, Yap-Pen; Shang, Yuanyuan; Zhou, Jie

    2014-02-01

    Kinship verification from facial images is an interesting and challenging problem in computer vision, and there are very limited attempts on tackle this problem in the literature. In this paper, we propose a new neighborhood repulsed metric learning (NRML) method for kinship verification. Motivated by the fact that interclass samples (without a kinship relation) with higher similarity usually lie in a neighborhood and are more easily misclassified than those with lower similarity, we aim to learn a distance metric under which the intraclass samples (with a kinship relation) are pulled as close as possible and interclass samples lying in a neighborhood are repulsed and pushed away as far as possible, simultaneously, such that more discriminative information can be exploited for verification. To make better use of multiple feature descriptors to extract complementary information, we further propose a multiview NRML (MNRML) method to seek a common distance metric to perform multiple feature fusion to improve the kinship verification performance. Experimental results are presented to demonstrate the efficacy of our proposed methods. Finally, we also test human ability in kinship verification from facial images and our experimental results show that our methods are comparable to that of human observers.

  9. Hybrid Deep Learning for Face Verification.

    PubMed

    Sun, Yi; Wang, Xiaogang; Tang, Xiaoou

    2016-10-01

    This paper proposes a hybrid convolutional network (ConvNet)-Restricted Boltzmann Machine (RBM) model for face verification. A key contribution of this work is to learn high-level relational visual features with rich identity similarity information. The deep ConvNets in our model start by extracting local relational visual features from two face images in comparison, which are further processed through multiple layers to extract high-level and global relational features. To keep enough discriminative information, we use the last hidden layer neuron activations of the ConvNet as features for face verification instead of those of the output layer. To characterize face similarities from different aspects, we concatenate the features extracted from different face region pairs by different deep ConvNets. The resulting high-dimensional relational features are classified by an RBM for face verification. After pre-training each ConvNet and the RBM separately, the entire hybrid network is jointly optimized to further improve the accuracy. Various aspects of the ConvNet structures, relational features, and face verification classifiers are investigated. Our model achieves the state-of-the-art face verification performance on the challenging LFW dataset under both the unrestricted protocol and the setting when outside data is allowed to be used for training.

  10. Signature spectrale des grains interstellaires.

    NASA Astrophysics Data System (ADS)

    Léger, A.

    Notre connaissance de la nature des grains interstellaires reposait sur un nombre très restreint de signatures spectrales dans la courbe d'extinction du milieu interstellaire. Une information considérable est contenue dans les 40 bandes interstellaires diffuses dans le visible, mais reste inexploitée. L'interprétation récente des cinq bandes IR en émission, en terme de molécules d'hydrocarbures aromatiques polycycliques, est développée. Elle permet l'utilisation d'une information spectroscopique comparable, à elle seule, à ce sur quoi était basée jusqu'alors notre connaissance de la matière interstellaire condensée. Différentes implications de cette mise en évidence sont proposées.

  11. GHG MITIGATION TECHNOLOGY PERFORMANCE EVALUATIONS UNDERWAY AT THE GHG TECHNOLOGY VERIFICATION CENTER

    EPA Science Inventory

    The paper outlines the verification approach and activities of the Greenhouse Gas (GHG) Technology Verification Center, one of 12 independent verification entities operating under the U.S. EPA-sponsored Environmental Technology Verification (ETV) program. (NOTE: The ETV program...

  12. Metabolic Signatures of Bacterial Vaginosis

    PubMed Central

    Morgan, Martin T.; Fiedler, Tina L.; Djukovic, Danijel; Hoffman, Noah G.; Raftery, Daniel; Marrazzo, Jeanne M.

    2015-01-01

    ABSTRACT Bacterial vaginosis (BV) is characterized by shifts in the vaginal microbiota from Lactobacillus dominant to a microbiota with diverse anaerobic bacteria. Few studies have linked specific metabolites with bacteria found in the human vagina. Here, we report dramatic differences in metabolite compositions and concentrations associated with BV using a global metabolomics approach. We further validated important metabolites using samples from a second cohort of women and a different platform to measure metabolites. In the primary study, we compared metabolite profiles in cervicovaginal lavage fluid from 40 women with BV and 20 women without BV. Vaginal bacterial representation was determined using broad-range PCR with pyrosequencing and concentrations of bacteria by quantitative PCR. We detected 279 named biochemicals; levels of 62% of metabolites were significantly different in women with BV. Unsupervised clustering of metabolites separated women with and without BV. Women with BV have metabolite profiles marked by lower concentrations of amino acids and dipeptides, concomitant with higher levels of amino acid catabolites and polyamines. Higher levels of the signaling eicosanoid 12-hydroxyeicosatetraenoic acid (12-HETE), a biomarker for inflammation, were noted in BV. Lactobacillus crispatus and Lactobacillus jensenii exhibited similar metabolite correlation patterns, which were distinct from correlation patterns exhibited by BV-associated bacteria. Several metabolites were significantly associated with clinical signs and symptoms (Amsel criteria) used to diagnose BV, and no metabolite was associated with all four clinical criteria. BV has strong metabolic signatures across multiple metabolic pathways, and these signatures are associated with the presence and concentrations of particular bacteria. PMID:25873373

  13. Irma multisensor predictive signature model

    NASA Astrophysics Data System (ADS)

    Watson, John S.; Flynn, David S.; Wellfare, Michael R.; Richards, Mike; Prestwood, Lee

    1995-06-01

    The Irma synthetic signature model was one of the first high resolution synthetic infrared (IR) target and background signature models to be developed for tactical air-to-surface weapon scenarios. Originally developed in 1980 by the Armament Directorate of the Air Force Wright Laboratory (WL/MN), the Irma model was used exclusively to generate IR scenes for smart weapons research and development. In 1988, a number of significant upgrades to Irma were initiated including the addition of a laser channel. This two channel version, Irma 3.0, was released to the user community in 1990. In 1992, an improved scene generator was incorporated into the Irma model which supported correlated frame-to-frame imagery. This and other improvements were released in Irma 2.2. Recently, Irma 3.2, a passive IR/millimeter wave (MMW) code, was completed. Currently, upgrades are underway to include an active MMW channel. Designated Irma 4.0, this code will serve as a cornerstone of sensor fusion research in the laboratory from 6.1 concept development to 6.3 technology demonstration programs for precision guided munitions. Several significant milestones have been reached in this development process and are demonstrated. The Irma 4.0 software design has been developed and interim results are available. Irma is being developed to facilitate multi-sensor smart weapons research and development. It is currently in distribution to over 80 agencies within the U.S. Air Force, U.S. Army, U.S. Navy, ARPA, NASA, Department of Transportation, academia, and industry.

  14. A Nucleotide Signature for the Identification of Angelicae Sinensis Radix (Danggui) and Its Products

    PubMed Central

    Wang, Xiaoyue; Liu, Yang; Wang, Lili; Han, Jianping; Chen, Shilin

    2016-01-01

    It is very difficult to identify Angelicae sinensis radix (Danggui) when it is processed into Chinese patent medicines. The proposed internal transcribed spacer 2 (ITS2) is not sufficient to resolve heavily processed materials. Therefore, a short barcode for the identification of processed materials is urgently needed. In this study, 265 samples of Angelicae sinensis radix and adulterants were collected. The ITS2 region was sequenced, and based on one single nucleotide polymorphism(SNP) site unique to Angelica sinensis, a nucleotide signature consisting of 37-bp (5′-aatccgcgtc atcttagtga gctcaaggac ccttagg-3′) was developed. It is highly conserved and specific within Angelica sinensis while divergent among other species. Then, we designed primers (DG01F/DG01R) to amplify the nucleotide signature region from processed materials. 15 samples procured online were analysed. By seeking the signature, we found that 7 of them were counterfeits. 28 batches of Chinese patent medicines containing Danggui were amplified. 19 of them were found to contain the signature, and adulterants such as Ligusticum sinense, Notopterygium incisum, Angelica decursiva and Angelica gigas were detected in other batches. Thus, this nucleotide signature, with only 37-bp, will broaden the application of DNA barcoding to identify the components in decoctions, Chinese patent medicines and other products with degraded DNA. PMID:27713564

  15. Online Monitoring of Plant Assets in the Nuclear Industry

    SciTech Connect

    Nancy Lybeck; Vivek Agarwal; Binh Pham; Richard Rusaw; Randy Bickford

    2013-10-01

    Today’s online monitoring technologies provide opportunities to perform predictive and proactive health management of assets within many different industries, in particular the defense and aerospace industries. The nuclear industry can leverage these technologies to enhance safety, productivity, and reliability of the aging fleet of existing nuclear power plants. The U.S. Department of Energy’s Light Water Reactor Sustainability Program is collaborating with the Electric Power Research Institute’s (EPRI’s) Long-Term Operations program to implement online monitoring in existing nuclear power plants. Proactive online monitoring in the nuclear industry is being explored using EPRI’s Fleet-Wide Prognostic and Health Management (FW-PHM) Suite software, a set of web-based diagnostic and prognostic tools and databases that serves as an integrated health monitoring architecture. This paper focuses on development of asset fault signatures used to assess the health status of generator step-up transformers and emergency diesel generators in nuclear power plants. Asset fault signatures describe the distinctive features based on technical examinations that can be used to detect a specific fault type. Fault signatures are developed based on the results of detailed technical research and on the knowledge and experience of technical experts. The Diagnostic Advisor of the FW-PHM Suite software matches developed fault signatures with operational data to provide early identification of critical faults and troubleshooting advice that could be used to distinguish between faults with similar symptoms. This research is important as it will support the automation of predictive online monitoring techniques in nuclear power plants to diagnose incipient faults, perform proactive maintenance, and estimate the remaining useful life of assets.

  16. Rhythmic TMS Causes Local Entrainment of Natural Oscillatory Signatures

    PubMed Central

    Thut, Gregor; Veniero, Domenica; Romei, Vincenzo; Miniussi, Carlo; Schyns, Philippe; Gross, Joachim

    2011-01-01

    Summary Background Neuronal elements underlying perception, cognition, and action exhibit distinct oscillatory phenomena, measured in humans by electro- or magnetoencephalography (EEG/MEG). So far, the correlative or causal nature of the link between brain oscillations and functions has remained elusive. A compelling demonstration of causality would primarily generate oscillatory signatures that are known to correlate with particular cognitive functions and then assess the behavioral consequences. Here, we provide the first direct evidence for causal entrainment of brain oscillations by transcranial magnetic stimulation (TMS) using concurrent EEG. Results We used rhythmic TMS bursts to directly interact with an MEG-identified parietal α-oscillator, activated by attention and linked to perception. With TMS bursts tuned to its preferred α-frequency (α-TMS), we confirmed the three main predictions of entrainment of a natural oscillator: (1) that α-oscillations are induced during α-TMS (reproducing an oscillatory signature of the stimulated parietal cortex), (2) that there is progressive enhancement of this α-activity (synchronizing the targeted, α-generator to the α-TMS train), and (3) that this depends on the pre-TMS phase of the background α-rhythm (entrainment of natural, ongoing α-oscillations). Control conditions testing different TMS burst profiles and TMS-EEG in a phantom head confirmed specificity of α-boosting to the case of synchronization between TMS train and neural oscillator. Conclusions The periodic electromagnetic force that is generated during rhythmic TMS can cause local entrainment of natural brain oscillations, emulating oscillatory signatures activated by cognitive tasks. This reveals a new mechanism of online TMS action on brain activity and can account for frequency-specific behavioral TMS effects at the level of biologically relevant rhythms. PMID:21723129

  17. The Pedagogic Signature of the Teaching Profession

    ERIC Educational Resources Information Center

    Kiel, Ewald; Lerche, Thomas; Kollmannsberger, Markus; Oubaid, Viktor; Weiss, Sabine

    2016-01-01

    Lee S. Shulman deplores that the field of education as a profession does not have a pedagogic signature, which he characterizes as a synthesis of cognitive, practical and moral apprenticeship. In this context, the following study has three goals: 1) In the first theoretical part, the basic problems of constructing a pedagogic signature are…

  18. 21 CFR 11.50 - Signature manifestations.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES GENERAL ELECTRONIC RECORDS; ELECTRONIC SIGNATURES Electronic Records § 11.50 Signature manifestations. (a) Signed electronic... the same controls as for electronic records and shall be included as part of any human readable...

  19. 48 CFR 4.102 - Contractor's signature.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 48 Federal Acquisition Regulations System 1 2010-10-01 2010-10-01 false Contractor's signature. 4.102 Section 4.102 Federal Acquisition Regulations System FEDERAL ACQUISITION REGULATION GENERAL ADMINISTRATIVE MATTERS Contract Execution 4.102 Contractor's signature. (a) Individuals. A contract with...

  20. A Real Quantum Designated Verifier Signature Scheme

    NASA Astrophysics Data System (ADS)

    Shi, Wei-Min; Zhou, Yi-Hua; Yang, Yu-Guang

    2015-09-01

    The effectiveness of most quantum signature schemes reported in the literature can be verified by a designated person, however, those quantum signature schemes aren't the real traditional designated verifier signature schemes, because the designated person hasn't the capability to efficiently simulate a signature which is indistinguishable from a signer, which cannot satisfy the requirements in some special environments such as E-voting, call for tenders and software licensing. For solving this problem, a real quantum designated verifier signature scheme is proposed in this paper. According to the property of unitary transformation and quantum one-way function, only a verifier designated by a signer can verify the "validity of a signature" and the designated verifier cannot prove to a third party that the signature was produced by the signer or by himself through a transcript simulation algorithm. Moreover, the quantum key distribution and quantum encryption algorithm guarantee the unconditional security of this scheme. Analysis results show that this new scheme satisfies the main security requirements of designated verifier signature scheme and the major attack strategies.

  1. Does Social Work Have a Signature Pedagogy?

    ERIC Educational Resources Information Center

    Earls Larrison, Tara; Korr, Wynne S.

    2013-01-01

    This article contributes to discourse on signature pedagogy by reconceptualizing how our pedagogies are understood and defined for social work education. We critique the view that field education is social work's signature pedagogy and consider what pedagogies are distinct about the teaching and learning of social work. Using Shulman's…

  2. 5 CFR 850.106 - Electronic signatures.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... to any provisions prescribed by the Director under § 850.104— (1) An electronic communication may be... signature of an electronic communication may be deemed to satisfy any statutory or regulatory requirement... section, an electronic signature is a method of signing an electronic communication, including...

  3. PBL Verification with Radiosonde and Aircraft Data

    NASA Astrophysics Data System (ADS)

    Tsidulko, M.; McQueen, J.; Dimego, G.; Ek, M.

    2008-12-01

    Boundary layer depth is an important characteristic in weather forecasting and it is a key parameter in air quality modeling determining extent of turbulence and dispersion for pollutants. Real-time PBL depths from the NAM(WRF/NMM) model are verified with different types of observations. PBL depths verification is incorporated into NCEP verification system including an ability to provide a range of statistical characteristics for the boundary layer heights. For the model, several types of boundary layer definitions are used. PBL height from the TKE scheme, critical Ri number approach as well as mixed layer depth are compared with observations. Observed PBL depths are determined applying Ri number approach to radiosonde profiles. Also, preliminary study of using ACARS data for PBL verification is conducted.

  4. Active alignment/contact verification system

    DOEpatents

    Greenbaum, William M.

    2000-01-01

    A system involving an active (i.e. electrical) technique for the verification of: 1) close tolerance mechanical alignment between two component, and 2) electrical contact between mating through an elastomeric interface. For example, the two components may be an alumina carrier and a printed circuit board, two mating parts that are extremely small, high density parts and require alignment within a fraction of a mil, as well as a specified interface point of engagement between the parts. The system comprises pairs of conductive structures defined in the surfaces layers of the alumina carrier and the printed circuit board, for example. The first pair of conductive structures relate to item (1) above and permit alignment verification between mating parts. The second pair of conductive structures relate to item (2) above and permit verification of electrical contact between mating parts.

  5. Packaged low-level waste verification system

    SciTech Connect

    Tuite, K.T.; Winberg, M.; Flores, A.Y.; Killian, E.W.; McIsaac, C.V.

    1996-08-01

    Currently, states and low-level radioactive waste (LLW) disposal site operators have no method of independently verifying the radionuclide content of packaged LLW that arrive at disposal sites for disposal. At this time, disposal sites rely on LLW generator shipping manifests and accompanying records to insure that LLW received meets the waste acceptance criteria. An independent verification system would provide a method of checking generator LLW characterization methods and help ensure that LLW disposed of at disposal facilities meets requirements. The Mobile Low-Level Waste Verification System (MLLWVS) provides the equipment, software, and methods to enable the independent verification of LLW shipping records to insure that disposal site waste acceptance criteria are being met. The MLLWVS system was developed under a cost share subcontract between WMG, Inc., and Lockheed Martin Idaho Technologies through the Department of Energy`s National Low-Level Waste Management Program at the Idaho National Engineering Laboratory (INEL).

  6. Land Ice Verification and Validation Kit

    SciTech Connect

    2015-07-15

    To address a pressing need to better understand the behavior and complex interaction of ice sheets within the global Earth system, significant development of continental-scale, dynamical ice-sheet models is underway. The associated verification and validation process of these models is being coordinated through a new, robust, python-based extensible software package, the Land Ice Verification and Validation toolkit (LIVV). This release provides robust and automated verification and a performance evaluation on LCF platforms. The performance V&V involves a comprehensive comparison of model performance relative to expected behavior on a given computing platform. LIVV operates on a set of benchmark and test data, and provides comparisons for a suite of community prioritized tests, including configuration and parameter variations, bit-4-bit evaluation, and plots of tests where differences occur.

  7. Critical Surface Cleaning and Verification Alternatives

    NASA Technical Reports Server (NTRS)

    Melton, Donald M.; McCool, A. (Technical Monitor)

    2000-01-01

    As a result of federal and state requirements, historical critical cleaning and verification solvents such as Freon 113, Freon TMC, and Trichloroethylene (TCE) are either highly regulated or no longer 0 C available. Interim replacements such as HCFC 225 have been qualified, however toxicity and future phase-out regulations necessitate long term solutions. The scope of this project was to qualify a safe and environmentally compliant LOX surface verification alternative to Freon 113, TCE and HCFC 225. The main effort was focused on initiating the evaluation and qualification of HCFC 225G as an alternate LOX verification solvent. The project was scoped in FY 99/00 to perform LOX compatibility, cleaning efficiency and qualification on flight hardware.

  8. Applying Causal Discovery to the Output of Climate Models - What Can We Learn from the Causal Signatures?

    NASA Astrophysics Data System (ADS)

    Ebert-Uphoff, I.; Hammerling, D.; Samarasinghe, S.; Baker, A. H.

    2015-12-01

    The framework of causal discovery provides algorithms that seek to identify potential cause-effect relationships from observational data. The output of such algorithms is a graph structure that indicates the potential causal connections between the observed variables. Originally developed for applications in the social sciences and economics, causal discovery has been used with great success in bioinformatics and, most recently, in climate science, primarily to identify interaction patterns between compound climate variables and to track pathways of interactions between different locations around the globe. Here we apply causal discovery to the output data of climate models to learn so-called causal signatures from the data that indicate interactions between the different atmospheric variables. These causal signatures can act like fingerprints for the underlying dynamics and thus serve a variety of diagnostic purposes. We study the use of the causal signatures for three applications: 1) For climate model software verification we suggest to use causal signatures as a means of detecting statistical differences between model runs, thus identifying potential errors and supplementing the Community Earth System Model Ensemble Consistency Testing (CESM-ECT) tool recently developed at NCAR for CESM verification. 2) In the context of data compression of model runs, we will test how much the causal signatures of the model outputs changes after different compression algorithms have been applied. This may result in additional means to determine which type and amount of compression is acceptable. 3) This is the first study applying causal discovery simultaneously to a large number of different atmospheric variables, and in the process of studying the resulting interaction patterns for the two aforementioned applications, we expect to gain some new insights into their relationships from this approach. We will present first results obtained for Applications 1 and 2 above.

  9. Signatures of mutational processes in human cancer

    PubMed Central

    Alexandrov, Ludmil B.; Nik-Zainal, Serena; Wedge, David C.; Aparicio, Samuel A.J.R.; Behjati, Sam; Biankin, Andrew V.; Bignell, Graham R.; Bolli, Niccolo; Borg, Ake; Børresen-Dale, Anne-Lise; Boyault, Sandrine; Burkhardt, Birgit; Butler, Adam P.; Caldas, Carlos; Davies, Helen R.; Desmedt, Christine; Eils, Roland; Eyfjörd, Jórunn Erla; Foekens, John A.; Greaves, Mel; Hosoda, Fumie; Hutter, Barbara; Ilicic, Tomislav; Imbeaud, Sandrine; Imielinsk, Marcin; Jäger, Natalie; Jones, David T.W.; Jones, David; Knappskog, Stian; Kool, Marcel; Lakhani, Sunil R.; López-Otín, Carlos; Martin, Sancha; Munshi, Nikhil C.; Nakamura, Hiromi; Northcott, Paul A.; Pajic, Marina; Papaemmanuil, Elli; Paradiso, Angelo; Pearson, John V.; Puente, Xose S.; Raine, Keiran; Ramakrishna, Manasa; Richardson, Andrea L.; Richter, Julia; Rosenstiel, Philip; Schlesner, Matthias; Schumacher, Ton N.; Span, Paul N.; Teague, Jon W.; Totoki, Yasushi; Tutt, Andrew N.J.; Valdés-Mas, Rafael; van Buuren, Marit M.; van ’t Veer, Laura; Vincent-Salomon, Anne; Waddell, Nicola; Yates, Lucy R.; Zucman-Rossi, Jessica; Futreal, P. Andrew; McDermott, Ultan; Lichter, Peter; Meyerson, Matthew; Grimmond, Sean M.; Siebert, Reiner; Campo, Elías; Shibata, Tatsuhiro; Pfister, Stefan M.; Campbell, Peter J.; Stratton, Michael R.

    2013-01-01

    All cancers are caused by somatic mutations. However, understanding of the biological processes generating these mutations is limited. The catalogue of somatic mutations from a cancer genome bears the signatures of the mutational processes that have been operative. Here, we analysed 4,938,362 mutations from 7,042 cancers and extracted more than 20 distinct mutational signatures. Some are present in many cancer types, notably a signature attributed to the APOBEC family of cytidine deaminases, whereas others are confined to a single class. Certain signatures are associated with age of the patient at cancer diagnosis, known mutagenic exposures or defects in DNA maintenance, but many are of cryptic origin. In addition to these genome-wide mutational signatures, hypermutation localized to small genomic regions, kataegis, is found in many cancer types. The results reveal the diversity of mutational processes underlying the development of cancer with potential implications for understanding of cancer etiology, prevention and therapy. PMID:23945592

  10. Real time gamma-ray signature identifier

    DOEpatents

    Rowland, Mark [Alamo, CA; Gosnell, Tom B [Moraga, CA; Ham, Cheryl [Livermore, CA; Perkins, Dwight [Livermore, CA; Wong, James [Dublin, CA

    2012-05-15

    A real time gamma-ray signature/source identification method and system using principal components analysis (PCA) for transforming and substantially reducing one or more comprehensive spectral libraries of nuclear materials types and configurations into a corresponding concise representation/signature(s) representing and indexing each individual predetermined spectrum in principal component (PC) space, wherein an unknown gamma-ray signature may be compared against the representative signature to find a match or at least characterize the unknown signature from among all the entries in the library with a single regression or simple projection into the PC space, so as to substantially reduce processing time and computing resources and enable real-time characterization and/or identification.

  11. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2017-01-01

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this paper, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determine their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d' , and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical kinship verification via representation learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU kinship database is created, which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields the state-of-the-art kinship verification accuracy on the WVU kinship database and on four existing benchmark data sets. Furthermore, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  12. Hierarchical Representation Learning for Kinship Verification.

    PubMed

    Kohli, Naman; Vatsa, Mayank; Singh, Richa; Noore, Afzel; Majumdar, Angshul

    2016-09-14

    Kinship verification has a number of applications such as organizing large collections of images and recognizing resemblances among humans. In this research, first, a human study is conducted to understand the capabilities of human mind and to identify the discriminatory areas of a face that facilitate kinship-cues. The visual stimuli presented to the participants determines their ability to recognize kin relationship using the whole face as well as specific facial regions. The effect of participant gender and age and kin-relation pair of the stimulus is analyzed using quantitative measures such as accuracy, discriminability index d1, and perceptual information entropy. Utilizing the information obtained from the human study, a hierarchical Kinship Verification via Representation Learning (KVRL) framework is utilized to learn the representation of different face regions in an unsupervised manner. We propose a novel approach for feature representation termed as filtered contractive deep belief networks (fcDBN). The proposed feature representation encodes relational information present in images using filters and contractive regularization penalty. A compact representation of facial images of kin is extracted as an output from the learned model and a multi-layer neural network is utilized to verify the kin accurately. A new WVU Kinship Database is created which consists of multiple images per subject to facilitate kinship verification. The results show that the proposed deep learning framework (KVRL-fcDBN) yields stateof- the-art kinship verification accuracy on the WVU Kinship database and on four existing benchmark datasets. Further, kinship information is used as a soft biometric modality to boost the performance of face verification via product of likelihood ratio and support vector machine based approaches. Using the proposed KVRL-fcDBN framework, an improvement of over 20% is observed in the performance of face verification.

  13. 37 CFR 380.6 - Verification of royalty payments.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... TRANSMISSIONS, NEW SUBSCRIPTION SERVICES AND THE MAKING OF EPHEMERAL REPRODUCTIONS § 380.6 Verification of... purpose of the audit. The Collective shall retain the report of the verification for a period of not...

  14. Generic Protocol for the Verification of Ballast Water Treatment Technology

    EPA Science Inventory

    In anticipation of the need to address performance verification and subsequent approval of new and innovative ballast water treatment technologies for shipboard installation, the U.S Coast Guard and the Environmental Protection Agency‘s Environmental Technology Verification Progr...

  15. Jet Propulsion Laboratory Environmental Verification Processes and Test Effectiveness

    NASA Technical Reports Server (NTRS)

    Hoffman, Alan R.; Green, Nelson W.

    2006-01-01

    Viewgraphs on the JPL processes for enviornmental verification and testing of aerospace systems is presented. The topics include: 1) Processes: a) JPL Design Principles b) JPL Flight Project Practices; 2) Environmental Verification; and 3) Test Effectiveness Assessment: Inflight Anomaly Trends.

  16. Fuel Retrieval System Design Verification Report

    SciTech Connect

    GROTH, B.D.

    2000-04-11

    The Fuel Retrieval Subproject was established as part of the Spent Nuclear Fuel Project (SNF Project) to retrieve and repackage the SNF located in the K Basins. The Fuel Retrieval System (FRS) construction work is complete in the KW Basin, and start-up testing is underway. Design modifications and construction planning are also underway for the KE Basin. An independent review of the design verification process as applied to the K Basin projects was initiated in support of preparation for the SNF Project operational readiness review (ORR). A Design Verification Status Questionnaire, Table 1, is included which addresses Corrective Action SNF-EG-MA-EG-20000060, Item No.9 (Miller 2000).

  17. The formal verification of generic interpreters

    NASA Technical Reports Server (NTRS)

    Windley, P.; Levitt, K.; Cohen, G. C.

    1991-01-01

    The task assignment 3 of the design and validation of digital flight control systems suitable for fly-by-wire applications is studied. Task 3 is associated with formal verification of embedded systems. In particular, results are presented that provide a methodological approach to microprocessor verification. A hierarchical decomposition strategy for specifying microprocessors is also presented. A theory of generic interpreters is presented that can be used to model microprocessor behavior. The generic interpreter theory abstracts away the details of instruction functionality, leaving a general model of what an interpreter does.

  18. Proton Therapy Verification with PET Imaging

    PubMed Central

    Zhu, Xuping; Fakhri, Georges El

    2013-01-01

    Proton therapy is very sensitive to uncertainties introduced during treatment planning and dose delivery. PET imaging of proton induced positron emitter distributions is the only practical approach for in vivo, in situ verification of proton therapy. This article reviews the current status of proton therapy verification with PET imaging. The different data detecting systems (in-beam, in-room and off-line PET), calculation methods for the prediction of proton induced PET activity distributions, and approaches for data evaluation are discussed. PMID:24312147

  19. Verification of Plan Models Using UPPAAL

    NASA Technical Reports Server (NTRS)

    Khatib, Lina; Muscettola, Nicola; Haveland, Klaus; Lau, Sonic (Technical Monitor)

    2001-01-01

    This paper describes work on the verification of HSTS, the planner and scheduler of the Remote Agent autonomous control system deployed in Deep Space 1 (DS1). The verification is done using UPPAAL, a real time model checking tool. We start by motivating our work in the introduction. Then we give a brief description of HSTS and UPPAAL. After that, we give a mapping of HSTS models into UPPAAL and we present samples of plan model properties one may want to verify. Finally, we conclude with a summary.

  20. Challenges in High-Assurance Runtime Verification

    NASA Technical Reports Server (NTRS)

    Goodloe, Alwyn E.

    2016-01-01

    Safety-critical systems are growing more complex and becoming increasingly autonomous. Runtime Verification (RV) has the potential to provide protections when a system cannot be assured by conventional means, but only if the RV itself can be trusted. In this paper, we proffer a number of challenges to realizing high-assurance RV and illustrate how we have addressed them in our research. We argue that high-assurance RV provides a rich target for automated verification tools in hope of fostering closer collaboration among the communities.

  1. Verification Of Tooling For Robotic Welding

    NASA Technical Reports Server (NTRS)

    Osterloh, Mark R.; Sliwinski, Karen E.; Anderson, Ronald R.

    1991-01-01

    Computer simulations, robotic inspections, and visual inspections performed to detect discrepancies. Method for verification of tooling for robotic welding involves combination of computer simulations and visual inspections. Verification process ensures accuracy of mathematical model representing tooling in off-line programming system that numerically simulates operation of robotic welding system. Process helps prevent damaging collisions between welding equipment and workpiece, ensures tooling positioned and oriented properly with respect to workpiece, and/or determines whether tooling to be modified or adjusted to achieve foregoing objectives.

  2. Electric power system test and verification program

    NASA Technical Reports Server (NTRS)

    Rylicki, Daniel S.; Robinson, Frank, Jr.

    1994-01-01

    Space Station Freedom's (SSF's) electric power system (EPS) hardware and software verification is performed at all levels of integration, from components to assembly and system level tests. Careful planning is essential to ensure the EPS is tested properly on the ground prior to launch. The results of the test performed on breadboard model hardware and analyses completed to date have been evaluated and used to plan for design qualification and flight acceptance test phases. These results and plans indicate the verification program for SSF's 75-kW EPS would have been successful and completed in time to support the scheduled first element launch.

  3. On Backward-Style Anonymity Verification

    NASA Astrophysics Data System (ADS)

    Kawabe, Yoshinobu; Mano, Ken; Sakurada, Hideki; Tsukada, Yasuyuki

    Many Internet services and protocols should guarantee anonymity; for example, an electronic voting system should guarantee to prevent the disclosure of who voted for which candidate. To prove trace anonymity, which is an extension of the formulation of anonymity by Schneider and Sidiropoulos, this paper presents an inductive method based on backward anonymous simulations. We show that the existence of an image-finite backward anonymous simulation implies trace anonymity. We also demonstrate the anonymity verification of an e-voting protocol (the FOO protocol) with our backward anonymous simulation technique. When proving the trace anonymity, this paper employs a computer-assisted verification tool based on a theorem prover.

  4. Three plasma metabolite signatures for diagnosing high altitude pulmonary edema

    PubMed Central

    Guo, Li; Tan, Guangguo; Liu, Ping; Li, Huijie; Tang, Lulu; Huang, Lan; Ren, Qian

    2015-01-01

    High-altitude pulmonary edema (HAPE) is a potentially fatal condition, occurring at altitudes greater than 3,000 m and affecting rapidly ascending, non-acclimatized healthy individuals. However, the lack of biomarkers for this disease still constitutes a bottleneck in the clinical diagnosis. Here, ultra-high performance liquid chromatography coupled with Q-TOF mass spectrometry was applied to study plasma metabolite profiling from 57 HAPE and 57 control subjects. 14 differential plasma metabolites responsible for the discrimination between the two groups from discovery set (35 HAPE subjects and 35 healthy controls) were identified. Furthermore, 3 of the 14 metabolites (C8-ceramide, sphingosine and glutamine) were selected as candidate diagnostic biomarkers for HAPE using metabolic pathway impact analysis. The feasibility of using the combination of these three biomarkers for HAPE was evaluated, where the area under the receiver operating characteristic curve (AUC) was 0.981 and 0.942 in the discovery set and the validation set (22 HAPE subjects and 22 healthy controls), respectively. Taken together, these results suggested that this composite plasma metabolite signature may be used in HAPE diagnosis, especially after further investigation and verification with larger samples. PMID:26459926

  5. Three plasma metabolite signatures for diagnosing high altitude pulmonary edema

    NASA Astrophysics Data System (ADS)

    Guo, Li; Tan, Guangguo; Liu, Ping; Li, Huijie; Tang, Lulu; Huang, Lan; Ren, Qian

    2015-10-01

    High-altitude pulmonary edema (HAPE) is a potentially fatal condition, occurring at altitudes greater than 3,000 m and affecting rapidly ascending, non-acclimatized healthy individuals. However, the lack of biomarkers for this disease still constitutes a bottleneck in the clinical diagnosis. Here, ultra-high performance liquid chromatography coupled with Q-TOF mass spectrometry was applied to study plasma metabolite profiling from 57 HAPE and 57 control subjects. 14 differential plasma metabolites responsible for the discrimination between the two groups from discovery set (35 HAPE subjects and 35 healthy controls) were identified. Furthermore, 3 of the 14 metabolites (C8-ceramide, sphingosine and glutamine) were selected as candidate diagnostic biomarkers for HAPE using metabolic pathway impact analysis. The feasibility of using the combination of these three biomarkers for HAPE was evaluated, where the area under the receiver operating characteristic curve (AUC) was 0.981 and 0.942 in the discovery set and the validation set (22 HAPE subjects and 22 healthy controls), respectively. Taken together, these results suggested that this composite plasma metabolite signature may be used in HAPE diagnosis, especially after further investigation and verification with larger samples.

  6. Dynamic characteristics of signatures: effects of writer style on genuine and simulated signatures.

    PubMed

    Mohammed, Linton; Found, Bryan; Caligiuri, Michael; Rogers, Doug

    2015-01-01

    The aims of this study were to determine if computer-measured dynamic features (duration, size, velocity, jerk, and pen pressure) differ between genuine and simulated signatures. Sixty subjects (3 equal groups of 3 signature styles) each provided 10 naturally written (genuine) signatures. Each of these subjects then provided 15 simulations of each of three model signatures. The genuine (N = 600) and simulated (N = 2700) signatures were collected using a digitizing tablet. MovAlyzeR(®) software was used to estimate kinematic parameters for each pen stroke. Stroke duration, velocity, and pen pressure were found to discriminate between genuine and simulated signatures regardless of the simulator's own style of signature or the style of signature being simulated. However, there was a significant interaction between style and condition for size and jerk (a measure of smoothness). The results of this study, based on quantitative analysis and dynamic handwriting features, indicate that the style of the simulator's own signature and the style of signature being simulated can impact the characteristics of handwriting movements for simulations. Writer style characteristics might therefore need to be taken into consideration as potentially significant when evaluating signature features with a view to forming opinions regarding authenticity.

  7. DNA Methylation Signature of Childhood Chronic Physical Aggression in T Cells of Both Men and Women

    PubMed Central

    Guillemin, Claire; Provençal, Nadine; Suderman, Matthew; Côté, Sylvana M.; Vitaro, Frank; Hallett, Michael; Tremblay, Richard E.; Szyf, Moshe

    2014-01-01

    Background High frequency of physical aggression is the central feature of severe conduct disorder and is associated with a wide range of social, mental and physical health problems. We have previously tested the hypothesis that differential DNA methylation signatures in peripheral T cells are associated with a chronic aggression trajectory in males. Despite the fact that sex differences appear to play a pivotal role in determining the development, magnitude and frequency of aggression, most of previous studies focused on males, so little is known about female chronic physical aggression. We therefore tested here whether or not there is a signature of physical aggression in female DNA methylation and, if there is, how it relates to the signature observed in males. Methodology/Principal Findings Methylation profiles were created using the method of methylated DNA immunoprecipitation (MeDIP) followed by microarray hybridization and statistical and bioinformatic analyses on T cell DNA obtained from adult women who were found to be on a chronic physical aggression trajectory (CPA) between 6 and 12 years of age compared to women who followed a normal physical aggression trajectory. We confirmed the existence of a well-defined, genome-wide signature of DNA methylation associated with chronic physical aggression in the peripheral T cells of adult females that includes many of the genes similarly associated with physical aggression in the same cell types of adult males. Conclusions This study in a small number of women presents preliminary evidence for a genome-wide variation in promoter DNA methylation that associates with CPA in women that warrant larger studies for further verification. A significant proportion of these associations were previously observed in men with CPA supporting the hypothesis that the epigenetic signature of early life aggression in females is composed of a component specific to females and another common to both males and females. PMID:24475181

  8. On the Privacy Protection of Biometric Traits: Palmprint, Face, and Signature

    NASA Astrophysics Data System (ADS)

    Panigrahy, Saroj Kumar; Jena, Debasish; Korra, Sathya Babu; Jena, Sanjay Kumar

    Biometrics are expected to add a new level of security to applications, as a person attempting access must prove who he or she really is by presenting a biometric to the system. The recent developments in the biometrics area have lead to smaller, faster and cheaper systems, which in turn has increased the number of possible application areas for biometric identity verification. The biometric data, being derived from human bodies (and especially when used to identify or verify those bodies) is considered personally identifiable information (PII). The collection, use and disclosure of biometric data — image or template, invokes rights on the part of an individual and obligations on the part of an organization. As biometric uses and databases grow, so do concerns that the personal data collected will not be used in reasonable and accountable ways. Privacy concerns arise when biometric data are used for secondary purposes, invoking function creep, data matching, aggregation, surveillance and profiling. Biometric data transmitted across networks and stored in various databases by others can also be stolen, copied, or otherwise misused in ways that can materially affect the individual involved. As Biometric systems are vulnerable to replay, database and brute-force attacks, such potential attacks must be analysed before they are massively deployed in security systems. Along with security, also the privacy of the users is an important factor as the constructions of lines in palmprints contain personal characteristics, from face images a person can be recognised, and fake signatures can be practised by carefully watching the signature images available in the database. We propose a cryptographic approach to encrypt the images of palmprints, faces, and signatures by an advanced Hill cipher technique for hiding the information in the images. It also provides security to these images from being attacked by above mentioned attacks. So, during the feature extraction, the

  9. On "A new quantum blind signature with unlinkability"

    NASA Astrophysics Data System (ADS)

    Luo, Yi-Ping; Tsai, Shang-Lun; Hwang, Tzonelih; Kao, Shih-Hung

    2017-04-01

    This article points out a security loophole in Shi et al.'s quantum blind signature scheme. By using the modification attack, a message owner can cheat a signature receiver with a fake message-signature pair without being detected.

  10. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... signature components and controls. (a) Electronic signatures that are not based upon biometrics shall: (1... signatures based upon biometrics shall be designed to ensure that they cannot be used by anyone other...

  11. 76 FR 411 - Regulatory Guidance Concerning Electronic Signatures and Documents

    Federal Register 2010, 2011, 2012, 2013, 2014

    2011-01-04

    ... Federal Motor Carrier Safety Administration Regulatory Guidance Concerning Electronic Signatures and... guidance. SUMMARY: FMCSA issues regulatory guidance concerning the use of electronic signatures and... information regarding FMCSA's acceptance of electronic signature on documents required by the Federal...

  12. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... signature components and controls. (a) Electronic signatures that are not based upon biometrics shall: (1... signatures based upon biometrics shall be designed to ensure that they cannot be used by anyone other...

  13. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... signature components and controls. (a) Electronic signatures that are not based upon biometrics shall: (1... signatures based upon biometrics shall be designed to ensure that they cannot be used by anyone other...

  14. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... signature components and controls. (a) Electronic signatures that are not based upon biometrics shall: (1... signatures based upon biometrics shall be designed to ensure that they cannot be used by anyone other...

  15. 21 CFR 11.200 - Electronic signature components and controls.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... signature components and controls. (a) Electronic signatures that are not based upon biometrics shall: (1... signatures based upon biometrics shall be designed to ensure that they cannot be used by anyone other...

  16. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  17. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  18. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  19. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  20. 24 CFR 5.512 - Verification of eligible immigration status.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... immigration status. 5.512 Section 5.512 Housing and Urban Development Office of the Secretary, Department of... Noncitizens § 5.512 Verification of eligible immigration status. (a) General. Except as described in paragraph...) Primary verification—(1) Automated verification system. Primary verification of the immigration status...

  1. 40 CFR 1066.240 - Torque transducer verification.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Torque transducer verification. 1066... POLLUTION CONTROLS VEHICLE-TESTING PROCEDURES Dynamometer Specifications § 1066.240 Torque transducer verification. Verify torque-measurement systems by performing the verifications described in §§ 1066.270...

  2. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... 40 Protection of Environment 34 2012-07-01 2012-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  3. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... 40 Protection of Environment 33 2011-07-01 2011-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  4. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... 40 Protection of Environment 34 2013-07-01 2013-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  5. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... 40 Protection of Environment 32 2010-07-01 2010-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  6. 40 CFR 1065.395 - Inertial PM balance verifications.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... 40 Protection of Environment 33 2014-07-01 2014-07-01 false Inertial PM balance verifications... Inertial PM balance verifications. This section describes how to verify the performance of an inertial PM balance. (a) Independent verification. Have the balance manufacturer (or a representative approved by...

  7. 15 CFR 748.13 - Delivery Verification (DV).

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 15 Commerce and Foreign Trade 2 2011-01-01 2011-01-01 false Delivery Verification (DV). 748.13... (CLASSIFICATION, ADVISORY, AND LICENSE) AND DOCUMENTATION § 748.13 Delivery Verification (DV). (a) Scope. (1) BIS may request the licensee to obtain verifications of delivery on a selective basis. A...

  8. 15 CFR 748.13 - Delivery Verification (DV).

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... 15 Commerce and Foreign Trade 2 2010-01-01 2010-01-01 false Delivery Verification (DV). 748.13... (CLASSIFICATION, ADVISORY, AND LICENSE) AND DOCUMENTATION § 748.13 Delivery Verification (DV). (a) Scope. (1) BIS may request the licensee to obtain verifications of delivery on a selective basis. A...

  9. 21 CFR 120.11 - Verification and validation.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 21 Food and Drugs 2 2014-04-01 2014-04-01 false Verification and validation. 120.11 Section 120.11 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD... § 120.11 Verification and validation. (a) Verification. Each processor shall verify that the...

  10. Sterilization of compounded parenteral preparations: verification of autoclaves.

    PubMed

    Rahe, Hank

    2013-01-01

    This article discusses the basic principles for verification of a sterilization process and provides a recommended approach to assure that autoclaves deliver the sterility-assured levels required for patient safety. Included is a summary of the protocol and verification (validation) results of a previously published case study involving autoclaves. To assure the sterility of compounded preparations, a verification procedure must be in place.

  11. 45 CFR 95.626 - Independent Verification and Validation.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 45 Public Welfare 1 2011-10-01 2011-10-01 false Independent Verification and Validation. 95.626... (FFP) Specific Conditions for Ffp § 95.626 Independent Verification and Validation. (a) An assessment for independent verification and validation (IV&V) analysis of a State's system development effort...

  12. Online shopping hesitation.

    PubMed

    Cho, Chang-Hoan; Kang, Jaewon; Cheon, Hongsik John

    2006-06-01

    This study was designed to understand which factors influence consumer hesitation or delay in online product purchases. The study examined four groups of variables (i.e., consumer characteristics, contextual factors perceived uncertainty factors, and medium/channel innovation factors) that predict three types of online shopping hesitation (i.e., overall hesitation, shopping cart abandonment, and hesitation at the final payment stage). We found that different sets of delay factors are related to different aspects of online shopping hesitation. The study concludes with suggestion for various delay-reduction devices to help consumers close their online decision hesitation.

  13. A feasibility study of treatment verification using EPID cine images for hypofractionated lung radiotherapy

    NASA Astrophysics Data System (ADS)

    Tang, Xiaoli; Lin, Tong; Jiang, Steve

    2009-09-01

    We propose a novel approach for potential online treatment verification using cine EPID (electronic portal imaging device) images for hypofractionated lung radiotherapy based on a machine learning algorithm. Hypofractionated radiotherapy requires high precision. It is essential to effectively monitor the target to ensure that the tumor is within the beam aperture. We modeled the treatment verification problem as a two-class classification problem and applied an artificial neural network (ANN) to classify the cine EPID images acquired during the treatment into corresponding classes—with the tumor inside or outside of the beam aperture. Training samples were generated for the ANN using digitally reconstructed radiographs (DRRs) with artificially added shifts in the tumor location—to simulate cine EPID images with different tumor locations. Principal component analysis (PCA) was used to reduce the dimensionality of the training samples and cine EPID images acquired during the treatment. The proposed treatment verification algorithm was tested on five hypofractionated lung patients in a retrospective fashion. On average, our proposed algorithm achieved a 98.0% classification accuracy, a 97.6% recall rate and a 99.7% precision rate. This work was first presented at the Seventh International Conference on Machine Learning and Applications, San Diego, CA, USA, 11-13 December 2008.

  14. Quantification, Prediction, and the Online Impact of Sentence Truth-Value: Evidence from Event-Related Potentials

    ERIC Educational Resources Information Center

    Nieuwland, Mante S.

    2016-01-01

    Do negative quantifiers like "few" reduce people's ability to rapidly evaluate incoming language with respect to world knowledge? Previous research has addressed this question by examining whether online measures of quantifier comprehension match the "final" interpretation reflected in verification judgments. However, these…

  15. What Signatures Dominantly Associate with Gene Age?

    PubMed Central

    Yin, Hongyan; Wang, Guangyu; Ma, Lina; Yi, Soojin V.; Zhang, Zhang

    2016-01-01

    As genes originate at different evolutionary times, they harbor distinctive genomic signatures of evolutionary ages. Although previous studies have investigated different gene age-related signatures, what signatures dominantly associate with gene age remains unresolved. Here we address this question via a combined approach of comprehensive assignment of gene ages, gene family identification, and multivariate analyses. We first provide a comprehensive and improved gene age assignment by combining homolog clustering with phylogeny inference and categorize human genes into 26 age classes spanning the whole tree of life. We then explore the dominant age-related signatures based on a collection of 10 potential signatures (including gene composition, gene length, selection pressure, expression level, connectivity in protein–protein interaction network and DNA methylation). Our results show that GC content and connectivity in protein–protein interaction network (PPIN) associate dominantly with gene age. Furthermore, we investigate the heterogeneity of dominant signatures in duplicates and singletons. We find that GC content is a consistent primary factor of gene age in duplicates and singletons, whereas PPIN is more strongly associated with gene age in singletons than in duplicates. Taken together, GC content and PPIN are two dominant signatures in close association with gene age, exhibiting heterogeneity in duplicates and singletons and presumably reflecting complex differential interplays between natural selection and mutation. PMID:27609935

  16. Going Online the MI Way.

    ERIC Educational Resources Information Center

    Feldt, Jill

    This booklet describes online searching using Materials Information, a metallurgy and metals science information service of the Institute of Metals in London and ASM International in Cleveland, Ohio, which is available through the major online vendors. Described in detail are online searching, online databases, costs, online hosts or vendors,…

  17. Molecular signatures of major depression.

    PubMed

    Cai, Na; Chang, Simon; Li, Yihan; Li, Qibin; Hu, Jingchu; Liang, Jieqin; Song, Li; Kretzschmar, Warren; Gan, Xiangchao; Nicod, Jerome; Rivera, Margarita; Deng, Hong; Du, Bo; Li, Keqing; Sang, Wenhu; Gao, Jingfang; Gao, Shugui; Ha, Baowei; Ho, Hung-Yao; Hu, Chunmei; Hu, Jian; Hu, Zhenfei; Huang, Guoping; Jiang, Guoqing; Jiang, Tao; Jin, Wei; Li, Gongying; Li, Kan; Li, Yi; Li, Yingrui; Li, Youhui; Lin, Yu-Ting; Liu, Lanfen; Liu, Tiebang; Liu, Ying; Liu, Yuan; Lu, Yao; Lv, Luxian; Meng, Huaqing; Qian, Puyi; Sang, Hong; Shen, Jianhua; Shi, Jianguo; Sun, Jing; Tao, Ming; Wang, Gang; Wang, Guangbiao; Wang, Jian; Wang, Linmao; Wang, Xueyi; Wang, Xumei; Yang, Huanming; Yang, Lijun; Yin, Ye; Zhang, Jinbei; Zhang, Kerang; Sun, Ning; Zhang, Wei; Zhang, Xiuqing; Zhang, Zhen; Zhong, Hui; Breen, Gerome; Wang, Jun; Marchini, Jonathan; Chen, Yiping; Xu, Qi; Xu, Xun; Mott, Richard; Huang, Guo-Jen; Kendler, Kenneth; Flint, Jonathan

    2015-05-04

    Adversity, particularly in early life, can cause illness. Clues to the responsible mechanisms may lie with the discovery of molecular signatures of stress, some of which include alterations to an individual's somatic genome. Here, using genome sequences from 11,670 women, we observed a highly significant association between a stress-related disease, major depression, and the amount of mtDNA (p = 9.00 × 10(-42), odds ratio 1.33 [95% confidence interval [CI] = 1.29-1.37]) and telomere length (p = 2.84 × 10(-14), odds ratio 0.85 [95% CI = 0.81-0.89]). While both telomere length and mtDNA amount were associated with adverse life events, conditional regression analyses showed the molecular changes were contingent on the depressed state. We tested this hypothesis with experiments in mice, demonstrating that stress causes both molecular changes, which are partly reversible and can be elicited by the administration of corticosterone. Together, these results demonstrate that changes in the amount of mtDNA and telomere length are consequences of stress and entering a depressed state. These findings identify increased amounts of mtDNA as a molecular marker of MD and have important implications for understanding how stress causes the disease.

  18. SIRUS spectral signature analysis code

    NASA Astrophysics Data System (ADS)

    Bishop, Gary J.; Caola, Mike J.; Geatches, Rachel M.; Roberts, Nick C.

    2003-09-01

    The Advanced Technology Centre (ATC) is responsible for developing IR signature prediction capabilities for its parent body, BAE SYSTEMS. To achieve this, the SIRUS code has been developed and used on a variety of projects for well over a decade. SIRUS is capable of providing accurate IR predictions for air breathing and rocket motor propelled vehicles. SIRUS models various physical components to derive its predictions. A key component is the radiance reflected from the surface of the modeled vehicle. This is modeled by fitting parameters to the measured Bi-Directional Reflectance Function (BDRF) of the surface material(s). The ATC have successfully implemented a parameterization scheme based on the published OPTASM model, and this is described. However, inconsistencies between reflectance measurements and values calculated from the parameterized fit have led to an elliptical parameter enhancement. The implementation of this is also described. Finally, an end-to-end measurement-parameterization capability is described, based on measurements taken with SOC600 instrumentation.

  19. Molecular signatures of vaccine adjuvants.

    PubMed

    Olafsdottir, Thorunn; Lindqvist, Madelene; Harandi, Ali M

    2015-09-29

    Mass vaccination has saved millions of human lives and improved the quality of life in both developing and developed countries. The emergence of new pathogens and inadequate protection conferred by some of the existing vaccines such as vaccines for tuberculosis, influenza and pertussis especially in certain age groups have resulted in a move from empirically developed vaccines toward more pathogen tailored and rationally engineered vaccines. A deeper understanding of the interaction of innate and adaptive immunity at molecular level enables the development of vaccines that selectively target certain type of immune responses without excessive reactogenicity. Adjuvants constitute an imperative element of modern vaccines. Although a variety of candidate adjuvants have been evaluated in the past few decades, only a limited number of vaccine adjuvants are currently available for human use. A better understanding of the mode of action of adjuvants is pivotal to harness the potential of existing and new adjuvants in shaping a desired immune response. Recent advancement in systems biology powered by the emerging cutting edge omics technology has led to the identification of molecular signatures rapidly induced after vaccination in the blood that correlate and predict a later protective immune response or vaccine safety. This can pave ways to prospectively determine the potency and safety of vaccines and adjuvants. This review is intended to highlight the importance of big data analysis in advancing our understanding of the mechanisms of actions of adjuvants to inform rational development of future human vaccines.

  20. Transcriptional Signatures in Huntington's Disease

    PubMed Central

    2007-01-01

    While selective neuronal death has been an influential theme in Huntington's disease (HD), there is now a preponderance of evidence that significant neuronal dysfunction precedes frank neuronal death. The best evidence for neuronal dysfunction is the observation that gene expression is altered in HD brain, suggesting that transcriptional dysregulation is a central mechanism. Studies of altered gene expression began with careful observations of post-mortem human HD brain and subsequently were accelerated by the development of transgenic mouse models. The application of DNA microarray technology has spurred tremendous progress with respect to the altered transcriptional processes that occur in HD, through gene expression studies of both transgenic mouse models as well as cellular models of HD. Gene expression profiles are remarkably comparable across these models, bolstering the idea that transcriptional signatures reflect an essential feature of disease pathogenesis. Finally, gene expression studies have been applied to human HD, thus not only validating the approach of using model systems, but also solidifying the idea that altered transcription is a key mechanism in HD pathogenesis. In the future, gene expression profiling will be used as a readout in clinical trials aimed at correcting transcriptional dysregulation in Huntington's disease. PMID:17467140

  1. Infrared signatures from bomb detonations

    NASA Astrophysics Data System (ADS)

    Orson, Jay A.; Bagby, William F.; Perram, Glen P.

    2003-04-01

    Remote observations of the temporal and spectral characteristics of the infrared emissions from bomb detonations have been correlated with explosion conditions. A Fourier transform interferometer was used to record spectra in the 1.6-20 μm range at spectral resolutions of 4-16 cm -1 and temporal resolutions of 0.047-0.123 s. Field observations of 56 detonation events included a set of aircraft delivered ordinance and a series of static ground detonations for a variety of bomb sizes, types and environmental conditions. The emission is well represented by a gray body with continuously decreasing temperature and characteristic decay times of 1-4 s, providing only limited variability with detonation conditions. However, the fireball size times the emissivity as a function of time can be determined from the spectra without imaging and provides a more sensitive signature. The degree of temporal overlap as a function of frequency for a pair of detonation events provides a very sensitive discriminator for explosion conditions. The temporal overlap decreases with increasing emission frequency for all the observed events, indicating more information content at higher frequencies.

  2. Molecular Signatures of Major Depression

    PubMed Central

    Cai, Na; Chang, Simon; Li, Yihan; Li, Qibin; Hu, Jingchu; Liang, Jieqin; Song, Li; Kretzschmar, Warren; Gan, Xiangchao; Nicod, Jerome; Rivera, Margarita; Deng, Hong; Du, Bo; Li, Keqing; Sang, Wenhu; Gao, Jingfang; Gao, Shugui; Ha, Baowei; Ho, Hung-Yao; Hu, Chunmei; Hu, Jian; Hu, Zhenfei; Huang, Guoping; Jiang, Guoqing; Jiang, Tao; Jin, Wei; Li, Gongying; Li, Kan; Li, Yi; Li, Yingrui; Li, Youhui; Lin, Yu-Ting; Liu, Lanfen; Liu, Tiebang; Liu, Ying; Liu, Yuan; Lu, Yao; Lv, Luxian; Meng, Huaqing; Qian, Puyi; Sang, Hong; Shen, Jianhua; Shi, Jianguo; Sun, Jing; Tao, Ming; Wang, Gang; Wang, Guangbiao; Wang, Jian; Wang, Linmao; Wang, Xueyi; Wang, Xumei; Yang, Huanming; Yang, Lijun; Yin, Ye; Zhang, Jinbei; Zhang, Kerang; Sun, Ning; Zhang, Wei; Zhang, Xiuqing; Zhang, Zhen; Zhong, Hui; Breen, Gerome; Wang, Jun; Marchini, Jonathan; Chen, Yiping; Xu, Qi; Xu, Xun; Mott, Richard; Huang, Guo-Jen; Kendler, Kenneth; Flint, Jonathan

    2015-01-01

    Summary Adversity, particularly in early life, can cause illness. Clues to the responsible mechanisms may lie with the discovery of molecular signatures of stress, some of which include alterations to an individual’s somatic genome. Here, using genome sequences from 11,670 women, we observed a highly significant association between a stress-related disease, major depression, and the amount of mtDNA (p = 9.00 × 10−42, odds ratio 1.33 [95% confidence interval [CI] = 1.29–1.37]) and telomere length (p = 2.84 × 10−14, odds ratio 0.85 [95% CI = 0.81–0.89]). While both telomere length and mtDNA amount were associated with adverse life events, conditional regression analyses showed the molecular changes were contingent on the depressed state. We tested this hypothesis with experiments in mice, demonstrating that stress causes both molecular changes, which are partly reversible and can be elicited by the administration of corticosterone. Together, these results demonstrate that changes in the amount of mtDNA and telomere length are consequences of stress and entering a depressed state. These findings identify increased amounts of mtDNA as a molecular marker of MD and have important implications for understanding how stress causes the disease. PMID:25913401

  3. (abstract) Topographic Signatures in Geology

    NASA Technical Reports Server (NTRS)

    Farr, Tom G.; Evans, Diane L.

    1996-01-01

    Topographic information is required for many Earth Science investigations. For example, topography is an important element in regional and global geomorphic studies because it reflects the interplay between the climate-driven processes of erosion and the tectonic processes of uplift. A number of techniques have been developed to analyze digital topographic data, including Fourier texture analysis. A Fourier transform of the topography of an area allows the spatial frequency content of the topography to be analyzed. Band-pass filtering of the transform produces images representing the amplitude of different spatial wavelengths. These are then used in a multi-band classification to map units based on their spatial frequency content. The results using a radar image instead of digital topography showed good correspondence to a geologic map, however brightness variations in the image unrelated to topography caused errors. An additional benefit to the use of Fourier band-pass images for the classification is that the textural signatures of the units are quantative measures of the spatial characteristics of the units that may be used to map similar units in similar environments.

  4. 21 CFR 123.8 - Verification.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 21 Food and Drugs 2 2012-04-01 2012-04-01 false Verification. 123.8 Section 123.8 Food and Drugs FOOD AND DRUG ADMINISTRATION, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) FOOD FOR HUMAN... processor shall verify that the HACCP plan is adequate to control food safety hazards that are...

  5. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  6. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  7. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  8. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  9. 20 CFR 325.6 - Verification procedures.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... Employees' Benefits RAILROAD RETIREMENT BOARD REGULATIONS UNDER THE RAILROAD UNEMPLOYMENT INSURANCE ACT REGISTRATION FOR RAILROAD UNEMPLOYMENT BENEFITS § 325.6 Verification procedures. The Board's procedures for adjudicating and processing applications and claims for unemployment benefits filed pursuant to this part...

  10. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2014 CFR

    2014-10-01

    ... 42 Public Health 4 2014-10-01 2014-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State...

  11. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2012 CFR

    2012-10-01

    ... 42 Public Health 4 2012-10-01 2012-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State...

  12. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... 9 Animals and Animal Products 2 2011-01-01 2011-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  13. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... 9 Animals and Animal Products 2 2012-01-01 2012-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  14. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... 9 Animals and Animal Products 2 2014-01-01 2014-01-01 false Agency verification. 416.17 Section 416.17 Animals and Animal Products FOOD SAFETY AND INSPECTION SERVICE, DEPARTMENT OF AGRICULTURE... (d) Direct observation or testing to assess the sanitary conditions in the establishment....

  15. Learner Verification: A Publisher's Case Study.

    ERIC Educational Resources Information Center

    Wilson, George

    Learner verification, a process by which publishers monitor the effectiveness of their products and strive to improve their services to schools, is a practice that most companies take seriously. The quality of educational materials may be ensured in many ways: by analysis of sales, through firsthand investigation, and by employing a system of…

  16. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2014 CFR

    2014-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  17. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  18. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  19. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2012 CFR

    2012-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  20. 16 CFR 315.5 - Prescriber verification.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... Commercial Practices FEDERAL TRADE COMMISSION REGULATIONS UNDER SPECIFIC ACTS OF CONGRESS CONTACT LENS RULE § 315.5 Prescriber verification. (a) Prescription requirement. A seller may sell contact lenses only in accordance with a contact lens prescription for the patient that is: (1) Presented to the seller by...

  1. 2017 EPA Protocol Gas Verification Program Participants

    EPA Pesticide Factsheets

    A list of participants for 2016 EPA's Protocol Gas Verification Program (PGVP) for stationary source monitoring. The list also has vendor IDs, which are production site-specific, and are the same ones used in the PGVP for ambient air monitoring.

  2. Environmental Technology Verification (ETV) Quality Program (Poster)

    EPA Science Inventory

    This is a poster created for the ETV Quality Program. The EPA Environmental Technology Verification Program (ETV) develops test protocols and verifies the performance of innovative technologies that have the potential to improve protection of human health and the environment. The...

  3. Distilling the Verification Process for Prognostics Algorithms

    NASA Technical Reports Server (NTRS)

    Roychoudhury, Indranil; Saxena, Abhinav; Celaya, Jose R.; Goebel, Kai

    2013-01-01

    The goal of prognostics and health management (PHM) systems is to ensure system safety, and reduce downtime and maintenance costs. It is important that a PHM system is verified and validated before it can be successfully deployed. Prognostics algorithms are integral parts of PHM systems. This paper investigates a systematic process of verification of such prognostics algorithms. To this end, first, this paper distinguishes between technology maturation and product development. Then, the paper describes the verification process for a prognostics algorithm as it moves up to higher maturity levels. This process is shown to be an iterative process where verification activities are interleaved with validation activities at each maturation level. In this work, we adopt the concept of technology readiness levels (TRLs) to represent the different maturity levels of a prognostics algorithm. It is shown that at each TRL, the verification of a prognostics algorithm depends on verifying the different components of the algorithm according to the requirements laid out by the PHM system that adopts this prognostics algorithm. Finally, using simplified examples, the systematic process for verifying a prognostics algorithm is demonstrated as the prognostics algorithm moves up TRLs.

  4. The Verification Guide, 1998-99.

    ERIC Educational Resources Information Center

    Office of Postsecondary Education, Washington DC. Student Financial Assistance Programs.

    This guide is intended to assist financial aid administrators at postsecondary education institutions in completing verification, the process of checking the accuracy of the information students provide when they apply for financial aid under student financial assistance (SFA) programs administered by the U.S. Department of Education. The first…

  5. Gender, Legitimation, and Identity Verification in Groups

    ERIC Educational Resources Information Center

    Burke, Peter J.; Stets, Jan E.; Cerven, Christine

    2007-01-01

    Drawing upon identity theory, expectation states theory, and legitimation theory, we examine how the task leader identity in task-oriented groups is more likely to be verified for persons with high status characteristics. We hypothesize that identity verification will be accomplished more readily for male group members and legitimated task leaders…

  6. Formal hardware verification of digital circuits

    NASA Technical Reports Server (NTRS)

    Joyce, J.; Seger, C.-J.

    1991-01-01

    The use of formal methods to verify the correctness of digital circuits is less constrained by the growing complexity of digital circuits than conventional methods based on exhaustive simulation. This paper briefly outlines three main approaches to formal hardware verification: symbolic simulation, state machine analysis, and theorem-proving.

  7. PROMOTING AIR QUALITY THROUGH ENVIRONMENTAL TECHNOLOGY VERIFICATION

    EPA Science Inventory

    The paper discusses the promotion of improved air quality through environmental technology verifications (ETVs). In 1995, the U.S. EPA's Office of Research and Development began the ETV Program in response to President Clinton's "Bridge to a Sustainable Future" and Vice Presiden...

  8. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2013 CFR

    2013-01-01

    ... SOP's and the procedures specified therein by determining that they meet the requirements of this part. Such verification may include: (a) Reviewing the Sanitation SOP's; (b) Reviewing the daily records documenting the implementation of the Sanitation SOP's and the procedures specified therein and any...

  9. 9 CFR 416.17 - Agency verification.

    Code of Federal Regulations, 2010 CFR

    2010-01-01

    ... SOP's and the procedures specified therein by determining that they meet the requirements of this part. Such verification may include: (a) Reviewing the Sanitation SOP's; (b) Reviewing the daily records documenting the implementation of the Sanitation SOP's and the procedures specified therein and any...

  10. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... 18 Conservation of Power and Water Resources 1 2014-04-01 2014-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  11. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2012 CFR

    2012-04-01

    ... 18 Conservation of Power and Water Resources 1 2012-04-01 2012-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  12. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2010 CFR

    2010-04-01

    ... 18 Conservation of Power and Water Resources 1 2010-04-01 2010-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  13. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... 18 Conservation of Power and Water Resources 1 2013-04-01 2013-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  14. 18 CFR 12.13 - Verification form.

    Code of Federal Regulations, 2011 CFR

    2011-04-01

    ... 18 Conservation of Power and Water Resources 1 2011-04-01 2011-04-01 false Verification form. 12.13 Section 12.13 Conservation of Power and Water Resources FEDERAL ENERGY REGULATORY COMMISSION, DEPARTMENT OF ENERGY REGULATIONS UNDER THE FEDERAL POWER ACT SAFETY OF WATER POWER PROJECTS AND PROJECT...

  15. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ... 42 Public Health 4 2010-10-01 2010-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State...

  16. 42 CFR 457.380 - Eligibility verification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ... 42 Public Health 4 2011-10-01 2011-10-01 false Eligibility verification. 457.380 Section 457.380 Public Health CENTERS FOR MEDICARE & MEDICAID SERVICES, DEPARTMENT OF HEALTH AND HUMAN SERVICES (CONTINUED) STATE CHILDREN'S HEALTH INSURANCE PROGRAMS (SCHIPs) ALLOTMENTS AND GRANTS TO STATES State...

  17. Verification of method performance for clinical laboratories.

    PubMed

    Nichols, James H

    2009-01-01

    Method verification, a one-time process to determine performance characteristics before a test system is utilized for patient testing, is often confused with method validation, establishing the performance of a new diagnostic tool such as an internally developed or modified method. A number of international quality standards (International Organization for Standardization (ISO) and Clinical Laboratory Standards Institute (CLSI)), accreditation agency guidelines (College of American Pathologists (CAP), Joint Commission, U.K. Clinical Pathology Accreditation (CPA)), and regional laws (Clinical Laboratory Improvement Amendments of 1988 (CLIA'88)) exist describing the requirements for method verification and validation. Consumers of marketed test kits should verify method accuracy, precision, analytic measurement range, and the appropriateness of reference intervals to the institution's patient population. More extensive validation may be required for new methods and those manufacturer methods that have been modified by the laboratory, including analytic sensitivity and specificity. This manuscript compares the various recommendations for method verification and discusses the CLSI evaluation protocols (EP) that are available to guide laboratories in performing method verification experiments.

  18. Formal System Verification for Trustworthy Embedded Systems

    DTIC Science & Technology

    2011-04-19

    Koscher, Alexei Czeskis, Franziska Roesner, Shwetak Patel, Tadayoshi Kohno, Stephen Checkoway, Damon McCoy, Brian Kantor, Danny Anderson, Hovav...security analysis of a commercial real-time operating system kernel. In David S. Hardin , editor, Design and Verification of Microprocessor Systems for High

  19. ENVIRONMENTAL TECHNOLOGY VERIFICATION OF BAGHOUSE FILTRATION PRODUCTS

    EPA Science Inventory

    The Environmental Technology Verification Program (ETV) was started by EPA in 1995 to generate independent credible data on the performance of innovative technologies that have potential to improve protection of public health and the environment. ETV does not approve or certify p...

  20. Verification of Autonomous Systems for Space Applications

    NASA Technical Reports Server (NTRS)

    Brat, G.; Denney, E.; Giannakopoulou, D.; Frank, J.; Jonsson, A.

    2006-01-01

    Autonomous software, especially if it is based on model, can play an important role in future space applications. For example, it can help streamline ground operations, or, assist in autonomous rendezvous and docking operations, or even, help recover from problems (e.g., planners can be used to explore the space of recovery actions for a power subsystem and implement a solution without (or with minimal) human intervention). In general, the exploration capabilities of model-based systems give them great flexibility. Unfortunately, it also makes them unpredictable to our human eyes, both in terms of their execution and their verification. The traditional verification techniques are inadequate for these systems since they are mostly based on testing, which implies a very limited exploration of their behavioral space. In our work, we explore how advanced V&V techniques, such as static analysis, model checking, and compositional verification, can be used to gain trust in model-based systems. We also describe how synthesis can be used in the context of system reconfiguration and in the context of verification.

  1. Optical Verification Laboratory Demonstration System for High Security Identification Cards

    NASA Technical Reports Server (NTRS)

    Javidi, Bahram

    1997-01-01

    Document fraud including unauthorized duplication of identification cards and credit cards is a serious problem facing the government, banks, businesses, and consumers. In addition, counterfeit products such as computer chips, and compact discs, are arriving on our shores in great numbers. With the rapid advances in computers, CCD technology, image processing hardware and software, printers, scanners, and copiers, it is becoming increasingly easy to reproduce pictures, logos, symbols, paper currency, or patterns. These problems have stimulated an interest in research, development and publications in security technology. Some ID cards, credit cards and passports currently use holograms as a security measure to thwart copying. The holograms are inspected by the human eye. In theory, the hologram cannot be reproduced by an unauthorized person using commercially-available optical components; in practice, however, technology has advanced to the point where the holographic image can be acquired from a credit card-photographed or captured with by a CCD camera-and a new hologram synthesized using commercially-available optical components or hologram-producing equipment. Therefore, a pattern that can be read by a conventional light source and a CCD camera can be reproduced. An optical security and anti-copying device that provides significant security improvements over existing security technology was demonstrated. The system can be applied for security verification of credit cards, passports, and other IDs so that they cannot easily be reproduced. We have used a new scheme of complex phase/amplitude patterns that cannot be seen and cannot be copied by an intensity-sensitive detector such as a CCD camera. A random phase mask is bonded to a primary identification pattern which could also be phase encoded. The pattern could be a fingerprint, a picture of a face, or a signature. The proposed optical processing device is designed to identify both the random phase mask and the

  2. Recognizing impactor signatures in the planetary record

    NASA Technical Reports Server (NTRS)

    Schultz, Peter H.; Gault, Donald E.

    1992-01-01

    Crater size reflects the target response to the combined effects of impactor size, density, and velocity. Isolating the effects of each variable in the cratering record is generally considered masked, if not lost, during late stages of crater modification (e.g., floor uplift and rim collapse). Important clues, however, come from the distinctive signatures of the impactor created by oblique impacts. In summary, oblique impacts allow for the identification of distinctive signatures of the impactor created during early penetration. Such signatures may further allow first-order testing of scaling relations for late crater excavation from the planetary surface record. Other aspects of this study are discussed.

  3. Timing signatures of large scale solar eruptions

    NASA Astrophysics Data System (ADS)

    Balasubramaniam, K. S.; Hock-Mysliwiec, Rachel; Henry, Timothy; Kirk, Michael S.

    2016-05-01

    We examine the timing signatures of large solar eruptions resulting in flares, CMEs and Solar Energetic Particle events. We probe solar active regions from the chromosphere through the corona, using data from space and ground-based observations, including ISOON, SDO, GONG, and GOES. Our studies include a number of flares and CMEs of mostly the M- and X-strengths as categorized by GOES. We find that the chromospheric signatures of these large eruptions occur 5-30 minutes in advance of coronal high temperature signatures. These timing measurements are then used as inputs to models and reconstruct the eruptive nature of these systems, and explore their utility in forecasts.

  4. Remote Sensing of Body Signs and Signatures

    DTIC Science & Technology

    1985-10-01

    REMOTE SENSING OF BODY SIGNS AND SIGNATURES LPrepared For Naval Medical Research and Development Command National Naval Medical Center, Bethesda...BODY SIGNS AND SIGNATURES S~By James C. Lin and Karen H. Chan Department of Bioengineering University of Illinois at Chicago Chicago, IL 60680 Abstract...Filters Di-t lb io. I AN!ý,z.biiity Codes I’ A.IDist jor p REMOTE SENSING OF BODY SIGNS AND SIGNATURES By James C. Lin and Karen H. Chan Department

  5. Runtime verification of embedded real-time systems.

    PubMed

    Reinbacher, Thomas; Függer, Matthias; Brauer, Jörg

    We present a runtime verification framework that allows on-line monitoring of past-time Metric Temporal Logic (ptMTL) specifications in a discrete time setting. We design observer algorithms for the time-bounded modalities of ptMTL, which take advantage of the highly parallel nature of hardware designs. The algorithms can be translated into efficient hardware blocks, which are designed for reconfigurability, thus, facilitate applications of the framework in both a prototyping and a post-deployment phase of embedded real-time systems. We provide formal correctness proofs for all presented observer algorithms and analyze their time and space complexity. For example, for the most general operator considered, the time-bounded Since operator, we obtain a time complexity that is doubly logarithmic both in the point in time the operator is executed and the operator's time bounds. This result is promising with respect to a self-contained, non-interfering monitoring approach that evaluates real-time specifications in parallel to the system-under-test. We implement our framework on a Field Programmable Gate Array platform and use extensive simulation and logic synthesis runs to assess the benefits of the approach in terms of resource usage and operating frequency.

  6. Verification of National Weather Service spot forecasts using surface observations

    NASA Astrophysics Data System (ADS)

    Lammers, Matthew Robert

    Software has been developed to evaluate National Weather Service spot forecasts issued to support prescribed burns and early-stage wildfires. Fire management officials request spot forecasts from National Weather Service Weather Forecast Offices to provide detailed guidance as to atmospheric conditions in the vicinity of planned prescribed burns as well as wildfires that do not have incident meteorologists on site. This open source software with online display capabilities is used to examine an extensive set of spot forecasts of maximum temperature, minimum relative humidity, and maximum wind speed from April 2009 through November 2013 nationwide. The forecast values are compared to the closest available surface observations at stations installed primarily for fire weather and aviation applications. The accuracy of the spot forecasts is compared to those available from the National Digital Forecast Database (NDFD). Spot forecasts for selected prescribed burns and wildfires are used to illustrate issues associated with the verification procedures. Cumulative statistics for National Weather Service County Warning Areas and for the nation are presented. Basic error and accuracy metrics for all available spot forecasts and the entire nation indicate that the skill of the spot forecasts is higher than that available from the NDFD, with the greatest improvement for maximum temperature and the least improvement for maximum wind speed.

  7. On the role of code comparisons in verification and validation.

    SciTech Connect

    Oberkampf, William Louis; Trucano, Timothy Guy; Pilch, Martin M.

    2003-08-01

    This report presents a perspective on the role of code comparison activities in verification and validation. We formally define the act of code comparison as the Code Comparison Principle (CCP) and investigate its application in both verification and validation. One of our primary conclusions is that the use of code comparisons for validation is improper and dangerous. We also conclude that while code comparisons may be argued to provide a beneficial component in code verification activities, there are higher quality code verification tasks that should take precedence. Finally, we provide a process for application of the CCP that we believe is minimal for achieving benefit in verification processes.

  8. Online Videoconferencing Products: Update

    ERIC Educational Resources Information Center

    Burton, Douglas; Kitchen, Tim

    2011-01-01

    Software allowing real-time online video connectivity is rapidly evolving. The ability to connect students, staff, and guest speakers instantaneously carries great benefits for the online distance education classroom. This evaluation report compares four software applications at opposite ends of the cost spectrum: "DimDim", "Elluminate VCS",…

  9. Online Higher Education Commodity

    ERIC Educational Resources Information Center

    Chau, Paule

    2010-01-01

    This article analyzes the current trend towards online education. It examines some of the reasons for the trend and the ramifications it may have on students, faculty and institutions of higher learning. The success and profitability of online programs and institutions such as the University of Phoenix has helped to make the move towards online…

  10. Children's Online Privacy.

    ERIC Educational Resources Information Center

    Aidman, Amy

    2000-01-01

    The first federal Internet privacy law (the Children's Online Privacy Protection Act) provides safeguards for children by regulating collection of their personal information. Unfortunately, teens are not protected. Legislation is pending to protect children from online marketers such as ZapMe! Interactive technologies require constant vigilance.…

  11. Online Database Searching Workbook.

    ERIC Educational Resources Information Center

    Littlejohn, Alice C.; Parker, Joan M.

    Designed primarily for use by first-time searchers, this workbook provides an overview of online searching. Following a brief introduction which defines online searching, databases, and database producers, five steps in carrying out a successful search are described: (1) identifying the main concepts of the search statement; (2) selecting a…

  12. Nonbibliographic Databases Online.

    ERIC Educational Resources Information Center

    Online Review, 1978

    1978-01-01

    A directory of 246 nonbibliographic data bases, which are also known as data banks and numeric data bases. Entries are arranged alphabetically by name of data base, followed by name of data base producer, subject content, and online vendor. This directory updates the listing published in Vol. 1, No. 4 of On-Line Review. (JPF)

  13. Online Knowledge Communities.

    ERIC Educational Resources Information Center

    de Vries, Sjoerd; Bloemen, Paul; Roossink, Lonneke

    This paper describes the concept of online knowledge communities. The concept is defined, and six qualities of online communities are identified: members (user roles are clearly defined); mission (generally accepted goal-statement, ideas, beliefs, etc.); commitment (members give their loyalty to the mission); social interaction (frequent…

  14. Authoritative Online Editions

    ERIC Educational Resources Information Center

    Benton, Thomas H.

    2007-01-01

    In this article, the author discusses how it is now very easy for anyone to find formerly hard-to-find books such as the works of Walt Whitman with the help of online booksellers. The author also describes the efforts made by various institutions to produce online editions of the works of major writers. One such prominent project is the archive…

  15. Assessing Online Learning.

    ERIC Educational Resources Information Center

    Wagner, June G.

    2001-01-01

    This document contains three articles devoted to assessing online learning. "Online Learning: A Digital Revolution" profiles innovative World Wide Web-based programs at high schools and colleges in the United States and worldwide and discusses the following topics: new demographic realities; the need for continuous (lifelong) learning; and…

  16. Serving the Online Learner

    ERIC Educational Resources Information Center

    Boettcher, Judith V.

    2007-01-01

    Systems and services for recruiting, advising, and support of online students have seldom been at the top of the list when planning online and distance learning programs. That is now changing: Forces pushing advising and support services into the foreground include recognition of the student learner as "customer" and the increasing…

  17. Online Learning for Teachers.

    ERIC Educational Resources Information Center

    Kesner, Rebecca J., Ed.

    2001-01-01

    This newsletter contains two articles on teacher use of educational technology. The first article, "Online Learning for Teachers," (Stephen G. Barkley) explains that online learning has the ability to multiply both the effectiveness and efficiency of traditional onsite training by eliminating the need for travel. It describes the five components…

  18. Taking Information Literacy Online.

    ERIC Educational Resources Information Center

    Levesque, Carla

    2003-01-01

    Explores the process of designing, teaching, and revising an online information literacy course at St. Petersburg College (SPC) (Florida). Shares methods for encouraging participation in online courses and ways of tracking students' progress. Reports that basic computer information and literacy is now a graduation requirement at SBC. Contains…

  19. Online Collaboration: Curriculum Unbound!

    ERIC Educational Resources Information Center

    Waters, John K.

    2007-01-01

    Freed from the nuisances of paper-based methods, districts are making creative use of digital tools to move their curricular documents online, where educators can collaborate on course development and lesson planning. Back in 2003, Amarillo Independent School District (Texas) had begun using the Blackboard Content System to provide lessons online.…

  20. Shifting gears in hippocampus: temporal dissociation between familiarity and novelty signatures in a single event.

    PubMed

    Ben-Yakov, Aya; Rubinson, Mica; Dudai, Yadin

    2014-09-24

    The hippocampus is known to be involved in encoding and retrieval of episodes. However, real-life experiences are expected to involve both encoding and retrieval, and it is unclear how the human hippocampus subserves both functions in the course of a single event. We presented participants with brief movie clips multiple times and examined the effect of familiarity on the hippocampal response at event onset versus event offset. Increased familiarity resulted in a decreased offset response, indicating that the offset response is a novelty-related signature. The magnitude of this offset response was correlated, across hippocampal voxels, with an independent measure of successful encoding, based on nonrepeated clips. This suggests that the attenuated offset response to familiar clips reflects reduced encoding. In addition, the posterior hippocampus exhibited an increased onset response to familiar events, switching from an online familiarity signal to an offline novelty signal during a single event. Moreover, participants with stronger memory exhibited increased reactivation of online activity during familiar events, in line with a retrieval signature. Our results reveal a spatiotemporal dissociation between novelty/encoding and familiarity/retrieval signatures, assumed to reflect different computational modes, in response to the same stimulus.

  1. Online Advertising in Social Networks

    NASA Astrophysics Data System (ADS)

    Bagherjeiran, Abraham; Bhatt, Rushi P.; Parekh, Rajesh; Chaoji, Vineet

    Online social networks offer opportunities to analyze user behavior and social connectivity and leverage resulting insights for effective online advertising. This chapter focuses on the role of social network information in online display advertising.

  2. Teaching Astronomy Online

    NASA Astrophysics Data System (ADS)

    Radnofsky, Mary L.; Bobrowsky, Matthew

    This article is intended to provide an overview of the practical, pedagogical, and philosophical considerations in designing a Web-based astronomy course, and to demonstrate the educational benefits that such online courses can afford students. Because online students need to take more responsibility for their learning, faculty must make course expectations extremely clear. Online education allows for increased student participation and equal access to college by such groups as the military, the handicapped, full-time employees, and rural and senior citizens. Teaching the sciences online--especially astronomy--gives students more time to think critically about new information. This article also includes tools, checklists, and resources helpful for introducing faculty to online course development in astronomy.

  3. Blind Quantum Signature with Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Li, Wei; Shi, Ronghua; Guo, Ying

    2017-04-01

    Blind quantum computation allows a client without quantum abilities to interact with a quantum server to perform a unconditional secure computing protocol, while protecting client's privacy. Motivated by confidentiality of blind quantum computation, a blind quantum signature scheme is designed with laconic structure. Different from the traditional signature schemes, the signing and verifying operations are performed through measurement-based quantum computation. Inputs of blind quantum computation are securely controlled with multi-qubit entangled states. The unique signature of the transmitted message is generated by the signer without leaking information in imperfect channels. Whereas, the receiver can verify the validity of the signature using the quantum matching algorithm. The security is guaranteed by entanglement of quantum system for blind quantum computation. It provides a potential practical application for e-commerce in the cloud computing and first-generation quantum computation.

  4. Blind Quantum Signature with Blind Quantum Computation

    NASA Astrophysics Data System (ADS)

    Li, Wei; Shi, Ronghua; Guo, Ying

    2016-12-01

    Blind quantum computation allows a client without quantum abilities to interact with a quantum server to perform a unconditional secure computing protocol, while protecting client's privacy. Motivated by confidentiality of blind quantum computation, a blind quantum signature scheme is designed with laconic structure. Different from the traditional signature schemes, the signing and verifying operations are performed through measurement-based quantum computation. Inputs of blind quantum computation are securely controlled with multi-qubit entangled states. The unique signature of the transmitted message is generated by the signer without leaking information in imperfect channels. Whereas, the receiver can verify the validity of the signature using the quantum matching algorithm. The security is guaranteed by entanglement of quantum system for blind quantum computation. It provides a potential practical application for e-commerce in the cloud computing and first-generation quantum computation.

  5. 15 CFR 908.16 - Signature.

    Code of Federal Regulations, 2011 CFR

    2011-01-01

    ... SUBMITTING REPORTS ON WEATHER MODIFICATION ACTIVITIES § 908.16 Signature. All reports filed with the National... or intending to conduct the weather modification activities referred to therein by such...

  6. Analysis of multispectral signatures of the shot

    NASA Astrophysics Data System (ADS)

    Kastek, Mariusz; Dulski, Rafał; Piątkowski, Tadeusz; Madura, Henryk; Bareła, Jarosław; Polakowski, Henryk

    2011-06-01

    The paper presents some practical aspects of sniper IR signature measurements. Description of particular signatures for sniper shot in typical scenarios has been presented. We take into consideration sniper activities in the open area as well as in urban environment. The measurements were made at field test ground. High precision laboratory measurements were also performed. Several infrared cameras were used during measurements to cover all measurement assumptions. Some of the cameras are measurement-class devices with high accuracy and frame rates. The registrations were simultaneously made in UV, NWIR, SWIR and LWIR spectral bands. The infrared cameras have possibilities to install optical filters for multispectral measurement. An ultra fast visual camera was also used for visible spectra registration. Exemplary sniper IR signatures for typical situation were presented. LWIR imaging spectroradiometer HyperCam was also used during the laboratory measurements and field experiments. The signatures collected by HyperCam were useful for the determination of spectral characteristics of shot.

  7. Secure quantum signatures using insecure quantum channels

    NASA Astrophysics Data System (ADS)

    Amiri, Ryan; Wallden, Petros; Kent, Adrian; Andersson, Erika

    2016-03-01

    Digital signatures are widely used in modern communication to guarantee authenticity and transferability of messages. The security of currently used classical schemes relies on computational assumptions. We present a quantum signature scheme that does not require trusted quantum channels. We prove that it is unconditionally secure against the most general coherent attacks, and show that it requires the transmission of significantly fewer quantum states than previous schemes. We also show that the quantum channel noise threshold for our scheme is less strict than for distilling a secure key using quantum key distribution. This shows that "direct" quantum signature schemes can be preferable to signature schemes relying on secret shared keys generated using quantum key distribution.

  8. Isotopic signatures by bulk analyses

    SciTech Connect

    Efurd, D.W.; Rokop, D.J.

    1997-12-01

    Los Alamos National Laboratory has developed a series of measurement techniques for identification of nuclear signatures by analyzing bulk samples. Two specific applications for isotopic fingerprinting to identify the origin of anthropogenic radioactivity in bulk samples are presented. The first example is the analyses of environmental samples collected in the US Arctic to determine the impact of dumping of radionuclides in this polar region. Analyses of sediment and biota samples indicate that for the areas sampled the anthropogenic radionuclide content of sediments was predominantly the result of the deposition of global fallout. The anthropogenic radionuclide concentrations in fish, birds and mammals were very low. It can be surmised that marine food chains are presently not significantly affected. The second example is isotopic fingerprinting of water and sediment samples from the Rocky Flats Facility (RFP). The largest source of anthropogenic radioactivity presently affecting surface-waters at RFP is the sediments that are currently residing in the holding ponds. One gram of sediment from a holding pond contains approximately 50 times more plutonium than 1 liter of water from the pond. Essentially 100% of the uranium in Ponds A-1 and A-2 originated as depleted uranium. The largest source of radioactivity in the terminal Ponds A-4, B-5 and C-2 was naturally occurring uranium and its decay product radium. The uranium concentrations in the waters collected from the terminal ponds contained 0.05% or less of the interim standard calculated derived concentration guide for uranium in waters available to the public. All of the radioactivity observed in soil, sediment and water samples collected at RFP was naturally occurring, the result of processes at RFP or the result of global fallout. No extraneous anthropogenic alpha, beta or gamma activities were detected. The plutonium concentrations in Pond C-2 appear to vary seasonally.

  9. ACCRETING CIRCUMPLANETARY DISKS: OBSERVATIONAL SIGNATURES

    SciTech Connect

    Zhu, Zhaohuan

    2015-01-20

    I calculate the spectral energy distributions of accreting circumplanetary disks using atmospheric radiative transfer models. Circumplanetary disks only accreting at 10{sup –10} M {sub ☉} yr{sup –1} around a 1 M{sub J} planet can be brighter than the planet itself. A moderately accreting circumplanetary disk ( M-dot ∼10{sup −8} M{sub ⊙} yr{sup −1}; enough to form a 10 M{sub J} planet within 1 Myr) around a 1 M{sub J} planet has a maximum temperature of ∼2000 K, and at near-infrared wavelengths (J, H, K bands), this disk is as bright as a late-M-type brown dwarf or a 10 M{sub J} planet with a ''hot start''. To use direct imaging to find the accretion disks around low-mass planets (e.g., 1 M{sub J} ) and distinguish them from brown dwarfs or hot high-mass planets, it is crucial to obtain photometry at mid-infrared bands (L', M, N bands) because the emission from circumplanetary disks falls off more slowly toward longer wavelengths than those of brown dwarfs or planets. If young planets have strong magnetic fields (≳100 G), fields may truncate slowly accreting circumplanetary disks ( M-dot ≲10{sup −9} M{sub ⊙} yr{sup −1}) and lead to magnetospheric accretion, which can provide additional accretion signatures, such as UV/optical excess from the accretion shock and line emission.

  10. Irma multisensor predictive signature model

    NASA Astrophysics Data System (ADS)

    Watson, John S.; Wellfare, Michael R.; Chenault, David B.; Talele, Sunjay E.; Blume, Bradley T.; Richards, Mike; Prestwood, Lee

    1997-06-01

    Development of target acquisition and target recognition algorithms in highly cluttered backgrounds in a variety of battlefield conditions demands a flexible, high fidelity capability for synthetic image generation. Cost effective smart weapons research and testing also requires extensive scene generation capability. The Irma software package addresses this need through a first principles, phenomenology based scene generator that enhances research into new algorithms, novel sensors, and sensor fusion approaches. Irma was one of the first high resolution synthetic infrared target and background signature models developed for tactical air-to-surface weapon scenarios. Originally developed in 1980 by the Armament Directorate of the Air Force Wright Laboratory, the Irma model was used exclusively to generate IR scenes for smart weapons research and development. in 1987, Nichols Research Corporation took over the maintenance of Irma and has since added substantial capabilities. The development of Irma has culminated in a program that includes not only passive visible, IR, and millimeter wave (MMW) channels but also active MMW and ladar channels. Each of these channels is co-registered providing the capability to develop algorithms for multi-band sensor fusion concepts and associated algorithms. In this paper, the capabilities of the latest release of Irma, Irma 4.0, will be described. A brief description of the elements of the software that are common to all channels will be provided. Each channel will be described briefly including a summary of the phenomenological effects and the sensor effects modeled in the software. Examples of Irma multi- channel imagery will be presented.

  11. The postprocessing of quantum digital signatures

    NASA Astrophysics Data System (ADS)

    Wang, Tian-Yin; Ma, Jian-Feng; Cai, Xiao-Qiu

    2017-01-01

    Many novel quantum digital signature proposals have been proposed, which can effectively guarantee the information-theoretic security of the signature for a singe bit against forging and denying. Using the current basic building blocks of signing a single bit, we give a new proposal to construct an entire protocol for signing a long message. Compared with the previous work, it can improve at least 33.33% efficiency.

  12. Research Plan for Fire Signatures and Detection

    NASA Technical Reports Server (NTRS)

    2004-01-01

    Viewgraphs on the prevention, suppression, and detection of fires aboard a spacecraft is presented. The topics include: 1) Fire Prevention, Detection, and Suppression Sub-Element Products; 2) FPDS Organizing Questions; 3) FPDS Organizing Questions; 4) Signatures, Sensors, and Simulations; 5) Quantification of Fire and Pre-Fire Signatures; 6) Smoke; 7) DAFT Hardware; 8) Additional Benefits of DAFT; 9) Development and Characterization of Sensors 10) Simulation of the Transport of Smoke and Fire Precursors; and 11) FPDS Organizing Questions.

  13. Talking Online: Reflecting on Online Communication Tools

    ERIC Educational Resources Information Center

    Greener, Susan

    2009-01-01

    Purpose: The purpose of this paper is to reflect on the value and constraints of varied online communication tools from web 2.0 to e-mail in a higher education (HE) teaching and learning context, where these tools are used to support or be the main focus of learning. Design/methodology/approach: A structured reflection is produced with the aid of…

  14. UTEX modeling of xenon signature sensitivity to geology and explosion cavity characteristics following an underground nuclear explosion

    NASA Astrophysics Data System (ADS)

    Lowrey, J. D.; Haas, D.

    2013-12-01

    Underground nuclear explosions (UNEs) produce anthropogenic isotopes that can potentially be used in the verification component of the Comprehensive Nuclear-Test-Ban Treaty. Several isotopes of radioactive xenon gas have been identified as radionuclides of interest within the International Monitoring System (IMS) and in an On-Site Inspection (OSI). Substantial research has been previously undertaken to characterize the geologic and atmospheric mechanisms that can drive the movement of radionuclide gas from a well-contained UNE, considering both sensitivities on gas arrival time and signature variability of xenon due to the nature of subsurface transport. This work further considers sensitivities of radioxenon gas arrival time and signatures to large variability in geologic stratification and generalized explosion cavity characteristics, as well as compares this influence to variability in the shallow surface.

  15. Kinematics of Signature Writing in Healthy Aging*

    PubMed Central

    Caligiuri, Michael P.; Kim, Chi; Landy, Kelly M.

    2014-01-01

    Forensic document examiners (FDE) called upon to distinguish a genuine from a forged signature of an elderly person are often required to consider the question of age-related deterioration and whether the available exemplars reliably capture the natural effects of aging of the original writer. An understanding of the statistical relationship between advanced age and handwriting movements can reduce the uncertainty that may exist in an examiner’s approach to questioned signatures formed by elderly writers. The primary purpose of this study was to systematically examine age-related changes in signature kinematics in healthy writers. Forty-two healthy subjects between the ages of 60–91 years participated in this study. Signatures were recorded using a digitizing tablet and commercial software was used to examine the temporal and spatial stroke kinematics and pen pressure. Results indicated that vertical stroke duration and dysfluency increased with age, whereas vertical stroke amplitude and velocity decreased with age. Pen pressure decreased with age. We found that a linear model characterized the best-fit relationship between advanced age and handwriting movement parameters for signature formation. Male writers exhibited stronger age effects than female writers, especially for pen pressure and stroke dysfluency. The present study contributes to an understanding of how advanced age alters signature formation in otherwise healthy adults. PMID:24673648

  16. Chemical and Physical Signatures for Microbial Forensics

    SciTech Connect

    Cliff, John B.; Kreuzer, Helen W.; Ehrhardt, Christopher J.; Wunschel, David S.

    2012-01-03

    Chemical and physical signatures for microbial forensics John Cliff and Helen Kreuzer-Martin, eds. Humana Press Chapter 1. Introduction: Review of history and statement of need. Randy Murch, Virginia Tech Chapter 2. The Microbe: Structure, morphology, and physiology of the microbe as they relate to potential signatures of growth conditions. Joany Jackman, Johns Hopkins University Chapter 3. Science for Forensics: Special considerations for the forensic arena - quality control, sample integrity, etc. Mark Wilson (retired FBI): Western Carolina University Chapter 4. Physical signatures: Light and electron microscopy, atomic force microscopy, gravimetry etc. Joseph Michael, Sandia National Laboratory Chapter 5. Lipids: FAME, PLFA, steroids, LPS, etc. James Robertson, Federal Bureau of Investigation Chapter 6. Carbohydrates: Cell wall components, cytoplasm components, methods Alvin Fox, University of South Carolina School of Medicine David Wunschel, Pacific Northwest National Laboratory Chapter 7. Peptides: Peptides, proteins, lipoproteins David Wunschel, Pacific Northwest National Laboratory Chapter 8. Elemental content: CNOHPS (treated in passing), metals, prospective cell types John Cliff, International Atomic Energy Agency Chapter 9. Isotopic signatures: Stable isotopes C,N,H,O,S, 14C dating, potential for heavy elements. Helen Kreuzer-Martin, Pacific Northwest National Laboratory Michaele Kashgarian, Lawrence Livermore National Laboratory Chapter 10. Extracellular signatures: Cellular debris, heme, agar, headspace, spent media, etc Karen Wahl, Pacific Northwest National Laboratory Chapter 11. Data Reduction and Integrated Microbial Forensics: Statistical concepts, parametric and multivariate statistics, integrating signatures Kristin Jarman, Pacific Northwest National Laboratory

  17. Assessing the Quality of Bioforensic Signatures

    SciTech Connect

    Sego, Landon H.; Holmes, Aimee E.; Gosink, Luke J.; Webb-Robertson, Bobbie-Jo M.; Kreuzer, Helen W.; Anderson, Richard M.; Brothers, Alan J.; Corley, Courtney D.; Tardiff, Mark F.

    2013-06-04

    We present a mathematical framework for assessing the quality of signature systems in terms of fidelity, cost, risk, and utility—a method we refer to as Signature Quality Metrics (SQM). We demonstrate the SQM approach by assessing the quality of a signature system designed to predict the culture medium used to grow a microorganism. The system consists of four chemical assays designed to identify various ingredients that could be used to produce the culture medium. The analytical measurements resulting from any combination of these four assays can be used in a Bayesian network to predict the probabilities that the microorganism was grown using one of eleven culture media. We evaluated fifteen combinations of the signature system by removing one or more of the assays from the Bayes network. We demonstrated that SQM can be used to distinguish between the various combinations in terms of attributes of interest. The approach assisted in clearly identifying assays that were least informative, largely in part because they only could discriminate between very few culture media, and in particular, culture media that are rarely used. There are limitations associated with the data that were used to train and test the signature system. Consequently, our intent is not to draw formal conclusions regarding this particular bioforensic system, but rather to illustrate an analytical approach that could be useful in comparing one signature system to another.

  18. Signature extension through the application of cluster matching algorithms to determine appropriate signature transformations

    NASA Technical Reports Server (NTRS)

    Lambeck, P. F.; Rice, D. P.

    1976-01-01

    Signature extension is intended to increase the space-time range over which a set of training statistics can be used to classify data without significant loss of recognition accuracy. A first cluster matching algorithm MASC (Multiplicative and Additive Signature Correction) was developed at the Environmental Research Institute of Michigan to test the concept of using associations between training and recognition area cluster statistics to define an average signature transformation. A more recent signature extension module CROP-A (Cluster Regression Ordered on Principal Axis) has shown evidence of making significant associations between training and recognition area cluster statistics, with the clusters to be matched being selected automatically by the algorithm.

  19. Online Sellers’ Website Quality Influencing Online Buyers’ Purchase Intention

    NASA Astrophysics Data System (ADS)

    Shea Lee, Tan; Ariff, Mohd Shoki Md; Zakuan, Norhayati; Sulaiman, Zuraidah; Zameri Mat Saman, Muhamad

    2016-05-01

    The increase adoption of Internet among young users in Malaysia provides high prospect for online seller. Young users aged between 18 and 25 years old are important to online sellers because they are actively involved in online purchasing and this group of online buyers is expected to dominate future online market. Therefore, examining online sellers’ website quality and online buyers’ purchase intention is crucial. Based on the Theory of planned behavior (TPB), a conceptual model of online sellers’ website quality and purchase intention of online buyers was developed. E-tailQ instrument was adapted in this study which composed of website design, reliability/fulfillment, security, privacy & trust, and customer service. Using online questionnaire and convenience sampling procedure, primary data were obtained from 240 online buyers aged between 18 to 25 years old. It was discovered that website design, website reliability/fulfillment, website security, privacy & trust, and website customer service positively and significantly influence intention of online buyers to continuously purchase via online channels. This study concludes that online sellers’ website quality is important in predicting online buyers’ purchase intention. Recommendation and implication of this study were discussed focusing on how online sellers should improve their website quality to stay competitive in online business.

  20. Verification of exceptional points in the collapse dynamics of Bose-Einstein condensates

    NASA Astrophysics Data System (ADS)

    Brinker, Jonas; Fuchs, Jacob; Main, Jörg; Wunner, Günter; Cartarius, Holger

    2015-01-01

    In Bose-Einstein condensates with an attractive contact interaction the stable ground state and an unstable excited state emerge in a tangent bifurcation at a critical value of the scattering length. At the bifurcation point both the energies and the wave functions of the two states coalesce, which is the characteristic of an exceptional point. In numerical simulations signatures of the exceptional point can be observed by encircling the bifurcation point in the complex extended space of the scattering length, however, this method cannot be applied in an experiment. Here we show in which way the exceptional point effects the collapse dynamics of the Bose-Einstein condensate. The harmonic inversion analysis of the time signal given as the spatial extension of the collapsing condensate wave function can provide clear evidence for the existence of an exceptional point. This method can be used for an experimental verification of exceptional points in Bose-Einstein condensates.